-
Notifications
You must be signed in to change notification settings - Fork 45
[New Sample] Add cv samples with argmax operators #278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
Open
Changes from all commits
Commits
Show all changes
54 commits
Select commit
Hold shift + click to select a range
05de565
Optimize the check codes of test_compiler.
Xreki 2db7cb4
Add cv models with more than 6 subgraphs.
Xreki f860a94
Add cv models with more than 10 subgraphs.
Xreki b9f8d59
Merge branch 'develop' into add_cv_samples_5
Xreki 0213131
Update graph_hash.
Xreki ea531b9
Remove redundant subgraphs.
Xreki 031ecb4
Fix a typo and add some printing hints.
Xreki 7a91e19
Implement a function to collect the model's execution stats.
Xreki bcf9d5a
Add support of get_attr and simplify some codes.
Xreki 1415926
Fix support of call_method.
Xreki dbcadfa
Support _native_multi_head_attention.
Xreki 16a5a6e
Merge branch 'develop' into add_cv_samples_5_need_fix
Xreki b2073f9
Merge branch 'develop' into collect_info
Xreki 8161df7
Fix several ops and change to use subprocess for multiple tests.
Xreki 07558f2
Support another method with make_fx.
Xreki 256c75f
Optimize the dtypes stats.
Xreki a3fb5ae
Enable to print error messages.
Xreki d10dcc3
Fix several problems.
Xreki f159a3d
Support to rerun the failed cases only.
Xreki ee5fd22
Implement method using torch.compile with customized backend.
Xreki 777f8dd
Update the log format.
Xreki 9f3086a
Merge branch 'develop' into collect_info
Xreki 9cf8a86
Add source and heuristic_tag.
Xreki b1eb293
Add timestamp in log.
Xreki 9623691
Merge branch 'develop' into collect_info
Xreki a738b6b
Merge branch 'develop' into collect_info
Xreki 7575957
Remove the make_fx implementation.
Xreki e7fd651
Merge branch 'develop' into opt_test_compiler_check
Xreki 6d9b156
Implement collecting stats for paddle.
Xreki 39b1806
Refine some codes and fix support for VectorType.
Xreki 6589e7a
Clear codes.
Xreki 8863e6a
Add cv samples for models with more than 20 subgraphs.
Xreki 7f3cf4c
Remove redundant subgraphs.
Xreki 26b99b6
Add cv samples for models with more than 30 subgraphs.
Xreki 1eada61
Remove redundant subgraphs.
Xreki f0f1426
Merge branch 'add_cv_samples_6' into add_cv_samples_5_need_fix
Xreki 9468244
Fix test_compiler of paddle and remove collecl_stats.
Xreki 593d460
Merge branch 'develop' into add_cv_samples_5_need_fix
Xreki 0ce778f
Add paddle implementation.
Xreki 1a3dc30
Reorganize some codes.
Xreki 449f38c
Merge branch 'develop' into collect_info
Xreki 5db7117
Reorgnanize codes.
Xreki 110c7a9
Merge branch 'collect_info' into add_cv_samples_5_need_fix
Xreki 6a5b8db
Support to collect input shapes and use json to dump list and dict.
Xreki da0ff51
Merge branch 'collect_info' into add_cv_samples_5_need_fix
Xreki 71f3316
Merge branch 'develop' into opt_test_compiler_check
Xreki 00b554c
Rename test_compiler_util.py.
Xreki 4bfb443
Reorganize codes and logs.
Xreki 29ab5ae
Merge branch 'develop' into opt_test_compiler_check
Xreki b0ba9c6
Reorganize codes.
Xreki 371562b
Fix seed of dropout.
Xreki 07e4340
Apply the new tolerance.
Xreki 8f46c3e
Merge branch 'opt_test_compiler_check' into add_cv_samples_5_need_fix
Xreki 294e1dd
Merge branch 'develop' into add_cv_samples_5_need_fix
Xreki File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
There are no files selected for viewing
100 changes: 100 additions & 0 deletions
graph_net/collect_stats_util.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,100 @@ | ||
| import ast | ||
| import json | ||
| import importlib | ||
| import inspect | ||
| from dataclasses import dataclass, field | ||
| from typing import Dict | ||
|
|
||
|
|
||
| @dataclass | ||
| class OpStat: | ||
| op_name: str | ||
| op_dtypes: dict[str, int] = field(default_factory=dict) | ||
| count: int = 0 | ||
|
|
||
| def update(self, other): | ||
| if isinstance(other, OpStat) and self.op_name == other.op_name: | ||
| self.count += other.count | ||
| for name, count in other.op_dtypes.items(): | ||
| self.op_dtypes[name] = self.op_dtypes.get(name, 0) + count | ||
|
|
||
|
|
||
| @dataclass | ||
| class ModelStats: | ||
| model_path: str | ||
| num_inputs: int = None | ||
| num_params: int = None | ||
| num_outputs: int = None | ||
| num_ops: int = None | ||
| model_size_in_billion: float = None | ||
| input_dtypes: Dict[str, int] = field(default_factory=dict) | ||
| param_dtypes: Dict[str, int] = field(default_factory=dict) | ||
| input_shapes: Dict[str, list] = field(default_factory=dict) | ||
| op_dtypes: Dict[str, int] = field(default_factory=dict) | ||
| ops: Dict[str, int] = field(default_factory=dict) | ||
| source: str = None | ||
| heuristic_tag: str = None | ||
|
|
||
|
|
||
| def print_model_stats(stats, log_prompt): | ||
| assert isinstance(stats, ModelStats), f"{type(stats)=}" | ||
|
|
||
| def print_with_log_prompt(key, value): | ||
| print( | ||
| f"{log_prompt} [ModelStats.{key}] model_path:{stats.model_path} {value}", | ||
| flush=True, | ||
| ) | ||
|
|
||
| print_with_log_prompt("num_inputs", stats.num_inputs) | ||
| print_with_log_prompt("num_params", stats.num_params) | ||
| print_with_log_prompt("num_outputs", stats.num_outputs) | ||
| print_with_log_prompt("num_ops", stats.num_ops) | ||
| print_with_log_prompt("model_size", f"{stats.model_size_in_billion}B") | ||
| print_with_log_prompt("input_dtypes", json.dumps(stats.input_dtypes)) | ||
| print_with_log_prompt("param_dtypes", json.dumps(stats.param_dtypes)) | ||
| print_with_log_prompt("input_shapes", json.dumps(stats.input_shapes)) | ||
| print_with_log_prompt("op_dtypes", json.dumps(stats.op_dtypes)) | ||
| print_with_log_prompt("ops", json.dumps(stats.ops)) | ||
| print_with_log_prompt("source", stats.source) | ||
| print_with_log_prompt("heuristic_tag", stats.heuristic_tag) | ||
|
|
||
|
|
||
| def load_class_from_file(file_path, class_name): | ||
| spec = importlib.util.spec_from_file_location("unnamed", file_path) | ||
| unnamed = importlib.util.module_from_spec(spec) | ||
| spec.loader.exec_module(unnamed) | ||
| model_class = getattr(unnamed, class_name, None) | ||
| return model_class | ||
|
|
||
|
|
||
| def get_argument_name_and_types(model_class, func_name): | ||
| argument_name2types = {} | ||
| for name, func in inspect.getmembers(model_class, predicate=inspect.isfunction): | ||
| if name == func_name: | ||
| for arg_name, arg in inspect.signature(func).parameters.items(): | ||
| if arg_name != "self": | ||
| argument_name2types[arg_name] = ( | ||
| None if arg.annotation is inspect._empty else arg.annotation | ||
| ) | ||
| return argument_name2types | ||
|
|
||
|
|
||
| def get_number_of_returns(file_path, class_name, func_name): | ||
| source = None | ||
| with open(f"{file_path}", "r") as f: | ||
| source = f.read() | ||
|
|
||
| tree = ast.parse(source) | ||
| for node in tree.body: | ||
| if isinstance(node, ast.ClassDef) and node.name == class_name: | ||
| for f in node.body: | ||
| if isinstance(f, ast.FunctionDef) and f.name == func_name: | ||
| for stmt in ast.walk(f): | ||
| if isinstance(stmt, ast.Return): | ||
| if stmt.value is None: | ||
| return 0 | ||
| elif isinstance(stmt.value, ast.Tuple): | ||
| return len(stmt.value.elts) | ||
| else: | ||
| return 1 | ||
| return 0 |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.