forked from open-mmlab/mmrazor
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Refactor] Refactor codebase (open-mmlab#220)
* [WIP] Refactor v2.0 (open-mmlab#163) * Refactor backend wrapper * Refactor mmdet.inference * Fix * merge * refactor utils * Use deployer and deploy_model to manage pipeline * Resolve comments * Add a real inference api function * rename wrappers * Set execute to private method * Rename deployer deploy_model * Refactor task * remove type hint * lint * Resolve comments * resolve comments * lint * docstring * [Fix]: Fix bugs in details in refactor branch (open-mmlab#192) * [WIP] Refactor v2.0 (open-mmlab#163) * Refactor backend wrapper * Refactor mmdet.inference * Fix * merge * refactor utils * Use deployer and deploy_model to manage pipeline * Resolve comments * Add a real inference api function * rename wrappers * Set execute to private method * Rename deployer deploy_model * Refactor task * remove type hint * lint * Resolve comments * resolve comments * lint * docstring * Fix errors * lint * resolve comments * fix bugs * conflict * lint and typo * Resolve comment * refactor mmseg (open-mmlab#201) * support mmseg * fix docstring * fix docstring * [Refactor]: Get the count of backend files (open-mmlab#202) * Fix backend files * resolve comments * lint * Fix ncnn * [Refactor]: Refactor folders of mmdet (open-mmlab#200) * Move folders * lint * test object detection model * lint * reset changes * fix openvino * resolve comments * __init__.py * Fix path * [Refactor]: move mmseg (open-mmlab#206) * [Refactor]: Refactor mmedit (open-mmlab#205) * feature mmedit * edit2.0 * edit * refactor mmedit * fix __init__.py * fix __init__ * fix formai * fix comment * fix comment * Fix wrong func_name of ConvFCBBoxHead (open-mmlab#209) * [Refactor]: Refactor mmdet unit test (open-mmlab#207) * Move folders * lint * test object detection model * lint * WIP * remove print * finish unit test * Fix tests * resolve comments * Add mask test * lint * resolve comments * Refine cfg file * Move files * add files * Fix path * [Unittest]: Refine the unit tests in mmdet open-mmlab#214 * [Refactor] refactor mmocr to mmdeploy/codebase (open-mmlab#213) * refactor mmocr to mmdeploy/codebase * fix docstring of show_result * fix docstring of visualize * refine docstring * replace print with logging * refince codes * resolve comments * resolve comments * [Refactor]: mmseg tests (open-mmlab#210) * refactor mmseg tests * rename test_codebase * update * add model.py * fix * [Refactor] Refactor mmcls and the package (open-mmlab#217) * refactor mmcls * fix yapf * fix isort * refactor-mmcls-package * fix print to logging * fix docstrings according to others comments * fix comments * fix comments * fix allentdans comment in pr215 * remove mmocr init * [Refactor] Refactor mmedit tests (open-mmlab#212) * feature mmedit * edit2.0 * edit * refactor mmedit * fix __init__.py * fix __init__ * fix formai * fix comment * fix comment * buff * edit test and code refactor * refactor dir * refactor tests/mmedit * fix docstring * add test coverage * fix lint * fix comment * fix comment * Update typehint (open-mmlab#216) * update type hint * update docstring * update * remove file * fix ppl * Refine get_predefined_partition_cfg * fix tensorrt version > 8 * move parse_cuda_device_id to device.py * Fix cascade * onnx2ncnn docstring Co-authored-by: Yifan Zhou <[email protected]> Co-authored-by: RunningLeon <[email protected]> Co-authored-by: VVsssssk <[email protected]> Co-authored-by: AllentDan <[email protected]> Co-authored-by: hanrui1sensetime <[email protected]>
- Loading branch information
1 parent
d742b42
commit 3a785f1
Showing
220 changed files
with
6,698 additions
and
5,837 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,75 +1,40 @@ | ||
from typing import Optional, Sequence, Union | ||
from typing import Any, Sequence, Union | ||
|
||
import mmcv | ||
import numpy as np | ||
import torch | ||
|
||
from mmdeploy.utils import (Backend, get_backend, get_codebase, | ||
get_input_shape, get_task_type, load_config) | ||
from .utils import (create_input, init_backend_model, init_pytorch_model, | ||
run_inference, visualize) | ||
from mmdeploy.utils import get_input_shape, load_config | ||
|
||
|
||
def inference_model(model_cfg: Union[str, mmcv.Config], | ||
deploy_cfg: Union[str, mmcv.Config], | ||
model: Union[str, Sequence[str], torch.nn.Module], | ||
img: Union[str, np.ndarray], | ||
device: str, | ||
backend: Optional[Backend] = None, | ||
output_file: Optional[str] = None, | ||
show_result: bool = False): | ||
backend_files: Sequence[str], img: Union[str, np.ndarray], | ||
device: str) -> Any: | ||
"""Run inference with PyTorch or backend model and show results. | ||
Args: | ||
model_cfg (str | mmcv.Config): Model config file or Config object. | ||
deploy_cfg (str | mmcv.Config): Deployment config file or Config | ||
object. | ||
model (str | list[str], torch.nn.Module): Input model or file(s). | ||
backend_files (Sequence[str]): Input backend model file(s). | ||
img (str | np.ndarray): Input image file or numpy array for inference. | ||
device (str): A string specifying device type. | ||
backend (Backend): Specifying backend type, defaults to `None`. | ||
output_file (str): Output file to save visualized image, defaults to | ||
`None`. Only valid if `show_result` is set to `False`. | ||
show_result (bool): Whether to show plotted image in windows, defaults | ||
to `False`. | ||
Returns: | ||
Any: The inference results | ||
""" | ||
deploy_cfg, model_cfg = load_config(deploy_cfg, model_cfg) | ||
|
||
codebase = get_codebase(deploy_cfg) | ||
task = get_task_type(deploy_cfg) | ||
input_shape = get_input_shape(deploy_cfg) | ||
if backend is None: | ||
backend = get_backend(deploy_cfg) | ||
|
||
if isinstance(model, str): | ||
model = [model] | ||
from mmdeploy.apis.utils import build_task_processor | ||
task_processor = build_task_processor(model_cfg, deploy_cfg, device) | ||
|
||
if isinstance(model, (list, tuple)): | ||
assert len(model) > 0, 'Model should have at least one element.' | ||
assert all([isinstance(m, str) for m in model]), 'All elements in the \ | ||
list should be str' | ||
model = task_processor.init_backend_model(backend_files) | ||
|
||
if backend == Backend.PYTORCH: | ||
model = init_pytorch_model(codebase, model_cfg, model[0], device) | ||
else: | ||
device_id = -1 if device == 'cpu' else 0 | ||
model = init_backend_model( | ||
model, | ||
model_cfg=model_cfg, | ||
deploy_cfg=deploy_cfg, | ||
device_id=device_id) | ||
|
||
model_inputs, _ = create_input(codebase, task, model_cfg, img, input_shape, | ||
device) | ||
input_shape = get_input_shape(deploy_cfg) | ||
model_inputs, _ = task_processor.create_input(img, input_shape) | ||
|
||
with torch.no_grad(): | ||
result = run_inference(codebase, model_inputs, model) | ||
result = task_processor.run_inference(model, model_inputs) | ||
|
||
visualize( | ||
codebase, | ||
img, | ||
result=result, | ||
model=model, | ||
output_file=output_file, | ||
backend=backend, | ||
show_result=show_result) | ||
return result |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,28 +1,8 @@ | ||
import importlib | ||
import os.path as osp | ||
|
||
from .init_plugins import get_onnx2ncnn_path, get_ops_path | ||
|
||
__all__ = ['get_ops_path', 'get_onnx2ncnn_path'] | ||
|
||
|
||
def is_available(): | ||
"""Check whether ncnn with extension is installed. | ||
Returns: | ||
bool: True if ncnn and its extension are installed. | ||
""" | ||
ncnn_ops_path = get_ops_path() | ||
if not osp.exists(ncnn_ops_path): | ||
return False | ||
has_pyncnn = importlib.util.find_spec('ncnn') is not None | ||
has_pyncnn_ext = importlib.util.find_spec( | ||
'mmdeploy.apis.ncnn.ncnn_ext') is not None | ||
|
||
return has_pyncnn and has_pyncnn_ext | ||
from mmdeploy.backend.ncnn import is_available | ||
|
||
__all__ = ['is_available'] | ||
|
||
if is_available(): | ||
from .ncnn_utils import NCNNWrapper | ||
|
||
__all__ += ['NCNNWrapper'] | ||
from mmdeploy.backend.ncnn.onnx2ncnn import (onnx2ncnn, | ||
get_output_model_file) | ||
__all__ += ['onnx2ncnn', 'get_output_model_file'] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,22 +1,3 @@ | ||
import importlib | ||
import os.path as osp | ||
from mmdeploy.backend.onnxruntime import is_available | ||
|
||
from .init_plugins import get_ops_path | ||
|
||
|
||
def is_available(): | ||
"""Check whether onnxruntime and its custom ops are installed. | ||
Returns: | ||
bool: True if onnxruntime package is installed and its | ||
custom ops are compiled. | ||
""" | ||
onnxruntime_op_path = get_ops_path() | ||
if not osp.exists(onnxruntime_op_path): | ||
return False | ||
return importlib.util.find_spec('onnxruntime') is not None | ||
|
||
|
||
if is_available(): | ||
from .onnxruntime_utils import ORTWrapper | ||
__all__ = ['get_ops_path', 'ORTWrapper'] | ||
__all__ = ['is_available'] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,19 +1,11 @@ | ||
import importlib | ||
|
||
|
||
def is_available() -> bool: | ||
"""Checking if OpenVINO is installed. | ||
Returns: | ||
bool: True if OpenVINO is installed. | ||
""" | ||
return importlib.util.find_spec('openvino') is not None | ||
from mmdeploy.backend.openvino import is_available | ||
|
||
__all__ = ['is_available'] | ||
|
||
if is_available(): | ||
from .openvino_utils import OpenVINOWrapper, get_input_shape_from_cfg | ||
from .onnx2openvino import (onnx2openvino, get_output_model_file) | ||
__all__ = [ | ||
'OpenVINOWrapper', 'onnx2openvino', 'get_output_model_file', | ||
'get_input_shape_from_cfg' | ||
from mmdeploy.backend.openvino.onnx2openvino \ | ||
import onnx2openvino, get_output_model_file | ||
from .utils import get_input_shape_from_cfg | ||
__all__ += [ | ||
'onnx2openvino', 'get_output_model_file', 'get_input_shape_from_cfg' | ||
] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
from typing import List | ||
|
||
import mmcv | ||
|
||
|
||
def get_input_shape_from_cfg(config: mmcv.Config) -> List[int]: | ||
"""Get the input shape from the model config for OpenVINO Model Optimizer. | ||
Args: | ||
config (mmcv.Config): Model config. | ||
Returns: | ||
List[int]: The input shape in [1, 3, H, W] format from config | ||
or [1, 3, 800, 1344]. | ||
""" | ||
shape = [] | ||
test_pipeline = config.get('test_pipeline', None) | ||
if test_pipeline is not None: | ||
img_scale = test_pipeline[1]['img_scale'] | ||
shape = [1, 3, img_scale[1], img_scale[0]] | ||
else: | ||
shape = [1, 3, 800, 1344] | ||
return shape |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,16 +1,8 @@ | ||
import importlib | ||
|
||
|
||
def is_available(): | ||
"""Check whether ppl is installed. | ||
Returns: | ||
bool: True if ppl package is installed. | ||
""" | ||
return importlib.util.find_spec('pyppl') is not None | ||
from mmdeploy.backend.ppl import is_available | ||
|
||
__all__ = ['is_available'] | ||
|
||
if is_available(): | ||
from .ppl_utils import PPLWrapper, register_engines | ||
from .onnx2ppl import onnx2ppl | ||
__all__ = ['register_engines', 'PPLWrapper', 'onnx2ppl'] | ||
from mmdeploy.backend.ppl import onnx2ppl | ||
|
||
__all__ += ['onnx2ppl'] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.