Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve tvmc error message from lazy-loading frontend imports #9074

Merged
merged 1 commit into from
Dec 13, 2021

Conversation

ophirfrish
Copy link
Contributor

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

cc @areusch @leandron @gromero @Mousius for reviews

Copy link
Member

@Mousius Mousius left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this change @ophirfrish, having met these errors many times it's great to get some guidance from tvmc - I've just a few suggestions 😸

python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
Copy link
Contributor

@leandron leandron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally LGTM - a few improvements to be done in the tests.

tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
@areusch
Copy link
Contributor

areusch commented Oct 3, 2021

@ophirfrish please address the comments and the ci failure (maybe just need to re-submit as CI was flaky the past week or two).

@ophirfrish
Copy link
Contributor Author

ophirfrish commented Oct 4, 2021 via email

python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
Copy link
Contributor

@areusch areusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ophirfrish thanks! i think @leandron is out for a bit, so i'm reviewing here. in general there are just a couple cleanups here and there, but also a suggestion if you're up for it.

Copy link
Contributor

@leandron leandron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Member

@Mousius Mousius left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @ophirfrish, I checked this out locally and I don't think this is working as intended:

$ tvmc compile resnet18-f37072fd.pth --target=llvm
2021-11-24 12:24:39.653314: W tensorflow/stream_executor/platform/default/dso_loader.cc:60] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2021-11-24 12:24:39.653347: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:164: Warning: Unable to detect CUDA version, default to "-mcpu=sm_20" instead
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:190: Warning: Unable to detect ROCm compute arch, default to "-mcpu=gfx900" instead
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:204: Warning: Unable to detect ROCm version, assuming >= 3.5
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:164: Warning: Unable to detect CUDA version, default to "-mcpu=sm_20" instead
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:190: Warning: Unable to detect ROCm compute arch, default to "-mcpu=gfx900" instead
[12:24:41] /workspaces/tvm-sandbox-micro/src/tvm/src/target/target_kind.cc:204: Warning: Unable to detect ROCm version, assuming >= 3.5
Traceback (most recent call last):
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/frontends.py", line 88, in lazy_import
    return importlib.import_module(pkg_name, package=from_pkg_name)
  File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'torch'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/main.py", line 94, in _main
    return args.func(args)
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/compiler.py", line 143, in drive_compile
    tvmc_model = frontends.load_model(args.FILE, args.model_format, args.input_shapes)
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/frontends.py", line 402, in load_model
    mod, params = frontend.load(path, shape_dict, **kwargs)
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/frontends.py", line 251, in load
    torch = lazy_import("torch", hide_stderr=True)
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/frontends.py", line 90, in lazy_import
    raise TVMCImportError({pkg_name})
tvm.driver.tvmc.common.TVMCImportError: {'torch'}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/tvmc", line 33, in <module>
    sys.exit(load_entry_point('tvm', 'console_scripts', 'tvmc')())
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/main.py", line 106, in main
    sys.exit(_main(sys.argv[1:]))
  File "/workspaces/tvm-sandbox-micro/src/tvm/python/tvm/driver/tvmc/main.py", line 97, in _main
    f'Package "{err.message}" is not installed. ' f'Hint: "pip install tlcpack[tvmc]".'
AttributeError: 'TVMCImportError' object has no attribute 'message'

I've added suggestions to solve some of the above and simplify the PR but it'd be good to consider how we can automate the testing of this to avoid this kind of regression.

Otherwise, I think this is a great UX improvement, so ping me again when you've taken a look 😸

python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/frontends.py Outdated Show resolved Hide resolved
python/tvm/driver/tvmc/main.py Outdated Show resolved Hide resolved
tests/python/driver/tvmc/test_frontends.py Outdated Show resolved Hide resolved
@ophirfrish ophirfrish force-pushed the tvmc-msg branch 2 times, most recently from a397284 to 49ccc96 Compare November 25, 2021 07:29
When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
Copy link
Contributor

@leandron leandron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. We had a few comments from @Mousius, which seems fixed now, so let us merge this one and for any new suggestions, we can fix on follow-up PRs.

Thanks @ophirfrish @Mousius @areusch!

@leandron leandron dismissed Mousius’s stale review December 13, 2021 11:50

We'll fix any remaining issue in follow-up PRs.

@leandron leandron merged commit 5557b8c into apache:main Dec 13, 2021
ylc pushed a commit to ylc/tvm that referenced this pull request Jan 7, 2022
…#9074)

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
yangulei pushed a commit to yangulei/tvm that referenced this pull request Jan 11, 2022
…#9074)

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
yangulei pushed a commit to yangulei/tvm that referenced this pull request Jan 12, 2022
…#9074)

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
ylc pushed a commit to ylc/tvm that referenced this pull request Jan 13, 2022
…#9074)

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
qsqqsqqsq-intellif pushed a commit to qsqqsqqsq-intellif/tvm that referenced this pull request Apr 29, 2022
…#9074)

When installing TVM from the python package, the Frontend frameworks dependencies such as TensorFlow, PyTorch, ONNX, etc, are not installed by default.
In case a user tries to run tvmc using a model whose framework was not installed, it will be presented with a very raw Python exception in the output.
The aim of this commit is to implement a better error messages for errors related to lazy-loading frontend frameworks in tvmc.

Change-Id: Ida52fac4116af392ee436390e14ea02c7090cef0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants