Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

installation issues for deployment #697

Open
connormeaton opened this issue Mar 15, 2023 · 6 comments
Open

installation issues for deployment #697

connormeaton opened this issue Mar 15, 2023 · 6 comments
Labels
dependencies Pull requests that update a dependency file high-priority high-priority

Comments

@connormeaton
Copy link

Hello, I am looking to deploy a tsai trained model for inference. I see you created the tsai.inference module for lightweight inference with load_learner. This looks like it will be helpful. However, I am having some issues building tsai in a fresh environment.

I prefer to use mamba to install and am running the command mamba install -c timeseriesai tsai . This appears to work as expected. However, when I run inference code in the fresh environment from tsai.inference import load_learner, I get a bunch of dependency errors. I receive messages saying I need Ipython, ipykernel, chardet, webrtcvad, etc. The docs state that the standard installation methods I am using install only what is required. Why am I being asked to download all of these other modules when I am not using them? I can install them, but I would prefer not to as to keep my production environment as small as possible.

This happens with conda and pip as well.

Please let me know if I am doing something wrong. Thank you!

@oguiza oguiza added dependencies Pull requests that update a dependency file high-priority high-priority labels Mar 16, 2023
oguiza added a commit that referenced this issue Mar 16, 2023
@oguiza
Copy link
Contributor

oguiza commented Mar 16, 2023

Hi @connormeaton, thanks a lot for sharing this.
I agree it doesn't make sense that you need to install additional dependencies to perform inference.
I've investigated this and I believe the issue is resolved now.
Bear in mind, that the learner object contains more than the model's weights (i.e. empty datasets, empty dataloaders, item and batch transforms, etc). A lot of imports are required. One of them was IPython. But it doesn't make sense if you are not using notebooks. I've now reorganized the imports so that it's no longer required if you don't use notebooks. This change is only available in GitHub until I create a new pip/ conda release (I'm planning to do it by the end of this month).
So you have 2 options now.

  1. Add Ipython to your current environment (recommended approach):
    mamba install -c anaconda ipython
    I've tested it, and adding IPython allowed me to run inference in a fresh environment I created using:
    mamba create -n py38torch113 python=3.8 tsai -c timeseriesai
  2. Pip install the github version of tsai that doesn't require IPython:
    pip install git+https://github.com/timeseriesAI/tsai.git
    But in this case, it's likely that when you try to import your model you get new issues due to changes in the code. To ensure you don't have these issues you'd need to retrain your model in an environment with the same GitHub version.

So I'd recommend you to test the 1st approach. And, when tsai 0.3.6 becomes available, you'll be able to create a new environment that doesn't require IPython.
Please, let me know if this works.

@connormeaton
Copy link
Author

Thank you so much @oguiza. This was very helpful.

Unfortunately, I am realizing that fastai and pyinstaller do not mix well for creating a shipable binary. It looks like converting the model to pytorch and then compiling with ONNX may be the best method for compilation and deployment. Do you have any recommendations on using a tsai model for inference in a pytorch only environment?

@oguiza
Copy link
Contributor

oguiza commented Mar 21, 2023

I'm not sure what you mean by a "Pytorch-only environment".
In general, my recommendation is to install tsai in a new environment.
Can you create a new conda environment and then install tsai (from pypi or conda)?

@connormeaton
Copy link
Author

Yes, I can do that. I was just curious if you had experience with running a tsai trained model with pytorch, and not calling tsai at all. I'm assuming this could be done because tsai uses pytorch (I think). The issue is that tsai/fastai do not interact with pyinstaller well. I want to turn my model into a binary and run on other machines. I know that this can be done with pytorch and onnx, so was just curious if you knew of any options. No worries if not, its very specific. Thank you for your help!

@oguiza
Copy link
Contributor

oguiza commented Mar 21, 2023

Well, the issue is not so much converting the model to pure Pytorch or ONNX. You could do that when you train the model.
But during training you usually need to apply transforms, and you need to ensure that those transforms are applied in the same way during inference. That's why you export a Learner object, that contains not only the trained model, but also empty dataloaders (including transforms) that can be used for inference. This is a very common scenario.
But if you trained a model without any transforms you could really train the model, convert it to ONNX and use if for inference.
If you are interested in pursuing this approach you may want to take a look at this nb where I show how you convert a model to ONNX. At this stage, I'm not sure all models in tsai can be converted but we plan to do this over the coming weeks.
One additional enhancement is to build a learn.to_onnx() new method to simplify this task.

@connormeaton
Copy link
Author

Thank you so much for the info. I will review the notebook you referenced.

A learn.to_onnx() method would be incredible!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file high-priority high-priority
Projects
None yet
Development

No branches or pull requests

2 participants