Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP FEATURE] Linux Snap Install #152

Open
RyanMetcalfeInt8 opened this issue Dec 4, 2024 · 17 comments
Open

[WIP FEATURE] Linux Snap Install #152

RyanMetcalfeInt8 opened this issue Dec 4, 2024 · 17 comments

Comments

@RyanMetcalfeInt8
Copy link
Contributor

Let's coordinate tasks needed to support Snap installation on Ubuntu (and other distros)

All changes to support this feature will go into snap branch, so please target PRs accordingly. Once it's all working alongside the traditional Windows/Linux install mechanism, we'll merge snap into main.

Some initial details (from @frenchwr), in response to this question: "Would it be possible for you to give a high level description of what the Snap installation will actually do? e.g. Will it invoke install.sh, or simply extract some bundled set of artifacts (like python virtual env.) from a previous installation?"

  • I am not invoking install.sh directly. That was my original aim but it just got too messy. I am, however, using setup.py and >requirements.txt to install the necessary python packages.
  • The snap distributes model_setup.py as an "app" within the snap that can be invoked with something like openvino-ai-plugins->gimp.model-setup. The models are installed to GIMP_OPENVINO_MODELS_PATH , if defined.
  • The ultimate aim is to integrate the openvino-ai-plugins-gimp snap with the gimp snap so that a user can just snap install gimp >and have the OpenVINO AI plugin support out-of-the-box.
  • Along these lines, I've designed the openvino-ai-plugins-gimp snap so that when a user runs gimp it will automatically invoke >complete_install.setup_python_weights so that gimp_openvino_config.json will get generated (and written to >GIMP_OPENVINO_CONFIG_PATH) on each run. In this way we check for device support (GPU, NPU, etc) at runtime. I realize this will >also copy the super resolution and semseg models each run, but since the models are relatively small it should not be a big penalty.

TODO List (This will grow):

  • Device lists (i.e. CPU, GPU, NPU) should be populated at runtime (via openvino.core.available_devices) instead of at install.sh time.
@frenchwr
Copy link
Contributor

frenchwr commented Dec 4, 2024

Are you all open to updating the logic here for determining the Python interpreter? Some Linux distros have stopped shipping a Python interpreter named python in favor of python3. If you're open to it I can submit a PR against snap for testing.

@RyanMetcalfeInt8
Copy link
Contributor Author

Are you all open to updating the logic here for determining the Python interpreter? Some Linux distros have stopped shipping a Python interpreter named python in favor of python3. If you're open to it I can submit a PR against snap for testing.

Yes, I think so. Looking back at that logic, I'm actually a little bit confused why we do that (get the path, and then append the right executable name to it).. I would think that sys.executable may just give us what we want? Perhaps I'm missing something..

Anyway, yes, definitely open to any improvements here!

@gblong1
Copy link
Contributor

gblong1 commented Dec 4, 2024

I would think that sys.executable may just give us what we want? Perhaps I'm missing something..

nope, not missing anything. we should be using sys.executable. this will ensure we are pointing to the same python interpreter that was used in the virtual environment.

@gblong1
Copy link
Contributor

gblong1 commented Dec 5, 2024

Are you all open to updating the logic here for determining the Python interpreter? Some Linux distros have stopped shipping a Python interpreter named python in favor of python3. If you're open to it I can submit a PR against snap for testing.

Hi @frenchwr - do you want to create the PR for this one? if not I can do it. thanks!

@frenchwr
Copy link
Contributor

frenchwr commented Dec 5, 2024

Hi @frenchwr - do you want to create the PR for this one? if not I can do it. thanks!

@gblong1 You can go for it! If you tag me in the PR I'm happy to test on my side.

@gblong1
Copy link
Contributor

gblong1 commented Dec 5, 2024

@frenchwr PR created : #155

@RyanMetcalfeInt8
Copy link
Contributor Author

@frenchwr PR created : #155

I reviewed & merged into snap branch, @frenchwr you can test it from there if you'd like. Thanks!

@frenchwr
Copy link
Contributor

frenchwr commented Dec 5, 2024

A few other nice-to-haves if you're open to it:

  • Can we replace these lines and instead just import with something like the following? As it is written now model_setup.py can only be run from the root directory of this repo.
from  gimpopenvino.plugins.openvino_utils.tools.model_manager import ModelManger
  • Could we either wrap this in a try-except or (even better if we can get away with it) store these files as executable in the repo?
  • Are we sure this line is needed? I believe typing is built in to recent versions of python and this caused conflicts when it was installed in my snap, so I ended up needing to remove it.

I'm happy to help make these changes and test, but just wanted to get your thoughts first in case I'm missing something.

@frenchwr
Copy link
Contributor

frenchwr commented Dec 5, 2024

Also I can create a separate issue for this if you prefer, but whenever I run stable diffusion on a NPU (balanced or power efficiency mode) I'm seeing errors like the following:

File "/snap/gimp/x14/usr/lib/x86_64-linux-gnu/gimp/2.99/plug-ins/openvino_utils/tools/openvino_common/models_ov/stable_diffusion_engine.py", line 157, in load_model
    with open(os.path.join(model, f"{model_name}.blob"), "rb") as f:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/openvino-ai-plugins-gimp/weights/stable-diffusion-ov/stable-diffusion-1.5/square_int8/unet_int8.blob'

I don't see those blob files after downloading the models with model_setup.py. Is there some extra step(s) to convert them to a .blob format? I've tested with square_int8 and also square_lcm. Performance mode (with GPU only) runs perfectly.

@gblong1
Copy link
Contributor

gblong1 commented Dec 5, 2024

I'm happy to help make these changes and test, but just wanted to get your thoughts first in case I'm missing something.

Please feel free to make the changes and submit a PR, thanks!

I don't see those blob files after downloading the models with model_setup.py. Is there some extra step(s) to convert them to a .blob format? I've tested with square_int8 and also square_lcm. Performance mode (with GPU only) runs perfectly.

After the model download completes, the NPU .blobs should get created. For example, see step 3 of the standalone script README.

If that is not happening, let me know. Which platform are you running on?

@frenchwr
Copy link
Contributor

frenchwr commented Dec 5, 2024

After the model download completes, the NPU .blobs should get created. For example, see step 3 of the standalone script README.

Aha - ok all good now. I didn't realize the NPU runtime libs were needed at the model download stage.

I assume the models are compiled for the NPU at this stage because the compilation is somewhat slow?

@frenchwr
Copy link
Contributor

frenchwr commented Dec 5, 2024

Please feel free to make the changes and submit a PR, thanks!

I'll work on a PR tomorrow. This will get me really close to having the snap functioning without any hacks in the packaging.

@gblong1
Copy link
Contributor

gblong1 commented Dec 5, 2024

I assume the models are compiled for the NPU at this stage because the compilation is somewhat slow?

Correct. We don't want to have the user waiting for the models to compile the first time they run a specific feature. So we compile them at model download time, since they are already waiting for the models to download.

@frenchwr
Copy link
Contributor

frenchwr commented Dec 6, 2024

@RyanMetcalfeInt8 @gblong1 I just opened a PR with the changes I mentioned yesterday: #156 Let me know what you think.

@gblong1
Copy link
Contributor

gblong1 commented Dec 7, 2024

Thanks! We'll test it out and provide any feedback.

@frenchwr
Copy link
Contributor

Hello! The snap packaging work is very close to being done on our (Canonical) side. I was hoping to share some details today for how you could test and provide feedback, but unfortunately it's not quite ready.

I'll have something to share in the new year. Happy holidays!

@frenchwr
Copy link
Contributor

frenchwr commented Jan 6, 2025

Hi @gblong1 and @RyanMetcalfeInt8 ,

We have published a version of the GIMP snap (still using version 2.99.16 at this point) in the Snap Store. If you want to try installing it on an Ubuntu 24.04 machine and running some tests that would be valuable feedback for us! Instructions for installing and running can be found in the README here: https://github.com/snapcrafters/gimp/tree/2.99-openvino

Some notes:

  • The NPU driver is on version 1.6.0 where Lunar Lake support is not great (as we discussed in NPU Turbo Bug in Linux #158). We'll be updating to a more recent version (1.10.1) soon but in the meantime you probably want to limit testing to Meteor Lake.
  • You'll notice currently you need to install four snaps separately, but this will be simplified sometime soon to a single command like sudo snap install gimp --channel 2.99-openvino/beta. And once GIMP 3.0 support is deemed stable hopefully it will be as simple as sudo snap install gimp.
  • My test cases for the three OV plugins are very manual so if you have something more comprehensive on your end that would be a huge value add for validating the snap packaging work we've performed.

Let us know what you think! We hope this will make the GIMP plugin work you've done more accessible for Linux users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants