Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

INT8 quantization for CoreML models does not work on macOS #10491

Closed
2 tasks done
dokluch opened this issue Dec 13, 2022 · 4 comments
Closed
2 tasks done

INT8 quantization for CoreML models does not work on macOS #10491

dokluch opened this issue Dec 13, 2022 · 4 comments
Labels
bug Something isn't working

Comments

@dokluch
Copy link

dokluch commented Dec 13, 2022

Search before asking

  • I have searched the YOLOv5 issues and found no similar bug report.

YOLOv5 Component

Export

Bug

Passing --int8 flag to any export.py command on macOS produces the following error.

CoreML: starting export with coremltools 6.1...
Tuple detected at graph output. This will be flattened in the converted model.
Converting PyTorch Frontend ==> MIL Ops: 100%|████████████████████████████████████▉| 607/609 [00:00<00:00, 4497.54 ops/s]
Running MIL Common passes: 100%|███████████████████████████████████████████████████| 39/39 [00:00<00:00, 430.07 passes/s]
Running MIL Clean up passes: 100%|█████████████████████████████████████████████████| 11/11 [00:00<00:00, 248.74 passes/s]
Translating MIL ==> NeuralNetwork Ops: 100%|████████████████████████████████████████| 632/632 [00:00<00:00, 641.14 ops/s]
Quantizing using kmeans_lut quantization
Optimizing Neural Network before Quantization:
Finished optimizing network. Quantizing neural network..
Quantizing layer input.1 of type convolution
CoreML: export failure ❌ 12.7s: scikit-learn package required for k-means quantization

All requirements are installed, sklearn can be imported inside an interactive session.
--half flag works, other export options work properly

Environment

  • Macbook M1 max
  • Conda environment with YOLOv5 🚀 v7.0-32-g357cde9 Python-3.10.8 torch-1.13.0 CPU

Minimal Reproducible Example

export.py --weights "weights.pt" --include coreml --int8

Additional

No response

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!
@dokluch dokluch added the bug Something isn't working label Dec 13, 2022
@dokluch dokluch closed this as completed Dec 13, 2022
@dokluch dokluch closed this as completed Dec 13, 2022
@bfialkoff
Copy link

Is there a solution for this issue? I have the same problem

@glenn-jocher
Copy link
Member

@bfialkoff hi there!

I'm sorry to hear that you're experiencing the same issue. Currently, the --int8 flag does not seem to work when using the export command on macOS, even when all requirements are installed properly. We're aware of the issue and are looking into it.

In the meantime, the --half flag may be used as an alternative. If there are any updates or fixes regarding this issue, we will be sure to announce it on our GitHub repository. Thank you for bringing this to our attention and for your patience.

If you have any further questions or concerns, please don't hesitate to ask.

Best,
Team Ultralytics

@ahbpp
Copy link

ahbpp commented Dec 1, 2024

Hi there!
It's a bit late, but pip install scikit-learn==1.1.2 worked for me
It looks like coremltools has strict requirements for scikit-learn

@pderrenger
Copy link
Member

Thanks for sharing your solution! Yes, coremltools does have version-specific compatibility requirements for scikit-learn. Installing scikit-learn==1.1.2 should indeed resolve the issue. This will help others facing the same problem. Feel free to let us know if you encounter any further issues!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants