No module named 'ultralytics.yolo'

Hi, I keep encountering this issue, I install it with

!pip install --no-cache-dir --force-reinstall git+https://github.com/DeGirum/ultralytics_yolov8.git@main

But couldn’t import YOLO from ultralytics:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-3-132e02f78920> in <cell line: 0>()
----> 1 from ultralytics import YOLO
      2 model = YOLO('yolov8s.pt')
      3 
      4 model.export(format="tflite", simplify=True, imgsz=320, int8=True, export_hw_optimized=True, uint8_io_dtype=True)

8 frames
/usr/local/lib/python3.11/dist-packages/ultralytics/utils/postprocess_utils.py in <module>
      4 
      5 from ultralytics.nn.modules.block import DFL
----> 6 from ultralytics.yolo.utils.tal import dist2bbox, make_anchors
      7 
      8 

ModuleNotFoundError: No module named 'ultralytics.yolo'

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------

I am trying to convert my custom trained yolov8 model to use with google coral edgetpu and need it to use uint8, and I found your repo from github.

Also is there anyway this would be possible with YOLO11?

Hi @josephcy95
Welcome to the DeGirum community. We will give you access to our cloud compiler so that you can directly convert your checkpoints to working models.

Thanks how do i access that? I do have an account at hub.degirum.com using same username and email

Hi @josephcy95
As soon as we enable access to your account, I will let you know.

Thanks sir! Can’t wait to test it, does it also work for YOLO11 with uint8?

@josephcy95
Just a heads up. The compiler works for yolo11 as well but yolo11 does not compile for edgetpu due to some unsupported layers. If your HW is edgetpu, the most suitable models in terms of speed and accuracy are yolov8.

Hi @josephcy95
You should now have permissions to compile a model using our cloud compiler. Please create a model zoo first in the Models tab and then your compiled model will appear in this zoo. Please reach out if you need help.

Thanks a lot, will try it out now

Alright I tried compiling with specify image size to be 320x320, but somehow the compiled tflite ends up with 160x160x12 input

I was expecting it to be 320x320x3 or did I do something wrong?

Hi @josephcy95
There is nothing wrong. This is a performance optimization. Did you check if the model is running properly? @khatami.mehrdad can explain the details of the optimization.

Hi, sorry was just about to read it but it’s deleted, can I get some idea what is the expected input and output so I can better modify my code to use the model?

Hi @josephcy95
You can follow our pysdk examples to run the inference: DeGirum/PySDKExamples: DeGirum PySDK Usage Examples

Hi @josephcy95
Just checking if you are able to run inference on the compiled model?