ERROR "DegirumException: Model 'yolov8n' is not found in model zoo"

Hi guys. I need help with a problem (summarized below). Basically, I tried to create 2 custom templates (one in yolov5s and one in yolov8n) based on Luiz_doleron’s tutorials (Run custom YOLOv8 Model on RPI5 with Hailo AI Shield - #6 by Luiz_doleron - General - Hailo Community) . Both have a problem where the execution of my python script informs me that it couldn’t locate the correct directory for my files.

Regarding my script, the idea is to use a custom model that contains a single class (‘minerio’). With the Degirum library, I’m performing class count tests within a space. Here’s a question: as I’m using a raspberry pi 5 with AI HAT+ Hailo 8, would I be able to trigger any GPIO (I’m thinking of using the ‘lgpio’ library) for external triggers, such as manipulating relays?

In summary, my current script has a control of the FPS of the camera (I had to do it, because it was overloading and crashing my system), and as I mentioned, I’m trying to count classes within a specific region. The desirable thing in the future would be to trigger GPIOs as soon as my ‘minerio’ class accesses the area. Well, let’s get to the problem:

(degirum_env) rasp5@raspberrypi:~/Desktop/degirum_test/yolov8n $ python3 /home/rasp5/hailo_examples/counting_test1.py 
Traceback (most recent call last):
  File "/home/rasp5/hailo_examples/counting_test1.py", line 42, in <module>
    model = dg.load_model(
            ^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/__init__.py", line 220, in load_model
    return zoo.load_model(model_name, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/log.py", line 59, in wrap
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/zoo_manager.py", line 270, in load_model
    model = self._zoo.load_model(model_name)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/log.py", line 59, in wrap
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/_zoo_accessor.py", line 313, in load_model
    model_params = self.model_info(model)
                   ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/log.py", line 59, in wrap
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/_zoo_accessor.py", line 131, in model_info
    raise DegirumException(
degirum.exceptions.DegirumException: Model 'yolov8n' is not found in model zoo '/home/rasp5/Desktop/degirum_test'
(degirum_env) rasp5@raspberrypi:~/Desktop/degirum_test/yolov8n $ 

Using default settings such as:
zoo_url = “degirum/hailo”
model_name = ‘yolov8n_relu6_coco–640x640_quant_hailort_hailo8_1’
and any “video_source”, my script runs without any problems. When targeting my ‘zoo_url’ and my ‘model_name’, I get an error. In this case, I’m going to talk about yolov8n.

My script /home/rasp5/hailo_examples/counting_test1.py:

import degirum as dg, degirum_tools
import cv2
import time

inference_host_address = "@local"

#URLS----------------------------------------------------
zoo_url = "/home/rasp5/Desktop/degirum_test" #my models samples
#zoo_url = "degirum/hailo"


token = ''
device_type = "HAILORT/HAILO8"

#MODELS--------------------------------------------------
model_name = 'yolov8n'
#model_name = 'yolov5s_600'
#model_name = 'yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1'

#VIDEO SOURCE--------------------------------------------
video_source = 0  # Para usar a câmera USB
#video_source = '/home/rasp5/Desktop/video_test2.mp4'
#video_source = "/home/rasp5/DeGirum_hailo_examples/assets/Traffic.mp4"

# ZONES--------------------------------------------------
polygon_zones = [
    [[90, 200], [540, 200], [540, 400], [90, 400]],
    #[[90, 200], [540, 200], [540, 400], [90, 400]], #if wish more zones
]

# class config
class_list = ["minerio"]
#class_list = ["car"]
per_class_display = True
window_name = "AI Camera"

# FPS control
desired_fps = 60  # <---
frame_delay = 1 / desired_fps

# load model
model = dg.load_model(
    model_name=model_name,
    inference_host_address=inference_host_address,
    zoo_url=zoo_url,
    token=token,
    overlay_color=[(0, 255, 255)],
    output_class_set=set(class_list)
)

# zone counter
zone_counter = degirum_tools.ZoneCounter(
    polygon_zones,
    class_list=class_list,
    per_class_display=per_class_display,
    triggering_position=degirum_tools.AnchorPoint.CENTER,
    window_name=window_name,
)


degirum_tools.attach_analyzers(model, [zone_counter])

cap = cv2.VideoCapture(video_source)
if not cap.isOpened():
    raise RuntimeError(f"Erro ao abrir fonte de vídeo: {video_source}")

# show results
with degirum_tools.Display(window_name) as display:
    while True:
        ret, frame = cap.read()
        if not ret:
            break

        start_time = time.time()
        inference_result = model.predict(frame)
        display.show(inference_result)
        elapsed_time = time.time() - start_time

        sleep_time = frame_delay - elapsed_time
        if sleep_time > 0:
            time.sleep(sleep_time)

cap.release()
cv2.destroyAllWindows()

At /home/rasp5/Desktop/degirum_test/yolov8n i have it:

yolov8n.hef;

yolov8n.json:


{
    "ConfigVersion": 10,
    "DEVICE": [
        {
            "DeviceType": "HAILO8",
            "RuntimeAgent": "HAILORT",
            "SupportedDeviceTypes": "HAILORT/HAILO8",
            "ThreadPackSize": 6
        }
    ],
    "PRE_PROCESS": [
        {
            "InputN": 1,
            "InputH": 640,
            "InputW": 640,
            "InputC": 3,
            "InputQuantEn": true
        }
    ],
    "MODEL_PARAMETERS": [
        {
            "ModelPath": "yolov8n.hef"
        }
    ],
    "POST_PROCESS": [
        {
            "LabelsPath": "yolov8n_minerio.json",
            "OutputNumClasses": 2,
            "OutputPostprocessType": "DetectionYoloV8"
        }
    ]
}

yolov8n_minerio.json:


{
  "detection_threshold": 0.5,
  "max_boxes": 200,
  "labels": [
    "unlabeled",
    "minerio"
  ]

}

Any idea why I’m getting the error? As far as I can tell, the directories and file names are correct. I enforced myself, but I couldn’t proceed from here

I tried to add it in my .json after ’ “ConfigVersion”: 10 ':

"Checksum": "926bf34651d94e850361ad272b141a61af0097e64e46f3a7519e7dff84c8f323",

Then i get:

(degirum_env) rasp5@raspberrypi:~/Desktop/degirum_test/yolov8n $ python3 /home/rasp5/hailo_examples/counting_test1.py 
degirum.exceptions.DegirumException: Model 'yolov8n' inference failed: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 6
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'yolov8n'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/rasp5/hailo_examples/counting_test1.py", line 75, in <module>
    inference_result = model.predict(frame)
                       ^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/log.py", line 59, in wrap
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/model.py", line 225, in predict
    res = list(self._predict_impl(source))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/model.py", line 1230, in _predict_impl
    raise DegirumException(msg) from saved_exception
degirum.exceptions.DegirumException: Failed to perform model 'yolov8n' inference: Model 'yolov8n' inference failed: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 6
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'yolov8n'
(degirum_env) rasp5@raspberrypi:~/Desktop/degirum_test/yolov8n $ 

Hi @RikLawliet
Welcome to the DeGirum community. Thank you for providing such detailed information on your problem.

If you compiled the yolov8n model by yourselves, it is likely that the hef file uses Hailo’s in-built NMS which needs a different type of postprocessor. You can see how to integrate such models in our guide: Hailo guide 3: Simplifying object detection on a Hailo device using DeGirum PySDK. Briefly, what you need to do is to replace DetectionYOLOV8 with another postprocessor whose code is provided in the guide.

1 Like

Hello, @shashi . Thank you for your reply. I used the reference guides, but I didn’t have any success. I tried changing my “OutputPostprocessType” field to various values, such as ‘Detection’, ‘DetectionYolo’ and the default ‘YoloDetectionV8’. All failed… Based on the data I sent, do you have any other approaches or suggestions? I’m still not sure why the error message tells me “input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses’”, since I only have 1 class in the model (in my docs, 2 because, “labels”: [“unlabeled”, “minerio”] )

Hi @RikLawliet
As I mentioned in my previous reply and as it is explained in detail in the user guide, if you have a model that uses Hailo’s built-in NMS you need to have model JSON similar to the one in the user guide which specifies a PythonPostprocessor file. You can download this model for reference: DeGirum AI Hub. If you still cannot get the model to work, please provide us the model hef and labels file and we will prepare the model folder for you with the proper JSON and the python postprocessor file.

1 Like

Hi @shashi .Right. As mentioned, my customized template was based on the following:

https://hub.degirum.com/degirum/models_hailort/yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1

As far as I can tell, it’s pretty much the same all the files (although I did tweak the .json labels, unfortunately without success).

You can find my files here (i will delete the link ASAP. Just let me know if you take it already): (deleted)

Hi @RikLawliet
You can delete the link. I downloaded the files.

1 Like

Hi @RikLawliet
You can find your ported model at: DeGirum AI Hub. I tested the model and it ran inference in the cloud without any problems. However, I do not have any images that contain whatever is the custom class you trained the model on. You mentioned that the model has only 1 class but your JSON had 2 classes: unlabeled and minerio. So, you may want to make sure it is set correctly. Please let me know if you run into issues running the model.

I test it with only one class, but i still get same error:

(degirum_env) rasp5@raspberrypi:~/Desktop/degirum_test/yolov8n $ python3 /home/rasp5/hailo_examples/counting_test1.py 
degirum.exceptions.DegirumException: Model 'yolov8n' inference failed: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 5
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'yolov8n'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/rasp5/hailo_examples/counting_test1.py", line 74, in <module>
    inference_result = model.predict(frame)
                       ^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/log.py", line 59, in wrap
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/model.py", line 225, in predict
    res = list(self._predict_impl(source))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rasp5/hailo_examples/degirum_env/lib/python3.11/site-packages/degirum/model.py", line 1230, in _predict_impl
    raise DegirumException(msg) from saved_exception
degirum.exceptions.DegirumException: Failed to perform model 'yolov8n' inference: Model 'yolov8n' inference failed: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 5
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'yolov8n'

You modify some info in .json?

Hi @RikLawliet
Did you download the model assets that I provided? It appears that you did not follow all the instructions in the guide. For example, you need to specify a python postprocessor and include the python file in model folder. Since you did not do this, you keep getting the same error. So, I took the model assets you provided and made sure that everything is configured properly. All you need to do is download the model folder from our AI Hub to your local folder and run it.

1 Like

I’ll do it now and get back to you soon with feedback. Thanks for all the support so far @shashi

1 Like

Wow… I have no idea what it is, but it worked, considering the two classes. I’ll make comparisons between my files and the one you sent me to find the reason. I’m extremely grateful for your kindness and support @shashi. Thank you very much indeed. One last question. As I mentioned at the beginning, do you think I can use conditional functions where my class, when inside the specified area, changes the state of the Rasp5 GPIO? I ask if you have any function or tip that can help me with this later.

For example, you need to specify a python postprocessor and include the python file in model folder. Since you did not do this, you keep getting the same error.

Indeed. I see it now… :sweat_smile:
Thanks again for everything!

1 Like

Hi @RikLawliet
Glad to see it is working. Regarding GPIO, I will loop in our team to see if they can help.

1 Like

@vladk
Can you help @RikLawliet with the GPIO request?

To control gpio using lgpio package, first install it:

sudo apt update
sudo apt install python3-lgpio

Then you control the pin:

import lgpio
import time

CHIP = 0            # Usually 0 for Raspberry Pi
GPIO_PIN = 17       # BCM pin 17 (physical pin 11)

# Open the GPIO chip
h = lgpio.gpiochip_open(CHIP)

# Set pin as output
lgpio.gpio_claim_output(h, GPIO_PIN, 0)

# set pin to 1
lgpio.gpio_write(h, GPIO_PIN, 1)