Hi! I’m new to Hailo8 and the model compilation process, running inference via Python on Google Colab. Happy to provide any other debugging information that might help resolve this issue.
Environment
-
Platform: DeGirum Cloud (@cloud)
-
Model Zoo:
sivagnanam_maheshwaran/Hailo8_Deployyment -
Hardware: Hailo8
Issue Description
I’m experiencing inconsistent behavior with two similar OBB (Oriented Bounding Box) models. One model works perfectly while another fails with a tensor mismatch error, despite both being compiled for the same target hardware and having similar naming conventions.
Models Tested
Working Model
-
Name:
ShipDatset-obb--640x640_quant_hailort_hailo8_1 -
Behavior: Inference runs successfully, outputs proper oriented bounding boxes
Failing Model
-
Name:
HRSID-obb--640x640_quant_hailort_hailo8_2 -
Error:
Mismatch in the number of box/prob tensors!
Error Details
DegirumException: Model 'sivagnanam_maheshwaran/Hailo8_Deployyment/HRSID-obb--640x640_quant_hailort_hailo8_2' inference failed: [ERROR]Incorrect value
Mismatch in the number of box/prob tensors!
dg_postprocess_detection.cpp: 1608 [DG::DetectionPostprocessYoloV8::findPostprocessorInputsOrder]
When running model 'sivagnanam_maheshwaran/Hailo8_Deployyment/HRSID-obb--640x640_quant_hailort_hailo8_2'
Reproduction Code
import degirum as dg
import cv2
# This WORKS
working_model = dg.load_model(
model_name="ShipDatset-obb--640x640_quant_hailort_hailo8_1",
inference_host_address="@cloud",
zoo_url="sivagnanam_maheshwaran/Hailo8_Deployyment",
token="[TOKEN]"
)
# This FAILS
failing_model = dg.load_model(
model_name="HRSID-obb--640x640_quant_hailort_hailo8_2",
inference_host_address="@cloud",
zoo_url="sivagnanam_maheshwaran/Hailo8_Deployyment",
token="[TOKEN]"
)
img = cv2.imread("test_image.jpg")
# Works fine
res_working = working_model(img) # ✅ Success
# Throws tensor mismatch error
res_failing = failing_model(img) # ❌ Fails
Both are models OBB models were compiled in DeGiurm AI Hub cloud compiler using the same settings, so I am confused as to why one of them work while the next one does not.
