Short description of my project: detect/track cutest cat on earth (by the way, my cat)
Longer description: my inference script, based on hailo-rpi5-examples/basic_pipelines/detection.py, sends notification via gotify to my Android when cat or person or both is detected and additionally send it to database. What to track and notify is governed by an Android App I developed. The selection is written to labels.json and read by my custom detection.py. Then I decided to fine tune my own model for my purpose.
I used YOLO11s.pt to fine tune to cat and person and upload best.pt to DeGirum Compiler.
I wish I could say more about the Compiler, but all I can say is: it was the task I feared most and it turned out to be the easiest and fasted part of my project. Well done!
My model works as expected, but because I was using class_ids instead of labels my cat was a person and I was a bicycle despite using the labels-json-parameter. I couldnā t convince the hailo pipeline to use my custom label file or persuade the cat to go to work instead of me, thus I decided to give Degirum PySDK a second chance (when I started with Hailo I couldnāt get a stable stream from camera with Degirum PySDK).
Iām impressed how much PySDK improved. I have a stable stream now - not to mention that Iām a person and my cat a cat. Long story short: I want to switch from Hailo/gstreamer-pipelines to DegirumPySDK and tools. Now I wonder how to best handle the apprise/gotify and database task. I found EventDetector and EventNotifier, but Iām unsure how to use them correctly. I found the docs and smart_nvr.ipynb, but failed to adapt it to my needs, which is: if detection result is in label-from-Android.json then do something, i.e. send a message to gotify. I even donāt know if I need those classes or if I should code them. I love the idea of gizmos and assume, this should go into a gizmo pipeline (or however it is called), but Iām unsure about that, too. Any help is highly appreciated.
Here is my code, pretty standard, I think:
import degirum as dg
import degirum_tools
import degirum_tools.streams as dgstreams
import apprise==== Setup ====
video_source=ārtsp://myserver/inputā
Loading a model
model = dg.load_model(
model_name = āmymodelā640x640_quant_hailort_hailo8_1ā,
inference_host_address = ā@localā,
zoo_url = ā/MyModels/mymodelā640x640_quant_hailort_hailo8_1ā,
overlay_show_probabilities = True,
overlay_line_width = 1
)==== Setup Gotify ====
apobj = apprise.Apprise()
apobj.add(āgotify-address hereā)
notification_config = {
āappriseā: apobj, # EventNotifier spricht mit Apprise
}Gizmo rulez
source = dgstreams.VideoSourceGizmo(video_source)
#resize = dgstreams.ResizingGizmo(640, 640)
detection = dgstreams.AiSimpleGizmo(model, allow_drop=True)#display = dgstreams.VideoDisplayGizmo(āDetectionā, allow_drop=True, show_ai_overlay=True, show_fps=True)
streamout = dgstreams.VideoStreamerGizmo(ārtsp://rtsp://myserver/outputā, show_ai_overlay=True)dgstreams.Composition(source >> detection >> streamout).start()