In this post, I’ll provide an overview of the connect
function in PySDK. The connect
function connects to the AI inference host and model zoo of your choice. This is the main PySDK entry point—you start your work with PySDK by calling this function.
degirum.connect(inference_host_address, zoo_url=None, token=None)
Parameters
Name | Type | Description | Default |
---|---|---|---|
inference_host_address |
str |
Inference engine designator; it defines which inference engine to use. For AI Server-based inference it can be either the hostname or IP address of the AI Server host, optionally followed by the port number in the form :port .For DeGirum AI Hub-based inference it is the string "@cloud" or degirum.CLOUD constant.For local inference it is the string "@local" or degirum.LOCAL constant. |
required |
zoo_url |
Optional[str] |
Model zoo URL string which defines the model zoo to operate with. For a hosted model zoo, it is specified in the following format: <cloud server prefix>[/<zoo suffix>] . The <cloud server prefix> part is the AI Hub root URL, typically https://cs.degirum.com . The optional <zoo suffix> part is the hosted zoo URL suffix in the form <organization>/<model zoo name> . You can confirm zoo URL suffix by visiting your hosted user account and opening the model zoo management page. If <zoo suffix> is not specified, then DeGirum public model zoo degirum/public is used.For AI Server-based inferences, you may omit both zoo_url and token parameters. In this case locally-deployed model zoo of the AI Server will be used.For local AI hardware inferences, if you want to use particular AI model from your local drive, then you specify zoo_url parameter equal to the path to that model’s .json configuration file. The token parameter is not needed in this case. |
None |
token |
Optional[str] |
Cloud API access token used to access the hosted zoo. To obtain this token you need to open a user account on DeGirum AI Hub. Please login to your account and go to the token generation page to generate an API access token. |
None |
Returns
Type | Description |
---|---|
ZooManager |
An instance of Model Zoo manager object configured to work with AI inference host and model zoo of your choice. |
Supported Use Cases
Inference Type | Hosted Zoo | Local Zoo | Single File Zoo |
---|---|---|---|
Hosted inference | |||
AI server inference | |||
Local inference |
Model Zoo URL Cheat Sheet
Inference Type | Model Zoo Type | connect() parameters |
---|---|---|
Hosted inference | Hosted zoo | zoo = dg.connect(dg.CLOUD, "https://cs.degirum.com[/<zoo URL>]", "<token>") |
AI server inference | Hosted zoo | zoo = dg.connect("<hostname>", "https://cs.degirum.com[/<zoo URL>]", "<token>") |
AI server inference | Local zoo | zoo = dg.connect("<hostname>") |
Local inference | Hosted zoo | zoo = dg.connect(dg.LOCAL, "https://cs.degirum.com[/<zoo URL>]", "<token>") |
Local inference | Local file | zoo = dg.connect(dg.LOCAL, "/path/to/model.json") |