How to use the kfserving.V1alpha2InferenceServiceSpec function in kfserving

To help you get started, we’ve selected a few kfserving examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github kubeflow / pipelines / components / kubeflow / kfserving / src / kfservingdeployer.py View on Github external
def InferenceService(metadata, default_model_spec, canary_model_spec=None, canary_model_traffic=None):
    return V1alpha2InferenceService(api_version=constants.KFSERVING_GROUP + '/' + constants.KFSERVING_VERSION,
                             kind=constants.KFSERVING_KIND,
                             metadata=metadata,
                             spec=V1alpha2InferenceServiceSpec(default=default_model_spec,
                                                               canary=canary_model_spec,
                                                               canary_traffic_percent=canary_model_traffic))