How to use the fire.decorators.SetParseFns function in fire

To help you get started, we’ve selected a few fire examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github kubeflow / pipelines / components / gcp / container / component_sdk / python / kfp_component / google / dataproc / _create_cluster.py View on Github external
@decorators.SetParseFns(image_version=str)
def create_cluster(project_id, region, name=None, name_prefix=None,
    initialization_actions=None, config_bucket=None, image_version=None,
    cluster=None, wait_interval=30):
    """Creates a DataProc cluster under a project.

    Args:
        project_id (str): Required. The ID of the Google Cloud Platform project 
            that the cluster belongs to.
        region (str): Required. The Cloud Dataproc region in which to handle the 
            request.
        name (str): Optional. The cluster name. Cluster names within a project
            must be unique. Names of deleted clusters can be reused.
        name_prefix (str): Optional. The prefix of the cluster name.
        initialization_actions (list): Optional. List of GCS URIs of executables 
            to execute on each node after config is completed. By default,
            executables are run on master and all worker nodes. 
github kubeflow / pipelines / components / gcp / container / component_sdk / python / kfp_component / google / ml_engine / _create_version.py View on Github external
@decorators.SetParseFns(python_version=str, runtime_version=str)
def create_version(model_name, deployemnt_uri=None, version_id=None, 
    runtime_version=None, python_version=None, version=None, 
    replace_existing=False, wait_interval=30):
    """Creates a MLEngine version and wait for the operation to be done.

    Args:
        model_name (str): required, the name of the parent model.
        deployment_uri (str): optional, the Google Cloud Storage location of 
            the trained model used to create the version.
        version_id (str): optional, the user provided short name of 
            the version. If it is not provided, the operation uses a random name.
        runtime_version (str): optinal, the Cloud ML Engine runtime version 
            to use for this deployment. If not set, Cloud ML Engine uses 
            the default stable version, 1.0. 
        python_version (str): optinal, the version of Python used in prediction. 
            If not set, the default version is '2.7'. Python '3.5' is available
github kubeflow / pipelines / components / gcp / container / component_sdk / python / kfp_component / google / ml_engine / _deploy.py View on Github external
@decorators.SetParseFns(python_version=str, runtime_version=str)
def deploy(model_uri, project_id, model_id=None, version_id=None, 
    runtime_version=None, python_version=None, model=None, version=None, 
    replace_existing_version=False, set_default=False, wait_interval=30):
    """Deploy a model to MLEngine from GCS URI

    Args:
        model_uri (str): Required, the GCS URI which contains a model file.
            If no model file is found, the same path will be treated as an export
            base directory of a TF Estimator. The last time-stamped sub-directory
            will be chosen as model URI.
        project_id (str): required, the ID of the parent project.
        model_id (str): optional, the user provided name of the model.
        version_id (str): optional, the user provided name of the version. 
            If it is not provided, the operation uses a random name.
        runtime_version (str): optinal, the Cloud ML Engine runtime version 
            to use for this deployment. If not set, Cloud ML Engine uses 
github kubeflow / pipelines / components / gcp / container / component_sdk / python / kfp_component / google / ml_engine / _train.py View on Github external
@decorators.SetParseFns(python_version=str, runtime_version=str)
def train(project_id, python_module=None, package_uris=None, 
    region=None, args=None, job_dir=None, python_version=None, 
    runtime_version=None, master_image_uri=None, worker_image_uri=None, 
    training_input=None, job_id_prefix=None, wait_interval=30):
    """Creates a MLEngine training job.

    Args:
        project_id (str): Required. The ID of the parent project of the job.
        python_module (str): Required. The Python module name to run after 
            installing the packages.
        package_uris (list): Required. The Google Cloud Storage location of 
            the packages with the training program and any additional 
            dependencies. The maximum number of package URIs is 100.
        region (str): Required. The Google Compute Engine region to run the 
            training job in
        args (list): Command line arguments to pass to the program.