JavaScript is disabled on your browser.
Skip navigation links

AWS SDK for Java 1.x API Reference - 1.12.795

We announced the upcoming end-of-support for AWS SDK for Java (v1). We recommend that you migrate to AWS SDK for Java v2. For dates, additional details, and information on how to migrate, please refer to the linked announcement.
com.amazonaws.services.sagemaker.model

Class ModelPackageContainerDefinition

    • Constructor Detail

      • ModelPackageContainerDefinition

        public ModelPackageContainerDefinition()
    • Method Detail

      • setContainerHostname

        public void setContainerHostname(String containerHostname)

        The DNS host name for the Docker container.

        Parameters:
        containerHostname - The DNS host name for the Docker container.
      • getContainerHostname

        public String getContainerHostname()

        The DNS host name for the Docker container.

        Returns:
        The DNS host name for the Docker container.
      • withContainerHostname

        public ModelPackageContainerDefinition withContainerHostname(String containerHostname)

        The DNS host name for the Docker container.

        Parameters:
        containerHostname - The DNS host name for the Docker container.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setImage

        public void setImage(String image)

        The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

        Parameters:
        image - The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

      • getImage

        public String getImage()

        The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

        Returns:
        The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

      • withImage

        public ModelPackageContainerDefinition withImage(String image)

        The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

        Parameters:
        image - The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.

        If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.

        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setImageDigest

        public void setImageDigest(String imageDigest)

        An MD5 hash of the training algorithm that identifies the Docker image used for training.

        Parameters:
        imageDigest - An MD5 hash of the training algorithm that identifies the Docker image used for training.
      • getImageDigest

        public String getImageDigest()

        An MD5 hash of the training algorithm that identifies the Docker image used for training.

        Returns:
        An MD5 hash of the training algorithm that identifies the Docker image used for training.
      • withImageDigest

        public ModelPackageContainerDefinition withImageDigest(String imageDigest)

        An MD5 hash of the training algorithm that identifies the Docker image used for training.

        Parameters:
        imageDigest - An MD5 hash of the training algorithm that identifies the Docker image used for training.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setModelDataUrl

        public void setModelDataUrl(String modelDataUrl)

        The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

        Parameters:
        modelDataUrl - The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

      • getModelDataUrl

        public String getModelDataUrl()

        The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

        Returns:
        The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

      • withModelDataUrl

        public ModelPackageContainerDefinition withModelDataUrl(String modelDataUrl)

        The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

        Parameters:
        modelDataUrl - The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).

        The model artifacts must be in an S3 bucket that is in the same region as the model package.

        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setModelDataSource

        public void setModelDataSource(ModelDataSource modelDataSource)

        Specifies the location of ML model data to deploy during endpoint creation.

        Parameters:
        modelDataSource - Specifies the location of ML model data to deploy during endpoint creation.
      • getModelDataSource

        public ModelDataSource getModelDataSource()

        Specifies the location of ML model data to deploy during endpoint creation.

        Returns:
        Specifies the location of ML model data to deploy during endpoint creation.
      • withModelDataSource

        public ModelPackageContainerDefinition withModelDataSource(ModelDataSource modelDataSource)

        Specifies the location of ML model data to deploy during endpoint creation.

        Parameters:
        modelDataSource - Specifies the location of ML model data to deploy during endpoint creation.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setProductId

        public void setProductId(String productId)

        The Amazon Web Services Marketplace product ID of the model package.

        Parameters:
        productId - The Amazon Web Services Marketplace product ID of the model package.
      • getProductId

        public String getProductId()

        The Amazon Web Services Marketplace product ID of the model package.

        Returns:
        The Amazon Web Services Marketplace product ID of the model package.
      • withProductId

        public ModelPackageContainerDefinition withProductId(String productId)

        The Amazon Web Services Marketplace product ID of the model package.

        Parameters:
        productId - The Amazon Web Services Marketplace product ID of the model package.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • getEnvironment

        public Map<String,String> getEnvironment()

        The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.

        Returns:
        The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.
      • setEnvironment

        public void setEnvironment(Map<String,String> environment)

        The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.

        Parameters:
        environment - The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.
      • withEnvironment

        public ModelPackageContainerDefinition withEnvironment(Map<String,String> environment)

        The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.

        Parameters:
        environment - The environment variables to set in the Docker container. Each key and value in the Environment string to string map can have length of up to 1024. We support up to 16 entries in the map.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • clearEnvironmentEntries

        public ModelPackageContainerDefinition clearEnvironmentEntries()
        Removes all the entries added into Environment.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setModelInput

        public void setModelInput(ModelInput modelInput)

        A structure with Model Input details.

        Parameters:
        modelInput - A structure with Model Input details.
      • getModelInput

        public ModelInput getModelInput()

        A structure with Model Input details.

        Returns:
        A structure with Model Input details.
      • withModelInput

        public ModelPackageContainerDefinition withModelInput(ModelInput modelInput)

        A structure with Model Input details.

        Parameters:
        modelInput - A structure with Model Input details.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setFramework

        public void setFramework(String framework)

        The machine learning framework of the model package container image.

        Parameters:
        framework - The machine learning framework of the model package container image.
      • getFramework

        public String getFramework()

        The machine learning framework of the model package container image.

        Returns:
        The machine learning framework of the model package container image.
      • withFramework

        public ModelPackageContainerDefinition withFramework(String framework)

        The machine learning framework of the model package container image.

        Parameters:
        framework - The machine learning framework of the model package container image.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setFrameworkVersion

        public void setFrameworkVersion(String frameworkVersion)

        The framework version of the Model Package Container Image.

        Parameters:
        frameworkVersion - The framework version of the Model Package Container Image.
      • getFrameworkVersion

        public String getFrameworkVersion()

        The framework version of the Model Package Container Image.

        Returns:
        The framework version of the Model Package Container Image.
      • withFrameworkVersion

        public ModelPackageContainerDefinition withFrameworkVersion(String frameworkVersion)

        The framework version of the Model Package Container Image.

        Parameters:
        frameworkVersion - The framework version of the Model Package Container Image.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setNearestModelName

        public void setNearestModelName(String nearestModelName)

        The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.

        Parameters:
        nearestModelName - The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.
      • getNearestModelName

        public String getNearestModelName()

        The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.

        Returns:
        The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.
      • withNearestModelName

        public ModelPackageContainerDefinition withNearestModelName(String nearestModelName)

        The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.

        Parameters:
        nearestModelName - The name of a pre-trained machine learning benchmarked by Amazon SageMaker Inference Recommender model that matches your model. You can find a list of benchmarked models by calling ListModelMetadata.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setAdditionalS3DataSource

        public void setAdditionalS3DataSource(AdditionalS3DataSource additionalS3DataSource)

        The additional data source that is used during inference in the Docker container for your model package.

        Parameters:
        additionalS3DataSource - The additional data source that is used during inference in the Docker container for your model package.
      • getAdditionalS3DataSource

        public AdditionalS3DataSource getAdditionalS3DataSource()

        The additional data source that is used during inference in the Docker container for your model package.

        Returns:
        The additional data source that is used during inference in the Docker container for your model package.
      • withAdditionalS3DataSource

        public ModelPackageContainerDefinition withAdditionalS3DataSource(AdditionalS3DataSource additionalS3DataSource)

        The additional data source that is used during inference in the Docker container for your model package.

        Parameters:
        additionalS3DataSource - The additional data source that is used during inference in the Docker container for your model package.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • toString

        public String toString()
        Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
        Overrides:
        toString in class Object
        Returns:
        A string representation of this object.
        See Also:
        Object.toString()
Skip navigation links

AltStyle によって変換されたページ (->オリジナル) /