JavaScript is disabled on your browser.
Skip navigation links

AWS SDK for Java 1.x API Reference - 1.12.795

We announced the upcoming end-of-support for AWS SDK for Java (v1). We recommend that you migrate to AWS SDK for Java v2. For dates, additional details, and information on how to migrate, please refer to the linked announcement.
com.amazonaws.services.sagemaker.model

Class InferenceComponentSpecification

    • Constructor Detail

      • InferenceComponentSpecification

        public InferenceComponentSpecification()
    • Method Detail

      • setModelName

        public void setModelName(String modelName)

        The name of an existing SageMaker model object in your account that you want to deploy with the inference component.

        Parameters:
        modelName - The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
      • getModelName

        public String getModelName()

        The name of an existing SageMaker model object in your account that you want to deploy with the inference component.

        Returns:
        The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
      • withModelName

        public InferenceComponentSpecification withModelName(String modelName)

        The name of an existing SageMaker model object in your account that you want to deploy with the inference component.

        Parameters:
        modelName - The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setContainer

        public void setContainer(InferenceComponentContainerSpecification container)

        Defines a container that provides the runtime environment for a model that you deploy with an inference component.

        Parameters:
        container - Defines a container that provides the runtime environment for a model that you deploy with an inference component.
      • getContainer

        public InferenceComponentContainerSpecification getContainer()

        Defines a container that provides the runtime environment for a model that you deploy with an inference component.

        Returns:
        Defines a container that provides the runtime environment for a model that you deploy with an inference component.
      • withContainer

        public InferenceComponentSpecification withContainer(InferenceComponentContainerSpecification container)

        Defines a container that provides the runtime environment for a model that you deploy with an inference component.

        Parameters:
        container - Defines a container that provides the runtime environment for a model that you deploy with an inference component.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setStartupParameters

        public void setStartupParameters(InferenceComponentStartupParameters startupParameters)

        Settings that take effect while the model container starts up.

        Parameters:
        startupParameters - Settings that take effect while the model container starts up.
      • getStartupParameters

        public InferenceComponentStartupParameters getStartupParameters()

        Settings that take effect while the model container starts up.

        Returns:
        Settings that take effect while the model container starts up.
      • withStartupParameters

        public InferenceComponentSpecification withStartupParameters(InferenceComponentStartupParameters startupParameters)

        Settings that take effect while the model container starts up.

        Parameters:
        startupParameters - Settings that take effect while the model container starts up.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • setComputeResourceRequirements

        public void setComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)

        The compute resources allocated to run the model assigned to the inference component.

        Parameters:
        computeResourceRequirements - The compute resources allocated to run the model assigned to the inference component.
      • getComputeResourceRequirements

        public InferenceComponentComputeResourceRequirements getComputeResourceRequirements()

        The compute resources allocated to run the model assigned to the inference component.

        Returns:
        The compute resources allocated to run the model assigned to the inference component.
      • withComputeResourceRequirements

        public InferenceComponentSpecification withComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)

        The compute resources allocated to run the model assigned to the inference component.

        Parameters:
        computeResourceRequirements - The compute resources allocated to run the model assigned to the inference component.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • toString

        public String toString()
        Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
        Overrides:
        toString in class Object
        Returns:
        A string representation of this object.
        See Also:
        Object.toString()
Skip navigation links

AltStyle によって変換されたページ (->オリジナル) /