public final class ImageClassificationModelMetadata extends GeneratedMessageV3 implements ImageClassificationModelMetadataOrBuilder
Model metadata for image classification.
Protobuf type google.cloud.automl.v1beta1.ImageClassificationModelMetadata
Static Fields
public static final int BASE_MODEL_ID_FIELD_NUMBER
Field Value
public static final int MODEL_TYPE_FIELD_NUMBER
Field Value
public static final int NODE_COUNT_FIELD_NUMBER
Field Value
public static final int NODE_QPS_FIELD_NUMBER
Field Value
public static final int STOP_REASON_FIELD_NUMBER
Field Value
public static final int TRAIN_BUDGET_FIELD_NUMBER
Field Value
public static final int TRAIN_COST_FIELD_NUMBER
Field Value
Static Methods
public static ImageClassificationModelMetadata getDefaultInstance()
Returns
public static final Descriptors.Descriptor getDescriptor()
Returns
public static ImageClassificationModelMetadata.Builder newBuilder()
Returns
public static ImageClassificationModelMetadata.Builder newBuilder(ImageClassificationModelMetadata prototype)
Parameter
Returns
public static ImageClassificationModelMetadata parseDelimitedFrom(InputStream input)
Parameter
Returns
Exceptions
public static ImageClassificationModelMetadata parseDelimitedFrom(InputStream input, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(byte[] data)
Parameter
Name | Description |
data | byte[]
|
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(byte[] data, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(ByteString data)
Parameter
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(ByteString data, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(CodedInputStream input)
Parameter
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(InputStream input)
Parameter
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(InputStream input, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(ByteBuffer data)
Parameter
Returns
Exceptions
public static ImageClassificationModelMetadata parseFrom(ByteBuffer data, ExtensionRegistryLite extensionRegistry)
Parameters
Returns
Exceptions
public static Parser<ImageClassificationModelMetadata> parser()
Returns
Methods
public boolean equals(Object obj)
Parameter
Returns
Overrides
public String getBaseModelId()
Optional. The ID of the base
model. If it is specified, the new model
will be created based on the base
model. Otherwise, the new model will be
created from scratch. The base
model must be in the same
project
and location
as the new model to create, and have the same
model_type
.
string base_model_id = 1;
Returns
Type | Description |
String | The baseModelId.
|
public ByteString getBaseModelIdBytes()
Optional. The ID of the base
model. If it is specified, the new model
will be created based on the base
model. Otherwise, the new model will be
created from scratch. The base
model must be in the same
project
and location
as the new model to create, and have the same
model_type
.
string base_model_id = 1;
Returns
Type | Description |
ByteString | The bytes for baseModelId.
|
public ImageClassificationModelMetadata getDefaultInstanceForType()
Returns
public String getModelType()
Optional. Type of the model. The available values are:
cloud
- Model to be used via prediction calls to AutoML API.
This is the default value.
mobile-low-latency-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards. Expected to have low latency, but
may have lower prediction quality than other models.
mobile-versatile-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards.
mobile-high-accuracy-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards. Expected to have a higher
latency, but should also have a higher prediction quality
than other models.
mobile-core-ml-low-latency-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile device with Core
ML afterwards. Expected to have low latency, but may have
lower prediction quality than other models.
mobile-core-ml-versatile-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile device with Core
ML afterwards.
mobile-core-ml-high-accuracy-1
- A model that, in addition to
providing prediction via AutoML API, can also be exported
(see AutoMl.ExportModel) and used on a mobile device with
Core ML afterwards. Expected to have a higher latency, but
should also have a higher prediction quality than other
models.
string model_type = 7;
Returns
Type | Description |
String | The modelType.
|
public ByteString getModelTypeBytes()
Optional. Type of the model. The available values are:
cloud
- Model to be used via prediction calls to AutoML API.
This is the default value.
mobile-low-latency-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards. Expected to have low latency, but
may have lower prediction quality than other models.
mobile-versatile-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards.
mobile-high-accuracy-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile or edge device
with TensorFlow afterwards. Expected to have a higher
latency, but should also have a higher prediction quality
than other models.
mobile-core-ml-low-latency-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile device with Core
ML afterwards. Expected to have low latency, but may have
lower prediction quality than other models.
mobile-core-ml-versatile-1
- A model that, in addition to providing
prediction via AutoML API, can also be exported (see
AutoMl.ExportModel) and used on a mobile device with Core
ML afterwards.
mobile-core-ml-high-accuracy-1
- A model that, in addition to
providing prediction via AutoML API, can also be exported
(see AutoMl.ExportModel) and used on a mobile device with
Core ML afterwards. Expected to have a higher latency, but
should also have a higher prediction quality than other
models.
string model_type = 7;
Returns
public long getNodeCount()
Output only. The number of nodes this model is deployed on. A node is an
abstraction of a machine resource, which can handle online prediction QPS
as given in the node_qps field.
int64 node_count = 14;
Returns
Type | Description |
long | The nodeCount.
|
public double getNodeQps()
Output only. An approximate number of online prediction QPS that can
be supported by this model per each node on which it is deployed.
double node_qps = 13;
Returns
Type | Description |
double | The nodeQps.
|
public Parser<ImageClassificationModelMetadata> getParserForType()
Returns
Overrides
public int getSerializedSize()
Returns
Overrides
public String getStopReason()
Output only. The reason that this create model operation stopped,
e.g. BUDGET_REACHED
, MODEL_CONVERGED
.
string stop_reason = 5;
Returns
Type | Description |
String | The stopReason.
|
public ByteString getStopReasonBytes()
Output only. The reason that this create model operation stopped,
e.g. BUDGET_REACHED
, MODEL_CONVERGED
.
string stop_reason = 5;
Returns
Type | Description |
ByteString | The bytes for stopReason.
|
public long getTrainBudget()
Required. The train budget of creating this model, expressed in hours. The
actual train_cost
will be equal or less than this value.
int64 train_budget = 2;
Returns
Type | Description |
long | The trainBudget.
|
public long getTrainCost()
Output only. The actual train cost of creating this model, expressed in
hours. If this model is created from a base
model, the train cost used
to create the base
model are not included.
int64 train_cost = 3;
Returns
Type | Description |
long | The trainCost.
|
public final UnknownFieldSet getUnknownFields()
Returns
Overrides
Returns
Overrides
protected GeneratedMessageV3.FieldAccessorTable internalGetFieldAccessorTable()
Returns
Overrides
public final boolean isInitialized()
Returns
Overrides
public ImageClassificationModelMetadata.Builder newBuilderForType()
Returns
protected ImageClassificationModelMetadata.Builder newBuilderForType(GeneratedMessageV3.BuilderParent parent)
Parameter
Returns
Overrides
protected Object newInstance(GeneratedMessageV3.UnusedPrivateParameter unused)
Parameter
Returns
Overrides
public ImageClassificationModelMetadata.Builder toBuilder()
Returns
public void writeTo(CodedOutputStream output)
Parameter
Overrides
Exceptions