Google.Cloud.Storage.V1
Google.Cloud.Storage.V1
is a.NET client library for the Google Cloud Storage API.
It wraps the Google.Apis.Storage.v1
generated library, providing a higher-level API to make it easier to use.
Note:
This documentation is for version 3.7.0
of the library.
Some samples may not work with other versions.
Installation
Install the Google.Cloud.Storage.V1
package from NuGet. Add it to
your project in the normal way (for example by right-clicking on the
project in Visual Studio and choosing "Manage NuGet Packages...").
Authentication
When running on Google Cloud Platform, no action needs to be taken to authenticate.
Otherwise, the simplest way of authenticating your API calls is to
download a service account JSON file then set the GOOGLE_APPLICATION_CREDENTIALS
environment variable to refer to it.
The credentials will automatically be used to authenticate. See the Getting Started With
Authentication guide for more details.
Getting started
Common operations are exposed via the StorageClient class.
Error responses
All errors reported by the underlying API (including precondition
failures) are propagated as exceptions of type
GoogleApiException
.
See the Cloud Storage API status and error codes
documentation
for details of the HTTP status codes used.
Client life-cycle management
In many cases you don't need to worry about disposing of
StorageClient
objects, and can create them reasonably freely -
but be aware that this can causes issues with memory and network
connection usage. We advise you to reuse a single client object if possible.
StorageClient
is thread-safe, so in most cases a single object is
the simplest option.
If your architecture requires you to frequently create new client objects, please dispose of them to help with timely resource clean-up. See the resource clean-up guide for more details.
Sample code
var client = StorageClient.Create();
// Create a bucket with a globally unique name
var bucketName = Guid.NewGuid().ToString();
var bucket = client.CreateBucket(projectId, bucketName);
// Upload some files
var content = Encoding.UTF8.GetBytes("hello, world");
var obj1 = client.UploadObject(bucketName, "file1.txt", "text/plain", new MemoryStream(content));
var obj2 = client.UploadObject(bucketName, "folder1/file2.txt", "text/plain", new MemoryStream(content));
// List objects
foreach (var obj in client.ListObjects(bucketName, ""))
{
Console.WriteLine(obj.Name);
}
// Download file
using (var stream = File.OpenWrite("file1.txt"))
{
client.DownloadObject(bucketName, "file1.txt", stream);
}
Signed URLs
Signed URLs can be created to provide limited access to specific buckets and objects to anyone in possession of the URL, regardless of whether they have a Google account.
For example, Signed URLs can be created to provide read-only access to existing objects:
// Create a signed URL which can be used to get a specific object for one hour.
UrlSigner urlSigner = UrlSigner.FromServiceAccountCredential(credential);
string url = urlSigner.Sign(bucketName, objectName, TimeSpan.FromHours(1));
// Get the content at the created URL.
HttpResponseMessage response = await httpClient.GetAsync(url);
string content = await response.Content.ReadAsStringAsync();
Or write-only access to put specific object content into a bucket:
// Create a request template that will be used to create the signed URL.
var destination = "places/world.txt";
UrlSigner.RequestTemplate requestTemplate = UrlSigner.RequestTemplate
.FromBucket(bucketName)
.WithObjectName(destination)
.WithHttpMethod(HttpMethod.Put)
.WithContentHeaders(new Dictionary<string, IEnumerable<string>>
{
{ "Content-Type", new[] { "text/plain" } }
});
// Create options specifying for how long the signer URL will be valid.
UrlSigner.Options options = UrlSigner.Options.FromDuration(TimeSpan.FromHours(1));
// Create a signed URL which allows the requester to PUT data with the text/plain content-type.
UrlSigner urlSigner = UrlSigner.FromServiceAccountCredential(credential);
string url = urlSigner.Sign(requestTemplate, options);
// Upload the content into the bucket using the signed URL.
string source = "world.txt";
ByteArrayContent content;
using (FileStream stream = File.OpenRead(source))
{
byte[] data = new byte[stream.Length];
stream.Read(data, 0, data.Length);
content = new ByteArrayContent(data)
{
Headers = { ContentType = new MediaTypeHeaderValue("text/plain") }
};
}
HttpResponseMessage response = await httpClient.PutAsync(url, content);
Signing URLs without a service account credential file
If you need to sign URLs but don't have a full service account
credential file (with private keys) available, you can create a
UrlSigner.IBlobSigner
implementation to perform the signing part.
The most common implementation of this is likely to be to use the
IAM service to perform the signing, with the
Google.Apis.Iam.v1
package. Here's a sample implementation:
internal sealed class IamServiceBlobSigner : UrlSigner.IBlobSigner
{
private readonly IamService _iamService;
public string Id { get; }
internal IamServiceBlobSigner(IamService service, string id)
{
_iamService = service;
Id = id;
}
public string CreateSignature(byte[] data) =>
CreateRequest(data).Execute().Signature;
public async Task<string> CreateSignatureAsync(byte[] data, CancellationToken cancellationToken)
{
ProjectsResource.ServiceAccountsResource.SignBlobRequest request = CreateRequest(data);
SignBlobResponse response = await request.ExecuteAsync(cancellationToken).ConfigureAwait(false);
return response.Signature;
}
private ProjectsResource.ServiceAccountsResource.SignBlobRequest CreateRequest(byte[] data)
{
SignBlobRequest body = new SignBlobRequest { BytesToSign = Convert.ToBase64String(data) };
string account = $"projects/-/serviceAccounts/{Id}";
ProjectsResource.ServiceAccountsResource.SignBlobRequest request =
_iamService.Projects.ServiceAccounts.SignBlob(body, account);
return request;
}
}
(We may make this available in its own package at some point in the future.)
To make use of this, the account making the request needs the
iam.serviceAccounts.signBlob
permission, which is usually granted
via the "Service Account Token Creator" role.
Here's an example showing how you could use this to sign a URL on behalf of the default Compute Engine credential on an instance. (This example will only work when running on Google Cloud Platform, as it relies on information from the metadata server.) If you want to use a different service account, you could include the account ID as part of your application configuration.
// First obtain the email address of the default service account for this instance from the metadata server.
HttpRequestMessage serviceAccountRequest = new HttpRequestMessage
{
// Note: you could use 169.254.169.254 as the address to avoid a DNS lookup.
RequestUri = new Uri("http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/email"),
Headers = { { "Metadata-Flavor", "Google" } }
};
HttpResponseMessage serviceAccountResponse = await httpClient.SendAsync(serviceAccountRequest).ConfigureAwait(false);
serviceAccountResponse.EnsureSuccessStatusCode();
string serviceAccountId = await serviceAccountResponse.Content.ReadAsStringAsync();
// Create an IAM service client object using the default application credentials.
GoogleCredential iamCredential = await GoogleCredential.GetApplicationDefaultAsync();
iamCredential = iamCredential.CreateScoped(IamService.Scope.CloudPlatform);
IamService iamService = new IamService(new BaseClientService.Initializer
{
HttpClientInitializer = iamCredential
});
// Create a request template that will be used to create the signed URL.
UrlSigner.RequestTemplate requestTemplate = UrlSigner.RequestTemplate
.FromBucket(bucketName)
.WithObjectName(objectName)
.WithHttpMethod(HttpMethod.Get);
// Create options specifying for how long the signer URL will be valid.
UrlSigner.Options options = UrlSigner.Options.FromDuration(TimeSpan.FromHours(1));
// Create a URL signer that will use the IAM service for signing. This signer is thread-safe,
// and would typically occur as a dependency, e.g. in an ASP.NET Core controller, where the
// same instance can be reused for each request.
IamServiceBlobSigner blobSigner = new IamServiceBlobSigner(iamService, serviceAccountId);
UrlSigner urlSigner = UrlSigner.FromBlobSigner(blobSigner);
// Use the URL signer to sign a request for the test object for the next hour.
string url = await urlSigner.SignAsync(requestTemplate, options);
// Prove we can fetch the content of the test object with a simple unauthenticated GET request.
HttpResponseMessage response = await httpClient.GetAsync(url);
string content = await response.Content.ReadAsStringAsync();
Specifying the signing version
(V4 signing is currently beta functionality.)
Google Cloud Storage supports two signing process versions: V2 and V4. Currently the default is V2, although in the future the library may be updated to use V4 by default.
To specify the URL signing versioning, use the UrlSigner.Options.WithSigningVersion method, specifying the signing version you wish to use. This does not change the UrlSigner it is called on; it returns a new UrlSigner that uses the specified version.
Note that V4 signing is restricted to generating URLs that are valid for at most 7 days.
// Create a signed URL which can be used to get a specific object for one hour,
// using the V4 signing process.
UrlSigner urlSigner = UrlSigner
.FromServiceAccountCredential(credential);
string url = urlSigner.Sign(bucketName, objectName, TimeSpan.FromHours(1), signingVersion: SigningVersion.V4);
// Get the content at the created URL.
HttpResponseMessage response = await httpClient.GetAsync(url);
string content = await response.Content.ReadAsStringAsync();
Uploading objects by using HTML forms
In some cases, you might need to allow your users to upload objects via HTML forms. You can create signed POST policies that specify what is and is not allowed in such scenarios. You can read the Policy Document documentation to get more information on how a POST policy document should be built. You can read the POST object documentation to get more details on how the forms should be built.
Below you will find some samples on how to create a signed POST policy.
Simplest approach, where you restrict the upload to a specific bucket and a specific object name.
// Create a signed post policy which can be used to upload a specific object and
// expires in 1 hour after creation.
UrlSigner urlSigner = UrlSigner
.FromServiceAccountCredential(credential);
UrlSigner.Options options = UrlSigner.Options
.FromDuration(TimeSpan.FromHours(1))
.WithSigningVersion(SigningVersion.V4)
.WithScheme("https");
UrlSigner.PostPolicy postPolicy = UrlSigner.PostPolicy.ForBucketAndKey(bucketName, objectName);
postPolicy.SetCustomField(UrlSigner.PostPolicyCustomElement.GoogleMetadata, "x-goog-meta-test", "data");
UrlSigner.SignedPostPolicy signedPostPolicy = await urlSigner.SignAsync(postPolicy, options);
// Create an HTML form including all the fields in the signed post policy.
StringBuilder form = new StringBuilder();
form.AppendLine($"<form action=\"{signedPostPolicy.PostUrl}\" method=\"post\" enctype=\"multipart/form-data\">");
foreach (var field in signedPostPolicy.Fields)
{
form.AppendLine($"<input type=\"hidden\" name=\"{field.Key}\" value=\"{field.Value}\">");
}
// Include the file element. It should always be the last element in the form.
form.AppendLine("<input name=\"file\" type=\"file\">");
form.AppendLine("<input type=\"submit\" value=\"Upload\">");
form.AppendLine("</form>");
// You can now save the form to file and serve it as static content
// or send it as the response to a request made to your application.
File.WriteAllText("PostPolicySimple.html", form.ToString());
Enforce how browser's cache will treat the uploaded object.
// Create a signed post policy which can be used to upload a specific object with a
// specific cache-control value and expires in 1 hour after creation.
UrlSigner urlSigner = UrlSigner
.FromServiceAccountCredential(credential);
UrlSigner.Options options = UrlSigner.Options
.FromDuration(TimeSpan.FromHours(1))
.WithSigningVersion(SigningVersion.V4)
.WithScheme("https");
UrlSigner.PostPolicy postPolicy = UrlSigner.PostPolicy.ForBucketAndKey(bucketName, objectName);
postPolicy.SetField(UrlSigner.PostPolicyStandardElement.Acl, "public-read");
postPolicy.SetField(UrlSigner.PostPolicyStandardElement.CacheControl, "public,max-age=86400");
UrlSigner.SignedPostPolicy signedPostPolicy = await urlSigner.SignAsync(postPolicy, options);
// Create an HTML form including all the fields in the signed post policy.
StringBuilder form = new StringBuilder();
form.AppendLine($"<form action=\"{signedPostPolicy.PostUrl}\" method=\"post\" enctype=\"multipart/form-data\">");
foreach (var field in signedPostPolicy.Fields)
{
form.AppendLine($"<input type=\"hidden\" name=\"{field.Key}\" value=\"{field.Value}\">");
}
// Include the file element. It should always be the last element in the form.
form.AppendLine("<input name=\"file\" type=\"file\">");
form.AppendLine("<input type=\"submit\" value=\"Upload\">");
form.AppendLine("</form>");
// You can now save the form to file and serve it as static content
// or send it as the response to a request made to your application.
File.WriteAllText("PostPolicyCacheControl.html", form.ToString());
You can also set starts-with conditions for some form elements. This means that the posting form should contain that element with a value that matches the condition.
// Create a signed post policy which can be used to upload a specific object and
// expires in 10 seconds after creation.
// It also sets a starts-with condition on the acl form element, that should be met
// by the actual form used for posting.
UrlSigner urlSigner = UrlSigner
.FromServiceAccountCredential(credential);
UrlSigner.Options options = UrlSigner.Options
.FromDuration(TimeSpan.FromHours(1))
.WithSigningVersion(SigningVersion.V4)
.WithScheme("https");
UrlSigner.PostPolicy postPolicy = UrlSigner.PostPolicy.ForBucketAndKey(bucketName, objectName);
postPolicy.SetStartsWith(UrlSigner.PostPolicyStandardElement.Acl, "public");
UrlSigner.SignedPostPolicy signedPostPolicy = await urlSigner.SignAsync(postPolicy, options);
// Create an HTML form including all the fields in the signed post policy.
StringBuilder form = new StringBuilder();
form.AppendLine($"<form action=\"{signedPostPolicy.PostUrl}\" method=\"post\" enctype=\"multipart/form-data\">");
foreach (var field in signedPostPolicy.Fields)
{
form.AppendLine($"<input type=\"hidden\" name=\"{field.Key}\" value=\"{field.Value}\">");
}
// Include also an acl element with a value that meets the condition set in the policy.
form.AppendLine("<input type=\"hidden\" name=\"acl\" value=\"public-read\">");
// Include the file element. It should always be the last element in the form.
form.AppendLine("<input name=\"file\" type=\"file\">");
form.AppendLine("<input type=\"submit\" value=\"Upload\">");
form.AppendLine("</form>");
// You can now save the form to file and serve it as static content
// or send it as the response to a request made to your application.
File.WriteAllText("PostPolicyAcl.html", form.ToString());
Tell the server which HTTP status code you want it to return on succes.
// Create a signed post policy which can be used to upload a specific object and
// expires in 1 hour after creation.
// It also sets a specific HTTP success satus code that should be returned.
// Only 200, 201 and 204 are allowed.
UrlSigner urlSigner = UrlSigner
.FromServiceAccountCredential(credential);
UrlSigner.Options options = UrlSigner.Options
.FromDuration(TimeSpan.FromHours(1))
.WithSigningVersion(SigningVersion.V4)
.WithScheme("https");
UrlSigner.PostPolicy postPolicy = UrlSigner.PostPolicy.ForBucketAndKey(bucketName, objectName);
postPolicy.SetField(UrlSigner.PostPolicyStandardElement.SuccessActionStatus, HttpStatusCode.OK);
UrlSigner.SignedPostPolicy signedPostPolicy = await urlSigner.SignAsync(postPolicy, options);
// Create an HTML form including all the fields in the signed post policy.
StringBuilder form = new StringBuilder();
form.AppendLine($"<form action=\"{signedPostPolicy.PostUrl}\" method=\"post\" enctype=\"multipart/form-data\">");
foreach (var field in signedPostPolicy.Fields)
{
form.AppendLine($"<input type=\"hidden\" name=\"{field.Key}\" value=\"{field.Value}\">");
}
// Include the file element. It should always be the last element in the form.
form.AppendLine("<input name=\"file\" type=\"file\">");
form.AppendLine("<input type=\"submit\" value=\"Upload\">");
form.AppendLine("</form>");
// You can now save the form to file and serve it as static content
// or send it as the response to a request made to your application.
File.WriteAllText("PostPolicySuccessStatus.html", form.ToString());
Upload URIs
In some cases, it may not make sense for client applications to have permissions to begin an upload for an object, but an authenticated service may choose to grant this ability for individual uploads. Signed URLs are one option for this. Another option is for the service to start a resumable upload session, but instead of performing the upload, sending the resulting upload URI to the client application so it can perform the upload instead. Unlike sessions initiated with a signed URL, a pre-initated upload session will force the client application to upload through the region in which the session began, which will likely be close to the service, and not necessarily the client.
var client = StorageClient.Create();
var source = "world.txt";
var destination = "places/world.txt";
var contentType = "text/plain";
// var acl = PredefinedAcl.PublicRead // public
var acl = PredefinedObjectAcl.AuthenticatedRead; // private
var options = new UploadObjectOptions { PredefinedAcl = acl };
// Create a temporary uploader so the upload session can be manually initiated without actually uploading.
var tempUploader = client.CreateObjectUploader(bucketName, destination, contentType, new MemoryStream(), options);
var uploadUri = await tempUploader.InitiateSessionAsync();
// Send uploadUri to (unauthenticated) client application, so it can perform the upload:
using (var stream = File.OpenRead(source))
{
// IUploadProgress defined in Google.Apis.Upload namespace
IProgress<IUploadProgress> progress = new Progress<IUploadProgress>(
p => Console.WriteLine($"bytes: {p.BytesSent}, status: {p.Status}")
);
var actualUploader = ResumableUpload.CreateFromUploadUri(uploadUri, stream);
actualUploader.ProgressChanged += progress.Report;
await actualUploader.UploadAsync();
}
Customer-supplied encryption keys
Storage objects are always stored encrypted, but if you wish to
specify your own encryption key instead of using the server-supplied
one, you can do so either for all operations with a particular
StorageClient
or on individual ones.
// Use EncryptionKey.Create if you already have a key.
EncryptionKey key = EncryptionKey.Generate();
// This will affect all relevant object-based operations by default.
var client = StorageClient.Create(encryptionKey: key);
var content = Encoding.UTF8.GetBytes("hello, world");
client.UploadObject(bucketName, "encrypted.txt", "text/plain", new MemoryStream(content));
// When downloading, either use a client with the same key...
client.DownloadObject(bucketName, "encrypted.txt", new MemoryStream());
// Or specify a key just for that operation.
var client2 = StorageClient.Create();
client2.DownloadObject(bucketName, "encrypted.txt", new MemoryStream(),
new DownloadObjectOptions { EncryptionKey = key });
Change notification via Google Cloud Pub/Sub
You can configure a bucket to send a change notification to a
Google Cloud Pub/Sub topic
when changes occur. The sample below shows how to create a Pub/Sub
topic, set its permissions so that the change notifications can be
published to it, and then create the notification configuration on a
bucket. You'll need to add a dependency on the
Google.Cloud.PubSub.V1
NuGet package to create the topic and
manage its permissions.
// First create a Pub/Sub topic.
PublisherServiceApiClient publisherClient = PublisherServiceApiClient.Create();
TopicName topicName = new TopicName(projectId, topicId);
publisherClient.CreateTopic(topicName);
// Prepare the topic for Storage notifications. The Storage Service Account must have Publish permission
// for the topic. The code below adds the service account into the "roles/pubsub.publisher" role for the topic.
// Determine the Storage Service Account name to use in IAM operations.
StorageClient storageClient = StorageClient.Create();
string storageServiceAccount = $"serviceAccount:{storageClient.GetStorageServiceAccountEmail(projectId)}";
// Fetch the IAM policy for the topic.
Iam.V1.Policy policy = publisherClient.GetIamPolicy(topicName.ToString());
var role = "roles/pubsub.publisher";
// Ensure the Storage Service Account is in the publisher role, setting the IAM policy for the topic
// on the server if necessary.
if (policy.AddRoleMember(role, storageServiceAccount))
{
publisherClient.SetIamPolicy(topicName.ToString(), policy);
}
// Now that the topic is ready, we can create a notification configuration for Storage
Notification notification = new Notification
{
Topic = $"//pubsub.googleapis.com/{topicName}",
PayloadFormat = "JSON_API_V1"
};
notification = storageClient.CreateNotification(bucket, notification);
Console.WriteLine($"Created notification ID: {notification.Id}");