Module: Google::Cloud::Storage
- Defined in:
- lib/google/cloud/storage.rb,
lib/google/cloud/storage/file.rb,
lib/google/cloud/storage/bucket.rb,
lib/google/cloud/storage/errors.rb,
lib/google/cloud/storage/policy.rb,
lib/google/cloud/storage/project.rb,
lib/google/cloud/storage/service.rb,
lib/google/cloud/storage/version.rb,
lib/google/cloud/storage/file/acl.rb,
lib/google/cloud/storage/file/list.rb,
lib/google/cloud/storage/bucket/acl.rb,
lib/google/cloud/storage/bucket/cors.rb,
lib/google/cloud/storage/bucket/list.rb,
lib/google/cloud/storage/credentials.rb,
lib/google/cloud/storage/file/signer.rb,
lib/google/cloud/storage/post_object.rb,
lib/google/cloud/storage/notification.rb,
lib/google/cloud/storage/file/verifier.rb
Overview
Google Cloud Storage
Google Cloud Storage is an Internet service to store data in Google's cloud. It allows world-wide storage and retrieval of any amount of data and at any time, taking advantage of Google's own reliable and fast networking infrastructure to perform data operations in a cost effective manner.
The goal of google-cloud is to provide an API that is comfortable to Rubyists. Your authentication credentials are detected automatically in Google Cloud Platform environments such as Google Compute Engine, Google App Engine and Google Kubernetes Engine. In other environments you can configure authentication easily, either directly in your code or via environment variables. Read more about the options for connecting in the Authentication Guide.
require "google/cloud/storage"
storage = Google::Cloud::Storage.new(
project_id: "my-project",
credentials: "/path/to/keyfile.json"
)
bucket = storage.bucket "my-bucket"
file = bucket.file "path/to/my-file.ext"
You can learn more about various options for connection on the Authentication Guide.
To learn more about Cloud Storage, read the Google Cloud Storage Overview .
Enabling Logging
To enable logging for this library, set the logger for the underlying
Google API Client
library. The logger that you set may be a Ruby stdlib
Logger
as shown below, or a
Google::Cloud::Logging::Logger
that will write logs to Stackdriver
Logging.
If you do not set the logger explicitly and your application is running in
a Rails environment, it will default to Rails.logger
. Otherwise, if you
do not set the logger and you are not using Rails, logging is disabled by
default.
Configuring a Ruby stdlib logger:
require "logger"
my_logger = Logger.new $stderr
my_logger.level = Logger::WARN
# Set the Google API Client logger
Google::Apis.logger = my_logger
Retrieving Buckets
A Bucket instance is a container for your data. There is no limit on the number of buckets that you can create in a project. You can use buckets to organize and control access to your data. For more information, see Working with Buckets.
Each bucket has a globally unique name, which is how they are retrieved: (See Project#bucket)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
You can also retrieve all buckets on a project: (See Project#buckets)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
all_buckets = storage.buckets
If you have a significant number of buckets, you may need to fetch them in multiple service requests.
Iterating over each bucket, potentially with multiple API calls, by
invoking all
with a block:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
buckets = storage.buckets
buckets.all do |bucket|
puts bucket.name
end
Limiting the number of API calls made:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
buckets = storage.buckets
buckets.all(request_limit: 10) do |bucket|
puts bucket.name
end
See Bucket::List for details.
Creating a Bucket
A unique name is all that is needed to create a new bucket: (See Project#create_bucket)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.create_bucket "my-todo-app-attachments"
Retrieving Files
A File instance is an individual data object that you store in Google Cloud Storage. Files contain the data stored as well as metadata describing the data. Files belong to a bucket and cannot be shared among buckets. There is no limit on the number of files that you can create in a bucket. For more information, see Working with Objects.
Files are retrieved by their name, which is the path of the file in the bucket: (See Bucket#file)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
You can also retrieve all files in a bucket: (See Bucket#files)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
all_files = bucket.files
Or you can retrieve all files in a specified path:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
avatar_files = bucket.files prefix: "avatars/"
If you have a significant number of files, you may need to fetch them in multiple service requests.
Iterating over each file, potentially with multiple API calls, by
invoking all
with a block:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
files = storage.files
files.all do |file|
puts file.name
end
Limiting the number of API calls made:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
files = storage.files
files.all(request_limit: 10) do |file|
puts bucket.name
end
See File::List for details.
Creating a File
A new file can be uploaded by specifying the location of a file on the local file system, and the name/path that the file should be stored in the bucket. (See Bucket#create_file)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
bucket.create_file "/var/todo-app/avatars/heidi/400x400.png",
"avatars/heidi/400x400.png"
Files can also be created from an in-memory StringIO object:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
bucket.create_file StringIO.new("Hello world!"), "hello-world.txt"
Customer-supplied encryption keys
By default, Google Cloud Storage manages server-side encryption keys on
your behalf. However, a customer-supplied encryption
key
can be provided with the encryption_key
option. If given, the same key
must be provided to subsequently download or copy the file. If you use
customer-supplied encryption keys, you must securely manage your keys and
ensure that they are not lost. Also, please note that file metadata is not
encrypted, with the exception of the CRC32C checksum and MD5 hash. The
names of files and buckets are also not encrypted, and you can read or
update the metadata of an encrypted file without providing the encryption
key.
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
# Key generation shown for example purposes only. Write your own.
cipher = OpenSSL::Cipher.new "aes-256-cfb"
cipher.encrypt
key = cipher.random_key
bucket.create_file "/var/todo-app/avatars/heidi/400x400.png",
"avatars/heidi/400x400.png",
encryption_key: key
# Store your key and hash securely for later use.
file = bucket.file "avatars/heidi/400x400.png",
encryption_key: key
Use File#rotate to rotate customer-supplied encryption keys.
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
# Old key was stored securely for later use.
old_key = "y\x03\"\x0E\xB6\xD3\x9B\x0E\xAB*\x19\xFAv\xDEY\xBEI..."
file = bucket.file "path/to/my-file.ext", encryption_key: old_key
# Key generation shown for example purposes only. Write your own.
cipher = OpenSSL::Cipher.new "aes-256-cfb"
cipher.encrypt
new_key = cipher.random_key
file.rotate encryption_key: old_key, new_encryption_key: new_key
Downloading a File
Files can be downloaded to the local file system. (See File#download)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
file.download "/var/todo-app/avatars/heidi/400x400.png"
Files can also be downloaded to an in-memory StringIO object:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "hello-world.txt"
downloaded = file.download
downloaded.rewind
downloaded.read #=> "Hello world!"
Download a public file with an anonymous, unauthenticated client. Use
skip_lookup
to avoid errors retrieving non-public bucket and file
metadata.
require "google/cloud/storage"
storage = Google::Cloud::Storage.anonymous
bucket = storage.bucket "public-bucket", skip_lookup: true
file = bucket.file "path/to/public-file.ext", skip_lookup: true
downloaded = file.download
downloaded.rewind
downloaded.read #=> "Hello world!"
Creating and downloading gzip-encoded files
When uploading a gzip-compressed file, you should pass
content_encoding: "gzip"
if you want the file to be eligible for
decompressive transcoding
when it is later downloaded. In addition, giving the gzip-compressed file
a name containing the original file extension (for example, .txt
) will
ensure that the file's Content-Type
metadata is set correctly. (You can
also set the file's Content-Type
metadata explicitly with the
content_type
option.)
require "zlib"
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
gz = StringIO.new ""
z = Zlib::GzipWriter.new gz
z.write "Hello world!"
z.close
data = StringIO.new gz.string
bucket = storage.bucket "my-bucket"
bucket.create_file data, "path/to/gzipped.txt",
content_encoding: "gzip"
file = bucket.file "path/to/gzipped.txt"
# The downloaded data is decompressed by default.
file.download "path/to/downloaded/hello.txt"
# The downloaded data remains compressed with skip_decompress.
file.download "path/to/downloaded/gzipped.txt",
skip_decompress: true
Using Signed URLs
Access without authentication can be granted to a file for a specified period of time. This URL uses a cryptographic signature of your credentials to access the file. (See File#signed_url)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
shared_url = file.signed_url method: "GET",
expires: 300 # 5 minutes from now
Controlling Access to a Bucket
Access to a bucket is controlled with Bucket#acl. A bucket has owners, writers, and readers. Permissions can be granted to an individual user's email address, a group's email address, as well as many predefined lists. See the Access Control guide for more.
Access to a bucket can be granted to a user by appending "user-"
to the
email address:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
email = "heidi@example.net"
bucket.acl.add_reader "user-#{email}"
Access to a bucket can be granted to a group by appending "group-"
to
the email address:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
email = "authors@example.net"
bucket.acl.add_reader "group-#{email}"
Access to a bucket can also be granted to a predefined list of permissions:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
bucket.acl.public!
Controlling Access to a File
Access to a file is controlled in two ways, either by the setting the default permissions to all files in a bucket with Bucket#default_acl, or by setting permissions to an individual file with File#acl.
Access to a file can be granted to a user by appending "user-"
to the
email address:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
email = "heidi@example.net"
file.acl.add_reader "user-#{email}"
Access to a file can be granted to a group by appending "group-"
to the
email address:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
email = "authors@example.net"
file.acl.add_reader "group-#{email}"
Access to a file can also be granted to a predefined list of permissions:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-todo-app"
file = bucket.file "avatars/heidi/400x400.png"
file.acl.public!
Assigning payment to the requester
The requester pays feature enables the owner of a bucket to indicate that a client accessing the bucket or a file it contains must assume the transit costs related to the access.
Assign transit costs for bucket and file operations to requesting clients
with the requester_pays
flag:
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-bucket"
bucket.requester_pays = true # API call
# Clients must now provide `user_project` option when calling
# Project#bucket to access this bucket.
Once the requester_pays
flag is enabled for a bucket, a client
attempting to access the bucket and its files must provide the
user_project
option to Project#bucket. If the argument given is
true
, transit costs for operations on the requested bucket or a file it
contains will be billed to the current project for the client. (See
Project#project for the ID of the current project.)
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "other-project-bucket", user_project: true
files = bucket.files # Billed to current project
If the argument is a project ID string, and the indicated project is authorized for the currently authenticated service account, transit costs will be billed to the indicated project.
require "google/cloud/storage"
storage = Google::Cloud::Storage.new
bucket = storage.bucket "other-project-bucket",
user_project: "my-other-project"
files = bucket.files # Billed to "my-other-project"
Configuring Pub/Sub notification subscriptions
You can configure notifications to send Google Cloud Pub/Sub messages about changes to files in your buckets. For example, you can track files that are created and deleted in your bucket. Each notification contains information describing both the event that triggered it and the file that changed.
You can send notifications to any Cloud Pub/Sub topic in any project for which your service account has sufficient permissions. As shown below, you need to explicitly grant permission to your service account to enable Google Cloud Storage to publish on behalf of your account. (Even if your current project created and owns the topic.)
require "google/cloud/pubsub"
require "google/cloud/storage"
pubsub = Google::Cloud::Pubsub.new
topic = pubsub.create_topic "my-topic"
topic.policy do |p|
p.add "roles/pubsub.publisher",
"serviceAccount:my-project" \
"@gs-project-accounts.iam.gserviceaccount.com"
end
storage = Google::Cloud::Storage.new
bucket = storage.bucket "my-bucket"
notification = bucket.create_notification topic.name
Configuring retries and timeout
You can configure how many times API requests may be automatically
retried. When an API request fails, the response will be inspected to see
if the request meets criteria indicating that it may succeed on retry,
such as 500
and 503
status codes or a specific internal error code
such as rateLimitExceeded
. If it meets the criteria, the request will be
retried after a delay. If another error occurs, the delay will be
increased before a subsequent attempt, until the retries
limit is
reached.
You can also set the request timeout
value in seconds.
require "google/cloud/storage"
storage = Google::Cloud::Storage.new retries: 10, timeout: 120
See the Storage status and error codes for a list of error conditions.
Defined Under Namespace
Classes: Bucket, Credentials, File, FileVerificationError, Notification, Policy, PostObject, Project, SignedUrlUnavailable
Constant Summary collapse
- GOOGLEAPIS_URL =
"https://storage.googleapis.com".freeze
- VERSION =
"1.13.0".freeze
Class Method Summary collapse
-
.anonymous(retries: nil, timeout: nil) ⇒ Google::Cloud::Storage::Project
Creates an unauthenticated, anonymous client for retrieving public data from the Storage service.
-
.configure {|Google::Cloud.configure.storage| ... } ⇒ Google::Cloud::Config
Configure the Google Cloud Storage library.
-
.new(project_id: nil, credentials: nil, scope: nil, retries: nil, timeout: nil, project: nil, keyfile: nil) ⇒ Google::Cloud::Storage::Project
Creates a new object for connecting to the Storage service.
Class Method Details
.anonymous(retries: nil, timeout: nil) ⇒ Google::Cloud::Storage::Project
Creates an unauthenticated, anonymous client for retrieving public data from the Storage service. Each call creates a new connection.
719 720 721 722 723 |
# File 'lib/google/cloud/storage.rb', line 719 def self.anonymous retries: nil, timeout: nil Storage::Project.new( Storage::Service.new(nil, nil, retries: retries, timeout: timeout) ) end |
.configure {|Google::Cloud.configure.storage| ... } ⇒ Google::Cloud::Config
Configure the Google Cloud Storage library.
The following Storage configuration parameters are supported:
project_id
- (String) Identifier for a Storage project. (The parameterproject
is considered deprecated, but may also be used.)credentials
- (String, Hash, Google::Auth::Credentials) The path to the keyfile as a String, the contents of the keyfile as a Hash, or a Google::Auth::Credentials object. (See Credentials) (The parameterkeyfile
is considered deprecated, but may also be used.)scope
- (String, Array) The OAuth 2.0 scopes controlling the set of resources and operations that the connection can access. retries
- (Integer) Number of times to retry requests on server error.timeout
- (Integer) Default timeout to use in requests.
745 746 747 748 749 |
# File 'lib/google/cloud/storage.rb', line 745 def self.configure yield Google::Cloud.configure.storage if block_given? Google::Cloud.configure.storage end |
.new(project_id: nil, credentials: nil, scope: nil, retries: nil, timeout: nil, project: nil, keyfile: nil) ⇒ Google::Cloud::Storage::Project
Creates a new object for connecting to the Storage service. Each call creates a new connection.
For more information on connecting to Google Cloud see the Authentication Guide.
676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 |
# File 'lib/google/cloud/storage.rb', line 676 def self.new project_id: nil, credentials: nil, scope: nil, retries: nil, timeout: nil, project: nil, keyfile: nil project_id ||= (project || default_project_id) project_id = project_id.to_s # Always cast to a string raise ArgumentError, "project_id is missing" if project_id.empty? scope ||= configure.scope retries ||= configure.retries timeout ||= configure.timeout credentials ||= (keyfile || default_credentials(scope: scope)) unless credentials.is_a? Google::Auth::Credentials credentials = Storage::Credentials.new credentials, scope: scope end Storage::Project.new( Storage::Service.new( project_id, credentials, retries: retries, timeout: timeout ) ) end |