Module: Google::Cloud::Bigquery::DataTransfer

Defined in:
lib/google/cloud/bigquery/data_transfer.rb,
lib/google/cloud/bigquery/data_transfer/v1.rb,
lib/google/cloud/bigquery/data_transfer/credentials.rb,
lib/google/cloud/bigquery/data_transfer/v1/doc/overview.rb,
lib/google/cloud/bigquery/data_transfer/v1/data_transfer_service_client.rb,
lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/data_transfer/v1/transfer.rb,
lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/data_transfer/v1/data_transfer.rb

Overview

Ruby Client for BigQuery Data Transfer API (Beta)

BigQuery Data Transfer API: Transfers data from partner SaaS applications to Google BigQuery on a scheduled, managed basis.

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the BigQuery Data Transfer API.
  4. Setup Authentication.

Installation

$ gem install google-cloud-bigquery-data_transfer

Preview

DataTransferServiceClient

require "google/cloud/bigquery/data_transfer"

data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new
formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)

# Iterate over all results.
data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
  # Process element.
end

# Or iterate over results one page at a time.
data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
  # Process each page at a time.
  page.each do |element|
    # Process element.
  end
end

Next Steps

Defined Under Namespace

Modules: V1 Classes: Credentials

Constant Summary collapse

FILE_DIR =

rubocop:enable LineLength

File.realdirpath(Pathname.new(__FILE__).join("..").join("data_transfer"))
AVAILABLE_VERSIONS =
Dir["#{FILE_DIR}/*"]
.select { |file| File.directory?(file) }
.select { |dir| Google::Gax::VERSION_MATCHER.match(File.basename(dir)) }
.select { |dir| File.exist?(dir + ".rb") }
.map { |dir| File.basename(dir) }

Class Method Summary collapse

Class Method Details

.new(version: , credentials: , scopes: , client_config: , timeout: ) ⇒ Object

The Google BigQuery Data Transfer Service API enables BigQuery users to configure the transfer of their data from other Google Products into BigQuery. This service contains methods that are end user exposed. It backs up the frontend.

Parameters:

  • credentials (Google::Auth::Credentials, String, Hash, GRPC::Core::Channel, GRPC::Core::ChannelCredentials, Proc)

    Provides the means for authenticating requests made by the client. This parameter can be many types. A Google::Auth::Credentials uses a the properties of its represented keyfile for authenticating requests made by this client. A String will be treated as the path to the keyfile to be used for the construction of credentials for this client. A Hash will be treated as the contents of a keyfile to be used for the construction of credentials for this client. A GRPC::Core::Channel will be used to make calls through. A GRPC::Core::ChannelCredentials for the setting up the RPC client. The channel credentials should already be composed with a GRPC::Core::CallCredentials object. A Proc will be used as an updater_proc for the Grpc channel. The proc transforms the metadata for requests, generally, to give OAuth credentials.

  • scopes (Array<String>)

    The OAuth scopes for this service. This parameter is ignored if an updater_proc is supplied.

  • client_config (Hash)

    A Hash for call options for each method. See Google::Gax#construct_settings for the structure of this data. Falls back to the default config if not specified or the specified config is missing data points.

  • timeout (Numeric)

    The default timeout, in seconds, for calls made through this client.

Parameters:

  • version (Symbol, String)

    The major version of the service to be used. By default :v1 is used.



116
117
118
119
120
121
122
123
124
125
126
127
128
# File 'lib/google/cloud/bigquery/data_transfer.rb', line 116

def self.new(*args, version: :v1, **kwargs)
  unless AVAILABLE_VERSIONS.include?(version.to_s.downcase)
    raise "The version: #{version} is not available. The available versions " \
      "are: [#{AVAILABLE_VERSIONS.join(", ")}]"
  end

  require "#{FILE_DIR}/#{version.to_s.downcase}"
  version_module = Google::Cloud::Bigquery::DataTransfer
    .constants
    .select {|sym| sym.to_s.downcase == version.to_s.downcase}
    .first
  Google::Cloud::Bigquery::DataTransfer.const_get(version_module).new(*args, **kwargs)
end