Class: Google::Cloud::Bigquery::Table
- Inherits:
-
Object
- Object
- Google::Cloud::Bigquery::Table
- Defined in:
- lib/google/cloud/bigquery/table.rb,
lib/google/cloud/bigquery/table/list.rb,
lib/google/cloud/bigquery/table/async_inserter.rb
Overview
Table
A named resource representing a BigQuery table that holds zero or more records. Every table is defined by a schema that may contain nested and repeated fields.
Direct Known Subclasses
Defined Under Namespace
Classes: AsyncInserter, List, Updater
Attributes collapse
-
#api_url ⇒ String
A URL that can be used to access the table using the REST API.
-
#buffer_bytes ⇒ Integer
A lower-bound estimate of the number of bytes currently in this table's streaming buffer, if one is present.
-
#buffer_oldest_at ⇒ Time?
The time of the oldest entry currently in this table's streaming buffer, if one is present.
-
#buffer_rows ⇒ Integer
A lower-bound estimate of the number of rows currently in this table's streaming buffer, if one is present.
-
#created_at ⇒ Time?
The time when this table was created.
-
#dataset_id ⇒ String
The ID of the
Dataset
containing this table. -
#description ⇒ String
A user-friendly description of the table.
-
#description=(new_description) ⇒ Object
Updates the user-friendly description of the table.
-
#etag ⇒ String
The ETag hash of the table.
-
#expires_at ⇒ Time?
The time when this table expires.
-
#external ⇒ External::DataSource
The External::DataSource (or subclass) object that represents the external data source that the table represents.
-
#external=(external) ⇒ Object
Set the External::DataSource (or subclass) object that represents the external data source that the table represents.
-
#external? ⇒ Boolean
Checks if the table's type is "EXTERNAL".
-
#fields ⇒ Array<Schema::Field>
The fields of the table, obtained from its schema.
-
#headers ⇒ Array<Symbol>
The names of the columns in the table, obtained from its schema.
-
#id ⇒ String
The combined Project ID, Dataset ID, and Table ID for this table, in the format specified by the Legacy SQL Query Reference:
project_name:datasetId.tableId
. -
#labels ⇒ Hash<String, String>
A hash of user-provided labels associated with this table.
-
#labels=(labels) ⇒ Object
Updates the hash of user-provided labels associated with this table.
-
#location ⇒ String
The geographic location where the table should reside.
-
#modified_at ⇒ Time?
The date when this table was last modified.
-
#name ⇒ String
The name of the table.
-
#name=(new_name) ⇒ Object
Updates the name of the table.
-
#project_id ⇒ String
The ID of the
Project
containing this table. -
#query_id(standard_sql: nil, legacy_sql: nil) ⇒ String
The value returned by #id, wrapped in square brackets if the Project ID contains dashes, as specified by the Query Reference.
-
#schema(replace: false) {|schema| ... } ⇒ Google::Cloud::Bigquery::Schema
Returns the table's schema.
-
#table? ⇒ Boolean
Checks if the table's type is "TABLE".
-
#table_id ⇒ String
A unique ID for this table.
-
#time_partitioning? ⇒ Boolean
Checks if the table is time-partitioned.
-
#time_partitioning_expiration ⇒ Integer?
The expiration for the table partitions, if any, in seconds.
-
#time_partitioning_expiration=(expiration) ⇒ Object
Sets the partition expiration for the table.
-
#time_partitioning_type ⇒ String?
The period for which the table is partitioned, if any.
-
#time_partitioning_type=(type) ⇒ Object
Sets the partitioning for the table.
-
#view? ⇒ Boolean
Checks if the table's type is "VIEW".
Data collapse
-
#bytes_count ⇒ Integer
The number of bytes in the table.
-
#copy(destination_table, create: nil, write: nil) ⇒ Boolean
Copies the data from the table to another table using a synchronous method that blocks for a response.
-
#copy_job(destination_table, create: nil, write: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil) ⇒ Google::Cloud::Bigquery::CopyJob
Copies the data from the table to another table using an asynchronous method.
-
#data(token: nil, max: nil, start: nil) ⇒ Google::Cloud::Bigquery::Data
Retrieves data from the table.
-
#extract(extract_url, format: nil, compression: nil, delimiter: nil, header: nil) ⇒ Boolean
Extracts the data from the table to a Google Cloud Storage file using a synchronous method that blocks for a response.
-
#extract_job(extract_url, format: nil, compression: nil, delimiter: nil, header: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil) ⇒ Google::Cloud::Bigquery::ExtractJob
Extracts the data from the table to a Google Cloud Storage file using an asynchronous method.
-
#insert(rows, skip_invalid: nil, ignore_unknown: nil) ⇒ Google::Cloud::Bigquery::InsertResponse
Inserts data into the table for near-immediate querying, without the need to complete a load operation before the data can appear in query results.
-
#insert_async(skip_invalid: nil, ignore_unknown: nil, max_bytes: 10000000, max_rows: 500, interval: 10, threads: 4) {|response| ... } ⇒ Table::AsyncInserter
Create an asynchonous inserter object used to insert rows in batches.
-
#load(file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, autodetect: nil, null_marker: nil) ⇒ Google::Cloud::Bigquery::LoadJob
Loads data into the table.
-
#load_job(file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil, autodetect: nil, null_marker: nil) ⇒ Google::Cloud::Bigquery::LoadJob
Loads data into the table.
-
#rows_count ⇒ Integer
The number of rows in the table.
Lifecycle collapse
-
#delete ⇒ Boolean
Permanently deletes the table.
-
#reload! ⇒ Object
(also: #refresh!)
Reloads the table with current data from the BigQuery service.
Instance Method Details
#api_url ⇒ String
A URL that can be used to access the table using the REST API.
325 326 327 328 |
# File 'lib/google/cloud/bigquery/table.rb', line 325 def api_url ensure_full_data! @gapi.self_link end |
#buffer_bytes ⇒ Integer
A lower-bound estimate of the number of bytes currently in this table's streaming buffer, if one is present. This field will be absent if the table is not being streamed to or if there is no data in the streaming buffer.
686 687 688 689 |
# File 'lib/google/cloud/bigquery/table.rb', line 686 def buffer_bytes ensure_full_data! @gapi.streaming_buffer.estimated_bytes if @gapi.streaming_buffer end |
#buffer_oldest_at ⇒ Time?
The time of the oldest entry currently in this table's streaming buffer, if one is present. This field will be absent if the table is not being streamed to or if there is no data in the streaming buffer.
715 716 717 718 719 720 721 722 723 724 |
# File 'lib/google/cloud/bigquery/table.rb', line 715 def buffer_oldest_at ensure_full_data! return nil unless @gapi.streaming_buffer oldest_entry_time = @gapi.streaming_buffer.oldest_entry_time begin ::Time.at(Integer(oldest_entry_time) / 1000.0) rescue nil end end |
#buffer_rows ⇒ Integer
A lower-bound estimate of the number of rows currently in this table's streaming buffer, if one is present. This field will be absent if the table is not being streamed to or if there is no data in the streaming buffer.
701 702 703 704 |
# File 'lib/google/cloud/bigquery/table.rb', line 701 def buffer_rows ensure_full_data! @gapi.streaming_buffer.estimated_rows if @gapi.streaming_buffer end |
#bytes_count ⇒ Integer
The number of bytes in the table.
361 362 363 364 365 366 367 368 |
# File 'lib/google/cloud/bigquery/table.rb', line 361 def bytes_count ensure_full_data! begin Integer @gapi.num_bytes rescue nil end end |
#copy(destination_table, create: nil, write: nil) ⇒ Boolean
Copies the data from the table to another table using a synchronous method that blocks for a response. Timeouts and transient errors are generally handled as needed to complete the job. See also #copy_job.
913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 |
# File 'lib/google/cloud/bigquery/table.rb', line 913 def copy destination_table, create: nil, write: nil job = copy_job destination_table, create: create, write: write job.wait_until_done! if job.failed? begin # raise to activate ruby exception cause handling fail job.gapi_error rescue => e # wrap Google::Apis::Error with Google::Cloud::Error raise Google::Cloud::Error.from_error(e) end end true end |
#copy_job(destination_table, create: nil, write: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil) ⇒ Google::Cloud::Bigquery::CopyJob
Copies the data from the table to another table using an asynchronous method. In this method, a CopyJob is immediately returned. The caller may poll the service by repeatedly calling Job#reload! and Job#done? to detect when the job is done, or simply block until the job is done by calling #Job#wait_until_done!. See also #copy.
849 850 851 852 853 854 855 856 857 858 |
# File 'lib/google/cloud/bigquery/table.rb', line 849 def copy_job destination_table, create: nil, write: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil ensure_service! = { create: create, write: write, dryrun: dryrun, job_id: job_id, prefix: prefix, labels: labels } gapi = service.copy_table table_ref, get_table_ref(destination_table), Job.from_gapi gapi, service end |
#created_at ⇒ Time?
The time when this table was created.
393 394 395 396 397 398 399 400 |
# File 'lib/google/cloud/bigquery/table.rb', line 393 def created_at ensure_full_data! begin ::Time.at(Integer(@gapi.creation_time) / 1000.0) rescue nil end end |
#data(token: nil, max: nil, start: nil) ⇒ Google::Cloud::Bigquery::Data
Retrieves data from the table.
766 767 768 769 770 771 |
# File 'lib/google/cloud/bigquery/table.rb', line 766 def data token: nil, max: nil, start: nil ensure_service! = { token: token, max: max, start: start } data_gapi = service.list_tabledata dataset_id, table_id, Data.from_gapi data_gapi, gapi, service end |
#dataset_id ⇒ String
The ID of the Dataset
containing this table.
105 106 107 |
# File 'lib/google/cloud/bigquery/table.rb', line 105 def dataset_id @gapi.table_reference.dataset_id end |
#delete ⇒ Boolean
Permanently deletes the table.
1523 1524 1525 1526 1527 |
# File 'lib/google/cloud/bigquery/table.rb', line 1523 def delete ensure_service! service.delete_table dataset_id, table_id true end |
#description ⇒ String
A user-friendly description of the table.
337 338 339 340 |
# File 'lib/google/cloud/bigquery/table.rb', line 337 def description ensure_full_data! @gapi.description end |
#description=(new_description) ⇒ Object
Updates the user-friendly description of the table.
349 350 351 352 |
# File 'lib/google/cloud/bigquery/table.rb', line 349 def description= new_description @gapi.update! description: new_description patch_gapi! :description end |
#etag ⇒ String
The ETag hash of the table.
313 314 315 316 |
# File 'lib/google/cloud/bigquery/table.rb', line 313 def etag ensure_full_data! @gapi.etag end |
#expires_at ⇒ Time?
The time when this table expires. If not present, the table will persist indefinitely. Expired tables will be deleted and their storage reclaimed.
411 412 413 414 415 416 417 418 |
# File 'lib/google/cloud/bigquery/table.rb', line 411 def expires_at ensure_full_data! begin ::Time.at(Integer(@gapi.expiration_time) / 1000.0) rescue nil end end |
#external ⇒ External::DataSource
The External::DataSource (or subclass) object that represents the external data source that the table represents. Data can be queried the table, even though the data is not stored in BigQuery. Instead of loading or streaming the data, this object references the external data source.
Present only if the table represents an External Data Source. See #external? and External::DataSource.
649 650 651 652 |
# File 'lib/google/cloud/bigquery/table.rb', line 649 def external return nil if @gapi.external_data_configuration.nil? External.from_gapi(@gapi.external_data_configuration).freeze end |
#external=(external) ⇒ Object
Set the External::DataSource (or subclass) object that represents the external data source that the table represents. Data can be queried the table, even though the data is not stored in BigQuery. Instead of loading or streaming the data, this object references the external data source.
Use only if the table represents an External Data Source. See #external? and External::DataSource.
671 672 673 674 |
# File 'lib/google/cloud/bigquery/table.rb', line 671 def external= external @gapi.external_data_configuration = external.to_gapi patch_gapi! :external_data_configuration end |
#external? ⇒ Boolean
Checks if the table's type is "EXTERNAL".
466 467 468 |
# File 'lib/google/cloud/bigquery/table.rb', line 466 def external? @gapi.type == "EXTERNAL" end |
#extract(extract_url, format: nil, compression: nil, delimiter: nil, header: nil) ⇒ Boolean
Extracts the data from the table to a Google Cloud Storage file using a synchronous method that blocks for a response. Timeouts and transient errors are generally handled as needed to complete the job. See also #extract_job.
1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 |
# File 'lib/google/cloud/bigquery/table.rb', line 1050 def extract extract_url, format: nil, compression: nil, delimiter: nil, header: nil job = extract_job extract_url, format: format, compression: compression, delimiter: delimiter, header: header job.wait_until_done! if job.failed? begin # raise to activate ruby exception cause handling fail job.gapi_error rescue => e # wrap Google::Apis::Error with Google::Cloud::Error raise Google::Cloud::Error.from_error(e) end end true end |
#extract_job(extract_url, format: nil, compression: nil, delimiter: nil, header: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil) ⇒ Google::Cloud::Bigquery::ExtractJob
Extracts the data from the table to a Google Cloud Storage file using an asynchronous method. In this method, an ExtractJob is immediately returned. The caller may poll the service by repeatedly calling Job#reload! and Job#done? to detect when the job is done, or simply block until the job is done by calling #Job#wait_until_done!. See also #extract.
997 998 999 1000 1001 1002 1003 1004 1005 1006 |
# File 'lib/google/cloud/bigquery/table.rb', line 997 def extract_job extract_url, format: nil, compression: nil, delimiter: nil, header: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil ensure_service! = { format: format, compression: compression, delimiter: delimiter, header: header, dryrun: dryrun, job_id: job_id, prefix: prefix, labels: labels } gapi = service.extract_table table_ref, extract_url, Job.from_gapi gapi, service end |
#fields ⇒ Array<Schema::Field>
The fields of the table, obtained from its schema.
606 607 608 |
# File 'lib/google/cloud/bigquery/table.rb', line 606 def fields schema.fields end |
#headers ⇒ Array<Symbol>
The names of the columns in the table, obtained from its schema.
628 629 630 |
# File 'lib/google/cloud/bigquery/table.rb', line 628 def headers schema.headers end |
#id ⇒ String
The combined Project ID, Dataset ID, and Table ID for this table, in
the format specified by the Legacy SQL Query
Reference:
project_name:datasetId.tableId
. To use this value in queries see
#query_id.
242 243 244 |
# File 'lib/google/cloud/bigquery/table.rb', line 242 def id @gapi.id end |
#insert(rows, skip_invalid: nil, ignore_unknown: nil) ⇒ Google::Cloud::Bigquery::InsertResponse
Inserts data into the table for near-immediate querying, without the need to complete a load operation before the data can appear in query results.
1443 1444 1445 1446 1447 1448 1449 1450 1451 |
# File 'lib/google/cloud/bigquery/table.rb', line 1443 def insert rows, skip_invalid: nil, ignore_unknown: nil rows = [rows] if rows.is_a? Hash fail ArgumentError, "No rows provided" if rows.empty? ensure_service! = { skip_invalid: skip_invalid, ignore_unknown: ignore_unknown } gapi = service.insert_tabledata dataset_id, table_id, rows, InsertResponse.from_gapi rows, gapi end |
#insert_async(skip_invalid: nil, ignore_unknown: nil, max_bytes: 10000000, max_rows: 500, interval: 10, threads: 4) {|response| ... } ⇒ Table::AsyncInserter
Create an asynchonous inserter object used to insert rows in batches.
1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 |
# File 'lib/google/cloud/bigquery/table.rb', line 1496 def insert_async skip_invalid: nil, ignore_unknown: nil, max_bytes: 10000000, max_rows: 500, interval: 10, threads: 4, &block ensure_service! AsyncInserter.new self, skip_invalid: skip_invalid, ignore_unknown: ignore_unknown, max_bytes: max_bytes, max_rows: max_rows, interval: interval, threads: threads, &block end |
#labels ⇒ Hash<String, String>
A hash of user-provided labels associated with this table. Labels are used to organize and group tables. See Using Labels.
The returned hash is frozen and changes are not allowed. Use #labels= to replace the entire hash.
505 506 507 508 509 |
# File 'lib/google/cloud/bigquery/table.rb', line 505 def labels m = @gapi.labels m = m.to_h if m.respond_to? :to_h m.dup.freeze end |
#labels=(labels) ⇒ Object
Updates the hash of user-provided labels associated with this table. Labels are used to organize and group tables. See Using Labels.
537 538 539 540 |
# File 'lib/google/cloud/bigquery/table.rb', line 537 def labels= labels @gapi.labels = labels patch_gapi! :labels end |
#load(file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, autodetect: nil, null_marker: nil) ⇒ Google::Cloud::Bigquery::LoadJob
Loads data into the table. You can pass a google-cloud storage file path or a google-cloud storage file instance. Or, you can upload a file directly. See Loading Data with a POST Request.
1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 |
# File 'lib/google/cloud/bigquery/table.rb', line 1379 def load file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, autodetect: nil, null_marker: nil job = load_job file, format: format, create: create, write: write, projection_fields: projection_fields, jagged_rows: jagged_rows, quoted_newlines: quoted_newlines, encoding: encoding, delimiter: delimiter, ignore_unknown: ignore_unknown, max_bad_records: max_bad_records, quote: quote, skip_leading: skip_leading, autodetect: autodetect, null_marker: null_marker job.wait_until_done! if job.failed? begin # raise to activate ruby exception cause handling fail job.gapi_error rescue => e # wrap Google::Apis::Error with Google::Cloud::Error raise Google::Cloud::Error.from_error(e) end end true end |
#load_job(file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil, autodetect: nil, null_marker: nil) ⇒ Google::Cloud::Bigquery::LoadJob
Loads data into the table. You can pass a google-cloud storage file path or a google-cloud storage file instance. Or, you can upload a file directly. See Loading Data with a POST Request.
1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 |
# File 'lib/google/cloud/bigquery/table.rb', line 1225 def load_job file, format: nil, create: nil, write: nil, projection_fields: nil, jagged_rows: nil, quoted_newlines: nil, encoding: nil, delimiter: nil, ignore_unknown: nil, max_bad_records: nil, quote: nil, skip_leading: nil, dryrun: nil, job_id: nil, prefix: nil, labels: nil, autodetect: nil, null_marker: nil ensure_service! = { format: format, create: create, write: write, projection_fields: projection_fields, jagged_rows: jagged_rows, quoted_newlines: quoted_newlines, encoding: encoding, delimiter: delimiter, ignore_unknown: ignore_unknown, max_bad_records: max_bad_records, quote: quote, skip_leading: skip_leading, dryrun: dryrun, job_id: job_id, prefix: prefix, labels: labels, autodetect: autodetect, null_marker: null_marker } return load_storage(file, ) if storage_url? file return load_local(file, ) if local_file? file fail Google::Cloud::Error, "Don't know how to load #{file}" end |
#location ⇒ String
The geographic location where the table should reside. Possible
values include EU
and US
. The default value is US
.
478 479 480 481 |
# File 'lib/google/cloud/bigquery/table.rb', line 478 def location ensure_full_data! @gapi.location end |
#modified_at ⇒ Time?
The date when this table was last modified.
427 428 429 430 431 432 433 434 |
# File 'lib/google/cloud/bigquery/table.rb', line 427 def modified_at ensure_full_data! begin ::Time.at(Integer(@gapi.last_modified_time) / 1000.0) rescue nil end end |
#name ⇒ String
The name of the table.
290 291 292 |
# File 'lib/google/cloud/bigquery/table.rb', line 290 def name @gapi.friendly_name end |
#name=(new_name) ⇒ Object
Updates the name of the table.
301 302 303 304 |
# File 'lib/google/cloud/bigquery/table.rb', line 301 def name= new_name @gapi.update! friendly_name: new_name patch_gapi! :friendly_name end |
#project_id ⇒ String
The ID of the Project
containing this table.
116 117 118 |
# File 'lib/google/cloud/bigquery/table.rb', line 116 def project_id @gapi.table_reference.project_id end |
#query_id(standard_sql: nil, legacy_sql: nil) ⇒ String
The value returned by #id, wrapped in square brackets if the Project ID contains dashes, as specified by the Query Reference. Useful in queries.
275 276 277 278 279 280 281 |
# File 'lib/google/cloud/bigquery/table.rb', line 275 def query_id standard_sql: nil, legacy_sql: nil if Convert.resolve_legacy_sql standard_sql, legacy_sql "[#{id}]" else "`#{project_id}.#{dataset_id}.#{table_id}`" end end |
#reload! ⇒ Object Also known as: refresh!
Reloads the table with current data from the BigQuery service.
1534 1535 1536 1537 1538 |
# File 'lib/google/cloud/bigquery/table.rb', line 1534 def reload! ensure_service! gapi = service.get_table dataset_id, table_id @gapi = gapi end |
#rows_count ⇒ Integer
The number of rows in the table.
377 378 379 380 381 382 383 384 |
# File 'lib/google/cloud/bigquery/table.rb', line 377 def rows_count ensure_full_data! begin Integer @gapi.num_rows rescue nil end end |
#schema(replace: false) {|schema| ... } ⇒ Google::Cloud::Bigquery::Schema
Returns the table's schema. This method can also be used to set, replace, or add to the schema by passing a block. See Schema for available methods.
574 575 576 577 578 579 580 581 582 583 584 585 586 |
# File 'lib/google/cloud/bigquery/table.rb', line 574 def schema replace: false ensure_full_data! schema_builder = Schema.from_gapi @gapi.schema if block_given? schema_builder = Schema.from_gapi if replace yield schema_builder if schema_builder.changed? @gapi.schema = schema_builder.to_gapi patch_gapi! :schema end end schema_builder.freeze end |
#table? ⇒ Boolean
Checks if the table's type is "TABLE".
443 444 445 |
# File 'lib/google/cloud/bigquery/table.rb', line 443 def table? @gapi.type == "TABLE" end |
#table_id ⇒ String
A unique ID for this table.
93 94 95 |
# File 'lib/google/cloud/bigquery/table.rb', line 93 def table_id @gapi.table_reference.table_id end |
#time_partitioning? ⇒ Boolean
Checks if the table is time-partitioned. See Partitioned Tables.
138 139 140 |
# File 'lib/google/cloud/bigquery/table.rb', line 138 def time_partitioning? !@gapi.time_partitioning.nil? end |
#time_partitioning_expiration ⇒ Integer?
The expiration for the table partitions, if any, in seconds. See Partitioned Tables.
195 196 197 198 199 200 |
# File 'lib/google/cloud/bigquery/table.rb', line 195 def time_partitioning_expiration ensure_full_data! @gapi.time_partitioning.expiration_ms / 1_000 if time_partitioning? && !@gapi.time_partitioning.expiration_ms.nil? end |
#time_partitioning_expiration=(expiration) ⇒ Object
Sets the partition expiration for the table. See Partitioned Tables. The table must also be partitioned.
224 225 226 227 228 229 |
# File 'lib/google/cloud/bigquery/table.rb', line 224 def time_partitioning_expiration= expiration @gapi.time_partitioning ||= Google::Apis::BigqueryV2::TimePartitioning.new @gapi.time_partitioning.expiration_ms = expiration * 1000 patch_gapi! :time_partitioning end |
#time_partitioning_type ⇒ String?
The period for which the table is partitioned, if any. See Partitioned Tables.
151 152 153 154 |
# File 'lib/google/cloud/bigquery/table.rb', line 151 def time_partitioning_type ensure_full_data! @gapi.time_partitioning.type if time_partitioning? end |
#time_partitioning_type=(type) ⇒ Object
Sets the partitioning for the table. See Partitioned Tables.
You can only set partitioning when creating a table as in the example below. BigQuery does not allow you to change partitioning on an existing table.
178 179 180 181 182 183 |
# File 'lib/google/cloud/bigquery/table.rb', line 178 def time_partitioning_type= type @gapi.time_partitioning ||= Google::Apis::BigqueryV2::TimePartitioning.new @gapi.time_partitioning.type = type patch_gapi! :time_partitioning end |
#view? ⇒ Boolean
Checks if the table's type is "VIEW".
454 455 456 |
# File 'lib/google/cloud/bigquery/table.rb', line 454 def view? @gapi.type == "VIEW" end |