Class: Google::Cloud::Dataproc::V1::SparkJob
- Inherits:
 - 
      Object
      
        
- Object
 - Google::Cloud::Dataproc::V1::SparkJob
 
 
- Defined in:
 - lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb
 
Overview
A Cloud Dataproc job for running Apache Spark applications on YARN.
Instance Attribute Summary collapse
- 
  
    
      #archive_uris  ⇒ Array<String> 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 - 
  
    
      #args  ⇒ Array<String> 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 - 
  
    
      #file_uris  ⇒ Array<String> 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 - 
  
    
      #jar_file_uris  ⇒ Array<String> 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 - 
  
    
      #logging_config  ⇒ Google::Cloud::Dataproc::V1::LoggingConfig 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 - 
  
    
      #main_class  ⇒ String 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
The name of the driver's main class.
 - 
  
    
      #main_jar_file_uri  ⇒ String 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
The HCFS URI of the jar file that contains the main class.
 - 
  
    
      #properties  ⇒ Hash{String => String} 
    
    
  
  
  
  
    
    
  
  
  
  
  
  
    
Optional.
 
Instance Attribute Details
#archive_uris ⇒ Array<String>
Returns Optional. HCFS URIs of archives to be extracted in the working directory of Spark drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#args ⇒ Array<String>
Returns Optional. The arguments to pass to the driver. Do not include arguments, such as +--conf+, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#file_uris ⇒ Array<String>
Returns Optional. HCFS URIs of files to be copied to the working directory of Spark drivers and distributed tasks. Useful for naively parallel tasks.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#jar_file_uris ⇒ Array<String>
Returns Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#logging_config ⇒ Google::Cloud::Dataproc::V1::LoggingConfig
Returns Optional. The runtime log config for job execution.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#main_class ⇒ String
Returns The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in +jar_file_uris+.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#main_jar_file_uri ⇒ String
Returns The HCFS URI of the jar file that contains the main class.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  | 
  
#properties ⇒ Hash{String => String}
Returns Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Cloud Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
      141  | 
    
      # File 'lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb', line 141 class SparkJob; end  |