google_api_dataproc v0.10.1 GoogleApi.Dataproc.V1.Model.OrderedJob View Source
A job executed by the workflow.
Attributes
hadoopJob
(type:GoogleApi.Dataproc.V1.Model.HadoopJob.t
, default:nil
) - Job is a Hadoop job.hiveJob
(type:GoogleApi.Dataproc.V1.Model.HiveJob.t
, default:nil
) - Job is a Hive job.labels
(type:map()
, default:nil
) - Optional. The labels to associate with this job.Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}{0,62}Label values must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}\p{N}_-{0,63}No more than 32 labels can be associated with a given job.pigJob
(type:GoogleApi.Dataproc.V1.Model.PigJob.t
, default:nil
) - Job is a Pig job.prerequisiteStepIds
(type:list(String.t)
, default:nil
) - Optional. The optional list of prerequisite job step_ids. If not specified, the job will start at the beginning of workflow.pysparkJob
(type:GoogleApi.Dataproc.V1.Model.PySparkJob.t
, default:nil
) - Job is a Pyspark job.scheduling
(type:GoogleApi.Dataproc.V1.Model.JobScheduling.t
, default:nil
) - Optional. Job scheduling configuration.sparkJob
(type:GoogleApi.Dataproc.V1.Model.SparkJob.t
, default:nil
) - Job is a Spark job.sparkSqlJob
(type:GoogleApi.Dataproc.V1.Model.SparkSqlJob.t
, default:nil
) - Job is a SparkSql job.stepId
(type:String.t
, default:nil
) - Required. The step id. The id must be unique among all jobs within the template.The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps.The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.
Link to this section Summary
Functions
Unwrap a decoded JSON object into its complex fields.
Link to this section Types
Link to this type
t()
View Source
t()
View Source
t() :: %GoogleApi.Dataproc.V1.Model.OrderedJob{
hadoopJob: GoogleApi.Dataproc.V1.Model.HadoopJob.t(),
hiveJob: GoogleApi.Dataproc.V1.Model.HiveJob.t(),
labels: map(),
pigJob: GoogleApi.Dataproc.V1.Model.PigJob.t(),
prerequisiteStepIds: [String.t()],
pysparkJob: GoogleApi.Dataproc.V1.Model.PySparkJob.t(),
scheduling: GoogleApi.Dataproc.V1.Model.JobScheduling.t(),
sparkJob: GoogleApi.Dataproc.V1.Model.SparkJob.t(),
sparkSqlJob: GoogleApi.Dataproc.V1.Model.SparkSqlJob.t(),
stepId: String.t()
}
t() :: %GoogleApi.Dataproc.V1.Model.OrderedJob{ hadoopJob: GoogleApi.Dataproc.V1.Model.HadoopJob.t(), hiveJob: GoogleApi.Dataproc.V1.Model.HiveJob.t(), labels: map(), pigJob: GoogleApi.Dataproc.V1.Model.PigJob.t(), prerequisiteStepIds: [String.t()], pysparkJob: GoogleApi.Dataproc.V1.Model.PySparkJob.t(), scheduling: GoogleApi.Dataproc.V1.Model.JobScheduling.t(), sparkJob: GoogleApi.Dataproc.V1.Model.SparkJob.t(), sparkSqlJob: GoogleApi.Dataproc.V1.Model.SparkSqlJob.t(), stepId: String.t() }
Link to this section Functions
Link to this function
decode(value, options) View Source
Unwrap a decoded JSON object into its complex fields.