-
Notifications
You must be signed in to change notification settings - Fork 751
Gobblin Schedulers
Oozie is a very popular scheduler for the Hadoop environment. It allows users to define complex workflows using XML files. A workflow can be composed of a series of actions, such as Java Jobs, Pig Jobs, Spark Jobs, etc. Gobblin has two integration points with Oozie. It can be run as a stand-alone Java process via Oozie's <java>
tag, or it can be run as an Map Reduce job via Oozie.
The following guides assume Oozie is already setup and running on some machine, if this is not the case consult the Oozie documentation for getting everything setup.
This guide focuses on getting Gobblin to run in as a stand alone Java Process. This means it will not launch a separate MR job to distribute its workload. It is important to understand how the current version of Oozie will launch a Java process. It will first start an MapReduce job and will run the Gobblin as a Java process inside a single map task. The Gobblin job will then ingest all data it is configured to pull and then it will shutdown.
gobblin-oozie/src/main/resources/
contains sample configuration files for launching Gobblin Oozie. There are a number of important files in this directory:
gobblin-oozie-example-system.properties
contains default system level properties for Gobblin. When launched with Oozie, Gobblin will run inside a map task; it is thus recommended to configure Gobblin to write directly to HDFS rather than the local file system. The property fs.uri
in this file should be changed to point to the NameNode of the Hadoop File System the job should write to. By default, all data is written under a folder called gobblin-out
; to change this modify the gobblin.work.dir
parameter in this file.
gobblin-oozie-example-workflow.properties
contains default Oozie properties for any job launched. It is also the entry point for launching an Oozie job (e.g. to launch an Oozie job from the command line you execute oozie job -config gobblin-oozie-example-workflow.properties -run
). In this file one needs to update the name.node
and resource.manager
to the values specific to their environment. Another important property in this file is oozie.wf.application.path
; it points to a folder on HDFS that contains any workflows to be run. It is important to note, that the workflow.xml
files must be on HDFS in order for Oozie to pick them up (this is because Oozie typically runs on a separate machine as any client process).
gobblin-oozie-example-workflow.xml
contains an example Oozie workflow. This example simply launches a Java process that invokes the main method of the CliLocalJobLauncher
. The main method of this class expects two file paths to be passed to it (once again these files need to be on HDFS). The jobconfig
arg should point to a file on HDFS containing all job configuration parameters. An example jobconfig
file can be found here. The sysconfig
arg should point to a file on HDFS containing all system configuration parameters. An example sysconfig
file for Oozie can be found here.
Oozie only reads a job properties file from the local file system (e.g. gobblin-oozie-example-workflow.properties
), it expects all other configuration and dependent files to be uploaded to HDFS. Specifically, it looks for these files under the directory specified by oozie.wf.application.path
Make sure this is the case before trying to launch an Oozie job.
Gobblin has a number of jar
dependencies that need to be used when launching a Gobblin job. These dependencies can be taken from the gobblin-dist.tar.gz
file that is created after building Gobblin. The tarball should contain a lib
folder will the necessary dependencies. This folder should be placed into a lib
folder under the same same directory specified by oozie.wf.application.path
in the gobblin-oozie-example-workflow.properties
file.
Assuming one has the Oozie CLI installed, the job can be launched using the following command: oozie job -config gobblin-oozie-example-workflow.properties -run
.
Once the job has been launched, its status can be queried via the following command: oozie job -info <oozie-job-id>
and the logs can be shown via the following command oozie job -log <oozie-job-id>
.
In order to get see the standard output of Gobblin, one needs to check the logs the Map task running the Gobblin process. oozie job -info should show the Hadoop
job_idof the Hadoop Job launched to run the Gobblin process. Using this id one should be able to find the logs of the Map tasks through the UI or other command line tools (e.g.
yarn logs`).
- Home
- [Getting Started](Getting Started)
- Architecture
- User Guide
- Working with Job Configuration Files
- [Deployment](Gobblin Deployment)
- Gobblin on Yarn
- Compaction
- [State Management and Watermarks] (State-Management-and-Watermarks)
- Working with the ForkOperator
- [Configuration Glossary](Configuration Properties Glossary)
- [Partitioned Writers](Partitioned Writers)
- Monitoring
- Schedulers
- [Job Execution History Store](Job Execution History Store)
- Gobblin Build Options
- Troubleshooting
- [FAQs] (FAQs)
- Case Studies
- Gobblin Metrics
- [Quick Start](Gobblin Metrics)
- [Existing Reporters](Existing Reporters)
- [Metrics for Gobblin ETL](Metrics for Gobblin ETL)
- [Gobblin Metrics Architecture](Gobblin Metrics Architecture)
- [Implementing New Reporters](Implementing New Reporters)
- [Gobblin Metrics Performance](Gobblin Metrics Performance)
- Developer Guide
- [Customization: New Source](Customization for New Source)
- [Customization: Converter/Operator](Customization for Converter and Operator)
- Code Style Guide
- IDE setup
- Monitoring Design
- Project
- [Feature List](Feature List)
- Contributors/Team
- [Talks/Tech Blogs](Talks and Tech Blogs)
- News/Roadmap
- Posts
- Miscellaneous