下载并编译Azkaban3.10.0,这里直接使用已编译和配置的安装包Azkaban3.10.0.tar.gz
将Azkaban3.10.0.tar.gz解压到/home/hadoop/soft目录下
进入/home/hadoop/soft/azkaban/plugins/jobtypes/hive/
将plugin.properties和private.properties中hive.aux.jars.path修改为hive lib目录
hive.aux.jars.path=/home/hadoop/soft/hive2.1.1/lib
进入/home/hadoop/soft/azkaban/plugins/jobtypes/spark/
将private.properties中jobtype.classpath注释掉
hadoop.home=/home/hadoop/soft/hadoop-2.6.0-cdh5.5.0
hive.home=/home/hadoop/soft/hive2.1.1
pig.home=/home/hadoop/soft/azkaban/plugins/jobtype/pig
spark.home=/home/hadoop/soft/spark-2.3.1-bin-hadoop2.6
azkaban.home=/home/hadoop/soft/azkaban
修改private.properties,参照上面修改应用对应目录
hadoop.home=/home/hadoop/soft/hadoop-2.6.0-cdh5.5.0 hive.home=/home/hadoop/soft/hive2.1.1 jobtype.classpath=${hadoop.home}/etc/hadoop:${hadoop.home}/share/hadoop/common/*:${hadoop.home}/share/hadoop/common/lib/*:${hadoop.home}/share/hadoop/hdfs/*:${hadoop.home}/share/hadoop/hdfs/lib/*:${hadoop.home}/share/hadoop/yarn/*:${hadoop.home}/share/hadoop/yarn/lib/*:${hadoop.home}/share/hadoop/mapreduce/*:${hadoop.home}/share/hadoop/mapreduce/lib/*:${hive.home}/conf:${hive.home}/lib/*
修改commonprivate.properties,按上面修改应用对应根目录,然后修改类路径
hadoop.home=/home/hadoop/soft/hadoop-2.6.0-cdh5.5.0 hive.home=/home/hadoop/soft/hive2.1.1 pig.home=/home/hadoop/soft/azkaban/plugins/jobtype/pig spark.home=/home/hadoop/soft/spark-2.3.1-bin-hadoop2.6 azkaban.home=/home/hadoop/soft/azkaban jobtype.global.classpath=${hadoop.home}/etc/hadoop:${hadoop.home}/share/hadoop/common/*:${hadoop.home}/share/hadoop/common/lib/*:${hadoop.home}/share/hadoop/hdfs/*:${hadoop.home}/share/hadoop/hdfs/lib/*:${hadoop.home}/share/hadoop/yarn/*:${hadoop.home}/share/hadoop/yarn/lib/*:${hadoop.home}/share/hadoop/mapreduce/*:${hadoop.home}/share/hadoop/mapreduce/lib/*:${hive.home}/conf:${hive.home}/lib/* hadoop.classpath=${hadoop.home}/etc/hadoop:${hadoop.home}/share/hadoop/common/*:${hadoop.home}/share/hadoop/common/lib/*:${hadoop.home}/share/hadoop/hdfs/*:${hadoop.home}/share/hadoop/hdfs/lib/*:${hadoop.home}/share/hadoop/yarn/*:${hadoop.home}/share/hadoop/yarn/lib/*:${hadoop.home}/share/hadoop/mapreduce/*:${hadoop.home}/share/hadoop/mapreduce/lib/*
修改/home/hadoop/soft/azkaban/conf/azkaban.properties
# Azkaban Personalization Settings
azkaban.name=azkaban name
azkaban.label=azkaban Scheduling Center
azkaban.color=#FF3601
azkaban.default.servlet.path=/index
web.resource.dir=web/
default.timezone.id=Asia/Shanghai
# Azkaban UserManager class
user.manager.class=azkaban.user.XmlUserManager
user.manager.xml.file=conf/azkaban-users.xml
# Loader for projects
executor.global.properties=conf/global.properties
azkaban.project.dir=projects
database.type=mysql
mysql.port=3306
mysql.host=127.0.0.1
mysql.user=azkaban
mysql.password=azkaban
mysql.numconnections=50
mysql.database=azkaban
azkaban.webserver.url=http://127.0.0.1:2020
#h2.path=./h2
#h2.create.tables=true
# Velocity dev mode
velocity.dev.mode=false
# Azkaban Jetty server properties.
jetty.use.ssl=false
jetty.maxThreads=25
jetty.port=2020
# Azkaban Executor settings
executor.port=12321
# mail settings
mail.sender=
mail.host=
job.failure.email=
job.success.email=
lockdown.create.projects=false
cache.directory=cache
# JMX stats
jetty.connector.stats=true
executor.connector.stats=true
# Azkaban plugin settings
azkaban.jobtype.plugin.dir=plugins/jobtypes
修改/home/hadoop/soft/azkaban/bin/azkaban-solo-start.sh,加入java环境变量
export PATH=/home/hadoop/soft/java/bin:$PATH
拷贝commons-configuration-1.6.jar和hadoop-common-2.6.0-cdh5.5.0.jar两个jar包到/home/hadoop/soft/azkaban/lib目录下(这里已经打包在里边了)
配置azkaban用户,/home/hadoop/soft/azkaban/conf/azkaban-users.xml
<azkaban-users>
<user username="azkaban" password="azkaban" roles="admin" groups="azkaban" />
<user username="metrics" password="metrics" roles="metrics"/>
<role name="admin" permissions="ADMIN" />
<role name="metrics" permissions="METRICS"/>
</azkaban-users>
首先在mysql中创建azkaban库
create database azkaban;
GRANT ALL PRIVILEGES ON azkaban.* TO 'azkaban'@'127.0.0.1' IDENTIFIED BY 'azkaban' WITH GRANT OPTION;
在根目录下启动Azkaban(必须),bin/azkaban-solo-start.sh