折腾了一天,最后才发现sqoop2暂时只支持mysql到hdfs或者hdfs到mysql,不支持hive或者hbase,无语啊。不过这里还是记下sqoop2的安装,兴许以后sqoop2加了支持,从这里能够快速记忆起来。
首先下载,这里版本是sqoop1.99.6,解压什么的就不说了。主要是配置,首先是环境变量
export SQOOP_HOME=/home/hadoop/sqoop/sqoop-1.99.6-bin-hadoop200export PATH = $SQOOP_HOME/bin:$PATHexport CATALINA_BASE=/home/hadoop/sqoop/sqoop-1.99.6-bin-hadoop200/serverexport LOGDIR=$SQOOP_HOME/logs/
然后是sqoop-1.99.6-bin-hadoop200/server/conf/sqoop.properties,修改属性org.apache.sqoop.submission.engine.mapreduce.configuration.directory
org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/home/hadoop/hadoop/hadoop-2.6.1/etc/hadoop
然后是sqoop-1.99.6-bin-hadoop200/server/conf/catalina.properties ,修改属性common.loader
common.loader=/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/common/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/common/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/hdfs/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/hdfs/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/mapreduce/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/mapreduce/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/tools/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/tools/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/yarn/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/yarn/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/httpfs/tomcat/lib/*.jar
然后下载mysql的驱动包,放到server/lib下。
下面是常用命令
./sqoop.sh server start 启动 ./sqoop.sh server stop 停止./sqoop.sh client 进入客户端set server --host hadoopMaster --port 12000 --webapp sqoop 设置服务器,注意hadoopMaster为hdfs主机名show connector --all 查看连接类型create link --cid 1 创建连接,cid为连接类型idshow link 查看连接update link -l 1 修改id为1的连接delete link -l 1 删除id为1的连接create job -f 1 -t 2 创建从连接1到连接2的jobshow job 查看jobupdate job -jid 1 修改jobdelete job -jid 1 删除jobstatus job -jid 1 看看job状态stop job -jid 1 停止job
日志在server/logs
1.99.7安装使用教程
http://blog.csdn.net/u012842205/article/details/52344196
http://blog.csdn.net/u012842205/article/details/52346595