MacローカルにDockerでHDP Sandboxを構築する


HDP(主にSpark)の動作環境が欲しかったのでDocker(boot2docker)で作りました。
ちょいちょいハマったのでメモ。

作業範囲はHDP sandboxの構築と、Saprk/Spark2の動作確認までです。

環境

  • OS: macOS Sierra 10.12.3
  • boot2docker version: v1.8.0
  • HDP: 2.5.0.0-1245

Get Docker image

取得したimageからHDPを起動する

Load image

Docker hostのディスクが不足してしまいました。

$ docker load < HDP_2.5_docker.tar.gz
Error processing tar file(exit status 1): write /9741599793f1189469b2428f2dc6d95748c24d953401847a066a7f1e16408c72/layer.tar: no space left on device

Docker hostのディスク容量を増やす

ついでにメモリも増やしておきます。
まずは設定。

$ vim ~/.boot2docker/profile
DiskSize = 80000
Memory = 4096

設定したらVMを作り直します。

boot2docker delete
boot2docker init
boot2docker up

改めてloadします。

$ docker load < HDP_2.5_docker.tar.gz
b1b065555b8a: Loading layer 202.2 MB/202.2 MB
3901568415a3: Loading layer 13.85 GB/13.85 GB
Loaded image: sandbox:latest

今度は足りました。
最終的に必要なディスク容量は13.3GBくらいでした。
もちろん今後HDFSにデータを入れたりするので十分余裕のあるサイズを割り当てました。

docker@boot2docker:~$ df -h
Filesystem                Size      Used Available Use% Mounted on
tmpfs                     1.8G    191.8M      1.6G  11% /
tmpfs                  1001.0M         0   1001.0M   0% /dev/shm
/dev/sda1                74.6G     13.3G     57.4G  19% /mnt/sda1
cgroup                 1001.0M         0   1001.0M   0% /sys/fs/cgroup
Users                   930.7G     96.1G    834.6G  10% /Users
/dev/sda1                74.6G     13.3G     57.4G  19% /mnt/sda1/var/lib/docker/aufs

なお、この後Mac再起動後にdocker hostnに接続しようとすると、x509 certificate errorが発生する問題が出ました。
結局もう一度コンテナを作り直したのですが、詳細はこちらに書いてあります。

docker run

本題に戻ってloadしたimageを起動します。

$ docker run -v hadoop:/hadoop --name sandbox --hostname "sandbox.hortonworks.com" --privileged -d \
-p 6080:6080 \
-p 9090:9090 \
-p 9000:9000 \
-p 8000:8000 \
-p 8020:8020 \
-p 42111:42111 \
-p 10500:10500 \
-p 16030:16030 \
-p 8042:8042 \
-p 8040:8040 \
-p 2100:2100 \
-p 4200:4200 \
-p 4040:4040 \
-p 8050:8050 \
-p 9996:9996 \
-p 9995:9995 \
-p 8080:8080 \
-p 8088:8088 \
-p 8886:8886 \
-p 8889:8889 \
-p 8443:8443 \
-p 8744:8744 \
-p 8888:8888 \
-p 8188:8188 \
-p 8983:8983 \
-p 1000:1000 \
-p 1100:1100 \
-p 11000:11000 \
-p 10001:10001 \
-p 15000:15000 \
-p 10000:10000 \
-p 8993:8993 \
-p 1988:1988 \
-p 5007:5007 \
-p 50070:50070 \
-p 19888:19888 \
-p 16010:16010 \
-p 50111:50111 \
-p 50075:50075 \
-p 50095:50095 \
-p 18080:18080 \
-p 60000:60000 \
-p 8090:8090 \
-p 8091:8091 \
-p 8005:8005 \
-p 8086:8086 \
-p 8082:8082 \
-p 60080:60080 \
-p 8765:8765 \
-p 5011:5011 \
-p 6001:6001 \
-p 6003:6003 \
-p 6008:6008 \
-p 1220:1220 \
-p 21000:21000 \
-p 6188:6188 \
-p 61888:61888 \
-p 2181:2181 \
-p 2222:22 \
sandbox /usr/sbin/sshd -D
7420eaae60a5915b2c7370af3b5399627157c21bef9e0e45ecb7bc74386f8f32

psで確認。

$ docker ps
CONTAINER ID        IMAGE               COMMAND               CREATED              STATUS              PORTS                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              NAMES
7420eaae60a5        sandbox             "/usr/sbin/sshd -D"   About a minute ago   Up 7 seconds        0.0.0.0:1000->1000/tcp, 0.0.0.0:1100->1100/tcp, 0.0.0.0:1220->1220/tcp, 0.0.0.0:1988->1988/tcp, 0.0.0.0:2100->2100/tcp, 0.0.0.0:2181->2181/tcp, 0.0.0.0:4040->4040/tcp, 0.0.0.0:4200->4200/tcp, 0.0.0.0:5007->5007/tcp, 0.0.0.0:5011->5011/tcp, 0.0.0.0:6001->6001/tcp, 0.0.0.0:6003->6003/tcp, 0.0.0.0:6008->6008/tcp, 0.0.0.0:6080->6080/tcp, 0.0.0.0:6188->6188/tcp, 0.0.0.0:8000->8000/tcp, 0.0.0.0:8005->8005/tcp, 0.0.0.0:8020->8020/tcp, 0.0.0.0:8040->8040/tcp, 0.0.0.0:8042->8042/tcp, 0.0.0.0:8050->8050/tcp, 0.0.0.0:8080->8080/tcp, 0.0.0.0:8082->8082/tcp, 0.0.0.0:8086->8086/tcp, 0.0.0.0:8088->8088/tcp, 0.0.0.0:8090-8091->8090-8091/tcp, 0.0.0.0:8188->8188/tcp, 0.0.0.0:8443->8443/tcp, 0.0.0.0:8744->8744/tcp, 0.0.0.0:8765->8765/tcp, 0.0.0.0:8886->8886/tcp, 0.0.0.0:8888-8889->8888-8889/tcp, 0.0.0.0:8983->8983/tcp, 0.0.0.0:8993->8993/tcp, 0.0.0.0:9000->9000/tcp, 0.0.0.0:9090->9090/tcp, 0.0.0.0:9995-9996->9995-9996/tcp, 0.0.0.0:10000-10001->10000-10001/tcp, 0.0.0.0:10500->10500/tcp, 0.0.0.0:11000->11000/tcp, 0.0.0.0:15000->15000/tcp, 0.0.0.0:16010->16010/tcp, 0.0.0.0:16030->16030/tcp, 0.0.0.0:18080->18080/tcp, 0.0.0.0:19888->19888/tcp, 0.0.0.0:21000->21000/tcp, 0.0.0.0:42111->42111/tcp, 0.0.0.0:50070->50070/tcp, 0.0.0.0:50075->50075/tcp, 0.0.0.0:50095->50095/tcp, 0.0.0.0:50111->50111/tcp, 0.0.0.0:60000->60000/tcp, 0.0.0.0:60080->60080/tcp, 0.0.0.0:61888->61888/tcp, 0.0.0.0:2222->22/tcp   sandbox

今後の利便性を考えて、Macの/etc/hostsにHDPコンテナのFQDNを登録しておきます。

$ cat /etc/hosts
##
# Host Database
#
# localhost is used to configure the loopback interface
# when the system is booting.  Do not change this entry.
##
127.0.0.1   localhost
255.255.255.255 broadcasthost
::1             localhost
192.168.59.104    sandbox.hortonworks.com

HDPコンテナに接続

セットアップするため起動したHDPコンテナにsshで接続します。

  • User: root
  • Initial Password: hadoop
$ ssh -p 2222 [email protected]
The authenticity of host '[sandbox.hortonworks.com]:2222 ([192.168.59.104]:2222)' can't be established.
RSA key fingerprint is SHA256:kiut+RYAoazuJVU/xIAhZ8nvbfZClAOJmMlwtzY3MTY.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[sandbox.hortonworks.com]:2222' (RSA) to the list of known hosts.
[email protected]'s password:
You are required to change your password immediately (root enforced)
Changing password for root.
(current) UNIX password:
New password:
Retype new password:

startup_script実行

warningは無視してよいとのこと(参照元)。

[root@sandbox ~]# /etc/init.d/startup_script start
Starting tutorials...                                      [  Ok  ]
Starting startup_script...
Starting HDP ...
Starting mysql                                            [  OK  ]
Starting Flume                                            [  OK  ]
Starting Postgre SQL                                      [  OK  ]
Starting name node                                        [  OK  ]
Starting Zookeeper nodes                                  [  OK  ]
Starting data node                                        [  OK  ]
Starting Ranger-admin                                     [WARNINGS]
find: failed to restore initial working directory: Permission denied
Starting Oozie                                            [  OK  ]
Starting Ranger-usersync                                  [  OK  ]
Starting NFS portmap                                      [  OK  ]
Starting Hdfs nfs                                         [  OK  ]
Starting Hive server                                      [  OK  ]
Starting Hiveserver2                                      [  OK  ]
Starting Yarn history server                              [  OK  ]
Starting Node manager                                     [  OK  ]
Starting Webhcat server                                   [  OK  ]
Starting Resource manager                                 [  OK  ]
Starting Spark                                            [  OK  ]
Starting Ambari server                                    [  OK  ]
Starting Zeppelin                                         [  OK  ]
@Starting Mapred history server                            [  OK  ]
Starting Ambari agent                                     [  OK  ]
Safe mode is OFF
Starting sandbox...
/etc/init.d/startup_script: line 98: /proc/sys/kernel/hung_task_timeout_secs: No such file or directory
Starting shellinaboxd:                                     [  OK  ]

Ambariにログイン

手順はこれで終わりで、ブラウザから http://sandbox.hortonworks.com:8888/ にアクセスすると、HDP sandboxの初期画面が表示されます。

まずはAmbariにアクセスして色々見たいところですが、ブラウザから http://sandbox.hortonworks.com:8080/ にアクセスしても、画面が表示できません。

sandboxコンテナにsshで入って確認したところ、Ambari serverが立ち上がっていませんでした。

手動で立ち上げます。

[root@sandbox ~]# /etc/init.d/ambari-server start
Using python  /usr/bin/python
Starting ambari-server
Ambari Server running with administrator privileges.
Organizing resource files at /var/lib/ambari-server/resources...
Ambari database consistency check started...
No errors were found.
Ambari database consistency check finished
Server PID at: /var/run/ambari-server/ambari-server.pid
Server out at: /var/log/ambari-server/ambari-server.out
Server log at: /var/log/ambari-server/ambari-server.log
Waiting for server start...................
/usr/bin/mysqld_safe: line 166:   340 Killed                  nohup /usr/sbin/mysqld --basedir=/usr --datadir=/var/lib/mysql --plugin-dir=/usr/lib64/mysql/plugin --user=mysql --log-error=/var/log/mysqld.log --pid-file=/var/run/mysqld/mysqld.pid --socket=/var/lib/mysql/mysql.sock < /dev/null >> /var/log/mysqld.log 2>&1
Ambari Server 'start' completed successfully.

Ambariパスワードリセット

Ambariが起動したので、再度ブラウザから接続してみると、今度はAmbari Web UIに、初期Id/Passwordであるadmin/adminで入れませんでした。

こちらにパスワードをリセットする方法が載っていたので、リセットします。

[root@sandbox ~]# ambari-admin-password-reset
Please set the password for admin:
Please retype the password for admin:

The admin password has been set.
Restarting ambari-server to make the password change effective...

Using python  /usr/bin/python
Restarting ambari-server
Using python  /usr/bin/python
Stopping ambari-server
Ambari Server stopped
Using python  /usr/bin/python
Starting ambari-server
Ambari Server running with administrator privileges.
Organizing resource files at /var/lib/ambari-server/resources...
Ambari database consistency check started...
No errors were found.
Ambari database consistency check finished
Server PID at: /var/run/ambari-server/ambari-server.pid
Server out at: /var/log/ambari-server/ambari-server.out
Server log at: /var/log/ambari-server/ambari-server.log
Waiting for server start....................
Ambari Server 'start' completed successfully.

Spark/Spark2動作確認

サービス起動

ようやくAmbariにログインすることができました。
Sparkの動作確認を行うため、以下のサービスを起動します。

Ambari Web UIの左メニューからサービスを選択し、画面右上にあるService Actionsからstartします。

  • HDFS
  • YARN
  • Spark
  • Spark2

Spark example jar実行

動作確認に円周率計算プログラムを実行します。

[root@sandbox ~]# su - spark
[spark@sandbox ~]$ cd /usr/hdp/2.5.0.0-1245/spark
[spark@sandbox spark]$ spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 1 --driver-memory 128m --executor-memory 128m --executor-cores 1 lib/spark-examples*.jar 10
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
17/03/18 19:24:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/03/18 19:24:48 INFO TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
17/03/18 19:24:48 INFO RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
17/03/18 19:24:48 INFO AHSProxy: Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
17/03/18 19:24:49 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/03/18 19:24:50 INFO Client: Requesting a new application from cluster with 1 NodeManagers
17/03/18 19:24:50 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2250 MB per container)
17/03/18 19:24:50 INFO Client: Will allocate AM container, with 512 MB memory including 384 MB overhead
17/03/18 19:24:50 INFO Client: Setting up container launch context for our AM
17/03/18 19:24:50 INFO Client: Setting up the launch environment for our AM container
17/03/18 19:24:50 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
17/03/18 19:24:50 INFO Client: Preparing resources for our AM container
17/03/18 19:24:50 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
17/03/18 19:24:50 INFO Client: Source and destination file systems are the same. Not copying hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
17/03/18 19:24:50 INFO Client: Uploading resource file:/usr/hdp/2.5.0.0-1245/spark/lib/spark-examples-1.6.2.2.5.0.0-1245-hadoop2.7.3.2.5.0.0-1245.jar -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1489825437372_0006/spark-examples-1.6.2.2.5.0.0-1245-hadoop2.7.3.2.5.0.0-1245.jar
17/03/18 19:24:52 INFO Client: Uploading resource file:/tmp/spark-9d0bed2b-a4c9-4392-808d-7ca573f5b0cb/__spark_conf__5699802957140890856.zip -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1489825437372_0006/__spark_conf__5699802957140890856.zip
17/03/18 19:24:53 INFO SecurityManager: Changing view acls to: spark
17/03/18 19:24:53 INFO SecurityManager: Changing modify acls to: spark
17/03/18 19:24:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
17/03/18 19:24:53 INFO Client: Submitting application 6 to ResourceManager
17/03/18 19:24:53 INFO YarnClientImpl: Submitted application application_1489825437372_0006
17/03/18 19:24:54 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:24:54 INFO Client:
     client token: N/A
     diagnostics: AM container is launched, waiting for AM container to Register with RM
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1489865093633
     final status: UNDEFINED
     tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1489825437372_0006/
     user: spark
17/03/18 19:24:55 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:24:56 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:24:57 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:24:58 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:24:59 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:00 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:03 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:04 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:05 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:06 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:07 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:08 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:09 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:10 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:11 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:12 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:13 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:14 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:15 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:16 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:17 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:18 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:19 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:21 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:22 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:23 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:24 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:25 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:26 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:27 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:28 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:29 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:30 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:31 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:32 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:33 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:34 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:35 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:36 INFO Client: Application report for application_1489825437372_0006 (state: ACCEPTED)
17/03/18 19:25:37 INFO Client: Application report for application_1489825437372_0006 (state: FAILED)
17/03/18 19:25:37 INFO Client:
     client token: N/A
     diagnostics: Application application_1489825437372_0006 failed 2 times due to AM Container for appattempt_1489825437372_0006_000002 exited with  exitCode: 15
For more detailed output, check the application tracking page: http://sandbox.hortonworks.com:8088/cluster/app/application_1489825437372_0006 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1489825437372_0006_02_000001
Exit code: 15
Stack trace: ExitCodeException exitCode=15:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:933)
    at org.apache.hadoop.util.Shell.run(Shell.java:844)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1123)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:225)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 15
Failing this attempt. Failing the application.
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1489865093633
     final status: FAILED
     tracking URL: http://sandbox.hortonworks.com:8088/cluster/app/application_1489825437372_0006
     user: spark
Exception in thread "main" org.apache.spark.SparkException: Application application_1489825437372_0006 finished with failed status
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1122)
    at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1169)
    at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/03/18 19:25:37 INFO ShutdownHookManager: Shutdown hook called
17/03/18 19:25:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-9d0bed2b-a4c9-4392-808d-7ca573f5b0cb

失敗しましたw

Spark2

同様にSpark2も動かしてみます。

[spark@sandbox spark]$ cd /usr/hdp/2.5.0.0-1245/spark2
[spark@sandbox spark2]$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 1 --driver-memory 512m --executor-memory 512m --executor-cores 1 examples/jars/spark-examples*.jar 10
Warning: Master yarn-cluster is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
17/03/19 16:09:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/03/19 16:09:24 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/03/19 16:09:24 INFO RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
17/03/19 16:09:25 INFO Client: Requesting a new application from cluster with 1 NodeManagers
17/03/19 16:09:25 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2250 MB per container)
17/03/19 16:09:25 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
17/03/19 16:09:25 INFO Client: Setting up container launch context for our AM
17/03/19 16:09:25 INFO Client: Setting up the launch environment for our AM container
17/03/19 16:09:25 INFO Client: Preparing resources for our AM container
17/03/19 16:09:25 INFO Client: Use hdfs cache file as spark.yarn.archive for HDP, hdfsCacheFile:hdfs:///hdp/apps/2.5.0.0-1245/spark2/spark2-hdp-yarn-archive.tar.gz
17/03/19 16:09:25 INFO Client: Source and destination file systems are the same. Not copying hdfs:/hdp/apps/2.5.0.0-1245/spark2/spark2-hdp-yarn-archive.tar.gz
17/03/19 16:09:25 INFO Client: Uploading resource file:/usr/hdp/2.5.0.0-1245/spark2/examples/jars/spark-examples_2.11-2.0.0.2.5.0.0-1245.jar -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1489938645171_0004/spark-examples_2.11-2.0.0.2.5.0.0-1245.jar
17/03/19 16:09:26 INFO Client: Uploading resource file:/tmp/spark-676fc718-309b-4bcb-843e-ab8325f78c28/__spark_conf__4920266636660798460.zip -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1489938645171_0004/__spark_conf__.zip
17/03/19 16:09:26 INFO SecurityManager: Changing view acls to: spark
17/03/19 16:09:26 INFO SecurityManager: Changing modify acls to: spark
17/03/19 16:09:26 INFO SecurityManager: Changing view acls groups to:
17/03/19 16:09:26 INFO SecurityManager: Changing modify acls groups to:
17/03/19 16:09:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(spark); groups with view permissions: Set(); users  with modify permissions: Set(spark); groups with modify permissions: Set()
17/03/19 16:09:26 INFO Client: Submitting application application_1489938645171_0004 to ResourceManager
17/03/19 16:09:26 INFO YarnClientImpl: Submitted application application_1489938645171_0004
17/03/19 16:09:27 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:27 INFO Client:
     client token: N/A
     diagnostics: AM container is launched, waiting for AM container to Register with RM
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1489939766377
     final status: UNDEFINED
     tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1489938645171_0004/
     user: spark
17/03/19 16:09:28 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:29 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:30 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:31 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:32 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:33 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:34 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:35 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:36 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:37 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:38 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:39 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:40 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:41 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:42 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:43 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:44 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:45 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:46 INFO Client: Application report for application_1489938645171_0004 (state: ACCEPTED)
17/03/19 16:09:47 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:47 INFO Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 172.17.0.2
     ApplicationMaster RPC port: 0
     queue: default
     start time: 1489939766377
     final status: UNDEFINED
     tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1489938645171_0004/
     user: spark
17/03/19 16:09:48 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:49 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:50 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:51 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:52 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:53 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:54 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:55 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:56 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:57 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:58 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:09:59 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:00 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:01 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:02 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:03 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:04 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:05 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:06 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:07 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:08 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:09 INFO Client: Application report for application_1489938645171_0004 (state: RUNNING)
17/03/19 16:10:10 INFO Client: Application report for application_1489938645171_0004 (state: FINISHED)
17/03/19 16:10:11 INFO Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 172.17.0.2
     ApplicationMaster RPC port: 0
     queue: default
     start time: 1489939766377
     final status: SUCCEEDED
     tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1489938645171_0004/
     user: spark
17/03/19 16:10:13 INFO ShutdownHookManager: Shutdown hook called
17/03/19 16:10:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-676fc718-309b-4bcb-843e-ab8325f78c28

Spark2は正常に動作しました。
私が使うSparkは2.0なので、Spark1系は使えなくてもひとまず問題ありません。

Tips

私の端末だとリソース不足でHDPのサービス起動にかなり時間がかかります。なので、boot2dockerを停止する際、boot2docker downではなくboot2docker saveをして次に動かす時の手間を省いています。

参考