Spark関連環境問題及び解決方法の記録
3519 ワード
シーン
Sparkの開発過程で出会った関連環境問題と解決方法を記録する.
メモ問題1:マシン使用可能な物理メモリが 未満
現象:start-all.sh Spark起動時に異常が発生する、提示により関連log#cat spark-hadoop-orgを参照.apache.spark.deploy.master.Master-1-master.out,発見:#There is insufficient memory for the Java Runtime Environment to continue.Native memory allocation (mmap) failed to map 715849728 bytes for committing reserved memory. # Possible reasons: # The system is out of physical RAM or swap space # In 32 bit mode, the process size limit was hit # Possible solutions: # Reduce memory load on the system # Increase physical memory or swap space # Check if swap backing store is full # Use 64 bit Java on a 64 bit OS # Decrease Java heap size (-Xmx/-Xms) # Decrease number of Java threads # Decrease Java thread stack sizes (-Xss) # Set larger code cache with -XX:ReservedCodeCacheSize= # This output file may be truncated or incomplete. # # Out of Memory Error (os_linux.cpp:2627), pid=22008, tid=139666963719936 # # JRE version: (8.0_60-b27) (build ) # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.60-b23 mixed mode linux-amd64 compressed oops) # Failed to write core dump. Core dumps have been disabled. To enable core dumping,try"ulimit-c unlimited"before starting Java again#解決策:free-m利用可能なメモリが足りないことを発見=>アリ雲をアップグレードして1核2 G=>80近くの大洋に配置して辛くない問題2:JVMメモリが 未満
現象:sbtを使用してScalaプログラムをパッケージングする時に現れる:hadoop@master:~/sparkapp$/usr/local/sbt/sbt package Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0 [info] Set current project to Simple Project (in build file:/home/hadoop/sparkapp/) [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: unable to create new native thread at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:188) at sbt.Execute.next$1(Execute.scala:83) at sbt.Execute.ProcessAll(Execute.scala:86)解決策:JVMの使用可能なメモリ#set JAVAを増やすOPTS=-Xms512m -Xmx1024m問題3:sparkとscalaバージョンの互換性のない問題 現象:ideaを使用してWordCountプログラムを実行すると、次のようになります.
まとめ
yes, I love problems ! I am a problem-solver !
Sparkの開発過程で出会った関連環境問題と解決方法を記録する.
メモ
現象:start-all.sh Spark起動時に異常が発生する、提示により関連log#cat spark-hadoop-orgを参照.apache.spark.deploy.master.Master-1-master.out,発見:#There is insufficient memory for the Java Runtime Environment to continue.Native memory allocation (mmap) failed to map 715849728 bytes for committing reserved memory. # Possible reasons: # The system is out of physical RAM or swap space # In 32 bit mode, the process size limit was hit # Possible solutions: # Reduce memory load on the system # Increase physical memory or swap space # Check if swap backing store is full # Use 64 bit Java on a 64 bit OS # Decrease Java heap size (-Xmx/-Xms) # Decrease number of Java threads # Decrease Java thread stack sizes (-Xss) # Set larger code cache with -XX:ReservedCodeCacheSize= # This output file may be truncated or incomplete. # # Out of Memory Error (os_linux.cpp:2627), pid=22008, tid=139666963719936 # # JRE version: (8.0_60-b27) (build ) # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.60-b23 mixed mode linux-amd64 compressed oops) # Failed to write core dump. Core dumps have been disabled. To enable core dumping,try"ulimit-c unlimited"before starting Java again#解決策:free-m利用可能なメモリが足りないことを発見=>アリ雲をアップグレードして1核2 G=>80近くの大洋に配置して辛くない
現象:sbtを使用してScalaプログラムをパッケージングする時に現れる:hadoop@master:~/sparkapp$/usr/local/sbt/sbt package Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0 [info] Set current project to Simple Project (in build file:/home/hadoop/sparkapp/) [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: unable to create new native thread at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:188) at sbt.Execute.next$1(Execute.scala:83) at sbt.Execute.ProcessAll(Execute.scala:86)解決策:JVMの使用可能なメモリ#set JAVAを増やすOPTS=-Xms512m -Xmx1024m
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:159)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:452)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
解決方法:scala.2.1.8とspark 1.6.0互換性に問題があります.scala環境をscala 2に変更する.10.4後にWordCountプログラムを正常に実行まとめ
yes, I love problems ! I am a problem-solver !