在同一个JVM中检测到多个SparkContext
根据我的最后一个问题,我必须为我的唯一JVM定义Multiple SparkContext。
我以另一种方式(使用Java)做到了:
SparkConf conf = new SparkConf();conf.setAppName("Spark MultipleContest Test");
conf.set("spark.driver.allowMultipleContexts", "true");
conf.setMaster("local");
之后,我创建下一个源代码:
SparkContext sc = new SparkContext(conf);SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
然后在代码中:
JavaSparkContext ctx = new JavaSparkContext(conf);JavaRDD<Row> testRDD = ctx.parallelize(AllList);
执行代码后,我收到下一条错误消息:
16/01/19 15:21:08 WARN SparkContext: Multiple running SparkContexts detected in the same JVM!org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
test.MLlib.BinarryClassification.main(BinaryClassification.java:41)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2083)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2065)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2065)
at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2151)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:2023)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at test.MLlib.BinarryClassification.main(BinaryClassification.java:105)
数字41
和105
是行,其中两个对象都用Java代码定义。我的问题是,如果我已经使用set
-method
,是否可以在同一JVM上执行多个SparkContext以及如何执行?
回答:
您确定需要将JavaSparkContext作为单独的上下文吗?您提到的上一个问题没有这么说。如果您已经拥有一个Spark
Context,则可以从中创建一个新的JavaSparkContext,而不用创建单独的上下文:
SparkConf conf = new SparkConf();conf.setAppName("Spark MultipleContest Test");
conf.set("spark.driver.allowMultipleContexts", "true");
conf.setMaster("local");
SparkContext sc = new SparkContext(conf);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
//Create a Java Context which is the same as the scala one under the hood
JavaSparkContext.fromSparkContext(sc)
以上是 在同一个JVM中检测到多个SparkContext 的全部内容, 来源链接: utcz.com/qa/414358.html