ElasticSearch Spark错误

我是ElasticSearch的新手,我正在尝试编写一些Apache

Spark代码以将一些数据保存到ElasticSearch中。我在SparkShell中输入了以下几行:

 import org.elasticsearch.spark._

val myMap = Map("France" -> "FRA", "United States" -> "US")

val myRDD = sc.makeRDD(Seq(myMap))

myRDD.saveToEs("Country/Abbrv")

错误:

 org.elasticsearch.hadoop.EsHadoopIlegalArgumentException:  Cannot determine write shards for [Country/Abbrv]; likely its format is incorrect (maybe it contains illegal characters?)

Spark 2.0.0 ElasticSearch-Spark 2.3.4

有任何想法吗?

回答:

问题是我没有在启动spark shell之前设置–conf var。它需要如下所示:

 spark-shell --jars {path}/elasticsearch-spark_2.11-2.3.4.jar --conf spark.es.resource=Country/Abbrv

以上是 ElasticSearch Spark错误 的全部内容, 来源链接: utcz.com/qa/412943.html

回到顶部