spark 写入elasticsearch报错Could not write all entries

我在使用Spark将Rdd写入到elasticsearch集群的时候报出异常

Could not write all entries [199/161664] (maybe ES was overloaded?). Bailing out...
    at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:250)
    at org.elasticsearch.hadoop.rest.RestRepository.doWriteToIndex(RestRepository.java:201)
    at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:163)
    at org.elasticsearch.spark.rdd.EsRDDWriter.write(EsRDDWriter.scala:49)
    at org.elasticsearch.spark.rdd.EsSpark$$anonfun$doSaveToEs$1.apply(EsSpark.scala:84)
    at org.elasticsearch.spark.rdd.EsSpark$$anonfun$doSaveToEs$1.apply(EsSpark.scala:84)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

RDD大概是5000W行数据,es集群有两个节点

EsSpark.saveToEs(result, "userindex/users", Map("es.mapping.id" -> "uid"))

楼主解决了吗,我也遇到了同样的问题

http://blog.csdn.net/dlj2324/article/details/70256486