[亚马逊](500310)操作无效:断言
我正在使用spark-redshift和查询使用pyspark进行处理的redshift数据。[亚马逊](500310)操作无效:断言
查询工作正常,如果我在使用工作台等红移运行。但spark-redshift卸载数据到s3,然后检索它,它会引发以下错误,当我运行它。什么是这里的问题,我怎么能解决这个
UNLOAD ('SELECT “x”,”y" FROM (select x,y from table_name where ((load_date=20171226 and hour>=16) or (load_date between 20171227 and
20171226) or (load_date=20171227 and hour<=16))) ') TO ‘s3:s3path' WITH
CREDENTIALS ‘aws_access_key_id=xxx;aws_secret_access_key=yyy' ESCAPE
MANIFEST
:
py4j.protocol.Py4JJavaError: An error occurred while calling o124.save. : java.sql.SQLException: [Amazon](500310) Invalid operation: Assert
Details:
-----------------------------------------------
error: Assert
code: 1000
context: !AmLeaderProcess -
query: 583860
location: scheduler.cpp:642
process: padbmaster [pid=31521]
-----------------------------------------------;
at com.amazon.redshift.client.messages.inbound.ErrorResponse.toErrorException(ErrorResponse.java:1830)
at com.amazon.redshift.client.PGMessagingContext.handleErrorResponse(PGMessagingContext.java:822)
at com.amazon.redshift.client.PGMessagingContext.handleMessage(PGMessagingContext.java:647)
at com.amazon.jdbc.communications.InboundMessagesPipeline.getNextMessageOfClass(InboundMessagesPipeline.java:312)
at com.amazon.redshift.client.PGMessagingContext.doMoveToNextClass(PGMessagingContext.java:1080)
at com.amazon.redshift.client.PGMessagingContext.getErrorResponse(PGMessagingContext.java:1048)
at com.amazon.redshift.client.PGClient.handleErrorsScenario2ForPrepareExecution(PGClient.java:2524)
at com.amazon.redshift.client.PGClient.handleErrorsPrepareExecute(PGClient.java:2465)
at com.amazon.redshift.client.PGClient.executePreparedStatement(PGClient.java:1420)
at com.amazon.redshift.dataengine.PGQueryExecutor.executePreparedStatement(PGQueryExecutor.java:370)
at com.amazon.redshift.dataengine.PGQueryExecutor.execute(PGQueryExecutor.java:245)
at com.amazon.jdbc.common.SPreparedStatement.executeWithParams(Unknown Source)
at com.amazon.jdbc.common.SPreparedStatement.execute(Unknown Source)
at com.databricks.spark.redshift.JDBCWrapper$$anonfun$executeInterruptibly$1.apply(RedshiftJDBCWrapper.scala:108)
at com.databricks.spark.redshift.JDBCWrapper$$anonfun$executeInterruptibly$1.apply(RedshiftJDBCWrapper.scala:108)
at com.databricks.spark.redshift.JDBCWrapper$$anonfun$2.apply(RedshiftJDBCWrapper.scala:126)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: Assert
它获取生成的查询。
回答:
断言错误通常发生在解释数据类型时出错,例如查询的union
查询的两部分,其中一列中的第N列是varchar,而另一部分中的同一列是整数或null。也许你的断言错误发生在来自不同节点的数据上(就像在联合查询中一样)。尝试为每列添加明确的数据格式,如x::integer
以上是 [亚马逊](500310)操作无效:断言 的全部内容, 来源链接: utcz.com/qa/264947.html