spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-24273) Failure while using .checkpoint method to private S3 store via S3A connector
Date Sun, 27 May 2018 21:10:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-24273?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Steve Loughran updated SPARK-24273:
-----------------------------------
    Description: 
We are getting following error:

{code}
com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 416, AWS Service: Amazon S3,
AWS Request ID: tx000000000000000014126-005ae9bfd9-9ed9ac2-default, AWS Error Code: InvalidRange,
AWS Error Message: null, S3 Extended Request ID: 9ed9ac2-default-default"
{code}
when we use checkpoint method as below.
{code}
val streamBucketDF = streamPacketDeltaDF
 .filter('timeDelta > maxGap && 'timeDelta <= 30000)
 .withColumn("bucket", when('timeDelta <= mediumGap, "medium")
 .otherwise("large")
 )
 .checkpoint()
{code}
Do you have idea how to prevent invalid range in header to be sent, or how it can be workarounded
or fixed?

Thanks.

  was:
We are getting following error:

com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 416, AWS Service: Amazon S3,
AWS Request ID: tx000000000000000014126-005ae9bfd9-9ed9ac2-default, AWS Error Code: InvalidRange,
AWS Error Message: null, S3 Extended Request ID: 9ed9ac2-default-default"

when we use checkpoint method as below.

val streamBucketDF = streamPacketDeltaDF
 .filter('timeDelta > maxGap && 'timeDelta <= 30000)
 .withColumn("bucket", when('timeDelta <= mediumGap, "medium")
 .otherwise("large")
 )
 .checkpoint()

Do you have idea how to prevent invalid range in header to be sent, or how it can be workarounded
or fixed?

Thanks.


> Failure while using .checkpoint method to private S3 store via S3A connector
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-24273
>                 URL: https://issues.apache.org/jira/browse/SPARK-24273
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.3.0
>            Reporter: Jami Malikzade
>            Priority: Major
>
> We are getting following error:
> {code}
> com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 416, AWS Service: Amazon
S3, AWS Request ID: tx000000000000000014126-005ae9bfd9-9ed9ac2-default, AWS Error Code: InvalidRange,
AWS Error Message: null, S3 Extended Request ID: 9ed9ac2-default-default"
> {code}
> when we use checkpoint method as below.
> {code}
> val streamBucketDF = streamPacketDeltaDF
>  .filter('timeDelta > maxGap && 'timeDelta <= 30000)
>  .withColumn("bucket", when('timeDelta <= mediumGap, "medium")
>  .otherwise("large")
>  )
>  .checkpoint()
> {code}
> Do you have idea how to prevent invalid range in header to be sent, or how it can be
workarounded or fixed?
> Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message