spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Buntu Dev <buntu...@gmail.com>
Subject Re: Error while running Streaming examples - no snappyjava in java.library.path
Date Mon, 20 Oct 2014 22:51:03 GMT
Thanks Akhil.

On Mon, Oct 20, 2014 at 1:57 AM, Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> Its a known bug in JDK7 and OSX's naming convention, here's how to resolve
> it:
>
>  1. Get the Snappy jar file from
> http://central.maven.org/maven2/org/xerial/snappy/snappy-java/
>  2. Copy the appropriate one to your project's class path.
>
>
>
> Thanks
> Best Regards
>
> On Sun, Oct 19, 2014 at 10:18 PM, bdev <buntudev@gmail.com> wrote:
>
>> I built the latest Spark project and I'm running into these errors when
>> attempting to run the streaming examples locally on the Mac, how do I fix
>> these errors?
>>
>> java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
>>         at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
>>         at java.lang.Runtime.loadLibrary0(Runtime.java:849)
>>         at java.lang.System.loadLibrary(System.java:1088)
>>         at
>> org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:170)
>>         at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:145)
>>         at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
>>         at
>> org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:81)
>>         at
>>
>> org.apache.spark.io.SnappyCompressionCodec.compressedOutputStream(CompressionCodec.scala:125)
>>         at
>>
>> org.apache.spark.storage.BlockManager.wrapForCompression(BlockManager.scala:1083)
>>         at
>>
>> org.apache.spark.storage.BlockManager$$anonfun$7.apply(BlockManager.scala:579)
>>         at
>>
>> org.apache.spark.storage.BlockManager$$anonfun$7.apply(BlockManager.scala:579)
>>         at
>>
>> org.apache.spark.storage.DiskBlockObjectWriter.open(BlockObjectWriter.scala:126)
>>         at
>>
>> org.apache.spark.storage.DiskBlockObjectWriter.write(BlockObjectWriter.scala:192)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$4$$anonfun$apply$2.apply(ExternalSorter.scala:732)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$4$$anonfun$apply$2.apply(ExternalSorter.scala:731)
>>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter$IteratorForPartition.foreach(ExternalSorter.scala:789)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$4.apply(ExternalSorter.scala:731)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$4.apply(ExternalSorter.scala:727)
>>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>         at
>>
>> org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:727)
>>         at
>>
>> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:70)
>>         at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>>         at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>>         at org.apache.spark.scheduler.Task.run(Task.scala:56)
>>         at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:181)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>> Thanks!
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Error-while-running-Streaming-examples-no-snappyjava-in-java-library-path-tp16765.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Mime
View raw message