Me 2 :)
On 12/22/2014 06:14 PM, Andrew Ash wrote:
> Hi Xiangrui,
>
> That link is currently returning a 503 Over Quota error message.
> Would you mind pinging back out when the page is back up?
>
> Thanks!
> Andrew
>
> On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <mengxr@gmail.com
> <mailto:mengxr@gmail.com>> wrote:
>
> Dear Spark users and developers,
>
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
> for any version of Spark, and makes it easy for developers to
> contribute packages.
>
> Spark Packages will feature integrations with various data sources,
> management tools, higher level domain-specific libraries, machine
> learning algorithms, code samples, and other Spark content. Thanks to
> the package authors, the initial listing of packages includes
> scientific computing libraries, a job execution server, a connector
> for importing Avro data, tools for launching Spark on Google Compute
> Engine, and many others.
>
> I’d like to invite you to contribute and use Spark Packages and
> provide feedback! As a disclaimer: Spark Packages is a community index
> maintained by Databricks and (by design) will include packages outside
> of the ASF Spark project. We are excited to help showcase and support
> all of the great work going on in the broader Spark community!
>
> Cheers,
> Xiangrui
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> <mailto:user-unsubscribe@spark.apache.org>
> For additional commands, e-mail: user-help@spark.apache.org
> <mailto:user-help@spark.apache.org>
>
>
|