spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Boric Tan <>
Subject How Spark utilize low-level architecture features?
Date Wed, 20 Jan 2016 00:12:46 GMT
Hi there,

I am new to Spark, and would like to get some help to understand if Spark
can utilize the underlying architectures for better performance. If so, how
does it do it?

For example, assume there is a cluster built with machines of different
CPUs, will Spark check the individual CPU information and use some
machine-specific setting for the tasks assigned to that machine? Or is it
totally dependent on the underlying JVM implementation to run the JAR file,
and therefor the JVM is the place to check if certain CPU features can be


View raw message