lundi 24 août 2015

Spark local testing reports OutOfMemoryException, how to fix?

I followed this article to write some spark tests which running on local: http://ift.tt/1dzuqET

The main code is like:

class SparkExampleSpec extends FlatSpec with BeforeAndAfter {

  private val master = "local[2]"
  private val appName = "example-spark"

  private var sc: SparkContext = _

  before {
    val conf = new SparkConf()
      .setMaster(master)
      .setAppName(appName)

    sc = new SparkContext(conf)
  }

  after {
    if (sc != null) {
      sc.stop()
    }
  }
  (...)

But I found when my tests increased, it will report OutOfMemoryException and the process is blocked.

How to fix it?

Aucun commentaire:

Enregistrer un commentaire