java - Cannot set Spark memory -


i trying set max memory spark application running locally. have tried many different ways.

in program

sparkconf conf = new sparkconf(); conf.setmaster("local[2]"); conf.setappname("app");      conf.set("spark.executor.memory", "4g"); conf.set("spark.driver.memory", "4g"); sc = new javasparkcontext(conf); 

and when executing

./bin/spark-submit.cmd ./local/app.jar --master local[2] --driver-memory 4g 

but following

info [memorystore] memorystore started capacity 366.3 mb 

and spark spills rdd disk. spark version 2.0.

what do?

when running locally, spark not used more memory java process has available. should try running application memory vm params: -xms2048m -xmx4096m


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -