当先锋百科网

首页 1 2 3 4 5 6 7

spark启动slave节点出错,日志文件

Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

https://www.cnblogs.com/tijun/p/7562282.html

Consider explicitly setting the appropriate port for the service 'sparkWorker' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.

https://my.oschina.net/u/2329800/blog/1826179

添加export SPARK_LOCAL_IP=127.0.0.1  

这个SPARK_LOCAL_IP可以slave的ip或者名字