I want to build a Spark environment on Windows I had installed it, but I got an error when starting the batch, so Make a note of the workaround.
Windows10 64bit
○ Install Apache Spark http://spark.apache.org/downloads.html
○ Download winutils.exe https://github.com/steveloughran/winutils/blob/master/hadoop-2.7.1/bin/winutils.exe
○HADOOP_HOME The directory where you unzipped Apache Spark D:\tools\spark-2.1.1-bin-hadoop2.7\
○ Add path %HADOOP_HOME%\bin
This time, start Scala. If it is 32bit of Window It seems that the following files should be executed with administrator privileges as they are. D:\tools\spark-2.1.1-bin-hadoop2.7\bin spark-shell.cmd
If it is 64bit, I got angry at the pass and something went wrong. .. After investigating, I found that the path specification was not good when executing java, so Modify the following files. .. D:\tools\spark-2.1.1-bin-hadoop2.7\bin spark-class2.cmd
spark-class2.cmd
rem Figure out where java is.
set RUNNER=java
if not "x%JAVA_HOME%"=="x" (
set RUNNER=%JAVA_HOME%\bin\java :★ From here's pass""I'll exclude
) else (
where /q "%RUNNER%"
if ERRORLEVEL 1 (
echo Java not found and JAVA_HOME environment variable is not set.
echo Install Java and set JAVA_HOME to point to the Java installation directory.
exit /b 1
)
)
Then the startup is completed safely!
You can play with spark with this.
I'm glad if you can use it as a reference.