线程主java.lang.exceptionininitializerError中的exception当没有hadoop安装spark时

我正在尝试安装spark2.3.0,更具体地说,它是spark-2.3.0-bin-hadoppo2.7

‘D:\ spark \ bin’已添加到环境变量PATH中。 同时,安装了JDK-10。 没有安装Hadoop。 但是google说火花可以在没有hadoop的情况下工作。

这是错误消息

C:\Users\a>spark-shell Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.hadoop.util.StringUtils.(StringUtils.java:80) at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2464) at org.apache.spark.SecurityManager.(SecurityManager.scala:222) at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.scala:393) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:393) at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401) at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401) at scala.Option.map(Option.scala:146) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:400) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2 at java.base/java.lang.String.checkBoundsBeginEnd(Unknown Source) at java.base/java.lang.String.substring(Unknown Source) at org.apache.hadoop.util.Shell.(Shell.java:52) ... 21 more 

任何人都知道我应该怎么做才能安装spark?

非常感谢先进。

当前的Spark版本(2.3)不支持JDK 9和10.最新支持的JDK版本是JDK 8.您应该降级Java安装。