You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
The latest snapshot jars for v21.10 are failing with the following error:
21/09/22 09:13:04 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.ClassNotFoundException: com.nvidia.spark.rapids.ColumnarRdd
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at ml.dmlc.xgboost4j.scala.spark.rapids.GpuUtils$.toColumnarRdd(GpuUtils.scala:37)
at ml.dmlc.xgboost4j.scala.spark.rapids.GpuXGBoost$.trainOnGpuInternal(GpuXGBoost.scala:240)
at ml.dmlc.xgboost4j.scala.spark.rapids.GpuXGBoost$.trainDistributedOnGpu(GpuXGBoost.scala:186)
at ml.dmlc.xgboost4j.scala.spark.rapids.GpuXGBoost$.trainOnGpu(GpuXGBoost.scala:91)
at ml.dmlc.xgboost4j.scala.spark.rapids.GpuXGBoost$.fitOnGpu(GpuXGBoost.scala:52)
at ml.dmlc.xgboost4j.scala.spark.XGBoostClassifier.fit(XGBoostClassifier.scala:170)
at com.nvidia.spark.examples.mortgage.GPUMain$.$anonfun$main$4(GPUMain.scala:72)
at com.nvidia.spark.examples.utility.Benchmark.time(Benchmark.scala:29)
at com.nvidia.spark.examples.mortgage.GPUMain$.main(GPUMain.scala:72)
at com.nvidia.spark.examples.mortgage.GPUMain.main(GPUMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Error in spark-shell
scala> root@87c06bb09f8d:~/notebook# spark-shell
......
scala> Class.forName("com.nvidia.spark.SQLPlugin")
res2: Class[_] = class com.nvidia.spark.SQLPlugin
scala>
scala> Class.forName("com.nvidia.spark.rapids.ColumnarRdd")
java.lang.ClassNotFoundException: com.nvidia.spark.rapids.ColumnarRdd
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:72)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
... 47 elided
scala>
Environment details (please complete the following information)
Environment location: YARN, Standalone, local
The text was updated successfully, but these errors were encountered:
Describe the bug
The latest snapshot jars for v21.10 are failing with the following error:
Error in
spark-shell
Environment details (please complete the following information)
Environment location: YARN, Standalone, local
The text was updated successfully, but these errors were encountered: