-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while importing sparkdl in google colab #226
Comments
Hey @jai-dewani this is expected behavior. Google colab's environment doesn't include all of spark's dependencies, including pyspark, hence the ModuleNotFoundError. You'll need to install these dependencies first. This repo (https://github.com/asifahmed90/pyspark-ML-in-Colab) has an example of that, but it's a bit dated, so you might ask @asifahmed90 if you run into any issues. Good luck! |
Actually I did all the necessary steps from the start yet I am ending with this problem, While running the document, just run the first two subsections and you will end up with eh the same result. I am looking hard for any minor mistake I could be doing or something I missed out on, but can't seem to find something :/ Edit: A similar issue has been posted with the same problem |
@jai-dewani, This setup worked for me. !apt-get install openjdk-8-jdk-headless -qq > /dev/null
!wget -q https://downloads.apache.org/spark/spark-3.1.1/spark-3.1.1-bin-hadoop3.2.tgz
!tar xf spark-3.1.1-bin-hadoop3.2.tgz
!pip install -q findspark import os
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["SPARK_HOME"] = "/content/spark-3.1.1-bin-hadoop3.2"
import findspark
findspark.init()
from pyspark.sql import SparkSession
spark = SparkSession.builder.master("local[*]").getOrCreate() I have come around to that solution by looking into latest spark package distribution page. You can do the same by checking out Change these filenames in the above code as required. |
Here is the error call back while importing sparkdl
Spark version ->
sparkdl-0.2.2
The text was updated successfully, but these errors were encountered: