diff --git a/README.md b/README.md index 3ef9f43f..9e104584 100644 --- a/README.md +++ b/README.md @@ -25,7 +25,7 @@ You can link against this library in your program at the following coordinates: ``` groupId: com.databricks artifactId: spark-xml_2.11 -version: 0.9.0 +version: 0.10.0 ``` ### Scala 2.12 @@ -33,7 +33,7 @@ version: 0.9.0 ``` groupId: com.databricks artifactId: spark-xml_2.12 -version: 0.9.0 +version: 0.10.0 ``` ## Using with Spark shell @@ -42,12 +42,12 @@ This package can be added to Spark using the `--packages` command line option. F ### Spark compiled with Scala 2.11 ``` -$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.9.0 +$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.10.0 ``` ### Spark compiled with Scala 2.12 ``` -$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.9.0 +$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.10.0 ``` ## Features @@ -400,7 +400,7 @@ Automatically infer schema (data types) ```R library(SparkR) -sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.9.0")) +sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0")) df <- read.df("books.xml", source = "xml", rowTag = "book") @@ -412,7 +412,7 @@ You can manually specify schema: ```R library(SparkR) -sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.9.0")) +sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0")) customSchema <- structType( structField("_id", "string"), structField("author", "string"),