Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mjohns 0.4.0 docs 11 #529

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 26 additions & 27 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,12 +46,12 @@ this will leverage the Databricks H3 expressions when using H3 grid system.

Mosaic provides:

* easy conversion between common spatial data encodings (WKT, WKB and GeoJSON);
* constructors to easily generate new geometries from Spark native data types;
* many of the OGC SQL standard :code:`ST_` functions implemented as Spark Expressions for transforming, aggregating and joining spatial datasets;
* high performance through implementation of Spark code generation within the core Mosaic functions;
* optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
* the choice of a Scala, SQL and Python API.
* easy conversion between common spatial data encodings (WKT, WKB and GeoJSON);
* constructors to easily generate new geometries from Spark native data types;
* many of the OGC SQL standard :code:`ST_` functions implemented as Spark Expressions for transforming, aggregating and joining spatial datasets;
* high performance through implementation of Spark code generation within the core Mosaic functions;
* optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
* the choice of a Scala, SQL and Python API.

.. note::
For Mosaic versions < 0.4 please use the `0.3 docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.
Expand All @@ -67,37 +67,36 @@ We recommend using Databricks Runtime versions 13.3 LTS with Photon enabled.

Mosaic 0.4.x series only supports DBR 13.x DBRs. If running on a different DBR it will throw an exception:

**DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify
`%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.**
DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13.
You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.

Mosaic 0.4.x series issues an ERROR on standard, non-Photon clusters `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ |
`AWS <https://docs.databricks.com/runtime/index.html/>`_ |
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_:

**DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial
AI benefits; Mosaic 0.4.x series restricts executing this cluster.**
DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for
spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.

As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Python, SQL, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Scala API (JVM) with
Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_ ;
Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.
* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_: Mosaic Python, SQL, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_: Mosaic Scala API (JVM) with
Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_;
Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.

.. warning::
Mosaic 0.4.x SQL bindings for DBR 13 can register with Assigned clusters (as Hive UDFs), but not Shared Access due
to `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ API changes, more `here <https://docs.databricks.com/en/udf/index.html>`_.
Mosaic 0.4.x SQL bindings for DBR 13 can register with Assigned clusters (as Hive UDFs), but not Shared Access due
to `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ API changes, more `here <https://docs.databricks.com/en/udf/index.html>`_.

.. note::
As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ : Enforces process isolation which is difficult to
accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other
supported languages in Shared Access Clusters.
* `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ : Along the same principle of isolation,
clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom
python calls which do not involve any custom JVM code.

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ enforces process isolation which is difficult
to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from
other supported languages in Shared Access Clusters.
* Along the same principle of isolation, clusters (both Assigned and Shared Access) can read
`Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ via relevant built-in readers and
writers or via custom python calls which do not involve any custom JVM code.

Version 0.3.x Series
====================
Expand All @@ -111,10 +110,10 @@ For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.g
As of the 0.3.11 release, Mosaic issues the following WARNING when initialized on a cluster that is neither Photon Runtime
nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ |
`AWS <https://docs.databricks.com/runtime/index.html/>`_ |
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_:

**DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial
AI benefits; Mosaic will stop working on this cluster after v0.3.x.**
DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial
AI benefits; Mosaic will stop working on this cluster after v0.3.x.

If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are
making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are
Expand Down
30 changes: 15 additions & 15 deletions docs/source/usage/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,32 +10,32 @@ Supported platforms

Mosaic 0.4.x series only supports DBR 13.x DBRs. If running on a different DBR it will throw an exception:

**DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify
`%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.**
DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13.
You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.

Mosaic 0.4.x series issues an ERROR on standard, non-Photon clusters `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ |
`AWS <https://docs.databricks.com/runtime/index.html/>`_ |
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :
`GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_:

**DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial
AI benefits; Mosaic 0.4.x series restricts executing this cluster.**
DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for
spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.

As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Python, SQL, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Scala API (JVM) with
Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_ ;
Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.
* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_: Mosaic Python, SQL, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_: Mosaic Scala API (JVM) with
Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_;
Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.

.. note::
As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ : Enforces process isolation which is difficult to
accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other
supported languages in Shared Access Clusters.
* `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ : Along the same principle of isolation,
clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom
python calls which do not involve any custom JVM code.
* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ enforces process isolation which is difficult
to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from
other supported languages in Shared Access Clusters.
* Along the same principle of isolation, clusters (both assigned and shared access) can read
`Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ via relevant built-in readers and
writers or via custom python calls which do not involve any custom JVM code.

If you have cluster creation permissions in your Databricks
workspace, you can create a cluster using the instructions
Expand Down
Loading