Skip to content

Commit

Permalink
Merge branch 'master' into docs/2024-02-02
Browse files Browse the repository at this point in the history
  • Loading branch information
igorlukanin authored Feb 5, 2024
2 parents 25ffb13 + 2b7cc30 commit 45410ab
Show file tree
Hide file tree
Showing 37 changed files with 303 additions and 144 deletions.
15 changes: 9 additions & 6 deletions docs/pages/product/caching/using-pre-aggregations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -238,7 +238,7 @@ This issue can be addressed by lambda pre-aggregations.

Alternatively, if you want to explicitly introduce key partitioning, you can use multi-tenancy to introduce multiple orchestrator IDs.
Each orchestrator ID can use a different pre-aggregation schema, so you may define those based on the partitioning key you want to introduce.
This technique, together with multi router Cube Store approach, allows you to
This technique, together with multi-router Cube Store approach, allows you to achieve linear scaling on the partitioning key of your choice.

## Using indexes

Expand Down Expand Up @@ -924,10 +924,10 @@ Cube Store in parallel:
Enabling the export bucket functionality requires extra configuration; please
refer to the database-specific documentation for more details:

- [AWS Athena][ref-connect-db-athena]
- [AWS Redshift][ref-connect-db-redshift]
- [BigQuery][ref-connect-db-bigquery]
- [Snowflake][ref-connect-db-snowflake]
- [AWS Athena][ref-export-athena]
- [AWS Redshift][ref-export-redshift]
- [BigQuery][ref-export-bigquery]
- [Snowflake][ref-export-snowflake]

When using cloud storage, it is important to correctly configure any data
retention policies to clean up the data in the export bucket as Cube does not
Expand All @@ -954,7 +954,10 @@ please [let us know](https://cube.dev/contact) if you are interested in early
access to any of these drivers or would like Cube to support any other SQL
streaming engine.


[ref-export-athena]: /product/configuration/data-sources/aws-athena#export-bucket
[ref-export-redshift]: /product/configuration/data-sources/aws-redshift#export-bucket
[ref-export-bigquery]: /product/configuration/data-sources/google-bigquery#export-bucket
[ref-export-snowflake]: /product/configuration/data-sources/snowflake#export-bucket
[ref-caching-in-mem-default-refresh-key]: /product/caching#default-refresh-keys
[ref-config-db]: /product/configuration/data-sources
[ref-config-driverfactory]: /reference/configuration/config#driverfactory
Expand Down
13 changes: 8 additions & 5 deletions docs/pages/product/configuration/vpc/gcp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,13 @@ redirect_from:

# Connecting with a VPC on GCP

## Prerequisites
Work with your Cube sales or customer success team to initiate this process.

- [Google Cloud Project ID][gcp-docs-projects]
- Google Cloud VPC Network Name
- [Sign up for a Cube Cloud account][cube-cloud-signup] if you haven't already.
- Let the Cube team know your Cube Cloud tenant name (e.g. example.cubecloud.dev) and your desired GCP region for the deployment. For best performance, select one of the "Supported Regions" listed below.
- Cube will provision the VPC and provide the following information you can use to create the peering request:
- [Google Cloud Project ID][gcp-docs-projects]
- Google Cloud VPC Network Name

## Setup

Expand All @@ -20,8 +23,7 @@ Console][gcp-console] or an infrastructure-as-code tool. To send a VPC peering
request through the Google Cloud Console, follow [the instructions
here][gcp-docs-create-vpc-peering], with the following amendments:

- In Step 6, use the project name `XXXXX` and network name(s) provided by Cube
Cloud.
- In Step 6, use the project name `XXXXX`, and the project ID and network name(s) provided by Cube.
- In Step 7, ensure **Import custom routes** and **Export custom routes** are
selected so that the necessary routes are created.

Expand Down Expand Up @@ -51,3 +53,4 @@ run the [Cloud SQL Auth Proxy][gcp-cloudsql-auth-proxy].
[gcp-docs-projects]:
https://cloud.google.com/resource-manager/docs/creating-managing-projects#before_you_begin
[gcp-docs-vpc-peering]: https://cloud.google.com/vpc/docs/vpc-peering
[cube-cloud-signup]: https://cubecloud.dev/auth/signup
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,6 @@ redirect_from:

# Connect to Snowflake

<InfoBox heading="Getting Started with Cube Cloud">

[Join our Cube Cloud Office Hours on August 23rd](https://event.on24.com/wcc/r/4314738/57FE0F7DCCEA887295E4BCD818061E2B?partnerref=docs)
to jumpstart your Cube Cloud account and gain insights on connecting data
sources, creating data models, and integrating your data apps.

</InfoBox>

In this section, we’ll create a Cube Cloud deployment and connect it to
Snowflake. A deployment represents a data model, configuration, and managed
infrastructure.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,6 @@ redirect_from:

# Create your first data model

<InfoBox heading="Getting Started with Cube Cloud">

[Join our Cube Cloud Office Hours on August 23rd](https://event.on24.com/wcc/r/4314738/57FE0F7DCCEA887295E4BCD818061E2B?partnerref=docs)
to jumpstart your Cube Cloud account and gain insights on connecting data
sources, creating data models, and integrating your data apps.

</InfoBox>

Cube follows a dataset-oriented data modeling approach, which is inspired by and
expands upon dimensional modeling. Cube incorporates this approach and provides
a practical framework for implementing dataset-oriented data modeling.
Expand Down
8 changes: 0 additions & 8 deletions docs/pages/product/getting-started/cloud/load-data.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,6 @@ redirect_from:

# Load data

<InfoBox heading="Getting Started with Cube Cloud">

[Join our Cube Cloud Office Hours on August 23rd](https://event.on24.com/wcc/r/4314738/57FE0F7DCCEA887295E4BCD818061E2B?partnerref=docs)
to jumpstart your Cube Cloud account and gain insights on connecting data
sources, creating data models, and integrating your data apps.

</InfoBox>

The following steps will guide you through setting up a Snowflake account and
uploading the demo dataset, which is stored as CSV files in a public S3 bucket.

Expand Down
8 changes: 0 additions & 8 deletions docs/pages/product/getting-started/cloud/query-from-bi.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,6 @@ redirect_from:

# Query from a BI tool

<InfoBox heading="Getting Started with Cube Cloud">

[Join our Cube Cloud Office Hours on August 23rd](https://event.on24.com/wcc/r/4314738/57FE0F7DCCEA887295E4BCD818061E2B?partnerref=docs)
to jumpstart your Cube Cloud account and gain insights on connecting data
sources, creating data models, and integrating your data apps.

</InfoBox>

You can query Cube using a BI or visualization tool through the Cube SQL API. To
provide a good end-user experience in your BI tool, we recommend mapping the
BI's data model to Cube's semantic layer. This can be done automatically with
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,6 @@ redirect_from:

# Query from a React app

<InfoBox heading="Getting Started with Cube Cloud">

[Join our Cube Cloud Office Hours on August 23rd](https://event.on24.com/wcc/r/4314738/57FE0F7DCCEA887295E4BCD818061E2B?partnerref=docs)
to jumpstart your Cube Cloud account and gain insights on connecting data
sources, creating data models, and integrating your data apps.

</InfoBox>

Cube offers both [REST](/product/apis-integrations/rest-api) and
[GraphQL](/product/apis-integrations/graphql-api) APIs, which can be used to
query data from applications built in React or other frontend frameworks.
Expand Down
16 changes: 8 additions & 8 deletions packages/cubejs-backend-native/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,7 @@ export class DatabricksQuery extends BaseQuery {
templates.functions.BTRIM = 'TRIM({% if args[1] is defined %}{{ args[1] }} FROM {% endif %}{{ args[0] }})';
templates.functions.LTRIM = 'LTRIM({{ args|reverse|join(", ") }})';
templates.functions.RTRIM = 'RTRIM({{ args|reverse|join(", ") }})';
// Databricks has a DATEDIFF function but produces values different from Redshift
delete templates.functions.DATEDIFF;
templates.functions.DATEDIFF = 'DATEDIFF({{ date_part }}, DATE_TRUNC(\'{{ date_part }}\', {{ args[1] }}), DATE_TRUNC(\'{{ date_part }}\', {{ args[2] }}))';
return templates;
}
}
2 changes: 2 additions & 0 deletions packages/cubejs-schema-compiler/src/adapter/BaseQuery.js
Original file line number Diff line number Diff line change
Expand Up @@ -2514,6 +2514,8 @@ export class BaseQuery {
in_list: '{{ expr }} {% if negated %}NOT {% endif %}IN ({{ in_exprs_concat }})',
negative: '-({{ expr }})',
not: 'NOT ({{ expr }})',
true: 'TRUE',
false: 'FALSE',
},
quotes: {
identifiers: '"',
Expand Down
3 changes: 2 additions & 1 deletion packages/cubejs-schema-compiler/src/adapter/PostgresQuery.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,8 @@ export class PostgresQuery extends BaseQuery {
templates.functions.NOW = 'NOW({{ args_concat }})';
// DATEADD is being rewritten to DATE_ADD
// templates.functions.DATEADD = '({{ args[2] }} + \'{{ interval }} {{ date_part }}\'::interval)';
delete templates.functions.DATEDIFF;
// TODO: is DATEDIFF expr worth documenting?
templates.functions.DATEDIFF = 'CASE WHEN LOWER(\'{{ date_part }}\') IN (\'year\', \'quarter\', \'month\') THEN (EXTRACT(YEAR FROM AGE(DATE_TRUNC(\'{{ date_part }}\', {{ args[2] }}), DATE_TRUNC(\'{{ date_part }}\', {{ args[1] }}))) * 12 + EXTRACT(MONTH FROM AGE(DATE_TRUNC(\'{{ date_part }}\', {{ args[2] }}), DATE_TRUNC(\'{{ date_part }}\', {{ args[1] }})))) / CASE LOWER(\'{{ date_part }}\') WHEN \'year\' THEN 12 WHEN \'quarter\' THEN 3 WHEN \'month\' THEN 1 END ELSE EXTRACT(EPOCH FROM DATE_TRUNC(\'{{ date_part }}\', {{ args[2] }}) - DATE_TRUNC(\'{{ date_part }}\', {{ args[1] }})) / EXTRACT(EPOCH FROM \'1 {{ date_part }}\'::interval) END::bigint';
templates.expressions.interval = 'INTERVAL \'{{ interval }}\'';
templates.expressions.extract = 'EXTRACT({{ date_part }} FROM {{ expr }})';

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ export class RedshiftQuery extends PostgresQuery {
public sqlTemplates() {
const templates = super.sqlTemplates();
templates.functions.DLOG10 = 'LOG(10, {{ args_concat }})';
templates.functions.DATEDIFF = 'DATEDIFF({{ date_part }}, {{ args[1] }}, {{ args[2] }})';
delete templates.functions.COVAR_POP;
delete templates.functions.COVAR_SAMP;
return templates;
Expand Down
16 changes: 8 additions & 8 deletions rust/cubesql/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion rust/cubesql/cubesql/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ homepage = "https://cube.dev"

[dependencies]
arc-swap = "1"
datafusion = { git = 'https://github.com/cube-js/arrow-datafusion.git', rev = "2841b5a0383f15ee5dedead6e2155fb7c398d35b", default-features = false, features = ["regex_expressions", "unicode_expressions"] }
datafusion = { git = 'https://github.com/cube-js/arrow-datafusion.git', rev = "3c85ef6583587f5b0b037be5810e979bede9c7dc", default-features = false, features = ["regex_expressions", "unicode_expressions"] }
anyhow = "1.0"
thiserror = "1.0.50"
cubeclient = { path = "../cubeclient" }
Expand Down
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
---
source: cubesql/e2e/tests/postgres.rs
assertion_line: 297
expression: "self.print_query_result(res, with_description, true).await"
---
Utf8(NULL) type: 25 (text)
NULL type: 25 (text)
f32 type: 700 (float4)
f64 type: 701 (float8)
i16 type: 21 (int2)
Expand All @@ -27,8 +26,8 @@ interval_month_day_nano type: 1186 (interval)
str_arr type: 1009 (_text)
i64_arr type: 1016 (_int8)
f64_arr type: 1022 (_float8)
+------------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
| Utf8(NULL) | f32 | f64 | i16 | u16 | i32 | u32 | i64 | u64 | bool_true | bool_false | str | d0 | d2 | d5 | d10 | date | tsmp | interval_year_month | interval_day_time | interval_month_day_nano | str_arr | i64_arr | f64_arr |
+------------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
| NULL | 1.234 | 1.234 | 1 | 1 | 1 | 1 | 1 | 1 | true | false | test | 1 | 1.25 | 1.25000 | 1.2500000000 | 2022-04-25 | 2022-04-25 16:25:01.164774 | 1 year 1 mons | 01:30:00 | 1 year 1 mons 1 days 01:30:00 | test1,test2 | 1,2,3 | 1.2,2.3,3.4 |
+------------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
+------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
| NULL | f32 | f64 | i16 | u16 | i32 | u32 | i64 | u64 | bool_true | bool_false | str | d0 | d2 | d5 | d10 | date | tsmp | interval_year_month | interval_day_time | interval_month_day_nano | str_arr | i64_arr | f64_arr |
+------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
| NULL | 1.234 | 1.234 | 1 | 1 | 1 | 1 | 1 | 1 | true | false | test | 1 | 1.25 | 1.25000 | 1.2500000000 | 2022-04-25 | 2022-04-25 16:25:01.164774 | 1 year 1 mons | 01:30:00 | 1 year 1 mons 1 days 01:30:00 | test1,test2 | 1,2,3 | 1.2,2.3,3.4 |
+------+-------+-------+-----+-----+-----+-----+-----+-----+-----------+------------+------+----+------+---------+--------------+------------+----------------------------+---------------------+-------------------+-------------------------------+-------------+---------+-------------+
Loading

0 comments on commit 45410ab

Please sign in to comment.