Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embed lang #132

Draft
wants to merge 6 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
103 changes: 77 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

# Info

Rewatch is an alternative build system for the [Rescript Compiler](https://rescript-lang.org/) (which uses a combination of Ninja, OCaml and a Node.js script). It strives to deliver consistent and faster builds in monorepo setups. Bsb doesn't support a watch-mode in a monorepo setup, and when setting up a watcher that runs a global incremental compile it's consistent but very inefficient and thus slow.
Rewatch is an alternative build system for the [Rescript Compiler](https://rescript-lang.org/) (which uses a combination of Ninja, OCaml and a Node.js script). It strives to deliver consistent and faster builds in monorepo setups. Bsb doesn't support a watch-mode in a monorepo setup, and when setting up a watcher that runs a global incremental compile it's consistent but very inefficient and thus slow.

We couldn't find a way to improve this without re-architecting the whole build system. The benefit of having a specialized build system is that it's possible to completely tailor it to ReScript and not being dependent of the constraints of a generic build system like Ninja. This allowed us to have significant performance improvements even in non-monorepo setups (30% to 3x improvements reported).

Expand All @@ -14,27 +14,27 @@ This project should be considered in beta status. We run it in production at [Wa

# Usage

1. Install the package
1. Install the package

```
yarn add @rolandpeelen/rewatch
```
```
yarn add @rolandpeelen/rewatch
```

2. Build / Clean / Watch
2. Build / Clean / Watch

```
yarn rewatch build
```
```
yarn rewatch build
```

```
yarn rewatch clean
```
```
yarn rewatch clean
```

```
yarn rewatch watch
```
```
yarn rewatch watch
```

You can pass in the folder as the second argument where the 'root' `bsconfig.json` lives. If you encounter a 'stale build error', either directly, or after a while, a `clean` may be needed to clean up some old compiler assets.
You can pass in the folder as the second argument where the 'root' `bsconfig.json` lives. If you encounter a 'stale build error', either directly, or after a while, a `clean` may be needed to clean up some old compiler assets.

## Full Options

Expand Down Expand Up @@ -67,7 +67,7 @@ Options:

-c, --create-sourcedirs <CREATE_SOURCEDIRS>
This creates a source_dirs.json file at the root of the monorepo, which is needed when you want to use Reanalyze

[possible values: true, false]

--compiler-args <COMPILER_ARGS>
Expand All @@ -88,16 +88,67 @@ Options:

# Contributing

Pre-requisites:
Pre-requisites:

- [Rust](https://rustup.rs/)
- [NodeJS](https://nodejs.org/en/) - For running testscripts only
- [Yarn](https://yarnpkg.com/) or [Npm](https://www.npmjs.com/) - Npm probably comes with your node installation

- [Rust](https://rustup.rs/)
- [NodeJS](https://nodejs.org/en/) - For running testscripts only
- [Yarn](https://yarnpkg.com/) or [Npm](https://www.npmjs.com/) - Npm probably comes with your node installation
1. `cd testrepo && yarn` (install dependencies for submodule)
2. `cargo run`

1. `cd testrepo && yarn` (install dependencies for submodule)
2. `cargo run`
Running tests:

Running tests:
1. `cargo build --release`
2. `./tests/suite.sh`

### embed-lang

- Parse -> MyModule.res
-> Rescript parser would generate (pass the embeds)
```
MyModule.embeds
[
{
"name": "MyModule.graphql.0.embeds.res",
"content": "query MyQuery { ... }"
"hash": "123" <- fast hash of the rescript file
},
{
"name": "MyModule.graphql.1.embeds.res",
"content": "query MyQuery { ... }"
}
]
```
- after parsing everything, track all the embeds in the build state
- remove the embeds that are extraneous
- read the first line of the embed -> and mark dirty or not && see if rescript file is there
- track the embeds in the compiler state
- Run the embeds
- run the dirty embeds
STDIN -> rescript-code -> STDOUT / STDERR (exit code)
-> /lib/ocaml/**generated**/MyModule.graphql.0.res
-> /lib/ocaml/**generated**/MyModule.graphql.1.res

-> Parse the outputs of the embeds
-> Determine the dependency tree (and add the embeds as deps)
-> Run compiler

#### configuration of embeds

- bsconfig.json

```json
{
"embed-generators": [
{
"name": "graphql",
"tags": ["graphql"],
"path": "./path/to/graphql/embed"
"package": "my-generator-package"
}
]
}
```

1. `cargo build --release`
2. `./tests/suite.sh`
-> Profit
26 changes: 26 additions & 0 deletions src/bsconfig.rs
Original file line number Diff line number Diff line change
Expand Up @@ -157,13 +157,23 @@ pub struct Config {
pub namespace: Option<NamespaceConfig>,
pub jsx: Option<JsxSpecs>,
pub uncurried: Option<bool>,
#[serde(rename = "embed-generators")]
pub embed_generators: Option<Vec<EmbedGenerator>>,

// this is a new feature of rewatch, and it's not part of the bsconfig.json spec
#[serde(rename = "namespace-entry")]
pub namespace_entry: Option<String>,
// this is a new feature of rewatch, and it's not part of the bsconfig.json spec
#[serde(rename = "allowed-dependents")]
pub allowed_dependents: Option<Vec<String>>,
}
#[derive(Deserialize, Debug, Clone)]
pub struct EmbedGenerator {
pub name: String,
pub tags: Vec<String>,
pub path: String,
pub package: Option<String>,
}

/// This flattens string flags
pub fn flatten_flags(flags: &Option<Vec<OneOrMore<String>>>) -> Vec<String> {
Expand Down Expand Up @@ -250,6 +260,22 @@ fn namespace_from_package_name(package_name: &str) -> String {
.to_case(Case::Pascal)
}

pub fn get_embed_generators_bsc_flags(config: &Config) -> Vec<String> {
config
.embed_generators
.as_ref()
.unwrap_or(&vec![])
.iter()
.flat_map(|generator| {
generator
.tags
.iter()
.map(|tag| vec![format!("-embed"), tag.to_string()])
})
.collect::<Vec<Vec<String>>>()
.concat()
}

impl Config {
pub fn get_namespace(&self) -> packages::Namespace {
let namespace_from_package = namespace_from_package_name(&self.name);
Expand Down
10 changes: 10 additions & 0 deletions src/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,16 @@ pub fn incremental_build(

let timing_ast = Instant::now();
let result_asts = parse::generate_asts(build_state, || pb.inc(1));
let result_asts = match result_asts {
Ok(err) => {
let result_asts = parse::generate_asts(build_state, || pb.inc(1));
match result_asts {
Ok(new_err) => Ok(err + &new_err),
Err(new_err) => Err(err + &new_err),
}
}
Err(err) => Err(err),
};
let timing_ast_elapsed = timing_ast.elapsed();

match result_asts {
Expand Down
97 changes: 97 additions & 0 deletions src/build/build_types.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
use crate::build::packages::{Namespace, Package};
use ahash::{AHashMap, AHashSet};
use serde::{Deserialize, Serialize};
use std::time::SystemTime;

#[derive(Debug, Clone, PartialEq)]
Expand Down Expand Up @@ -35,10 +36,106 @@ pub struct Implementation {
pub parse_dirty: bool,
}

#[derive(Deserialize, Serialize, Debug, Clone, PartialEq)]
pub struct Location {
pub line: u32,
pub col: u32,
}

#[derive(Deserialize, Serialize, Debug, Clone, PartialEq)]
pub struct EmbedLoc {
pub start: Location,
pub end: Location,
}

// example of the *.embeds.json file
// [
// {
// "tag": "sql.one",
// "filename": "Tst__sql_one_1.res",
// "contents": "\n SELECT * FROM tst.res\n WHERE id = 1\n",
// "loc": {"start": {"line": 1, "col": 22}, "end": {"line": 4, "col": 64}}
// },
// {
// "tag": "sql.many",
// "filename": "Tst__sql_many_1.res",
// "contents": "\n SELECT * FROM tst.res\n WHERE id > 1\n",
// "loc": {"start": {"line": 6, "col": 86}, "end": {"line": 9, "col": 128}}
// },
// {
// "tag": "sql.one",
// "filename": "Tst__sql_one_2.res",
// "contents":

#[derive(Deserialize, Debug, Clone, PartialEq)]
pub struct EmbedJsonData {
pub tag: String,
pub filename: String,
pub contents: String,
pub loc: EmbedLoc,
}

#[derive(Debug, Clone, PartialEq)]
pub struct Embed {
pub embed: EmbedJsonData,
pub hash: String,
pub dirty: bool,
}

#[derive(Serialize, Clone, PartialEq, Eq)]
pub struct EmbedGeneratorConfig {
pub tag: String,
pub content: String,
pub source_file_path: String,
}

#[derive(Serialize, Deserialize, Debug)]
pub struct EmbedGeneratorResponseOk {
pub content: String,
}

#[derive(Serialize, Deserialize, Debug, Clone)]
pub struct EmbedGeneratorError {
pub message: String,
pub loc: EmbedLoc,
}

#[derive(Serialize, Deserialize, Debug)]
pub struct EmbedGeneratorResponseError {
pub errors: Vec<EmbedGeneratorError>,
}

#[derive(Debug, Clone)]
pub struct GeneratorReturnedError {
pub errors: Vec<EmbedGeneratorError>,
pub source_file_path: String,
pub embed_data: EmbedJsonData,
pub package_name: String,
}

#[derive(Debug, Clone)]
pub enum ProcessEmbedsErrorReason {
EmbedsJsonFileCouldNotBeRead(String),
EmbedsJsonDataParseError(String),
CouldNotWriteToGeneratorStdin(String, String),
RunningGeneratorCommandFailed(String, String),
GeneratorReturnedError(GeneratorReturnedError),
GeneratorReturnedInvalidJSON(String),
CouldNotWriteGeneratedFile(String),
NoEmbedGeneratorFoundForTag(String),
}

#[derive(Debug, Clone)]
pub struct ProcessEmbedsError {
pub reason: ProcessEmbedsErrorReason,
pub generator_name: Option<String>,
}

#[derive(Debug, Clone, PartialEq)]
pub struct SourceFile {
pub implementation: Implementation,
pub interface: Option<Interface>,
pub embeds: Vec<Embed>, // Added embeds field
}

#[derive(Debug, Clone, PartialEq)]
Expand Down
8 changes: 8 additions & 0 deletions src/build/packages.rs
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@ pub struct Package {
pub name: String,
pub bsconfig: bsconfig::Config,
pub source_folders: AHashSet<bsconfig::PackageSource>,
pub generated_file_folder: PathBuf,
// these are the relative file paths (relative to the package root)
pub source_files: Option<AHashMap<String, SourceFileMeta>>,
pub namespace: Namespace,
Expand Down Expand Up @@ -381,6 +382,11 @@ fn make_package(
source_files: None,
namespace: bsconfig.get_namespace(),
modules: None,
generated_file_folder: match &bsconfig.sources {
bsconfig::OneOrMore::Single(source) => PathBuf::from(bsconfig::to_qualified_without_children(source, None).dir).join("__generated__"),
bsconfig::OneOrMore::Multiple(sources) if !sources.is_empty() => PathBuf::from(bsconfig::to_qualified_without_children(&sources[0], None).dir).join("__generated__"),
_ => panic!("Error: Invalid or empty sources configuration in bsconfig.json. Please ensure at least one valid source is specified."),
},
// we canonicalize the path name so it's always the same
path: PathBuf::from(package_path)
.canonicalize()
Expand Down Expand Up @@ -652,6 +658,7 @@ pub fn parse_packages(build_state: &mut BuildState) {
parse_dirty: true,
},
interface: None,
embeds: vec![],
}),
deps: AHashSet::new(),
dependents: AHashSet::new(),
Expand Down Expand Up @@ -705,6 +712,7 @@ pub fn parse_packages(build_state: &mut BuildState) {
last_modified: metadata.modified,
parse_dirty: true,
}),
embeds: vec![],
}),
deps: AHashSet::new(),
dependents: AHashSet::new(),
Expand Down
Loading
Loading