Skip to content

Commit

Permalink
Merge pull request #958 from svilupp/fix-mini-typos-in-docs
Browse files Browse the repository at this point in the history
Fix mini typos in docs
  • Loading branch information
ablaom authored Aug 1, 2022
2 parents 2828c60 + d82d1c4 commit 56ea42d
Show file tree
Hide file tree
Showing 38 changed files with 262 additions and 262 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ Science Investment
Fund](https://www.mbie.govt.nz/science-and-technology/science-and-innovation/funding-information-and-opportunities/investment-funds/strategic-science-investment-fund/ssif-funded-programmes/university-of-auckland/)
awarded to the University of Auckland.

MLJ been developed with the support of the following organizations:
MLJ has been developed with the support of the following organizations:

<div align="center">
<img src="material/Turing_logo.png" width = 100/>
Expand All @@ -60,7 +60,7 @@ MLJ been developed with the support of the following organizations:

### The MLJ Universe

The functionality of MLJ is distributed over a number of repositories
The functionality of MLJ is distributed over several repositories
illustrated in the dependency chart below. These repositories live at
the [JuliaAI](https://github.com/JuliaAI) umbrella organization.

Expand Down
30 changes: 15 additions & 15 deletions ROADMAP.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ list](https://github.com/alan-turing-institute/MLJ.jl/issues/673).
### Adding models

- [ ] **Integrate deep learning** using [Flux.jl](https://github.com/FluxML/Flux.jl.git) deep learning. [Done](https://github.com/FluxML/MLJFlux.jl) but can
improve experience by:
improve the experience by:

- [x] finishing iterative model wrapper [#139](https://github.com/alan-turing-institute/MLJ.jl/issues/139)

Expand Down Expand Up @@ -86,7 +86,7 @@ list](https://github.com/alan-turing-institute/MLJ.jl/issues/673).
[done](https://github.com/JuliaAI/MLJTuning.jl/pull/96)

- [ ] Bayesian methods, starting with Gaussian Process methods a
la PyMC3. Some preliminary research done .
la PyMC3. Some preliminary research done.

- [ ] POC for AD-powered gradient descent [#74](https://github.com/alan-turing-institute/MLJ.jl/issues/74)

Expand All @@ -99,36 +99,36 @@ list](https://github.com/alan-turing-institute/MLJ.jl/issues/673).
- [ ] Genetic algorithms
[#38](https://github.com/alan-turing-institute/MLJ.jl/issues/38)

- [ ] Particle Swarm Optization (current WIP, GSoC project @lhnguyen-vn)
- [ ] Particle Swarm Optimization (current WIP, GSoC project @lhnguyen-vn)

- [ ] tuning strategies for non-Cartesian spaces of models [MLJTuning
#18](https://github.com/JuliaAI/MLJTuning.jl/issues/18), architecture search, and other AutoML workflows

- [ ] Systematic benchmarking, probably modeled on
[MLaut](https://arxiv.org/abs/1901.03678) [#69](https://github.com/alan-turing-institute/MLJ.jl/issues/74)

- [ ] Give `EnsembleModel` more extendible API and extend beyond bagging
(boosting, etc) and migrate to separate repository?
- [ ] Give `EnsembleModel` a more extendible API and extend beyond bagging
(boosting, etc) and migrate to a separate repository?
[#363](https://github.com/alan-turing-institute/MLJ.jl/issues/363)

- [ ] **** Enhance complex model compostition:
- [ ] **** Enhance complex model composition:

- [x] Introduce a canned
stacking model wrapper ([POC](https://alan-turing-institute.github.io/DataScienceTutorials.jl/getting-started/stacking/)). WIP @olivierlabayle

- [ ] Get rid of macros for creating pipelines and possibly
implement target transforms as wrapper ([MLJBase
implement target transforms as wrappers ([MLJBase
#594](https://github.com/alan-turing-institute/MLJ.jl/issues/594))
WIP @CameronBieganek and @ablaom


### Broadening scope

- [ ] Integrate causal and counterfactual methods for, example,
- [ ] Integrate causal and counterfactual methods for example,
applications to FAIRness; see [this
proposal](https://julialang.org/jsoc/gsoc/MLJ/#causal_and_counterfactual_methods_for_fairness_in_machine_learning)

- [ ] Explore possibility of closer integration of Interpretable
- [ ] Explore the possibility of closer integration of Interpretable
Machine Learning approaches, such as Shapley values and lime; see
[Shapley.jl](https://gitlab.com/ExpandingMan/Shapley.jl),
[ShapML.jl](https://github.com/nredell/ShapML.jl),
Expand All @@ -146,7 +146,7 @@ list](https://github.com/alan-turing-institute/MLJ.jl/issues/673).
- [ ] Add sparse data support and better support for NLP models; we
could use [NaiveBayes.jl](https://github.com/dfdx/NaiveBayes.jl)
as a POC (currently wrapped only for dense input) but the API
needs finalizing first
needs to be finalized first
{#731](https://github.com/alan-turing-institute/MLJ.jl/issues/731). Probably
need a new SparseTables.jl package.

Expand All @@ -159,27 +159,27 @@ list](https://github.com/alan-turing-institute/MLJ.jl/issues/673).
first, and someone to finish [PR on time series
CV](https://github.com/JuliaAI/MLJBase.jl/pull/331). See also [this proposal](https://julialang.org/jsoc/gsoc/MLJ/#time_series_forecasting_at_scale_-_speed_up_via_julia)

- [ ] Add tools or separate repository for visualization in MLJ.
- [ ] Add tools or a separate repository for visualization in MLJ.

- [x] Extend visualization of tuning plots beyond two-parameters
[#85](https://github.com/alan-turing-institute/MLJ.jl/issues/85)
(closed).
[#416](https://github.com/alan-turing-institute/MLJ.jl/issues/416)
[Done](https://github.com/JuliaAI/MLJTuning.jl/pull/121) but might be worth adding alternatives suggested in issue.

- [ ] visualizing decision boundaries ? [#342](https://github.com/alan-turing-institute/MLJ.jl/issues/342)
- [ ] visualizing decision boundaries? [#342](https://github.com/alan-turing-institute/MLJ.jl/issues/342)

- [ ] provide visualizations that MLR3 provides via [mlr3viz](https://github.com/mlr-org/mlr3viz)

- [ ] Extend API to accomodate outlier detection, as provided by [OutlierDetection.jl](https://github.com/davnn/OutlierDetection.jl) [#780](https://github.com/alan-turing-institute/MLJ.jl/issues/780) WIP @davn and @ablaom
- [ ] Extend API to accommodate outlier detection, as provided by [OutlierDetection.jl](https://github.com/davnn/OutlierDetection.jl) [#780](https://github.com/alan-turing-institute/MLJ.jl/issues/780) WIP @davn and @ablaom

- [ ] Add more pre-processing tools:

- [x] missing value imputation using Gaussina Mixture Model. Done,
- [x] missing value imputation using Gaussian Mixture Model. Done,
via addition of BetaML model, `MissingImputator`.

- [ ] improve `autotype` method (from ScientificTypes), perhaps by
training on large collection of datasets with manually labelled
training on a large collection of datasets with manually labelled
scitype schema.

- [ ] Add integration with [MLFlow](https://julialang.org/jsoc/gsoc/MLJ/#mlj_and_mlflow_integration); see [this proposal](https://julialang.org/jsoc/gsoc/MLJ/#mlj_and_mlflow_integration)
Expand Down
2 changes: 1 addition & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ pages = [
"Learning Curves" => "learning_curves.md",
"Preparing Data" => "preparing_data.md",
"Transformers and Other Unsupervised models" => "transformers.md",
"More on Probablistic Predictors" => "more_on_probabilistic_predictors.md",
"More on Probabilistic Predictors" => "more_on_probabilistic_predictors.md",
"Composing Models" => "composing_models.md",
"Linear Pipelines" => "linear_pipelines.md",
"Target Transformations" => "target_transformations.md",
Expand Down
16 changes: 8 additions & 8 deletions docs/src/about_mlj.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ models](@ref model_list) written in Julia and other languages. In
particular MLJ wraps a large number of
[scikit-learn](https://scikit-learn.org/stable/) models.

MLJ is released under the MIT licensed.
MLJ is released under the MIT license.

## Lightning tour

Expand Down Expand Up @@ -47,7 +47,7 @@ booster = Booster(max_depth=2) # specify hyper-parameter at construction
booster.nrounds=50 # or mutate afterwards
```

This model is an example of an iterative model. As is stands, the
This model is an example of an iterative model. As it stands, the
number of iterations `nrounds` is fixed.


Expand Down Expand Up @@ -146,7 +146,7 @@ Extract:
* Data agnostic, train models on any data supported by the
[Tables.jl](https://github.com/JuliaData/Tables.jl) interface.

* Extensive, state-of-the art, support for model composition
* Extensive, state-of-the-art, support for model composition
(*pipelines*, *stacks* and, more generally, *learning networks*). See more
[below](#model-composability).

Expand All @@ -156,7 +156,7 @@ Extract:

* Extensible [tuning
interface](https://github.com/JuliaAI/MLJTuning.jl),
to support growing number of optimization strategies, and designed
to support a growing number of optimization strategies, and designed
to play well with model composition.

* Options to accelerate model evaluation and tuning with
Expand Down Expand Up @@ -225,7 +225,7 @@ See also, [Known Issues](@ref)

## Installation

Initially it is recommended that MLJ and associated packages be
Initially, it is recommended that MLJ and associated packages be
installed in a new
[environment](https://julialang.github.io/Pkg.jl/v1/environments/) to
avoid package conflicts. You can do this with
Expand All @@ -247,7 +247,7 @@ julia> Pkg.test("MLJ")
```

It is important to note that MLJ is essentially a big wrapper
providing unified access to _model providing packages_. For this
providing unified access to _model-providing packages_. For this
reason, one generally needs to add further packages to your
environment to make model-specific code available. This
happens automatically when you use MLJ's interactive load command
Expand All @@ -267,12 +267,12 @@ module or function) see [Loading Model Code](@ref).
It is recommended that you start with models from more mature
packages such as DecisionTree.jl, ScikitLearn.jl or XGBoost.jl.

MLJ is supported by a number of satellite packages (MLJTuning,
MLJ is supported by several satellite packages (MLJTuning,
MLJModelInterface, etc) which the general user is *not* required to
install directly. Developers can learn more about these
[here](https://github.com/alan-turing-institute/MLJ.jl/blob/master/ORGANIZATION.md).

See also the alternative instalation instructions for [Modifying Behavior](@ref).
See also the alternative installation instructions for [Modifying Behavior](@ref).


## Funding
Expand Down
Loading

0 comments on commit 56ea42d

Please sign in to comment.