Skip to content

Releases: JuliaAI/MLJ.jl

v0.11.1

29 Apr 10:05
7bd2ed1
Compare
Choose a tag to compare

MLJ v0.11.1

Diff since v0.11.0

Minor issues only:

Closed issues:

  • Add sample-weight interface point? (#177)
  • Add default_measure to learning_curve! (#283)
  • Flush out unsupervised models in "Adding models for general use" section of manual (#285)
  • is_probabilistic=true in @pipeline syntax is clunky (#305)
  • [suggestions] Unroll the network in @from_network (#311)
  • Towards stabilisation of the core API (#318)
  • failure on nightly (1.4) (#384)
  • Documentation of extracting best fitted params (#386)
  • incorporate input_scitype and target_scitype declarations for @pipeline models (#412)
  • "Supervised" models with no predict method (#460)
  • Use OpenML.load to iris data set in the Getting Started page of docs? (#461)
  • Review cheatsheet (#474)
  • Re-export UnsupervisedNetwork from MLJBase (#497)
  • Broken link for MLJ tour in documentation (#501)

Merged pull requests:

v0.11.0

24 Apr 20:05
3936fd2
Compare
Choose a tag to compare

MLJ v0.11.0

Diff since v0.10.3

Make compatibility updates to MLJBase and MLJModels to effect the following changes to MLJ (see the linked release notes for links to the issues/PRs)):

  • (new model) Add LightGBM models LightGBMClassifier and
    LightGBMRegressor

  • (new model) Add new built-in model, ContinuousEncoder, for
    transforming all features of a table to Continuous scitype,
    dropping any features that cannot be so transformed

  • (new model) Add ParallelKMeans model, KMeans, loaded with
    @load KMeans pkg=ParallelKMeans

  • (mildly breaking enhancement) Arrange for the CV
    resampling strategyto spread fold "remainders" evenly among folds in
    train_test_pairs(::CV, ...) (a small change only noticeable in
    small datasets)

  • (breaking) Restyle report and fitted_params for exported
    learning networks (e.g., pipelines) to include a dictionary of reports or
    fitted_params, keyed on the machines in the underlying learning
    network. New doc-strings detail the new behaviour.

  • (enhancement) Allow calling of transform on machines with Static models without
    first calling fit!

  • Allow machine constructor to work on supervised models that take nothing for
    the input features X (for models that simply fit a
    sampler/distribution to the target data y) (#51)

Also:

  • (documentation) In the "Adding New Models for General Use"
    section of the manual, add detail on how to wrap unsupervised
    models, as well as models that fit a sampler/distribution to data

  • (documentation) Expand the "Transformers" sections of the
    manual, including more material on static transformers and
    transformers that implement predict (#393)

Closed issues:

  • Add tuning by stochastic search (#37)
  • Improve documentation around static transformers (#393)
  • Error in docs for model search (#478)
  • Update [compat] StatsBase="^0.32,^0.33" (#481)
  • For a 0.10.3 release (#483)
  • Help with coercing strings for binary data into Continuous variables (#489)
  • EvoTree Error (#490)
  • Add info with workaround to avoid MKL error (#491)
  • LogisticClassifier pkg = MLJLinearModels computes a number of coefficients but not the same number of mean_and_std_given_feature (#492)
  • MethodError: no method matching... (#493)
  • For a 0.10.4 release (#495)
  • Error: fitted_params(LogisticModel) (#498)

Merged pull requests:

v0.10.3

03 Apr 18:05
7ff111c
Compare
Choose a tag to compare

MLJ v0.10.3

Diff since v0.10.2

  • Allow MLJ to use StatsBase v0.33 (PR #484 , #481)

  • Enable use of RandomSearch tuning strategy (PR #482, #37)

Merged pull requests:

  • Enable hyper-parameter tuning using random search (#482) (@ablaom)
  • Extend [compat] StatsBase = "^0.32,^0.33" (#484) (@ablaom)
  • For a 0.10.3 release (#485) (@ablaom)

v0.10.2

25 Mar 09:05
5c2bed1
Compare
Choose a tag to compare

MLJ v0.10.2

Diff since v0.10.1

  • Extend [compat] Distributions = "^0.21,^0.22,^0.23"

  • Minor doc fixes

Closed issues:

  • Task design discussion (#166)
  • Non-normalized versions of measures (#445)
  • Overload model traits to work on the named-tuple "proxies" for models listed by models() (#464)
  • Multiprocess issue (#468)
  • Julia v1.4.0 is downloading MLJ v0.2.3 instead of MLJ v0.10.1 (#476)

Merged pull requests:

v0.10.1

14 Mar 21:07
4f229c1
Compare
Choose a tag to compare

MLJ v0.10.1

Diff since v0.10.0

(enhancement) Add serialization for machines. Serialization is model-specific, with a fallback implementation using JLSO. The user serializes with MLJBase.save(path, mach) and de-serializes with machine(path) (#138, #292)

Closed issues:

  • Adhere by Invenia's bluestyle (#434)
  • Update list of scikitlearn models in readme table. (#469)

Merged pull requests:

v0.10.0

11 Mar 11:07
c369a90
Compare
Choose a tag to compare

MLJ v0.10.0

Diff since v0.9.3

Upgrade to MLJBase 0.12.0 and MLJModels 0.9.0 to effect the following changes:

  • (breaking) suppress normalisation of measure weights (MLJBase PR #208)

  • (breaking) Shift the optional rng argument of iterator to first position (MLJBase #215)

  • (mildly breaking) Let all models (supervised and unsupervised) share a common set of traits. So, for example, unsupervised models now have the target_scitype trait (usually taking the value Unknown). For a list of the common traits, do models()[1] |> keys |> collect (JuliaAI/MLJBase.jl#163).

  • (enhancement) Add sampler wrapper for one-dimensional ranges, for random sampling from ranges using rand (MLJBase #213)

  • Change default value of num_round in XGBoost models from 1 to 100 (MLJModels PR #201)

Closed issues:

  • Help with loading code on multiple processes for paralleled tuning of a pipeline (#440)
  • Re-export CPU1, CPUProcesses, CPUThreads (#447)
  • Taking loss functions seriously (#450)
  • @pipeline to accept multiple Supervised models (#455)
  • What parts of MLJBase should be reexported in MLJ (#462)
  • unpack not working (#465)
  • Automatic Ensembling options (#466)

Merged pull requests:

v0.9.3

29 Feb 21:06
ed04561
Compare
Choose a tag to compare

MLJ v0.9.3

Diff since v0.9.2

Merged pull requests:

v0.9.2

26 Feb 03:06
6c6d53f
Compare
Choose a tag to compare

MLJ v0.9.2

  • (enhancement) Update Tables requirement to "^1.0" (#444)

  • (new models) Add the pure-julia gradient boosted tree models from EvoTrees: EvoRegressor, EvoTreeCount, EvoTreeGaussian, EvoTreeCount (#122)

  • (documentation) Update README.md and some documentation errors

Diff since v0.9.1

Closed issues:

  • Implementing MLJ model interface for EvoTrees.jl (#122)
  • Improve the tuning strategy interface (#315)
  • Re-organizing the MLJ stack (#317)
  • Add Tables 1.0 (#444)

Merged pull requests:

v0.9.1

14 Feb 06:07
d7d189e
Compare
Choose a tag to compare

MLJ v0.9.1

Diff since v0.9.0

  • (enhancement) Enable dataset loading from OpenML using OpenML.load(id).

  • (documentation) Update the MLJ manual with missing measure docstrings; and to reflect use of MLJScientificTypes in place of ScientificTypes

  • (documentation - developers) Update manual to reflect split of MLJBase into MLJBase and MLJModelInterface

Closed issues:

  • Evaluation error logs on loading model (#433)

v0.9.0

12 Feb 15:00
3573d8a
Compare
Choose a tag to compare
Minor release with the light interface (#439)