This tutorial shows how to use and implement optional by handy features of tfaip. Go to the minimal tutorial to see a scenario that only implements the required classes and functions.
This tutorial sets up training on the MNIST data which can then be used to predict digits of image files.
The following features are covered by this tutorial
- setting up of a DataPipeline using DataProcessors, see here.
- setting up of different data generation for training and prediction, see also here.
- selection and configuration of different dynamic graphs
- Writing image files to the tensorboard, see here
- setting up a Predictor that can vote the predictions of multiple individual models, see here
- Evaluator
Dynamic graphs allow to change and setup layers with parameters that can be set in the command line.
- First, setup a static Graph which will handle the creation of the dynamic layers. For MNIST, this graph also adds the final output as additional layer since it is obligatory. Furthermore, the data is normalized and reshaped.
- Next, create a base class and base params which is derived from
keras.layers.Layer
and must layer be implemented by each variant. Add an abstract method to the parameters to define how to create the layer. Here (cls()
), only the class type is returned while assuming that the first and only argument of the__init__
is the parameter. Optionally define a genericTypeVar
for the parameters that can be used to define the actual parameter type in the actual implemented layer. - Now, implement the base class and base parameters. In the tutorial, a CNN and MLP setup is provided.
- Finally, add a parameter to select the layers to the base params, here called
graph
in theModelParams
. Optionally, set thechoices
flag ofpai_meta
to provide the list of available parameters that can be selected. The static Graph calls the abstractcls()
method to retrieve the actual implementation and instantiates it.