General
In general, in order to solve our own tasks with our own models in carefree-learn
, we need to concern:
- How to define a new model & How to use it for training.
- How to customize pre-processings of the dataset.
- How to control some fine-grained behaviours of the training loop.
In this section, we will focus on the general customizations.
tip
Customize Models
In carefree-learn
, a Model
should implement the core algorithms. It's basically an nn.Module
, with some extra useful functions:
As shown above, there are two special forward
methods defined in a Model
, which allows us to customize onnx
export procedure and summary
procedure respectively.
If we want to define our own models, we will need to override the forward
method (required) and the _init_with_trainer
method (optional).
forward
batch_idx
- Indicates the batch index of current batch.
batch
Input batch. It will be a dictionary (
Dict[str, torch.Tensor]
) returned byDataLoader
.In general, it will:
- always contain an
"input"
key, which represents the input data. - usually contain a
"labels"
key, which represents the target labels.
Other constants could be found here.
- always contain an
state
[default =None
]- The
TrainerState
instance.
- The
kwargs
- Other keyword arguments.
_init_with_trainer
This is an optional method, which is useful when we need to initialize our models with the prepared Trainer
instance.
tip
Since the prepared Trainer
instance will contain the dataset information, this method will be very useful if our models depend on the information.
Register & Apply
After defining the forward
(and probably the _init_with_trainer
) method, we need to register our model to apply it in carefree-learn
:
After which we can:
- set the
model_name
inPipeline
to the corresponding name to apply it. - set the
model_config
inPipeline
to the corresponding configurations.
note
For Machine Learning tasks, the APIs will remain the same but the internal design will be a little different. Please refer to the MLModel
section for more details.
Customize Training Loop
caution
To be continued...