Losses
mlpoppyns.learning.losses.loss_base
Base loss.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
LossBase
Base abstract class for all losses.
This class serves as a blueprint for creating various loss functions used in machine learning models. It defines the essential interface that all loss functions must implement, ensuring consistency.
Source code in mlpoppyns/learning/losses/loss_base.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 | |
__call__(output, target)
abstractmethod
Compute the loss between the model output and the target.
This method takes the predicted output from the model and the corresponding target values, and computes the loss. The specific implementation of this method will vary depending on the type of loss function being implemented (e.g., Mean Squared Error, Cross-Entropy).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output
|
Tensor
|
The predicted output from the model. |
required |
target
|
Tensor
|
The ground truth values to compare against. |
required |
Returns:
| Type | Description |
|---|---|
Tensor
|
The computed loss value, which is a scalar tensor |
Tensor
|
representing the difference between the output and the target. |
Source code in mlpoppyns/learning/losses/loss_base.py
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 | |
__str__()
abstractmethod
String representation of the loss.
This method should provide a human-readable description of the loss function, including its name and any relevant parameters or characteristics.
Returns:
| Type | Description |
|---|---|
str
|
A string that describes the loss function. |
Source code in mlpoppyns/learning/losses/loss_base.py
46 47 48 49 50 51 52 53 54 55 56 57 58 59 | |
mlpoppyns.learning.losses.loss_mae
Mean absolute error loss.
Authors:
Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
LossMAE
Bases: LossBase
Mean Absolute Error (MAE) loss.
Source code in mlpoppyns/learning/losses/loss_mae.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 | |
__call__(output, target)
Computation of the MAE loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output
|
Tensor
|
Network output tensor (predictions). |
required |
target
|
Tensor
|
Ground truth tensor (labels). |
required |
Returns:
| Type | Description |
|---|---|
Tensor
|
Tensor with a MAE loss value for each input pair output-target. |
Source code in mlpoppyns/learning/losses/loss_mae.py
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 | |
__str__()
String representation for the MAE loss.
Returns:
| Type | Description |
|---|---|
str
|
String representation for the MAE loss. |
Source code in mlpoppyns/learning/losses/loss_mae.py
39 40 41 42 43 44 45 46 47 | |
mlpoppyns.learning.losses.loss_mse
Mean square error loss.
Authors:
Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
LossMSE
Bases: LossBase
Mean Square Error (MSE) loss.
Source code in mlpoppyns/learning/losses/loss_mse.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 | |
__call__(output, target)
Computation of the MSE loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output
|
Tensor
|
Network output tensor (predictions). |
required |
target
|
Tensor
|
Ground truth tensor (labels). |
required |
Returns:
| Type | Description |
|---|---|
Tensor
|
Tensor with a MSE loss value for each input pair output-target. |
Source code in mlpoppyns/learning/losses/loss_mse.py
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 | |
__str__()
String representation for the MSE loss.
Returns:
| Type | Description |
|---|---|
str
|
String representation for the MSE loss. |
Source code in mlpoppyns/learning/losses/loss_mse.py
39 40 41 42 43 44 45 46 47 | |
mlpoppyns.learning.losses.loss_nll
Negative log-likelihood loss.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
LossNLL
Bases: LossBase
Negative log-likelihood (NLL) loss.
Source code in mlpoppyns/learning/losses/loss_nll.py
15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | |
__call__(output, target)
Computation of the negative log-likelihood.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output
|
Tensor
|
Network output tensor (predictions). |
required |
target
|
Tensor
|
Ground truth tensor (labels). |
required |
Returns:
| Type | Description |
|---|---|
Tensor
|
Tensor with a NLL loss value for each input pair output-target. |
Source code in mlpoppyns/learning/losses/loss_nll.py
20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | |
__str__()
String representation for the NLL loss.
Returns:
| Type | Description |
|---|---|
str
|
String representation for the NLL loss. |
Source code in mlpoppyns/learning/losses/loss_nll.py
37 38 39 40 41 42 43 44 45 | |
mlpoppyns.learning.losses.loss_rmse
Root mean square error loss.
Authors:
Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
LossRMSE
Bases: LossBase
Root mean square error (RMSE) loss.
Source code in mlpoppyns/learning/losses/loss_rmse.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | |
__call__(output, target)
Computation of the RMSE loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output
|
Tensor
|
Network output tensor (predictions). |
required |
target
|
Tensor
|
Ground truth tensor (labels). |
required |
Returns:
| Type | Description |
|---|---|
Tensor
|
Tensor with a RMS loss value for each input pair output-target. |
Source code in mlpoppyns/learning/losses/loss_rmse.py
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 | |
__str__()
String representation for the RMSE loss.
Returns:
| Type | Description |
|---|---|
str
|
String representation for the RMSE loss. |
Source code in mlpoppyns/learning/losses/loss_rmse.py
40 41 42 43 44 45 46 47 48 | |
mlpoppyns.learning.losses.losses
Losses.
This is just an empty module that gathers all the available losses.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)