Skip to content

Losses

mlpoppyns.learning.losses.loss_base

Base loss.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

LossBase

Base abstract class for all losses.

This class serves as a blueprint for creating various loss functions used in machine learning models. It defines the essential interface that all loss functions must implement, ensuring consistency.

Source code in mlpoppyns/learning/losses/loss_base.py
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
class LossBase:
    """
    Base abstract class for all losses.

    This class serves as a blueprint for creating various loss functions
    used in machine learning models. It defines the essential interface
    that all loss functions must implement, ensuring consistency.
    """

    @abc.abstractmethod
    def __call__(
        self, output: torch.Tensor, target: torch.Tensor
    ) -> torch.Tensor:
        """
        Compute the loss between the model output and the target.

        This method takes the predicted output from the model and the
        corresponding target values, and computes the loss. The specific
        implementation of this method will vary depending on the type of
        loss function being implemented (e.g., Mean Squared Error, Cross-Entropy).

        Args:
            output (torch.Tensor): The predicted output from the model.
            target (torch.Tensor): The ground truth values to compare against.

        Returns:
            (torch.Tensor): The computed loss value, which is a scalar tensor
            representing the difference between the output and the target.
        """

        raise NotImplementedError

    @abc.abstractmethod
    def __str__(self) -> str:
        """
        String representation of the loss.

        This method should provide a human-readable description of the
        loss function, including its name and any relevant parameters or
        characteristics.

        Returns:
            (str): A string that describes the loss function.
        """

        raise NotImplementedError

__call__(output, target) abstractmethod

Compute the loss between the model output and the target.

This method takes the predicted output from the model and the corresponding target values, and computes the loss. The specific implementation of this method will vary depending on the type of loss function being implemented (e.g., Mean Squared Error, Cross-Entropy).

Parameters:

Name Type Description Default
output Tensor

The predicted output from the model.

required
target Tensor

The ground truth values to compare against.

required

Returns:

Type Description
Tensor

The computed loss value, which is a scalar tensor

Tensor

representing the difference between the output and the target.

Source code in mlpoppyns/learning/losses/loss_base.py
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
@abc.abstractmethod
def __call__(
    self, output: torch.Tensor, target: torch.Tensor
) -> torch.Tensor:
    """
    Compute the loss between the model output and the target.

    This method takes the predicted output from the model and the
    corresponding target values, and computes the loss. The specific
    implementation of this method will vary depending on the type of
    loss function being implemented (e.g., Mean Squared Error, Cross-Entropy).

    Args:
        output (torch.Tensor): The predicted output from the model.
        target (torch.Tensor): The ground truth values to compare against.

    Returns:
        (torch.Tensor): The computed loss value, which is a scalar tensor
        representing the difference between the output and the target.
    """

    raise NotImplementedError

__str__() abstractmethod

String representation of the loss.

This method should provide a human-readable description of the loss function, including its name and any relevant parameters or characteristics.

Returns:

Type Description
str

A string that describes the loss function.

Source code in mlpoppyns/learning/losses/loss_base.py
46
47
48
49
50
51
52
53
54
55
56
57
58
59
@abc.abstractmethod
def __str__(self) -> str:
    """
    String representation of the loss.

    This method should provide a human-readable description of the
    loss function, including its name and any relevant parameters or
    characteristics.

    Returns:
        (str): A string that describes the loss function.
    """

    raise NotImplementedError

mlpoppyns.learning.losses.loss_mae

Mean absolute error loss.

Authors:

Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)

LossMAE

Bases: LossBase

Mean Absolute Error (MAE) loss.

Source code in mlpoppyns/learning/losses/loss_mae.py
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
class LossMAE(LossBase):
    """
    Mean Absolute Error (MAE) loss.
    """

    def __call__(
        self, output: torch.Tensor, target: torch.Tensor
    ) -> torch.Tensor:
        """
        Computation of the MAE loss.

        Args:
            output (torch.Tensor): Network output tensor (predictions).
            target (torch.Tensor): Ground truth tensor (labels).

        Returns:
            (torch.Tensor): Tensor with a MAE loss value for each input pair output-target.
        """
        self.mae = nn.L1Loss()

        loss = self.mae(output, target)
        return loss

    def __str__(self) -> str:
        """
        String representation for the MAE loss.

        Returns:
            (str): String representation for the MAE loss.
        """

        return "MAE Loss"

__call__(output, target)

Computation of the MAE loss.

Parameters:

Name Type Description Default
output Tensor

Network output tensor (predictions).

required
target Tensor

Ground truth tensor (labels).

required

Returns:

Type Description
Tensor

Tensor with a MAE loss value for each input pair output-target.

Source code in mlpoppyns/learning/losses/loss_mae.py
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
def __call__(
    self, output: torch.Tensor, target: torch.Tensor
) -> torch.Tensor:
    """
    Computation of the MAE loss.

    Args:
        output (torch.Tensor): Network output tensor (predictions).
        target (torch.Tensor): Ground truth tensor (labels).

    Returns:
        (torch.Tensor): Tensor with a MAE loss value for each input pair output-target.
    """
    self.mae = nn.L1Loss()

    loss = self.mae(output, target)
    return loss

__str__()

String representation for the MAE loss.

Returns:

Type Description
str

String representation for the MAE loss.

Source code in mlpoppyns/learning/losses/loss_mae.py
39
40
41
42
43
44
45
46
47
def __str__(self) -> str:
    """
    String representation for the MAE loss.

    Returns:
        (str): String representation for the MAE loss.
    """

    return "MAE Loss"

mlpoppyns.learning.losses.loss_mse

Mean square error loss.

Authors:

Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)

LossMSE

Bases: LossBase

Mean Square Error (MSE) loss.

Source code in mlpoppyns/learning/losses/loss_mse.py
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
class LossMSE(LossBase):
    """
    Mean Square Error (MSE) loss.
    """

    def __call__(
        self, output: torch.Tensor, target: torch.Tensor
    ) -> torch.Tensor:
        """
        Computation of the MSE loss.

        Args:
            output (torch.Tensor): Network output tensor (predictions).
            target (torch.Tensor): Ground truth tensor (labels).

        Returns:
            (torch.Tensor): Tensor with a MSE loss value for each input pair output-target.
        """
        self.mse = nn.MSELoss()

        loss = self.mse(output, target)
        return loss

    def __str__(self) -> str:
        """
        String representation for the MSE loss.

        Returns:
            (str): String representation for the MSE loss.
        """

        return "MSE Loss"

__call__(output, target)

Computation of the MSE loss.

Parameters:

Name Type Description Default
output Tensor

Network output tensor (predictions).

required
target Tensor

Ground truth tensor (labels).

required

Returns:

Type Description
Tensor

Tensor with a MSE loss value for each input pair output-target.

Source code in mlpoppyns/learning/losses/loss_mse.py
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
def __call__(
    self, output: torch.Tensor, target: torch.Tensor
) -> torch.Tensor:
    """
    Computation of the MSE loss.

    Args:
        output (torch.Tensor): Network output tensor (predictions).
        target (torch.Tensor): Ground truth tensor (labels).

    Returns:
        (torch.Tensor): Tensor with a MSE loss value for each input pair output-target.
    """
    self.mse = nn.MSELoss()

    loss = self.mse(output, target)
    return loss

__str__()

String representation for the MSE loss.

Returns:

Type Description
str

String representation for the MSE loss.

Source code in mlpoppyns/learning/losses/loss_mse.py
39
40
41
42
43
44
45
46
47
def __str__(self) -> str:
    """
    String representation for the MSE loss.

    Returns:
        (str): String representation for the MSE loss.
    """

    return "MSE Loss"

mlpoppyns.learning.losses.loss_nll

Negative log-likelihood loss.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

LossNLL

Bases: LossBase

Negative log-likelihood (NLL) loss.

Source code in mlpoppyns/learning/losses/loss_nll.py
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
class LossNLL(LossBase):
    """
    Negative log-likelihood (NLL) loss.
    """

    def __call__(
        self,
        output: torch.Tensor,
        target: torch.Tensor,
    ) -> torch.Tensor:
        """
        Computation of the negative log-likelihood.

        Args:
            output (torch.Tensor): Network output tensor (predictions).
            target (torch.Tensor): Ground truth tensor (labels).

        Returns:
            (torch.Tensor): Tensor with a NLL loss value for each input pair output-target.
        """
        return F.nll_loss(output, target)

    def __str__(self) -> str:
        """
        String representation for the NLL loss.

        Returns:
            (str): String representation for the NLL loss.
        """

        return "NLL Loss"

__call__(output, target)

Computation of the negative log-likelihood.

Parameters:

Name Type Description Default
output Tensor

Network output tensor (predictions).

required
target Tensor

Ground truth tensor (labels).

required

Returns:

Type Description
Tensor

Tensor with a NLL loss value for each input pair output-target.

Source code in mlpoppyns/learning/losses/loss_nll.py
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
def __call__(
    self,
    output: torch.Tensor,
    target: torch.Tensor,
) -> torch.Tensor:
    """
    Computation of the negative log-likelihood.

    Args:
        output (torch.Tensor): Network output tensor (predictions).
        target (torch.Tensor): Ground truth tensor (labels).

    Returns:
        (torch.Tensor): Tensor with a NLL loss value for each input pair output-target.
    """
    return F.nll_loss(output, target)

__str__()

String representation for the NLL loss.

Returns:

Type Description
str

String representation for the NLL loss.

Source code in mlpoppyns/learning/losses/loss_nll.py
37
38
39
40
41
42
43
44
45
def __str__(self) -> str:
    """
    String representation for the NLL loss.

    Returns:
        (str): String representation for the NLL loss.
    """

    return "NLL Loss"

mlpoppyns.learning.losses.loss_rmse

Root mean square error loss.

Authors:

Michele Ronchi (ronchi@ice.csic.es)
Alberto Garcia Garcia (garciagarcia@ice.csic.es)

LossRMSE

Bases: LossBase

Root mean square error (RMSE) loss.

Source code in mlpoppyns/learning/losses/loss_rmse.py
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
class LossRMSE(LossBase):
    """
    Root mean square error (RMSE) loss.
    """

    def __call__(
        self, output: torch.Tensor, target: torch.Tensor
    ) -> torch.Tensor:
        """
        Computation of the RMSE loss.

        Args:
            output (torch.Tensor): Network output tensor (predictions).
            target (torch.Tensor): Ground truth tensor (labels).

        Returns:
            (torch.Tensor): Tensor with a RMS loss value for each input pair output-target.

        """
        self.mse = nn.MSELoss()

        loss = torch.sqrt(self.mse(output, target))
        return loss

    def __str__(self) -> str:
        """
        String representation for the RMSE loss.

        Returns:
            (str): String representation for the RMSE loss.
        """

        return "RMSE Loss"

__call__(output, target)

Computation of the RMSE loss.

Parameters:

Name Type Description Default
output Tensor

Network output tensor (predictions).

required
target Tensor

Ground truth tensor (labels).

required

Returns:

Type Description
Tensor

Tensor with a RMS loss value for each input pair output-target.

Source code in mlpoppyns/learning/losses/loss_rmse.py
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
def __call__(
    self, output: torch.Tensor, target: torch.Tensor
) -> torch.Tensor:
    """
    Computation of the RMSE loss.

    Args:
        output (torch.Tensor): Network output tensor (predictions).
        target (torch.Tensor): Ground truth tensor (labels).

    Returns:
        (torch.Tensor): Tensor with a RMS loss value for each input pair output-target.

    """
    self.mse = nn.MSELoss()

    loss = torch.sqrt(self.mse(output, target))
    return loss

__str__()

String representation for the RMSE loss.

Returns:

Type Description
str

String representation for the RMSE loss.

Source code in mlpoppyns/learning/losses/loss_rmse.py
40
41
42
43
44
45
46
47
48
def __str__(self) -> str:
    """
    String representation for the RMSE loss.

    Returns:
        (str): String representation for the RMSE loss.
    """

    return "RMSE Loss"

mlpoppyns.learning.losses.losses

Losses.

This is just an empty module that gathers all the available losses.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)