Initializers
mlpoppyns.learning.initializers.initializer_base
Base initializer.
This is an abstract class that contains the skeleton for any weight initialization scheme. Note that such weight initializer classes instances do behave as callable functions.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerBase
Base abstract class for all weight initializers.
This class serves as a blueprint for creating various initialization methods. It defines the essential methods that all weight initializers must implement, ensuring consistency.
Source code in mlpoppyns/learning/initializers/initializer_base.py
18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | |
__call__(m)
abstractmethod
Custom call operator for initializing the parameters of a torch module.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. |
required |
Source code in mlpoppyns/learning/initializers/initializer_base.py
26 27 28 29 30 31 32 33 34 35 | |
__str__()
abstractmethod
String representation of the weight initializer.
This method should provide a human-readable description of the weight initializer, including its name and any relevant parameters or characteristics.
Returns:
| Type | Description |
|---|---|
str
|
String representation of the weight initializer. |
Source code in mlpoppyns/learning/initializers/initializer_base.py
37 38 39 40 41 42 43 44 45 46 47 48 | |
mlpoppyns.learning.initializers.initializer_kaiming
Kaiming uniform initializer.
Class for a Kaiming uniform weight initializer.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerKaiming
Bases: InitializerBase
Kaiming uniform weight initializer class.
Any instance of this class is a callable function that will initialize a given torch module which contains trainable parameters (weights and biases) with a Kaiming uniform distribution for the weights and a constant value for the biases.
Source code in mlpoppyns/learning/initializers/initializer_kaiming.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | |
__call__(m)
Custom call operator for initializing the parameters of a module.
Weights are initialized using Kaiming uniform distribution (see "Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification" - He, K. et al. (2015)) in which the values of are sampled from a uniform distribution U(-bound,bound) where: bound = gain * sqrt(3 / fan_mode). Biases are just filled with a constant close-to-zero value (0.01).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. Right now, only initialization of Linear layers is performed. |
required |
Source code in mlpoppyns/learning/initializers/initializer_kaiming.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 | |
__str__()
Custom to string operator for the weight initializer.
Returns:
| Type | Description |
|---|---|
str
|
A string which describes the weight initializer for output purposes. |
Source code in mlpoppyns/learning/initializers/initializer_kaiming.py
46 47 48 49 50 51 52 53 | |
mlpoppyns.learning.initializers.initializer_normal
Normal initializer.
Class for a normal weight initializer.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerNormal
Bases: InitializerBase
Normal weight initializer class.
Any instance of this class is a callable function that will initialize a given torch module which contains trainable parameters (weights and biases) with a normal distribution for the weights and zero the biases.
Source code in mlpoppyns/learning/initializers/initializer_normal.py
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | |
__call__(m)
Custom call operator for initializing the parameters of a module.
Weights are initialized using a normal distribution (with values taken from the N(mean, std^2) distribution) whilst biases are just filled with a constant zero value. In this case, std = 1 / sqrt(y) where y is the number of input features.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. Right now, only initialization of Linear layers is performed. |
required |
Source code in mlpoppyns/learning/initializers/initializer_normal.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 | |
__str__()
Custom to string operator for the weight initializer.
Returns:
| Type | Description |
|---|---|
str
|
A string which describes the weight initializer for output purposes. |
Source code in mlpoppyns/learning/initializers/initializer_normal.py
45 46 47 48 49 50 51 52 53 | |
mlpoppyns.learning.initializers.initializer_uniform
Uniform initializer.
Class for a uniform weight initializer.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerUniform
Bases: InitializerBase
Uniform weight initializer class.
Any instance of this class is a callable function that will initialize a given torch module which contains trainable parameters (weights and biases) with a uniform distribution for the weights and zero the biases.
Source code in mlpoppyns/learning/initializers/initializer_uniform.py
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | |
__call__(m)
Custom call operator for initializing the parameters of a module.
Weights are initialized using an uniform distribution (with values taken from the U(0,1) distribution) whilst biases are just filled with a constant zero value.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. Right now, only initialization of Linear layers is performed. |
required |
Source code in mlpoppyns/learning/initializers/initializer_uniform.py
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 | |
__str__()
Custom to string operator for the weight initializer.
Returns:
| Type | Description |
|---|---|
str
|
A string which describes the weight initializer for output purposes. |
Source code in mlpoppyns/learning/initializers/initializer_uniform.py
45 46 47 48 49 50 51 52 53 | |
mlpoppyns.learning.initializers.initializer_uniform_rule
Uniform rule initializer.
Class for an uniform rule weight initializer.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerUniformRule
Bases: InitializerBase
Uniform weight initializer class.
Any instance of this class is a callable function that will initialize a given torch module which contains trainable parameters (weights and biases) with a uniform rule distribution for the weights and zero the biases.
Source code in mlpoppyns/learning/initializers/initializer_uniform_rule.py
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 | |
__call__(m)
Custom call operator for initializing the parameters of a module.
Weights are initialized using an uniform rule distribution (with values drawn from the distribution U(-y, y) where y = 1 / sqrt(n) being n the number of input features to the module) biases are just filled with a constant zero value.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. Right now, only initialization of Linear layers is performed. |
required |
Source code in mlpoppyns/learning/initializers/initializer_uniform_rule.py
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 | |
__str__()
Custom to string operator for the weight initializer.
Returns:
| Type | Description |
|---|---|
str
|
A string which describes the weight initializer for output purposes. |
Source code in mlpoppyns/learning/initializers/initializer_uniform_rule.py
48 49 50 51 52 53 54 55 56 | |
mlpoppyns.learning.initializers.initializer_xavier
Xavier uniform initializer.
Class for a xavier uniform weight initializer.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
InitializerXavier
Bases: InitializerBase
Xavier uniform weight initializer class.
Any instance of this class is a callable function that will initialize a given torch module which contains trainable parameters (weights and biases) with a Xavier uniform distribution for the weights and a constant value for the biases.
Source code in mlpoppyns/learning/initializers/initializer_xavier.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 | |
__call__(m)
Custom call operator for initializing the parameters of a module.
Weights are initialized using Xavier uniform distribution (see "Understanding the difficulty of training deep feedforward neural networks" - Glorot, X. & Bengio, Y. (2010) in which the values of are sampled from a uniform distribution U(-a,a) where: a = gain * sqrt(6 / (fan_in + fan_out)). Biases are just filled with a constant close-to-zero value (0.01).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
m
|
module
|
Module with parameters to be initialized. Could be anything from a linear layer to a convolutional one. Right now, only initialization of Linear layers is performed. |
required |
Source code in mlpoppyns/learning/initializers/initializer_xavier.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | |
__str__()
Custom to string operator for the weight initializer.
Returns:
| Type | Description |
|---|---|
str
|
A string which describes the weight initializer for output purposes. |
Source code in mlpoppyns/learning/initializers/initializer_xavier.py
47 48 49 50 51 52 53 54 | |
mlpoppyns.learning.initializers.initializers
Initializers.
This is just an empty module that gathers all the available weight initializers.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)