Benchmarking
utilities.benchmark.cprofile
cProfile helper module.
This module provides helper functions to perform deep profiling of other routines using Python's built-in cProfiler.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
do_cprofile(enabled, output_dir)
Function to be used as decorator to perform a deep cProfile of another routine. It will call such function with the provided arguments with profiling enabled, later it gathers all the results in a readable format sorting them by total time, and then outputs the cProfile result to a text file in the specified folder with the name of the function as file name.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
enabled
|
bool
|
Whether or not profiling is toggled. |
required |
output_dir
|
str
|
Output directory for the profile text file. |
required |
Returns:
| Type | Description |
|---|---|
Callable
|
If profiling is disabled, it just returns the result of the function without profiling and generating any text file (seamless execution). If profiling is enabled, it also returns the result of executing the function seamlessly but generates the text output as specified above. |
Source code in utilities/benchmark/cprofile.py
19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 | |
utilities.benchmark.json_profile_to_rst_table
JSON profile to RST table.
This script parses a timing profile from TimeWith contexts and generates an RST table representation to include in our documentation automagically.
Display help message to run the code:
python json_profile_to_rst_table.py --help
Displays all the relevant arguments that can be used.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
generate_header(fields)
Generates a header for the table providing a list of field names with automatic width handling.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
fields
|
List
|
Names for the fields (columns) excluding the first one. |
required |
Returns:
| Type | Description |
|---|---|
str
|
An RST string representation for the table header. |
Source code in utilities/benchmark/json_profile_to_rst_table.py
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 | |
generate_multicolumn(name, columns)
Generates a multicolumn spawning a number of specified fields and populated with a given name and automatic width handling.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str
|
Text to populate the row. |
required |
columns
|
int
|
Number of fields or columns to take. |
required |
Returns:
| Type | Description |
|---|---|
str
|
An RST string representation of the multicolumn. |
Source code in utilities/benchmark/json_profile_to_rst_table.py
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 | |
generate_row(name, values)
Generates a row for the table with the given checkpoint name and the values for the fields (which spawn one column each one). The width is automatically adjusted to a maximum WIDTH per column.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str
|
Name of the checkpoint (row). |
required |
values
|
List
|
Values for each field in the checkpoint (columns). |
required |
Returns:
| Type | Description |
|---|---|
str
|
An RST string representation of the row with newline at the end. |
Source code in utilities/benchmark/json_profile_to_rst_table.py
90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 | |
generate_separator(fields)
Generates a separator for the table with automatic width for it.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
fields
|
int
|
Number of columns or fields (excluding the checkpoint names). |
required |
Returns:
| Type | Description |
|---|---|
str
|
An RST string containing the separator representation. |
Source code in utilities/benchmark/json_profile_to_rst_table.py
73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 | |
print_table(filename)
Prints a table in RST format by parsing the specified JSON profile file.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
filename
|
str
|
Path to the JSON profile to parse. |
required |
Source code in utilities/benchmark/json_profile_to_rst_table.py
115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 | |
utilities.benchmark.pyinstrument
PyInstrument helper module.
This module provides helper functions to perform deep profiling of other routines using Python's third-party PyInstrument.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
profile(enabled=True, show=True, output_dir=None)
Function to be used as decorator to perform a deep PyInstrument of another routine. It will call such function with the provided arguments with profiling enabled, later it gathers all the results in a readable format sorting them by total time, and then outputs the PyInstrument result to a text file in the specified folder with the name of the function as file name.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
enabled
|
bool
|
Whether or not profiling is toggled. |
True
|
show
|
bool
|
Whether or not to print info to terminal. |
True
|
output_dir
|
str
|
Output directory for the profile text file. |
None
|
Returns:
| Type | Description |
|---|---|
Callable
|
If profiling is disabled, it just returns the result of the function without profiling and generating any text file (seamless execution). If profiling is enabled, it also returns the result of executing the function seamlessly but generates the text output as specified above. |
Source code in utilities/benchmark/pyinstrument.py
18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 | |
utilities.benchmark.timefunc
TimeFunc.
This module provides a function for decorating other subroutines to automatically obtain timings for them.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
time_function(filename=None, show=True)
Function to use as a decorator to time another function for each call seamlessly. It sets up a timer, calls the specified function with the provided arguments and prints the elapsed time, returning the result of the provided function call.
The profiling result is optionally printed to screen and dumped to a file if a filename is specified.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
filename
|
str
|
Name of the file to dump profiling information. |
None
|
show
|
bool
|
Whether or not to print info to terminal. |
True
|
Returns:
| Type | Description |
|---|---|
Callable
|
The result of calling the specified function. |
Source code in utilities/benchmark/timefunc.py
18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 | |
utilities.benchmark.timewith
TimeWith.
This module provides a class for handling timings within a with scope and also
use checkpoints inside such context.
Authors:
Alberto Garcia Garcia (garciagarcia@ice.csic.es)
TimeWith
Class for timing contexts or scopes with checkpointing.
Source code in utilities/benchmark/timewith.py
20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 | |
__enter__()
Enter method when a context is created.
Returns:
| Type | Description |
|---|---|
Self
|
The instance of the class. |
Source code in utilities/benchmark/timewith.py
120 121 122 123 124 125 126 127 128 | |
__exit__(type, value, traceback)
Boilerplate exit method when the context is finished. In this case, it is overridden to optionally print the total time elapsed since its beginning. Such info is also dumped to a file if a filename is specified.
Note: the signature of exit is painful, forgive me for not typing all the arguments here.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
type
|
Optional[Type[BaseException]]
|
The exception type if an exception occurred, else None. |
required |
value
|
Optional[BaseException]
|
The exception instance if an exception occurred, else None. |
required |
traceback
|
Optional[BaseException]
|
The traceback object if an exception occurred, else None. |
required |
Source code in utilities/benchmark/timewith.py
130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 | |
__init__(name='', log_filename='profile.log', json_filename='profile.json', show=True)
Initialization of the timing context by holding a name for it and also capturing the current time as the starting time for the scope.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str
|
A name for the context to be used when printing info. |
''
|
log_filename
|
str
|
Name of the LOG file to dump profiling information. |
'profile.log'
|
json_filename
|
str
|
Name of the JSON file to dump profiling information. |
'profile.json'
|
show
|
bool
|
Whether or not to print info to terminal. |
True
|
Source code in utilities/benchmark/timewith.py
25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 | |
checkpoint(name='')
Checkpoints at the current time within the context optionally printing out the name given to the checkpoint and showing the amount of time elapsed since the last checkpoint (or the start of the scope if no checkpoint was done). Such info is also dumped to a file if a filename is specified.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str
|
A name for the checkpoint to print information. |
''
|
Source code in utilities/benchmark/timewith.py
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 | |
elapsed()
Elapsed time getter since start of scope and between individual elapsed calls (i.e., time between checkpoints).
Returns:
| Type | Description |
|---|---|
Tuple[float, float]
|
Tuple that contains the cumulative time since the start of the context and this call and the total time spent just on that time window in seconds. |
Source code in utilities/benchmark/timewith.py
64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 | |