Skip to content

Benchmarking

utilities.benchmark.cprofile

cProfile helper module.

This module provides helper functions to perform deep profiling of other routines using Python's built-in cProfiler.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

do_cprofile(enabled, output_dir)

Function to be used as decorator to perform a deep cProfile of another routine. It will call such function with the provided arguments with profiling enabled, later it gathers all the results in a readable format sorting them by total time, and then outputs the cProfile result to a text file in the specified folder with the name of the function as file name.

Parameters:

Name Type Description Default
enabled bool

Whether or not profiling is toggled.

required
output_dir str

Output directory for the profile text file.

required

Returns:

Type Description
Callable

If profiling is disabled, it just returns the result of the function without profiling and generating any text file (seamless execution). If profiling is enabled, it also returns the result of executing the function seamlessly but generates the text output as specified above.

Source code in utilities/benchmark/cprofile.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
def do_cprofile(enabled: bool, output_dir: str) -> typing.Callable:
    """
    Function to be used as decorator to perform a deep cProfile of another
    routine. It will call such function with the provided arguments with
    profiling enabled, later it gathers all the results in a readable format
    sorting them by total time, and then outputs the cProfile result to a text
    file in the specified folder with the name of the function as file name.

    Args:
        enabled (bool): Whether or not profiling is toggled.
        output_dir (str): Output directory for the profile text file.

    Returns:
        (typing.Callable): If profiling is disabled, it just returns the result of the function
            without profiling and generating any text file (seamless execution). If
            profiling is enabled, it also returns the result of executing the
            function seamlessly but generates the text output as specified above.
    """

    def inner(func: typing.Callable) -> typing.Callable:
        if not enabled:
            return func

        def profiled_func(*args, **kwargs):
            profile = cProfile.Profile()

            try:
                profile.enable()
                result = func(*args, **kwargs)
                profile.disable()
                return result

            finally:
                s = io.StringIO()
                ps = pstats.Stats(profile, stream=s).sort_stats("tottime")
                ps.print_stats()

                path = pathlib.Path(output_dir)
                path.mkdir(exist_ok=True)
                filename = path / (func.__name__ + ".txt")

                with open(filename, "w") as f:
                    f.write(s.getvalue())

        return profiled_func

    return inner

utilities.benchmark.json_profile_to_rst_table

JSON profile to RST table.

This script parses a timing profile from TimeWith contexts and generates an RST table representation to include in our documentation automagically.

Display help message to run the code:

python json_profile_to_rst_table.py --help

Displays all the relevant arguments that can be used.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

generate_header(fields)

Generates a header for the table providing a list of field names with automatic width handling.

Parameters:

Name Type Description Default
fields List

Names for the fields (columns) excluding the first one.

required

Returns:

Type Description
str

An RST string representation for the table header.

Source code in utilities/benchmark/json_profile_to_rst_table.py
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
def generate_header(fields: typing.List) -> str:
    """
    Generates a header for the table providing a list of field names with
    automatic width handling.

    Args:
        fields (List): Names for the fields (columns) excluding the first one.

    Returns:
        (str): An RST string representation for the table header.
    """
    name = "Context"
    row_str = "| " + name + " " * (WIDTH - len(name) - 2) + " |"
    for field in fields:
        row_str += " " + field + " " * (WIDTH - len(field) - 2) + " |"
    row_str += "\n"
    row_str += "+"
    for _ in range(len(fields) + 1):
        row_str += "=" * WIDTH + "+"
    row_str += "\n"
    return row_str

generate_multicolumn(name, columns)

Generates a multicolumn spawning a number of specified fields and populated with a given name and automatic width handling.

Parameters:

Name Type Description Default
name str

Text to populate the row.

required
columns int

Number of fields or columns to take.

required

Returns:

Type Description
str

An RST string representation of the multicolumn.

Source code in utilities/benchmark/json_profile_to_rst_table.py
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
def generate_multicolumn(name: str, columns: int) -> str:
    """
    Generates a multicolumn spawning a number of specified fields and populated
    with a given name and automatic width handling.

    Args:
        name (str): Text to populate the row.
        columns (int): Number of fields or columns to take.

    Returns:
        (str): An RST string representation of the multicolumn.
    """
    multicolumn_str = (
        "| "
        + name
        + " " * (WIDTH * (columns + 1) + columns - len(name) - 2)
        + " |\n"
    )
    return multicolumn_str

generate_row(name, values)

Generates a row for the table with the given checkpoint name and the values for the fields (which spawn one column each one). The width is automatically adjusted to a maximum WIDTH per column.

Parameters:

Name Type Description Default
name str

Name of the checkpoint (row).

required
values List

Values for each field in the checkpoint (columns).

required

Returns:

Type Description
str

An RST string representation of the row with newline at the end.

Source code in utilities/benchmark/json_profile_to_rst_table.py
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
def generate_row(name: str, values: typing.List) -> str:
    """
    Generates a row for the table with the given checkpoint name and the values
    for the fields (which spawn one column each one). The width is automatically
    adjusted to a maximum WIDTH per column.

    Args:
        name (str): Name of the checkpoint (row).
        values (List): Values for each field in the checkpoint (columns).

    Returns:
        (str): An RST string representation of the row with newline at the end.
    """
    row_str = "| " + name + " " * (WIDTH - len(name) - 2) + " |"
    for v in values:
        row_str += (
            " "
            + ("{:" + str(FLOAT_WIDTH) + ".4f}").format(v)
            + " " * (WIDTH - FLOAT_WIDTH - 2)
            + " |"
        )
    row_str += "\n"
    return row_str

generate_separator(fields)

Generates a separator for the table with automatic width for it.

Parameters:

Name Type Description Default
fields int

Number of columns or fields (excluding the checkpoint names).

required

Returns:

Type Description
str

An RST string containing the separator representation.

Source code in utilities/benchmark/json_profile_to_rst_table.py
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
def generate_separator(fields: int) -> str:
    """
    Generates a separator for the table with automatic width for it.

    Args:
        fields (int): Number of columns or fields (excluding the checkpoint names).

    Returns:
        (str): An RST string containing the separator representation.
    """
    separator_str = "+"
    for _ in range(fields + 1):
        separator_str += "-" * WIDTH + "+"
    separator_str += "\n"
    return separator_str

print_table(filename)

Prints a table in RST format by parsing the specified JSON profile file.

Parameters:

Name Type Description Default
filename str

Path to the JSON profile to parse.

required
Source code in utilities/benchmark/json_profile_to_rst_table.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
def print_table(filename: str) -> None:
    """
    Prints a table in RST format by parsing the specified JSON profile file.

    Args:
        filename (str): Path to the JSON profile to parse.
    """

    with open(filename, "r") as f:
        data = json.load(f)

        # Accumulate total time field.
        accumulated_time = 0.0

        # Add a separator and the header with the selected field names.
        table_str = generate_separator(len(fields.items()))
        table_str += generate_header([v for _, v in fields.items()])

        for key, value in data.items():
            # For each context, place a multicolumn with its name.
            table_str += generate_multicolumn(key, len(fields.items()))
            table_str += generate_separator(len(fields.items()))

            # Traverse each checkpoint of the context.
            for k, v in value.items():
                if k == "time":
                    continue

                # Accumulate this checkpoint total time.
                accumulated_time += v[accumulate]

                # Get the values for the tracked fields in the table and insert
                # a row with the name of the checkpoint and the values for those
                # selected fields.
                time_values = [x for kk, x in v.items() if kk in fields.keys()]
                table_str += generate_row(k, time_values)
                table_str += generate_separator(len(fields.items()))

        # Insert an empty multicolumn to separate total time.
        table_str += generate_multicolumn(" ", len(fields.items()))
        table_str += generate_separator(len(fields.items()))

        # Insert total time row.
        table_str += generate_multicolumn(
            (
                "Total "
                + fields[accumulate]
                + ": "
                + ("{:" + str(FLOAT_WIDTH) + ".4f}").format(accumulated_time)
            ),
            len(fields.items()),
        )
        table_str += generate_separator(len(fields.items()))

        print(table_str)

utilities.benchmark.pyinstrument

PyInstrument helper module.

This module provides helper functions to perform deep profiling of other routines using Python's third-party PyInstrument.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

profile(enabled=True, show=True, output_dir=None)

Function to be used as decorator to perform a deep PyInstrument of another routine. It will call such function with the provided arguments with profiling enabled, later it gathers all the results in a readable format sorting them by total time, and then outputs the PyInstrument result to a text file in the specified folder with the name of the function as file name.

Parameters:

Name Type Description Default
enabled bool

Whether or not profiling is toggled.

True
show bool

Whether or not to print info to terminal.

True
output_dir str

Output directory for the profile text file.

None

Returns:

Type Description
Callable

If profiling is disabled, it just returns the result of the function without profiling and generating any text file (seamless execution). If profiling is enabled, it also returns the result of executing the function seamlessly but generates the text output as specified above.

Source code in utilities/benchmark/pyinstrument.py
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
def profile(
    enabled: bool = True, show: bool = True, output_dir: str = None
) -> typing.Callable:
    """
    Function to be used as decorator to perform a deep PyInstrument of another
    routine. It will call such function with the provided arguments with
    profiling enabled, later it gathers all the results in a readable format
    sorting them by total time, and then outputs the PyInstrument result to a
    text file in the specified folder with the name of the function as file name.

    Args:
        enabled (bool): Whether or not profiling is toggled.
        show (bool): Whether or not to print info to terminal.
        output_dir (str): Output directory for the profile text file.

    Returns:
        (typing.Callable): If profiling is disabled, it just returns the result of the function
            without profiling and generating any text file (seamless execution). If
            profiling is enabled, it also returns the result of executing the
            function seamlessly but generates the text output as specified above.
    """

    def inner(func: typing.Callable) -> typing.Callable:
        if not enabled:
            return func

        def profiled_func(*args, **kwargs):
            profiler = pyinstrument.Profiler()

            try:
                profiler.start()
                result = func(*args, **kwargs)
                profiler.stop()
                return result

            finally:
                output = profiler.output_text(unicode=True, color=True)

                if show:
                    print(output)

                if output_dir is not None:
                    path = pathlib.Path(output_dir)
                    path.mkdir(exist_ok=True)
                    filename = path / (func.__name__ + ".pyinstrument")

                    with open(filename, "w") as f:
                        f.write(output)

        return profiled_func

    return inner

utilities.benchmark.timefunc

TimeFunc.

This module provides a function for decorating other subroutines to automatically obtain timings for them.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

time_function(filename=None, show=True)

Function to use as a decorator to time another function for each call seamlessly. It sets up a timer, calls the specified function with the provided arguments and prints the elapsed time, returning the result of the provided function call.

The profiling result is optionally printed to screen and dumped to a file if a filename is specified.

Parameters:

Name Type Description Default
filename str

Name of the file to dump profiling information.

None
show bool

Whether or not to print info to terminal.

True

Returns:

Type Description
Callable

The result of calling the specified function.

Source code in utilities/benchmark/timefunc.py
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
def time_function(filename: str = None, show: bool = True) -> typing.Callable:
    """
    Function to use as a decorator to time another function for each call
    seamlessly. It sets up a timer, calls the specified function with the
    provided arguments and prints the elapsed time, returning the result
    of the provided function call.

    The profiling result is optionally printed to screen and dumped to a file
    if a filename is specified.

    Args:
        filename (str): Name of the file to dump profiling information.
        show (bool): Whether or not to print info to terminal.

    Returns:
        (typing.Callable): The result of calling the specified function.
    """

    def inner(func: typing.Callable) -> typing.Callable:
        def f_timer(*args, **kwargs):
            start = time.time()
            result = func(*args, **kwargs)
            end = time.time()

            output = "<prof>{} took {:.4f} [s]".format(
                func.__name__, end - start
            )

            if show:
                print(termcolor.colored(output, "green"))

            if filename is not None:
                with open(filename, "a") as f:
                    f.write(output + "\n")

            return result

        return f_timer

    return inner

utilities.benchmark.timewith

TimeWith.

This module provides a class for handling timings within a with scope and also use checkpoints inside such context.

Authors:

Alberto Garcia Garcia (garciagarcia@ice.csic.es)

TimeWith

Class for timing contexts or scopes with checkpointing.

Source code in utilities/benchmark/timewith.py
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
class TimeWith:
    """
    Class for timing contexts or scopes with checkpointing.
    """

    def __init__(
        self,
        name: str = "",
        log_filename: str = "profile.log",
        json_filename: str = "profile.json",
        show: bool = True,
    ) -> None:
        """
        Initialization of the timing context by holding a name for it and also
        capturing the current time as the starting time for the scope.

        Args:
            name (str): A name for the context to be used when printing info.
            log_filename (str): Name of the LOG file to dump profiling information.
            json_filename (str): Name of the JSON file to dump profiling information.
            show (bool): Whether or not to print info to terminal.
        """

        self.name = name
        self.start = time.time()
        self.last = self.start
        self.log_filename = log_filename
        self.json_filename = json_filename
        self.show = show

        if self.json_filename is not None:
            if not os.path.isfile(self.json_filename):
                with open(self.json_filename, "w+") as f:
                    f.write(json.dumps({}))

            with open(self.json_filename, "r") as f:
                data = json.load(f)

            data[name] = {}
            data[name]["time"] = 0.0

            with open(self.json_filename, "w") as f:
                json.dump(data, f, indent=2)

    def elapsed(self) -> Tuple[float, float]:
        """
        Elapsed time getter since start of scope and between individual elapsed
        calls (i.e., time between checkpoints).

        Returns:
            (Tuple[float, float]): Tuple that contains the cumulative time since the
                start of the context and this call and the total time spent just on
                that time window in seconds.
        """

        current = time.time()
        cumulative = current - self.start
        total = current - self.last

        # Mark the last time we fetched time with the current one for the next
        # partial timing on checkpoint.
        self.last = current

        return cumulative, total

    def checkpoint(self, name: str = "") -> None:
        """
        Checkpoints at the current time within the context optionally printing
        out the name given to the checkpoint and showing the amount of time
        elapsed since the last checkpoint (or the start of the scope if no
        checkpoint was done). Such info is also dumped to a file if a filename
        is specified.

        Args:
            name (str): A name for the checkpoint to print information.
        """

        cumulative, total = self.elapsed()
        output = "<prof>{}{} took {:.4f} [s] (cumulative {:.4f} [s])".format(
            self.name, name, total, cumulative
        ).strip()

        if self.show:
            print(termcolor.colored(output, "green"))

        if self.log_filename is not None:
            with open(self.log_filename, "a") as f:
                f.write(output + "\n")

        if self.json_filename is not None:
            with open(self.json_filename) as f:
                data = json.load(f)

            data[self.name][name] = {}
            data[self.name][name]["elapsed_time"] = total
            data[self.name][name]["cumulative_time"] = cumulative

            with open(self.json_filename, "w") as f:
                json.dump(data, f, indent=2)

    def __enter__(self) -> Self:
        """
        Enter method when a context is created.

        Returns:
            (Self): The instance of the class.
        """

        return self

    def __exit__(
        self,
        type: Optional[Type[BaseException]],
        value: Optional[BaseException],
        traceback: Optional[BaseException],
    ) -> None:
        """
        Boilerplate exit method when the context is finished. In this case, it
        is overridden to optionally print the total time elapsed since its
        beginning. Such info is also dumped to a file if a filename is specified.

        Note: the signature of __exit__ is painful, forgive me for not typing
        all the arguments here.

        Args:
            type (Optional[Type[BaseException]]): The exception type if an exception occurred, else None.
            value (Optional[BaseException]): The exception instance if an exception occurred, else None.
            traceback (Optional[BaseException]): The traceback object if an exception occurred, else None.
        """

        cumulative, _ = self.elapsed()
        output = "<prof>{} {} took {:.4f} [s]".format(
            self.name, "finished", cumulative
        ).strip()

        if self.show:
            print(termcolor.colored(output, "green"))

        if self.log_filename is not None:
            with open(self.log_filename, "a") as f:
                f.write(output + "\n")

        if self.json_filename is not None:
            with open(self.json_filename, "r") as f:
                data = json.load(f)

            data[self.name]["time"] = cumulative

            with open(self.json_filename, "w") as f:
                json.dump(data, f, indent=2)

__enter__()

Enter method when a context is created.

Returns:

Type Description
Self

The instance of the class.

Source code in utilities/benchmark/timewith.py
120
121
122
123
124
125
126
127
128
def __enter__(self) -> Self:
    """
    Enter method when a context is created.

    Returns:
        (Self): The instance of the class.
    """

    return self

__exit__(type, value, traceback)

Boilerplate exit method when the context is finished. In this case, it is overridden to optionally print the total time elapsed since its beginning. Such info is also dumped to a file if a filename is specified.

Note: the signature of exit is painful, forgive me for not typing all the arguments here.

Parameters:

Name Type Description Default
type Optional[Type[BaseException]]

The exception type if an exception occurred, else None.

required
value Optional[BaseException]

The exception instance if an exception occurred, else None.

required
traceback Optional[BaseException]

The traceback object if an exception occurred, else None.

required
Source code in utilities/benchmark/timewith.py
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
def __exit__(
    self,
    type: Optional[Type[BaseException]],
    value: Optional[BaseException],
    traceback: Optional[BaseException],
) -> None:
    """
    Boilerplate exit method when the context is finished. In this case, it
    is overridden to optionally print the total time elapsed since its
    beginning. Such info is also dumped to a file if a filename is specified.

    Note: the signature of __exit__ is painful, forgive me for not typing
    all the arguments here.

    Args:
        type (Optional[Type[BaseException]]): The exception type if an exception occurred, else None.
        value (Optional[BaseException]): The exception instance if an exception occurred, else None.
        traceback (Optional[BaseException]): The traceback object if an exception occurred, else None.
    """

    cumulative, _ = self.elapsed()
    output = "<prof>{} {} took {:.4f} [s]".format(
        self.name, "finished", cumulative
    ).strip()

    if self.show:
        print(termcolor.colored(output, "green"))

    if self.log_filename is not None:
        with open(self.log_filename, "a") as f:
            f.write(output + "\n")

    if self.json_filename is not None:
        with open(self.json_filename, "r") as f:
            data = json.load(f)

        data[self.name]["time"] = cumulative

        with open(self.json_filename, "w") as f:
            json.dump(data, f, indent=2)

__init__(name='', log_filename='profile.log', json_filename='profile.json', show=True)

Initialization of the timing context by holding a name for it and also capturing the current time as the starting time for the scope.

Parameters:

Name Type Description Default
name str

A name for the context to be used when printing info.

''
log_filename str

Name of the LOG file to dump profiling information.

'profile.log'
json_filename str

Name of the JSON file to dump profiling information.

'profile.json'
show bool

Whether or not to print info to terminal.

True
Source code in utilities/benchmark/timewith.py
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def __init__(
    self,
    name: str = "",
    log_filename: str = "profile.log",
    json_filename: str = "profile.json",
    show: bool = True,
) -> None:
    """
    Initialization of the timing context by holding a name for it and also
    capturing the current time as the starting time for the scope.

    Args:
        name (str): A name for the context to be used when printing info.
        log_filename (str): Name of the LOG file to dump profiling information.
        json_filename (str): Name of the JSON file to dump profiling information.
        show (bool): Whether or not to print info to terminal.
    """

    self.name = name
    self.start = time.time()
    self.last = self.start
    self.log_filename = log_filename
    self.json_filename = json_filename
    self.show = show

    if self.json_filename is not None:
        if not os.path.isfile(self.json_filename):
            with open(self.json_filename, "w+") as f:
                f.write(json.dumps({}))

        with open(self.json_filename, "r") as f:
            data = json.load(f)

        data[name] = {}
        data[name]["time"] = 0.0

        with open(self.json_filename, "w") as f:
            json.dump(data, f, indent=2)

checkpoint(name='')

Checkpoints at the current time within the context optionally printing out the name given to the checkpoint and showing the amount of time elapsed since the last checkpoint (or the start of the scope if no checkpoint was done). Such info is also dumped to a file if a filename is specified.

Parameters:

Name Type Description Default
name str

A name for the checkpoint to print information.

''
Source code in utilities/benchmark/timewith.py
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
def checkpoint(self, name: str = "") -> None:
    """
    Checkpoints at the current time within the context optionally printing
    out the name given to the checkpoint and showing the amount of time
    elapsed since the last checkpoint (or the start of the scope if no
    checkpoint was done). Such info is also dumped to a file if a filename
    is specified.

    Args:
        name (str): A name for the checkpoint to print information.
    """

    cumulative, total = self.elapsed()
    output = "<prof>{}{} took {:.4f} [s] (cumulative {:.4f} [s])".format(
        self.name, name, total, cumulative
    ).strip()

    if self.show:
        print(termcolor.colored(output, "green"))

    if self.log_filename is not None:
        with open(self.log_filename, "a") as f:
            f.write(output + "\n")

    if self.json_filename is not None:
        with open(self.json_filename) as f:
            data = json.load(f)

        data[self.name][name] = {}
        data[self.name][name]["elapsed_time"] = total
        data[self.name][name]["cumulative_time"] = cumulative

        with open(self.json_filename, "w") as f:
            json.dump(data, f, indent=2)

elapsed()

Elapsed time getter since start of scope and between individual elapsed calls (i.e., time between checkpoints).

Returns:

Type Description
Tuple[float, float]

Tuple that contains the cumulative time since the start of the context and this call and the total time spent just on that time window in seconds.

Source code in utilities/benchmark/timewith.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
def elapsed(self) -> Tuple[float, float]:
    """
    Elapsed time getter since start of scope and between individual elapsed
    calls (i.e., time between checkpoints).

    Returns:
        (Tuple[float, float]): Tuple that contains the cumulative time since the
            start of the context and this call and the total time spent just on
            that time window in seconds.
    """

    current = time.time()
    cumulative = current - self.start
    total = current - self.last

    # Mark the last time we fetched time with the current one for the next
    # partial timing on checkpoint.
    self.last = current

    return cumulative, total