Skip to content

Documentation

ModuleProfiler

Main class used to profile an arbitraty nn.Module and describe different specifications of it such as tracing input and output shapes, counting model parameters or estimating the number of operations the model peforms:

Parameters:

Name Type Description Default
input_size_attr str

Hidden attribute in the module under measurement used to store its input size.

'__input_size__'
output_size_attr str

Hidden attribute in the module under measurement used to store its output size.

'__output_size__'
ops_attr str

Hidden attribute in the module under measurement used to store the number of operations it performs.

'__ops__'
inference_start_attr str

Hidden attribute used to store inference start times while timing a model.

'__inference_start__'
inference_end_attr str

Hidden attribute used to store inference end times while timing a model.

'__inference_end__'
io_size_fn_map dict

Dictionary containing a map between modules and
their corresponding functions useed to trace the its size.

get_default_io_size_map()
ops_fn_map dict

Dictionary containing a map between modules and their corresponding function to estimate the number of operations.

get_default_ops_map()
exclude_from_ops Optional[List[Module]]

Modules to exclude from ops estimations.

None
ts_fmt str

Timestamp format used to print messages if verbose=True.

'%Y-%m-%d %H:%M:%S'
verbose bool

If True, enabled verbose output mode.

False

count_params(module, param_size=True, param_dtype=True, percent=True)

Counts the number of parameters in a model.

Parameters:

Name Type Description Default
module Module

Model whose parameters will be counted.

required
param_size bool

If True, the size in bits of each parameters will be calculated.

True
param_dtype bool

If True, the data type of different parameters will be reported.

True
percent bool

If True, the percentage each parameter represents with respect to the total amount of parameters of the model will be reported.

True

Returns:

Type Description
dict

Analysis results containing the measured module names and each corresponding parameter count.

count_params_csv(file, *args, **kwargs)

Same as count_params but saves a .csv file instead.

count_params_df(*args, **kwargs)

Same as count_params but returns a DataFrame instead.

count_params_html(file, *args, **kwargs)

Same as count_params but saves a .html file instead.

count_params_latex(*args, index=False, **kwargs)

Same as count_params but returns a LaTeX output instead.

estimate_inference_time(module, input, eval=True, num_iters=1000, drop_first=100)

Estimates the time spent on each module during the forward pass of a model. The final results are statistical aggregation of num_iters dropping the first drop_first iterations to avoid outliers caused by warmup routines.

Parameters:

Name Type Description Default
module Module

Input module.

required
input (Union[Tensor], Tuple[Tensor])

Model input.

required
eval bool

If True, the module is set to eval mode before computing the inference time.

True
num_iters int

Number of iterations to be performed.

1000
drop_first int

Inferences to be dropped before aggregating the results.

100

Returns:

Type Description
dict

Measurement results.

estimate_inference_time_csv(file, *args, **kwargs)

Same as estimate_inference_time but saves a .csv file instead.

estimate_inference_time_df(*args, aggr=True, **kwargs)

Same as estimate_inference_time but returns a DataFrame instead. Additional argument aggr can be set to True if only aggregations should be kept.

estimate_inference_time_html(file, *args, **kwargs)

Same as estimate_inference_time but saves a .html file instead.

estimate_inference_time_latex(*args, index=False, **kwargs)

Same as estimate_inference_time but return a LaTeX output instead.

estimate_ops(module, input, pred_fn=None, eval=True)

Estimates the number of operations computed in a forward pass of a module.

Parameters:

Name Type Description Default
module Module

Input module.

required
input Union[Tensor, Tuple[Tensor]]

Model input.

required
pred_fn Optional[Callable]

Optional prediction function that replaces the forward call if the module requires additional steps.

None
eval bool

If True, the module is set to eval mode before computing the inference time.

True

Returns:

Type Description
dict

Results containing the estimated operations per module.

estimate_ops_csv(file, *args, **kwargs)

Same as estimate_ops but saves a .csv file instead.

estimate_ops_df(*args, **kwargs)

Same as estimate_ops but returns a DataFrame instead.

estimate_ops_html(file, *args, **kwargs)

Same as estimate_ops but saves a .html file instead.

estimate_ops_latex(*args, index=False, **kwargs)

Same as estimate_ops but returns a LaTeX output instead.

estimate_total_inference_time(module, input, eval=True, num_iters=1000, drop_first=100)

Estimates the total inference time taken by the model to run an inference.

Parameters:

Name Type Description Default
module Module

Input module.

required
input Union[Tensor, Tuple[Tensor]]

Model input.

required
eval bool

If True, the module is set to eval mode before computing the inference time.

True
num_iters int

Number of iterations to be performed.

1000
drop_first int

Inferences to be dropped before aggregating the results.

100

Returns:

Type Description
dict

Measurement results.

estimate_total_inference_time_csv(file, *args, **kwargs)

Same as estimate_total_inference_time but saves a .csv file instead.

estimate_total_inference_time_df(*args, aggr=False, **kwargs)

Same as estimate_total_inference_time but returns a DataFrame instead. Additional argument aggr can be set to True if only aggregations should be kept.

estimate_total_inference_time_html(file, *args, **kwargs)

Same as estimate_total_inference_time but saves a .html file instead.

estimate_total_inference_time_latex(*args, index=False, **kwargs)

Same as estimate_total_inference_time but returns a LaTeX output instead.

trace_io_sizes(module, input, pred_fn=None, eval=False)

Traces the input and output tensor shapes of a module given a sample input.

Parameters:

Name Type Description Default
module Module

Input module.

required
input Union[torch.Tensor, Tuple[torch.Tensor]

Model input.

required
pred_fn Optional[Callable]

Optional prediction function that replaces the forward call if the module requires additional steps.

None
eval bool

If True, the module is set to eval mode before computing the inference time.

False

Returns:

Type Description
dict

Results containing input and output shapes of each module.

trace_io_sizes_csv(file, *args, **kwargs)

Same as trace_io_sizes but saves a .csv file instead.

trace_io_sizes_df(*args, **kwargs)

Same as trace_io_sizes but returns a DataFrame instead.

trace_io_sizes_html(file, *args, **kwargs)

Same as trace_io_sizes but saves a .html file instead.

trace_io_sizes_latex(*args, index=False, **kwargs)

Same as trace_io_sizes but returns a LaTeX output instead.