Miscellaneous utilities
Configuration management
Configuration files are managed by subclasses of
lisa.conf.MultiSrcConf
. It allows loading from a YAML file (not to be
confused with serializing the instance).
- class lisa.conf.DeferredValue(callback, *args, **kwargs)[source]
Bases:
object
Wrapper similar to
functools.partial()
allowing to defer computation of the value until the key is actually used.Once computed, the deferred value is replaced by the value that was computed. This is useful for values that are very costly to compute, but should be used with care as it means it will usually not be available in the offline
lisa.platforms.platinfo.PlatformInfo
instances. This means that client code such as submodules oflisa.analysis
will typically not have it available (unlesseval_deferred()
was called) although they might need it.
- class lisa.conf.FilteredDeferredValue(callback, sentinel=None)[source]
Bases:
DeferredValue
Same as
lisa.conf.DeferredValue
except that the given sentinel value will be interpreted as no value .
- class lisa.conf.DeferredExcep(excep)[source]
Bases:
DeferredValue
Specialization of
DeferredValue
to lazily raise an exception.- Parameters:
excep (BaseException) – Exception to raise when the value is used.
- exception lisa.conf.TopLevelKeyError(key)[source]
Bases:
ValueError
Exception raised when no top-level key matches the expected one in the given configuration file.
- class lisa.conf.KeyDescBase(name, help)[source]
Bases:
ABC
Base class for configuration files key descriptor.
This allows defining the structure of the configuration file, in order to sanitize user input and generate help snippets used in various places.
- INDENTATION = ' '
- property qualname
“Qualified” name of the key.
This is a slash-separated path in the config file from the root to that key: <parent qualname>/<name>
- property path
Path in the config file from the root to that key.
Note
This includes the top-level key name, which must be removed before it’s fed to
MultiSrcConf.get_nested_key()
.
- abstract get_help(style=None, last=False)[source]
Get a help message describing the key.
- Parameters:
style – When “rst”, ResStructuredText formatting may be applied
style – str
last (bool) –
True
if this is the last item in a list.
- abstract validate_val(val)[source]
Validate a value to be used for that key.
- Raises:
TypeError – When the value has the wrong type
ValueError – If the value does not comply with some other constraints. Note that constraints should ideally be encoded in the type itself, to make help message as straightforward as possible.
- class lisa.conf.KeyDesc(name, help, classinfo, newtype=None, deepcopy_val=True)[source]
Bases:
KeyDescBase
Key descriptor describing a leaf key in the configuration.
- Parameters:
name – Name of the key
help – Short help message describing the use of that key
classinfo (collections.abc.Sequence) – sequence of allowed types for that key. As a special case, None is allowed in that sequence of types, even though it is not strictly speaking a type.
newtype (str or None) – If specified, a type with the given name will be created for that key with that name. Otherwise, a camel-case name derived from the key name will be used:
toplevel-key/sublevel/mykey
will give a type named SublevelMykey. This class will be exposed as an attribute of the parentMultiSrcConf
(which is why the toplevel key is omitted from its name). A getter will also be created on the parent configuration class, so that the typed key is exposed toexekall
. If the key is not present in the configuration object, the getter will returnNone
.deepcopy_val (bool) – If
True
, the values will be deepcopied upon lookup. This prevents accidental modification of mutable types (like lists) by the user.
- property newtype
- validate_val(val)[source]
Check that the value is an instance of one of the type specified in the
self.classinfo
.If the value is not an instance of any of these types, then a
TypeError
is raised corresponding to the first type in the tuple, which is assumed to be the main one.
- exception lisa.conf.ConfigKeyError(msg, key=None, src=None)[source]
Bases:
KeyError
Exception raised when a key is not found in the config instance.
- exception lisa.conf.MissingBaseKeyError(msg, key=None, src=None)[source]
Bases:
ConfigKeyError
Exception raised when a base key needed to compute a derived key is missing.
- exception lisa.conf.DeferredValueComputationError(msg, excep, key=None, src=None)[source]
Bases:
ConfigKeyError
Raised when computing the value of
DeferredValue
lead to an exception.
- exception lisa.conf.KeyComputationRecursionError(msg, key=None, src=None)[source]
Bases:
ConfigKeyError
,RecursionError
Raised when
DerivedKeyDesc.compute_val()
is reentered while computing a given key on a configuration instance, or when aDeferredValue
callback is reentered.
- class lisa.conf.DerivedKeyDesc(name, help, classinfo, base_key_paths, compute, newtype=None)[source]
Bases:
KeyDesc
Key descriptor describing a key derived from other keys
Derived keys cannot be added from a source, since they are purely computed out of other keys. It is also not possible to change their source priorities. To achieve that, set the source priorities on the keys it is based on.
- Parameters:
base_key_paths (list(list(str))) – List of paths to the keys this key is derived from. The paths in the form of a list of string are relative to the current level. To reference a level above the current one, use the special key
..
.compute (collections.abc.Callable) – Function used to compute the value of the key. It takes a dictionary of base keys specified in
base_key_paths
as only parameter and is expected to return the key’s value.
- property help
- get_non_evaluated_base_keys(conf)[source]
Get the
KeyDescBase
of base keys that have aDeferredValue
value.
- class lisa.conf.LevelKeyDesc(name, help, children, value_path=None)[source]
Bases:
KeyDescBase
,Mapping
Key descriptor defining a hierarchical level in the configuration.
- Parameters:
name – name of the key in the configuration
help – Short help describing the use of the keys inside that level
children (collections.abc.Sequence) – collections.abc.Sequence of
KeyDescBase
defining the allowed keys under that levelvalue_path (list(str) or None) –
Relative path to a sub-key that will receive assignment to that level for non-mapping types. This allows turning a leaf key into a level while preserving backward compatibility, as long as:
The key did not accept mapping values, otherwise it would be ambiguous and is therefore rejected.
The old leaf key has a matching new leaf key, that is a sub-key of the new level key.
In practice, that allows turning a single knob into a tree of settings.
Children keys will get this key assigned as a parent when passed to the constructor.
- property key_desc
- class lisa.conf.VariadicLevelKeyDesc(name, help, child, value_path=None)[source]
Bases:
LevelKeyDesc
Level key descriptor that allows configuration-source-defined sub-level keys.
- Parameters:
child (lisa.conf.LevelKeyDesc) – Variadic level. Its name will only be used for documentation purposes, the configuration instances will be able to hold any string.
- Variable keyword arguments:
Forwarded to
lisa.conf.LevelKeyDesc
.
This allows “instantiating” a whole sub-configuration for variable level keys.
- class lisa.conf.DelegatedLevelKeyDesc(name, help, conf, **kwargs)[source]
Bases:
LevelKeyDesc
Level key descriptor that imports the keys from another
MultiSrcConfABC
subclass.- Parameters:
conf (MultiSrcConfABC) – Configuration class to extract keys from.
- Variable keyword arguments:
Forwarded to
lisa.conf.LevelKeyDesc
.
This allows embedding a configuration inside another one, mostly to be able to split a configuration class while preserving backward compatibility.
Note
Only the children keys are taken from the passed level, other information such as
value_path
are ignored and must be set explicitly.
- class lisa.conf.TopLevelKeyDescBase(levels, *args, **kwargs)[source]
Bases:
LevelKeyDesc
Top-level key descriptor, which defines the top-level key to use in the configuration files.
- Parameters:
levels (list(str)) – Levels of the top-level key, as a list of strings. Each item specifies a level in a mapping, so that multiple classes can share the same actual top-level without specific cooperation.
This top-level key is omitted in all interfaces except for the configuration file, since it only reflects the configuration class
- class lisa.conf.TopLevelKeyDesc(name, *args, **kwargs)[source]
Bases:
TopLevelKeyDescBase
Regular top-level key descriptor, with only one level.
- Parameters:
name (str) – Name of the top-level key, as a string.
- class lisa.conf.NestedTopLevelKeyDesc(levels, *args, **kwargs)[source]
Bases:
TopLevelKeyDescBase
Top-level key descriptor, with an arbitrary amount of levels.
- class lisa.conf.MultiSrcConfABC[source]
Bases:
Serializable
,ABC
Warning
Arbitrary code can be executed while loading an instance from a YAML or Pickle file. To include untrusted data in YAML, use the !untrusted tag along with a string
- classmethod from_yaml_map(path, add_default_src=True)[source]
Allow reloading from a plain mapping, to avoid having to specify a tag in the configuration file. The content is hosted under the top-level key specified in
STRUCTURE
.- Parameters:
Note
Only load YAML files from trusted source as it can lead to arbitrary code execution.
- classmethod from_yaml_map_list(path_list, add_default_src=True)[source]
Create a mapping of configuration classes to instance, by loading them from the list of paths using
from_yaml_map()
and merging them.- Parameters:
path_list (list(str)) – List of paths to YAML configuration files.
add_default_src – See
from_yaml_map()
.
Note
When merging, the configuration coming from the rightmost path will win if it defines some keys that were also defined in another file. Each file will be mapped to a different sources, named after the basename of the file.
Note
Only load YAML files from trusted source as it can lead to arbitrary code execution.
- property as_yaml_map
Give a mapping suitable for storing in a YAML configuration file.
See also
- to_yaml_map(path)[source]
Write a configuration file, with the key descriptions in comments.
- Parameters:
path (str) – Path to the file to write to.
- to_yaml_map_str(**kwargs)[source]
Return the content of the file that would be create by
to_yaml_map()
in a string.- Variable keyword arguments:
Forwarded to
to_yaml_map()
- class lisa.conf.MultiSrcConf(conf=None, src='user', add_default_src=True)[source]
Bases:
MultiSrcConfABC
,Loggable
,Mapping
Base class providing layered configuration management.
- Parameters:
conf (collections.abc.Mapping) – collections.abc.Mapping to initialize the configuration with. This must be optional, in which case it is assumed the object will contain a default base configuration.
src – Name of the source added when passing
conf
src – str
The class inherits from
collections.abc.Mapping
, which means it can be used like a readonly dict. Writing to it is handled by a different API that allows naming the source of values that are stored.Each configuration key can be either a leaf key, that holds a value, or a level key that allows to defined nested levels. The allowed keys is set by the
STRUCTURE
class attribute.Each leaf key can hold different values coming from different named sources. By default, the last added source will have the highest priority and will be served when looking up that key. A different priority order can be defined for a specific key if needed.
See also
This base class will modify the docstring of subclasses, using it as an
str.format
template with the following placeholders:{generated_help}
: snippet of ResStructuredText containing the list of allowed keys.{yaml_example}
: example snippet of YAML
It will also create the types specified using
newtype
in theKeyDesc
, along with a getter to expose it toexekall
.Note
Since the dosctring is interpreted as a template, “{” and “}” characters must be doubled to appear in the final output.
Attention
The layout of the configuration is typically guaranteed to be backward-compatible in terms of accepted shape of input, but layout of the configuration might change. This means that the path to a given key could change as long as old input is still accepted. Types of values can also be widened, so third party code re-using config classes from
lisa
might have to evolve along the changes of configuration.Warning
Arbitrary code can be executed while loading an instance from a YAML or Pickle file. To include untrusted data in YAML, use the !untrusted tag along with a string
- abstract STRUCTURE()[source]
Class attribute defining the structure of the configuration file, as a instance of
TopLevelKeyDescBase
- DEFAULT_SRC = {}
Source added automatically using
add_src()
under the name ‘default’ when instances are built.
- __copy__()[source]
Shallow copy of the nested configuration tree, without duplicating the leaf values.
- to_map()[source]
Export the configuration as a mapping.
The return value should preserve key-specific priority override list, which is not done if directly passing that instance to
dict()
.
- classmethod from_map(mapping, add_default_src=True)[source]
Create a new configuration instance, using the output of
to_map()
- add_src(src, conf, filter_none=False, fallback=False, inplace=True)[source]
Add a source of configuration.
- Parameters:
src (str) – Name of the soruce to add
conf (collections.abc.Mapping) – Nested mapping of key/values to overlay
filter_none (bool) – Ignores the keys that have a
None
value. That simplifies the creation of the mapping, by having keys always present. That should not be used ifNone
value for a key is expected, as opposit to not having that key set at all.fallback (bool) – If True, the source will be added as a fallback, which means at the end of the priority list. By default, the source will have the highest priority and will be used unless a key-specific priority override is setup.
inplace (bool) – If
True
, the object is modified. IfFalse
, a mutated copy is returned and the original object is left unmodified.
This method provides a way to update the configuration, by importing a mapping as a new source.
- set_default_src(src_prio)[source]
Set the default source priority list.
- Parameters:
src_prio (collections.abc.Sequence(str)) – list of source names, first is the highest priority
Adding sources using
add_src()
in the right order is preferable, but the default priority order can be specified using that method.
- force_src_nested(key_src_map)[source]
Force the source priority list for all the keys defined in the nested mapping
key_src_map
- Parameters:
key_src_map (collections.abc.Mapping) – nested mapping of keys to priority list of sources
- force_src(key, src_prio)[source]
Force the source priority list for a given key
- Parameters:
key (str) – name of the key. Only leaf keys are allowed here, since level keys have no source on their own.
src_prio (collections.abc.Sequence(str) or None) – List of sources in priority order (first is highest priority). Special value
None
can be used to remove the key-specific priority override, so the default priority list will be used instead.
- eval_deferred(cls=<class 'lisa.conf.DeferredValue'>, src=None, resolve_src=True, error='raise')[source]
Evaluate instances of
DeferredValue
that can be used for values that are expensive to compute.- Parameters:
cls (subclass of
DeferredValue
) – Only evaluate values of instances of that class. This can be used to have different categories ofDeferredValue
by subclassing.src (str or None) – If not
None
, only evaluate values that were added under that source name.resolve_src (bool) – If
True
, resolve the source of each key and only compute deferred values for this source.error (str or None) – If
'raise'
orNone
, exception are raised as usual. Iflog
, the exception is logged at error level.
- __getstate__()[source]
Filter instances of
DeferredValue
that are not computed already since their runtime parameters will probably not be available after deserialization.If needed, call
eval_deferred()
before serializing.
- get_key(key, src=None, eval_deferred=True, quiet=False)[source]
Get the value of the given key. It returns a deepcopy of the value.
The special key
..
can be used to refer to the parent in the hierarchy.- Parameters:
key (str) – name of the key to lookup
src (str or None) – If not None, look up the value of the key in that source
eval_deferred (bool) – If True, evaluate instances of
DeferredValue
if neededquiet (bool) – Avoid logging the access
Note
Using the indexing operator
self[key]
is preferable in most cases , but this method provides more parameters.
- get_nested_key(key, *args, **kwargs)[source]
Same as
get_key()
but works on a list of keys to access nested mappings.
- get_src_map(key)[source]
Get a mapping of all sources for the given
key
, in priority order (first item is the highest priority source).
- class lisa.conf.SimpleMultiSrcConf(conf=None, src='user', add_default_src=True)[source]
Bases:
MultiSrcConf
Like
MultiSrcConf
, with a simpler config file.conf
andsource
are not available, and the behaviour is as all keys were located under aconf
key. We do not allow overriding source for this kind of configuration to keep the YAML interface simple and dict-likeWarning
Arbitrary code can be executed while loading an instance from a YAML or Pickle file. To include untrusted data in YAML, use the !untrusted tag along with a string
- classmethod from_map(*args, **kwargs)[source]
Create a new configuration instance, using the output of
to_map()
- to_map()[source]
Export the configuration as a mapping.
The return value should preserve key-specific priority override list, which is not done if directly passing that instance to
dict()
.
- to_yaml_map(path, add_placeholder=False, placeholder='<no default>')[source]
Write a configuration file, with the key descriptions in comments.
- Parameters:
path (str) – Path to the file to write to.
add_placeholder (bool) – If
True
, a placeholder value will be used for keys that don’t have values. This allows creating template configuration files that list all keys.placeholder (object) – Placeholder to use for missing values when
add_placeholder
is used.
- class lisa.conf.Configurable[source]
Bases:
ABC
Pair a regular class with a configuration class.
The pairing is achieved by inheriting from
Configurable
and settingCONF_CLASS
attribute. The benefits are:The docstring of the class is processed as a string template and
{configurable_params}
is replaced with a Sphinx-compliant list of parameters. The help and type of each parameter is extracted from the configuration class.The
DEFAULT_SRC
attribute of the configuration class is updated with non-None
default values of the class__init__
parameters.The
conf_to_init_kwargs()
method allows turning a configuration object into a dictionary suitable for passing to__init__
as**kwargs
.The
check_init_param()
method allows checking types of__init__
parameters according to what is specified in the configuration class.
Most of the time, the configuration keys and
__init__
parameters have the same name (modulo underscore/dashes which are handled automatically). In that case, the mapping between config keys and__init__
parameters is done without user intervention. When that is not the case, theINIT_KWARGS_KEY_MAP
class attribute can be used. Its a dictionary with keys being__init__
parameter names, and values being path to configuration key. That path is a list of strings to take into account sublevels like['level-key', 'sublevel', 'foo']
.Note
A given configuration class must be paired to only one class. Otherwise, the
DEFAULT_SRC
conf class attribute will be updated multiple times, leading to unexpected results.Note
Some services offered by
Configurable
are not extended to subclasses of a class using it. For example, it would not make sense to updateDEFAULT_SRC
using a subclass__init__
parameters.- INIT_KWARGS_KEY_MAP = {}
- classmethod __init_subclass__(**kwargs)[source]
This method is called when a class is subclassed.
The default implementation does nothing. It may be overridden to extend subclasses.
Regression testing
- class lisa.regression.ResultCount(passed, failed)
Bases:
tuple
Create new instance of ResultCount(passed, failed)
- __getnewargs__()
Return self as a plain tuple. Used by copy and pickle.
- __match_args__ = ('passed', 'failed')
- static __new__(_cls, passed, failed)
Create new instance of ResultCount(passed, failed)
- failed
Alias for field number 1
- passed
Alias for field number 0
- class lisa.regression.RegressionResult(testcase_id, old_count, new_count, alpha=None)[source]
Bases:
object
Compute failure-rate regression between old and new series.
The regression is checked using Fisher’s exact test.
- Parameters:
testcase_id (str) – ID of the testcase, used for pretty-printing
old_count (ResultCount) – number of times the test passed and failed in the old series
new_count (ResultCount) – number of times the test passed and failed in the new series
alpha (float) – Alpha risk when carrying the statistical test
- classmethod from_result_list(testcase_id, old_list, new_list, alpha=None)[source]
Build a
RegressionResult
from two list oflisa.tests.base.Result
, or objects that can be converted to bool.Note
Only
FAILED
andPASSED
results are taken into account, other results are ignored.- Parameters:
testcase_id (str) – ID of the testcase
old_list (list(lisa.tests.base.Result)) – old series
new_list (list(lisa.tests.base.Result)) – new series
alpha (float) – Alpha risk of the statistical test
- property sample_size
Tuple of sample sizes for old and new series.
- property failure_pc
Tuple of failure rate in percent for old an new series.
- property failure_delta_pc
Delta between old and new failure rate in percent.
- property significant
True if there is a significant difference in failure rate, False otherwise.
- property p_val
P-value of the statistical test.
- get_p_val(alternative='two-sided')[source]
Compute the p-value of the statistical test, with the given alternative hypothesis.
- property fix_validation_min_iter_nr
Number of iterations required to validate a fix that would “revert” a regression.
Assuming that the “fixed” failure rate is exactly equal to the “old” one, this gives the number of iterations after which comparing the “fixed” failure rate with the “new” failure rate will give a statistically significant result.
- lisa.regression.compute_regressions(old_list, new_list, remove_tags=None, **kwargs)[source]
Compute a list of
RegressionResult
out of two lists ofexekall.engine.FrozenExprVal
.The tests are first grouped by their ID, and then a
RegressionResult
is computed for each of these ID.- Parameters:
old_list (list(exekall.engine.FrozenExprVal)) – old series of
exekall.engine.FrozenExprVal
.new_list (list(exekall.engine.FrozenExprVal)) – new series of
exekall.engine.FrozenExprVal
. Values with a UUID that is also present in old_list will be removed from that list before the regressions are computed.remove_tags (list(str) or None) – remove the given list of tags from the IDs before computing the regression. That allows computing regressions with a different “board” tag for example.
- Variable keyword arguments:
Forwarded to
RegressionResult.from_result_list()
.
Utilities
Miscellaneous utilities that don’t fit anywhere else.
Also used as a home for everything that would create cyclic dependency issues between modules if they were hosted in their “logical” module. This is mostly done for secondary utilities that are not used often.
- lisa.utils.LISA_HOME = '/home/ubuntu/builds/-JSzPHSA/1/tooling/lisa/tmp/tmp.P6QKibgwKj-worktree'
The detected location of your LISA installation
- lisa.utils.LISA_CACHE_HOME = '/home/ubuntu/builds/-JSzPHSA/1/tooling/lisa/tmp/tmp.P6QKibgwKj-worktree/cache/git-32168f3cc4c33a6512aef14975782cce0662bd6f'
Base folder used for caching files.
- lisa.utils.LISA_HOST_ABI = 'x86_64'
ABI of the machine that imported that module.
- lisa.utils.TASK_COMM_MAX_LEN = 15
Value of
TASK_COMM_LEN - 1
macro in the kernel, to account for\
terminator.
- lisa.utils.sphinx_register_nitpick_ignore(x)[source]
Register an object with a name that cannot be resolved and therefore cross referenced by Sphinx.
- lisa.utils.sphinx_nitpick_ignore()[source]
Set of objects to ignore without warning when cross referencing in Sphinx.
- class lisa.utils.UnboundMethodType[source]
Bases:
object
Dummy class to be used to check if a function is a method defined in a class or not:
class C: def f(self): ... @classmethod def f_class(cls): ... @staticmethod def f_static(): ... def g(): ... assert isinstance(C.f, UnboundMethodType) assert isinstance(C.f_class, UnboundMethodType) assert isinstance(C.f_static, UnboundMethodType) assert not isinstance(g, UnboundMethodType)
- class lisa.utils.bothmethod(f)[source]
Bases:
object
Decorator to allow a method to be used both as an instance method and a classmethod.
If it’s called on a class, the first parameter will be bound to the class, otherwise it will be bound to the instance.
- class lisa.utils.instancemethod(f)[source]
Bases:
object
Decorator providing a hybrid of a normal method and a classmethod:
Like a classmethod, it can be looked up on the class itself, and the class is passed as first parameter. This allows selecting the class “manually” before applying on an instance.
Like a normal method, it can be looked up on an instance. In that case, the first parameter is the class of the instance and the second parameter is the instance itself.
- class lisa.utils.Loggable[source]
Bases:
object
A simple class for uniformly named loggers
- property logger
Convenience short-hand for
self.get_logger()
.
- lisa.utils.compose(*fs)[source]
Compose multiple functions such that
compose(f, g)(x) == g(f(x))
.Note
This handles well functions with arity higher than 1, as if they were curried. The functions will consume the number of parameters they need out of the parameters passed to the composed function. Innermost functions are served first.
- class lisa.utils.mappable(f)[source]
Bases:
object
Decorator that allows the decorated function to be mapped on an iterable:
@mappable def f(x): return x * 2 f @ [1, 2, 3] == map(f, [1, 2, 3])
- lisa.utils.get_subclasses(cls, only_leaves=False, cls_set=None)[source]
Get all indirect subclasses of the class.
- lisa.utils.get_cls_name(cls, style=None, fully_qualified=True)[source]
Get a prettily-formated name for the class given as parameter
- Parameters:
cls (type) – Class or typing hint to get the name from.
style – When “rst”, a RestructuredText snippet is returned
style – str
- lisa.utils.get_common_ancestor(classes)[source]
Pick the most derived common ancestor between the classes, assuming single inheritance.
If multiple inheritance is used, only the first base of each class is considered.
- class lisa.utils.HideExekallID[source]
Bases:
object
Hide the subclasses in the simplified ID format of exekall.
That is mainly used for uninteresting classes that do not add any useful information to the ID. This should not be used on domain-specific classes since alternatives may be used by the user while debugging for example. Hiding too many classes may lead to ambiguity, which is exactly what the ID is fighting against.
- lisa.utils.memoized(f)[source]
Decorator to memoize the result of a callable, based on
functools.lru_cache()
Note
The first parameter of the callable is cached with a weak reference. This suits well the method use-case, since we don’t want the memoization of methods to prevent garbage collection of the instances they are bound to.
- lisa.utils.lru_memoized(first_param_maxsize=None, other_params_maxsize=1024)[source]
Decorator to memoize the result of a callable, based on
functools.lru_cache()
- Parameters:
Note
The first parameter of the callable is cached with a weak reference when the function is a method. This suits well the method use-case, since we don’t want the memoization of methods to prevent garbage collection of the instances they are bound to.
- lisa.utils.resolve_dotted_name(name)[source]
Only resolve names where __qualname__ == __name__, i.e the callable is a module-level name.
- lisa.utils.import_all_submodules(pkg, best_effort=False)[source]
Import all submodules of a given package.
- Parameters:
pkg (types.ModuleType) – Package to import.
best_effort (bool) – If
True
, modules in the hierarchy that cannot be imported will be silently skipped.
- lisa.utils.docstring_update(msg)[source]
Create a class to inherit from in order to add a snippet of doc at the end of the docstring of all direct and indirect subclasses:
class C(docstring_update('world')): "hello" assert C.__doc__ == 'hello
world’
- class lisa.utils.Serializable[source]
Bases:
Loggable
,_DocstringAppend
A helper class for YAML serialization/deserialization
The following YAML tags are supported on top of what YAML provides out of the box:
!call
: call a Python callable with a mapping of arguments:# will execute: # package.module.Class(arg1='foo', arg2='bar', arg3=42) # NB: there is no space after "call:" !call:package.module.Class arg1: foo arg2: bar arg3: 42
!include
: include the content of another YAML file. Environment variables are expanded in the given path:!include /foo/$ENV_VAR/bar.yml
Relative paths are treated as relative to the file in which the
!include
tag appears.!include-untrusted
: Similar to!include
but will disable custom tag interpretation when loading the content of the file. This is suitable to load untrusted input. Note that the env var interpolation and the relative path behavior depends on the mode of the YAML parser. This means that the path itself must be trusted, as this could leak environment variable values. Only the content of the included file is treated as untrusted.!env
: take the value of an environment variable, and convert it to a Python type:!env:int MY_ENV_VAR
If interpolate is used as type, the value will be interpolated using
os.path.expandvars()
and the resulting string returned:!env:interpolate /foo/$MY_ENV_VAR/bar
!var
: reference a module-level variable:!var package.module.var
!untrusted
: Interpret the given string as a YAML snippet, without any of the special constructor being enabled. This provides a way of safely including untrusted input in the YAML document without running the risk of the user being able to use e.g.!call
.# Note the "|": this allows having a multiline string, leaving # its interpretation to the untrusted loader. !untrusted | foo: bar
Note
Not to be used on its own - instead, your class should inherit from this class to gain serialization superpowers.
Warning
Arbitrary code can be executed while loading an instance from a YAML or Pickle file. To include untrusted data in YAML, use the !untrusted tag along with a string
- ATTRIBUTES_SERIALIZATION = {'allowed': [], 'ignored': [], 'placeholders': {}}
- YAML_ENCODING = 'utf-8'
Encoding used for YAML files
- DEFAULT_SERIALIZATION_FMT = 'yaml'
Default format used when serializing objects
- classmethod from_path(filepath, fmt=None)[source]
Deserialize an object from a file
- Parameters:
- Raises:
AssertionError – if the deserialized object is not an instance of the class.
Note
Only deserialize files from trusted source, as both pickle and YAML formats can lead to arbitrary code execution.
- __getstate__()[source]
Filter the instance’s attributes upon serialization.
The following keys in
ATTRIBUTES_SERIALIZATION
can be used to customize the serialized content:allowed
: list of attribute names to serialize. All other attributes will be ignored and will not be saved/restored.ignored
: list of attribute names to not serialize. All other attributes will be saved/restored.placeholders
: Map of attribute names to placeholder values. These attributes will not be serialized, and the placeholder value will be used upon restoration.
If both
allowed
andignored
are specified,ignored
is ignored.
- lisa.utils.setup_logging(filepath='logging.conf', level=None)[source]
Initialize logging used for all the LISA modules.
- Parameters:
filepath (str) – the relative or absolute path of the logging configuration to use. Relative path uses
lisa.utils.LISA_HOME
as base folder.level (int or str) – Override the conf file and force logging level. Defaults to
logging.INFO
.
- class lisa.utils.ArtifactPath(root, relative, *args, **kwargs)[source]
Bases:
str
,Loggable
,HideExekallID
Path to a folder that can be used to store artifacts of a function. This must be a clean folder, already created on disk.
- classmethod join(path1, path2)[source]
Join two paths together, similarly to
os.path.join()
.If
path1
is aArtifactPath
, the result will also be one, and the root ofpath1
will be used as the root of the new path.
- lisa.utils.value_range(start, stop, step=None, nr_steps=None, inclusive=False, type_=None, clip=False)[source]
Equivalent to builtin
range
function, but works for floats as well.- Parameters:
start (numbers.Number) – First value to use.
stop (numbers.Number) – Last value to use.
step (numbers.Number) – Mutually exclusive with
nr_steps
: increment. IfNone
, increment defaults to 1.nr_steps (int or None) – Mutually exclusive with
step
: number of steps.inclusive (bool) – If
True
, thestop
value will be included (unlike the builtinrange
)type (collections.abc.Callable) – If specified, will be mapped on the resulting values.
clip (bool) – If
True
, the last value is set tostop
, rather than potentially be different ifinclusive=True
.
Note
Unlike
range
, it will raiseValueError
ifstart > stop and step > 0
.
- lisa.utils.filter_values(iterable, values)[source]
Yield value from
iterable
unless they are invalues
.
- lisa.utils.groupby(iterable, key=None, reverse=False)[source]
Equivalent of
itertools.groupby()
, with a pre-sorting so it works as expected.
- lisa.utils.grouper(iterable, n, fillvalue=None)[source]
Collect data into fixed-length chunks or blocks
- lisa.utils.group_by_value(mapping, key_sort=<function <lambda>>)[source]
Group a mapping by its values
- Parameters:
mapping (collections.abc.Mapping or collections.abc.Sequence) – Mapping to reverse. If a sequence is passed, it is assumed to contain key/value subsequences.
key_sort (collections.abc.Callable) – The
key
parameter to asorted()
call on the mapping keys
- Return type:
The idea behind this method is to “reverse” a mapping, IOW to create a new mapping that has the passed mapping’s values as keys. Since different keys can point to the same value, the new values will be lists of old keys.
Example:
>>> group_by_value({0: 42, 1: 43, 2: 42}) OrderedDict([(42, [0, 2]), (43, [1])])
- lisa.utils.deduplicate(seq, keep_last=True, key=<function <lambda>>)[source]
Deduplicate items in the given sequence and return a list. :param seq: Sequence to deduplicate :type Seq: collections.abc.Sequence
- Parameters:
key (collections.abc.Callable) – Key function that will be used to determine duplication. It takes one item at a time, returning a hashable key value
keep_last (bool) – If True, will keep the last occurence of each duplicated items. Otherwise, keep the first occurence.
- lisa.utils.order_as(items, order_as, key=None)[source]
Reorder the iterable of
items
to match the sequence inorder_as
. Items present initems
and not inorder_as
will be appended at the end, in appearance order.- Parameters:
key (collections.abc.Callable) – If provided, will be called on each item of
items
before being compared toorder_as
to determine the order.
- lisa.utils.fold(f, xs, init=None)[source]
Fold the given function over
xs
, withinit
initial accumulator value.This is very similar to
functools.reduce()
, except that it is not assumed that the function returns values of the same type as the item type.This means that this function enforces non-empty input.
- lisa.utils.foldr(f, xs, init=None)[source]
Right-associative version of
fold()
.Note
This requires reversing xs. If reversing is not supported by the iterator, it will be first converted to a tuple.
- lisa.utils.is_monotonic(iterable, decreasing=False)[source]
Return
True
if the given sequence is monotonic,False
otherwise.- Parameters:
decreasing (bool) – If
True
, check that the sequence is decreasing rather than increasing.
- lisa.utils.fixedpoint(f, init, limit=None)[source]
Find the fixed point of a function
f
with the initial parameterinit
.- Parameters:
limit (int or None) – If provided, set a limit on the number of iterations.
- lisa.utils.get_common_prefix(*iterables)[source]
Return the common prefix of the passed iterables as an iterator.
- lisa.utils.take(n, iterable)[source]
Yield the first
n
items of an iterator, ifn
positive, or last items otherwise.Yield nothing if the iterator is empty.
- lisa.utils.consume(n, iterator)[source]
Advance the iterator n-steps ahead. If
n
is None, consume entirely.
- lisa.utils.unzip_into(n, iterator)[source]
Unzip a given
iterator
inton
variables.Example:
orig_a = [1, 3] orig_b = [2, 4] a, b = unzip(zip(orig_a, orig_b)) assert a == orig_a assert b == orig_b
Note
n
is needed in order to handle properly the case where an empty iterator is passed.
- lisa.utils.get_nested_key(mapping, key_path, getitem=<built-in function getitem>)[source]
Get a key in a nested mapping
- Parameters:
mapping (collections.abc.Mapping) – The mapping to lookup in
key_path (list) – Path to the key in the mapping, in the form of a list of keys.
getitem (collections.abc.Callable) – Function used to get items on the mapping. Defaults to
operator.getitem()
.
- lisa.utils.set_nested_key(mapping, key_path, val, level=None)[source]
Set a key in a nested mapping
- Parameters:
mapping (collections.abc.MutableMapping) – The mapping to update
key_path (list) – Path to the key in the mapping, in the form of a list of keys.
level (collections.abc.Callable) – Factory used when creating a level is needed. By default,
type(mapping)
will be called without any parameter.
- lisa.utils.loopify(items)[source]
Try to factor an iterable into a prefix that is repeated a number of times.
Returns a tuple
(N, prefix)
withN
such thatN * prefix == list(items)
.
- lisa.utils.get_call_site(levels=0, exclude_caller_module=False)[source]
Get the location of the source that called that function.
- Returns:
(caller, filename, lineno) tuples. Any component can be None if nothing was found. Caller is a string containing the function name.
- Parameters:
levels (int) – How many levels to look at in the stack
exclude_caller_module – Return the first function in the stack that is not defined in the same module as the direct caller of
get_call_site()
.
Warning
That function will exclude all source files that are not part of the lisa package. It will also exclude functions of
lisa.utils
module.
- lisa.utils.is_running_sphinx()[source]
Returns True if the module is imported when Sphinx is running, False otherwise.
- lisa.utils.is_running_ipython()[source]
Returns True if running in IPython console or Jupyter notebook, False otherwise.
- lisa.utils.non_recursive_property(f)[source]
Create a property that raises an
AttributeError
if it is re-entered.Note
This only guards against single-thread accesses, it is not threadsafe.
- lisa.utils.get_short_doc(obj, strip_rst=False)[source]
Get the short documentation paragraph at the beginning of docstrings.
- Parameters:
strip_rst (bool) – If
True
, remove reStructuredText markup.
- lisa.utils.optional_kwargs(func)[source]
Decorator used to allow another decorator to both take keyword parameters when called, and none when not called:
@optional_kwargs def decorator(func, xxx=42): ... # Both of these work: @decorator def foo(...): ... @decorator(xxx=42) def foo(...): ...
Note
This only works for keyword parameters.
Note
When decorating classmethods,
optional_kwargs()
must be above@classmethod
so it can handle it properly.
- lisa.utils.update_params_from(f, ignore=None)[source]
Decorator to update the signature of the decorated function using annotation and default values from the specified
f
function.If the parameter already has a default value, it will be used instead of copied-over. Same goes for annotations.
- lisa.utils.kwargs_forwarded_to(f, ignore=None)[source]
Similar to
functools.wraps()
, except that it will only fixup the signature.The signature is modified in the following way:
Variable keyword parameters are removed
All the parameters that
f
take are added as keyword-only in the decorated function’s signature, under the assumption that**kwargs
in the decorated function is used to relay the parameters tof
.
Example:
def f(x, y, z): pass @kwargs_forwarded_to(f) def g(z, **kwargs): f(**kwargs) return z # The signature of g() is now "(z, *, x, y)", i.e. x and y are # keyword-only.
- lisa.utils.update_wrapper_doc(func, added_by=None, sig_from=None, description=None, remove_params=None, include_kwargs=False)[source]
Equivalent to
functools.wraps()
that updates the signature by taking into account the wrapper’s extra keyword-only parameters and the given description.- Parameters:
func (collections.abc.Callable) – callable to decorate
added_by (collections.abc.Callable or str or None) – Add some kind of reference to give a sense of where the new behaviour of the wraps function comes from.
sig_from (collections.abc.Callable) – By default, the signature containing the added parameters will be taken from
func
. This allows overriding that, in casefunc
is just a wrapper around something else.description (str or None) – Extra description output in the docstring.
remove_params (list(str) or None) – Set of parameter names of
func
to not include in the decorated function signature. This can be used to hide parameters that are only used as part of a decorated/decorator protocol, and not exposed in the final decorated function.include_kwargs (bool) – If True, variable keyword parameter (
**kwargs
) of the decorator is kept in the signature. It is usually removed, since it’s mostly used to transparently forward arguments to the innerfunc
, but can also serve other purposes.
Note
functools.wraps()
is applied by this decorator, which will not work if you applied it yourself.
- lisa.utils.sig_bind(sig, args, kwargs, partial=True, include_defaults=True)[source]
Similar to
inspect.Signature.bind()
but expands variable keyword arguments so that the resulting dictionary can be used directly in a function call.- The function returns a
(kwargs, missing)
with: missing
a set of the missing mandatory parameters.kwargs
a dictionary of parameter names to values, ready to be used to call a function.
- Parameters:
sig (inspect.Signature) – Signature to extract parameters from.
partial (bool) – If
True
, behave likeinspect.Signature.bind_partial()
. Otherwise, behave likeinspect.Signature.bind()
.include_defaults (bool) – If
True
, the returnedkwargs
will include the default values.
- The function returns a
- lisa.utils.dispatch_kwargs(funcs, kwargs, call=True, allow_overlap=False)[source]
Dispatch the provided
kwargs
mapping to thefuncs
functions, based on their signature.- Parameters:
funcs (list(collections.abc.Callable)) – List of functions to dispatch to.
kwargs (dict(str, object)) – Dictionary of arguments to pass to the functions.
call (bool) – If
True
, the functions are called and the return value is a{f: result}
withf
functions offuncs
. IfFalse
, theresult
is just a mapping of arguments ready to be used to call the given function.allow_overlap (bool) – If
False
, the provided functions are not allowed to have overlapping parameters. If they do, aTypeError
is raised.
- lisa.utils.kwargs_dispatcher(f_map, ignore=None, allow_overlap=True)[source]
Decorate a function so that it acts as an argument dispatcher between multiple other functions.
- Parameters:
f_map (dict(collections.abc.Callable, str) or list(collections.abc.Callable)) – Mapping of functions to name of the parameter that will receive the collected parameters.
None
values will be replaced withf'{f.__name__}_kwargs'
for convenience. If passed an non-mapping iterable, it will be transformed into{f: None, f2: None, ...}
.ignore (list(str) or None) – Set of parameters to ignore in the
f_map
functions. They will not be added to the signature and not be collected.allow_overlap (bool) – If
True
, the functions inf_map
are allowed to have overlapping parameters. IfFalse
, anTypeError
will be raised if there is any overlap.
Example:
def f(x, y): print('f', x, y) def g(y, z): print('g', y, z) # f_kwargs will receive a dict of parameters to pass to "f", same for # g_kwargs. # The values will also be passed to the function directly in x, y and z # parameters. @kwargs_dispatcher({f: "f_kwargs", g: "g_kwargs"}) def h(x, y, f_kwargs, g_kwargs, z): print('f', f_kwargs) print('g', g_kwargs) print(x, y, z) h(y=2, x=1, z=3) h(1,2,3) h(1,y=2,z=3)
- lisa.utils.DEPRECATED_MAP = {'lisa.analysis.base.AnalysisHelpers.cycle_colors': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.cycle_colors>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.base.AnalysisHelpers.do_plot': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.do_plot>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.base.AnalysisHelpers.get_next_color': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.get_next_color>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.base.AnalysisHelpers.set_axis_cycler': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.set_axis_cycler>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.base.AnalysisHelpers.set_axis_rc_params': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.set_axis_rc_params>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.base.AnalysisHelpers.setup_plot': {'deprecated_in': '2.0', 'msg': 'Made irrelevant by the use of holoviews', 'obj': <function AnalysisHelpers.setup_plot>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.analysis.load_tracking.LoadTrackingAnalysis.df_cpus_signals': {'deprecated_in': '2.0', 'msg': None, 'obj': <function LoadTrackingAnalysis.df_cpus_signals>, 'removed_in': (4, 0), 'replaced_by': <function LoadTrackingAnalysis.df_cpus_signal>}, 'lisa.analysis.load_tracking.LoadTrackingAnalysis.df_tasks_signals': {'deprecated_in': '2.0', 'msg': None, 'obj': <function LoadTrackingAnalysis.df_tasks_signals>, 'removed_in': (4, 0), 'replaced_by': <function LoadTrackingAnalysis.df_tasks_signal>}, 'lisa.analysis.tasks.TasksAnalysis.get_task_by_pid': {'deprecated_in': '2.0', 'msg': 'This function raises exceptions when faced with ambiguity instead of giving the choice to the user', 'obj': <function TasksAnalysis.get_task_by_pid>, 'removed_in': (4, 0), 'replaced_by': <function TasksAnalysis.get_task_pid_names>}, 'lisa.analysis.tasks.TasksAnalysis.get_task_pid': {'deprecated_in': '2.0', 'msg': None, 'obj': <function TasksAnalysis.get_task_pid>, 'removed_in': (4, 0), 'replaced_by': <function TasksAnalysis.get_task_id>}, 'lisa.analysis.tasks.TasksAnalysis.plot_task_activation': {'deprecated_in': '2.0', 'msg': 'Deprecated since it does not provide anything more than plot_tasks_activation', 'obj': <function TasksAnalysis.plot_task_activation>, 'removed_in': (4, 0), 'replaced_by': <function TasksAnalysis.plot_tasks_activation>}, 'lisa.datautils.SignalDesc.from_event': {'deprecated_in': '3.0', 'msg': 'No new signals will be added to this list, use explicit signal description where appropriate in the Trace API', 'obj': <function SignalDesc.from_event>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.datautils.series_tunnel_mean': {'deprecated_in': '2.0', 'msg': None, 'obj': <function series_tunnel_mean>, 'removed_in': (4, 0), 'replaced_by': <function series_envelope_mean>}, 'lisa.energy_meter.ACME': {'deprecated_in': '2.0', 'msg': 'LISA energy meters are deprecated, please use devlib instruments or contribute the instrument to devlib', 'obj': <class 'lisa.energy_meter.ACME'>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.energy_meter.AEP': {'deprecated_in': '2.0', 'msg': 'LISA energy meters are deprecated, please use devlib instruments or contribute the instrument to devlib', 'obj': <class 'lisa.energy_meter.AEP'>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.energy_meter.Gem5EnergyMeter': {'deprecated_in': '2.0', 'msg': 'LISA energy meters are deprecated, please use devlib instruments or contribute the instrument to devlib', 'obj': <class 'lisa.energy_meter.Gem5EnergyMeter'>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.energy_meter.HWMon': {'deprecated_in': '2.0', 'msg': 'LISA energy meters are deprecated, please use devlib instruments or contribute the instrument to devlib', 'obj': <class 'lisa.energy_meter.HWMon'>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.energy_meter.Monsoon': {'deprecated_in': '2.0', 'msg': 'LISA energy meters are deprecated, please use devlib instruments or contribute the instrument to devlib', 'obj': <class 'lisa.energy_meter.Monsoon'>, 'removed_in': (4, 0), 'replaced_by': None}, 'lisa.energy_model.EnergyModel.from_debugfsEM_target': {'deprecated_in': '2.0', 'msg': None, 'obj': <staticmethod(<function EnergyModel.from_debugfsEM_target>)>, 'removed_in': (4, 0), 'replaced_by': 'lisa.energy_model.LinuxEnergyModel.from_target'}, 'lisa.energy_model.EnergyModel.from_sd_target': {'deprecated_in': '2.0', 'msg': None, 'obj': <staticmethod(<function EnergyModel.from_sd_target>)>, 'removed_in': (4, 0), 'replaced_by': 'lisa.energy_model.LegacyEnergyModel.from_target'}, 'lisa.fuzz.Gen.lift': {'deprecated_in': '2.0', 'msg': 'Note that GenMonad.do() will not automatically await on arguments if they are Gen instances, this must be done manually.', 'obj': <function Gen.lift>, 'removed_in': (4, 0), 'replaced_by': <bound method MonadTrans.do of <class 'lisa.fuzz.GenMonad'>>}, 'lisa.trace.CollectorBase.get_trace': {'deprecated_in': '2.0', 'msg': None, 'obj': <function CollectorBase.get_trace>, 'removed_in': (4, 0), 'replaced_by': <function CollectorBase.get_data>}, 'lisa.trace.TraceBase.df_events': {'deprecated_in': '2.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.df_events>, 'removed_in': (4, 0), 'replaced_by': 'df_event'}, 'lisa.trace.TraceBase.get_task_by_name': {'deprecated_in': '2.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_by_name>, 'removed_in': (4, 0), 'replaced_by': <function TraceBase.get_task_name_pids>}, 'lisa.trace.TraceBase.get_task_by_pid': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_by_pid>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_by_pid'}, 'lisa.trace.TraceBase.get_task_id': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_id>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_id'}, 'lisa.trace.TraceBase.get_task_ids': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_ids>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_ids'}, 'lisa.trace.TraceBase.get_task_name_pids': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_name_pids>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_name_pids'}, 'lisa.trace.TraceBase.get_task_pid': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_pid>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_pid'}, 'lisa.trace.TraceBase.get_task_pid_names': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_task_pid_names>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_task_pid_names'}, 'lisa.trace.TraceBase.get_tasks': {'deprecated_in': '3.0', 'msg': 'This method has been deprecated and is an alias', 'obj': <function TraceBase.get_tasks>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.get_tasks'}, 'lisa.trace.TraceBase.task_ids': {'deprecated_in': '3.0', 'msg': 'This property has been deprecated and is an alias', 'obj': <property object>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.tasks.TasksAnalysis.task_ids'}, 'lisa.trace._InternalTraceBase.add_events_deltas': {'deprecated_in': '2.0', 'msg': 'Prefer adding delta once signals have been extracted from the event dataframe for correctness', 'obj': <function _InternalTraceBase.add_events_deltas>, 'removed_in': (4, 0), 'replaced_by': <function df_add_delta>}, 'lisa.trace._InternalTraceBase.df_all_events': {'deprecated_in': '2.0', 'msg': 'This method has been deprecated and is an alias for "trace.ana.notebook.df_all_events()"', 'obj': <function _InternalTraceBase.df_all_events>, 'removed_in': (4, 0), 'replaced_by': 'lisa.analysis.notebook.NotebookAnalysis.df_all_event'}, 'lisa.wa_results_collector.WaResultsCollector': {'deprecated_in': '2.0', 'msg': None, 'obj': <class 'lisa.wa_results_collector.WaResultsCollector'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wa.WAOutput'>}, 'lisa.wlgen.rta.Periodic': {'deprecated_in': '2.0', 'msg': 'Replaced by :class:`lisa.wlgen.rta.RTAPhase` along with :class:`lisa.wlgen.rta.PeriodicWload` workload', 'obj': <class 'lisa.wlgen.rta.Periodic'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.RTAPhase'>}, 'lisa.wlgen.rta.Pulse': {'deprecated_in': '2.0', 'msg': None, 'obj': <class 'lisa.wlgen.rta.Pulse'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.RTAPhase'>}, 'lisa.wlgen.rta.RTA.by_profile': {'deprecated_in': '2.0', 'msg': None, 'obj': <classmethod(<function RTA.by_profile>)>, 'removed_in': (4, 0), 'replaced_by': <lisa.utils.PartialInit._PartialFactory object>}, 'lisa.wlgen.rta.RTA.by_str': {'deprecated_in': '2.0', 'msg': None, 'obj': <classmethod(<function RTA.by_str>)>, 'removed_in': (4, 0), 'replaced_by': <lisa.utils.PartialInit._PartialFactory object>}, 'lisa.wlgen.rta.RTATask': {'deprecated_in': '2.0', 'msg': None, 'obj': <class 'lisa.wlgen.rta.RTATask'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.RTAPhase'>}, 'lisa.wlgen.rta.Ramp': {'deprecated_in': '2.0', 'msg': None, 'obj': <class 'lisa.wlgen.rta.Ramp'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.DutyCycleSweepPhase'>}, 'lisa.wlgen.rta.RunAndSync': {'deprecated_in': '2.0', 'msg': 'Replaced by :class:`lisa.wlgen.rta.RTAPhase` along with :class:`lisa.wlgen.rta.RunWload` and :class:`lisa.wlgen.rta.BarrierWload` workloads', 'obj': <class 'lisa.wlgen.rta.RunAndSync'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.RTAPhase'>}, 'lisa.wlgen.rta.Step': {'deprecated_in': '2.0', 'msg': None, 'obj': <class 'lisa.wlgen.rta.Step'>, 'removed_in': (4, 0), 'replaced_by': <class 'lisa.wlgen.rta.DutyCycleSweepPhase'>}, 'lisa.wlgen.workload.Workload.output': {'deprecated_in': '2.0', 'msg': 'Processed output is returned by run() or by the ".output" attribute of the value returned by the run_background() context manager', 'obj': <function Workload.output>, 'removed_in': (4, 0), 'replaced_by': None}}
Global dictionary of deprecated classes, functions and so on.
- lisa.utils.deprecate(msg=None, replaced_by=None, deprecated_in=None, removed_in=None, parameter=None)[source]
Mark a class, method, function etc as deprecated and update its docstring.
- Parameters:
msg (str or None) – Message to tell more about this deprecation.
replaced_by (object) – Other object the deprecated object is replaced by.
deprecated_in (str) – Version in which the object was flagged as deprecated.
removed_in (str) – Version in which the deprecated object will be removed.
parameter (str or None) – If not
None
, the deprecation will only apply to the usage of the given parameter. The relevant:param:
block in the docstring will be updated, and the deprecation warning will be emitted anytime a caller gives a value to that parameter (default or not).
Note
In order to decorate all the accessors of properties, apply the decorator once the property is fully built:
class C: @property def foo(self): pass @foo.setter def foo(self, val): pass # Once all getters/setter/deleters are set, apply the decorator foo = deprecate()(foo)
- lisa.utils.show_doc(obj, iframe=False)[source]
Show the online LISA documentation about the given object.
- lisa.utils.split_paragraphs(string)[source]
Split string into a list of paragraphs.
A paragraph is delimited by empty lines, or lines containing only whitespace characters.
- lisa.utils.guess_format(path)[source]
Guess the file format from a path, using the mime types database.
- lisa.utils.ignore_exceps(exceps, cm, callback=None)[source]
Wrap a context manager and handle exceptions raised in
__enter__()
and__exit__()
.- Parameters:
exceps (BaseException or tuple(BaseException)) – Tuple of exceptions to catch.
Note
If the
__enter__()
method failed,__exit__()
will not be called.
- exception lisa.utils.ContextManagerExit[source]
Bases:
Exception
Dummy exception raised in the generator wrapped by
destroyablecontextmanager()
when anything else thanGeneratorExit
happened duringyield
.
- exception lisa.utils.ContextManagerExcep(e)[source]
Bases:
ContextManagerExit
Exception raised when an exception was raised during
yield
in a context manager created withdestroyablecontextmanager()
.The
e
attribute holds the original exception.
- exception lisa.utils.ContextManagerNoExcep[source]
Bases:
ContextManagerExit
Exception raised when no exception was raised during
yield
in a context manager created withdestroyablecontextmanager()
.
- exception lisa.utils.ContextManagerDestroyed[source]
Bases:
GeneratorExit
Exception raised in context managers created by
destroyablecontextmanager()
when no exception was raised duringyield
per say but the context manager was destroyed without calling__exit__
.
- lisa.utils.destroyablecontextmanager(f)[source]
Similar to
contextlib.contextmanager()
but treats all cases ofyield
as an exception.This forces the user to handle them as such, and makes it more apparent that the
finally
clause intry/yield/finally
also catches the case where the context manager is simply destroyed.The user can handle
ContextManagerExit
to run cleanup code regardless of exceptions but not when context manager is simply destroyed without calling__exit__()
(standard behavior of context manager not created withcontextlib.contextmanager()
).Handling exceptions is achieved by handling
ContextManagerExcep
, with the original exception stored in thee
attribute.Handling destruction is achieved with
ContextManagerDestroyed
.Unlike
contextlib.contextmanager()
and like normal__exit__()
, swallowing exceptions is achieved by returning a truthy value. If a falsy value is returned,destroyablecontextmanager()
will re-raise the exception as appropriate.
- class lisa.utils.ExekallTaggable[source]
Bases:
object
Allows tagging the objects produced in exekall expressions ID.
See also
- lisa.utils.annotations_from_signature(sig)[source]
Build a PEP484
__annotations__
dictionary from ainspect.Signature
.
- lisa.utils.namedtuple(*args, module, **kwargs)[source]
Same as
collections.namedtuple()
, withcollections.abc.Mapping
behaviour.Warning
Iterating over instances will yield the field names rather the values, unlike regular
collections.namedtuple()
.- Parameters:
module (str) – Name of the module the type is defined in.
- lisa.utils.measure_time(clock=<built-in function monotonic>)[source]
Context manager to measure time in seconds.
- Parameters:
clock (collections.abc.Callable) – Clock to use.
Example:
with measure_time() as measure: ... print(measure.start, measure.stop, measure.exclusive_delta, measure.exclusive_delta)
Note
The
exclusive_delta
discount the time spent in nestedmeasure_time
context managers.
- lisa.utils.checksum(file_, method)[source]
Compute a checksum on a given file-like object.
- Parameters:
The file is read block by block to avoid clogging the memory with a huge read.
- lisa.utils.get_sphinx_name(obj, style=None, abbrev=False)[source]
Get a Sphinx-friendly name of an object.
- lisa.utils.newtype(cls, name, doc=None, module=None)[source]
Make a new class inheriting from
cls
with the givenname
.- Parameters:
The instances of
cls
class will be recognized as instances of the new type as well usingisinstance
.
- class lisa.utils.FrozenDict(x, deepcopy=True, type_=<class 'dict'>)[source]
Bases:
Mapping
Read-only mapping that is therefore hashable.
- Parameters:
deepcopy (bool) – If
True
, a deepcopy of the input will be done after applyingtype_
.type (collections.abc.Callable) – Called on the input to provide a suitable mapping, so that the input can be any iterable.
Note
The content of the iterable passed to the constructor is deepcopied to ensure non-mutability.
Note
Hashability allows to use it as a key in other mappings.
- class lisa.utils.SimpleHash[source]
Bases:
object
Base class providing a basic implementation of
__eq__
and__hash__
: two instances are equal if their__dict__
and__class__
attributes are equal.- HASH_COERCE(x, coerce)[source]
Used to coerce the values of
self.__dict__
to hashable values.- Parameters:
x (object) – the value to coerce to a hashable value
coerce (collections.abc.Callable) –
Function to be used to recurse, rather than
self.HASH_COERCE
. This takes care of memoization to avoid infinite recursion.Attention
The
coerce
function should only be called on values that will be alive after the call has ended, i.e. it can only be passed parts of thex
structure. If temporary objects are passed, memoization will not work as it relies onid()
, which is only guaranteed to provide unique ID to objects that are alive.
- class lisa.utils.PartialInit(*args, **kwargs)[source]
Bases:
object
Allow partial initialization of instances with curry-like behaviour for the constructor.
Subclasses will be able to be used in this way:
class Sub(PartialInit): def __init__(self, x, y): self.x = x self.y = y # This decorator allows the classmethod to be partially applied as # well. # # Note: since PartialInit relies on accurate signatures, **kwargs # cannot be used, unless the signature is patched-up with something # like lisa.utils.kwargs_forwarded_to() @PartialInit.factory def factory(cls, x, y): return cls(x=x, y=y) # Bind x=1 # Sub.__init__ not yet called obj = Sub(1) # Provide a value for "y", which will trigger a call to the # user-provided __init__ obj = obj(y=2) # Make a new instance with value of x=42, and all the other parameters # being the same as what was provided to build "obj" obj2 = obj(x=42)
Note
any attribute access on a partially initialized instance will result in a
TypeError
exception.
- class lisa.utils.ComposedContextManager(cms)[source]
Bases:
object
Compose context managers together.
- Parameters:
cms (list(contextlib.AbstractContextManager or collections.abc.Callable)) – Context manager factories to compose. Each item can either be a context manager already, or a function that will be called to produce one.
Example:
with ComposedContextManager([cm1, cm2]): ... # Equivalent to with cm1() as _cm1: with cm2() as _cm2: ...
- lisa.utils.chain_cm(*fs)[source]
Chain the context managers returned by the given callables.
This is equivalent to:
@contextlib.contextmanager def combined(x): fs = list(reversed(fs)) with fs[0](x) as y: with fs[1](y) as z: with fs[2](z) as ...: ... with ... as final: yield final
It is typically used instead of regular function composition when the functions return a context manager
@contextlib.contextmanager def f(a, b): print(f'f a={a} b={b}') yield a * 2 @contextlib.contextmanager def g(x): print(f'g x={x}') yield f'final x={x}' combined = chain_cm(g, f) with combined(a=1, b=2) as res: print(res) # Would print: # f a=1 b=2 # g x=2 # final x=2
- class lisa.utils.DirCache(category, populate)[source]
Bases:
Loggable
Provide a folder-based cache.
- Parameters:
category (str) – Unique name for the cache category. This allows an arbitrary number of categories to be used under
lisa.utils.LISA_CACHE_HOME
.populate (collections.abc.Callable) –
Callback to populate a new cache entry if none is found. It will be passed the following parameters:
The key that is being looked up
The path to populate
It must return a subfolder of the passed path to populate, or
None
, which is the same as returning the passed path.
The cache is managed in a process-safe way, so that there can be no race between concurrent processes or threads.
- has_key(key)[source]
Check if the given
key
is already present in the cache. If the key is present, return the path, otherwise returnsNone
.- Parameters:
key – Same as for
get_entry()
.
- get_entry(key)[source]
Return the folder of a cache entry.
If no entry is found, a new one is created using the
populate()
callback.- Parameters:
key (object) –
Key of the cache entry. All the components of the key must be isomorphic to their
repr()
, otherwise the cache will be hit in cases where it should not. For convenience, some types are normalized:Mapping
is only considered for its keys and values and type name. Keys are sorted are sorted. If the passed object contains other relevant metadata, it should be rendered to a string first by the caller.Iterable
keys are normalized and the object is only considered as an iterable. If other relevant metadata is contained in the object, it should be rendered to a string by the caller.
Note
The return folder must never be modified, as it would lead to races.
- lisa.utils.subprocess_log(cmd, level=None, name=None, logger=None, **kwargs)[source]
Similar to
subprocess.check_output()
but merges stdout and stderr and logs them using thelogging
module as it goes.- Parameters:
cmd (str or list(str)) – Command passed to
subprocess.Popen
level (int or None) – Log level to use (e.g.
logging.INFO
).name (str or None) – Name of the logger to use. Defaults the the beginning of the command.
logger (logging.Logger) – Logger to use.
- Variable keyword arguments:
Forwarded to
subprocess.Popen
.
- class lisa.utils.SerializeViaConstructor(*args, **kwargs)[source]
Bases:
object
Base class providing serialization to objects that typically cannot due to unpicklable attributes.
This works by recording the constructor that was used and the parameters passed to it in order to recreate an equivalent object, under the assmuption that the constructor arguments will be picklable.
Alternative constructors (e.g. classmethod) can be decorated with
SerializeViaConstructor.constructor()
in order to record the parameters passed to them if necessary.- classmethod constructor(f)[source]
Decorator to apply on alternative constructors if arguments passed to the class are not serializable, or if the alternative constructor makes necessary initialization.
Example:
class Foo(SerializeViaConstructor): def __init__(self, x): self.x = x @classmethod @SerializeViaConstructor.constructor def from_path(cls, path): return cls(x=open(path))
Note
This only works on classmethods. staticmethods are not supported.
- class lisa.utils.LazyMapping(*args, **kwargs)[source]
Bases:
Mapping
Lazy Mapping dict-like class for elements evaluated on the fly.
It takes the same set of arguments as a dict with keys as the mapping keys and values as closures that take a key and return the value. The class does no automatic memoization but memoization can easily be achieved using
functools.lru_cache()
, as shown in the example below.Example:
LazyMapping({ x: lru_cache()(lambda k: k + 42) for x in [1, 2, 3, 4] })
- lisa.utils.mp_spawn_pool(import_main=False, **kwargs)[source]
Create a context manager wrapping
multiprocessing.pool.Pool
using thespawn
method, which is safe even in multithreaded applications.- Parameters:
import_main (bool) – If
True
, let the spawned process import the__main__
module. This is usually not necessary when the functions executed in the pool are small and not importing__main__
saves a lot of time (actually, unbounded amount of time).- Variable keyword arguments:
Forwarded to
multiprocessing.pool.Pool
.
- lisa.utils.is_link_dead(url)[source]
Check if link is dead. If dead, returns a truthy value, otherwise a falsy one.
- lisa.utils.subprocess_detailed_excep()[source]
Context manager that will replace
subprocess.CalledProcessError
by a subclass that shows more details.
Kernel modules
This module provides classes to build kernel modules from source on the fly.
Here is an example of such module:
import time
from lisa.target import Target
from lisa.trace import DmesgCollector
from lisa._kmod import KmodSrc
from lisa.utils import setup_logging
setup_logging()
target = Target(
kind='linux',
name='my_board',
host='192.158.1.38',
username='root',
password='root',
lazy_platinfo=True,
kernel_src='/path/to/kernel/tree/',
kmod_build_env='alpine',
)
# Example module from: https://tldp.org/LDP/lkmpg/2.6/html/x279.html
code = r'''
/*
* hello-4.c - Demonstrates module documentation.
*/
#include <linux/module.h> /* Needed by all modules */
#include <linux/kernel.h> /* Needed for KERN_INFO */
#include <linux/init.h> /* Needed for the macros */
#define DRIVER_AUTHOR "XXX"
#define DRIVER_DESC "A sample driver"
static int __init init_hello(void)
{
printk(KERN_INFO "Hello, world\n");
return 0;
}
static void __exit cleanup_hello(void)
{
printk(KERN_INFO "Goodbye, worldn");
}
module_init(init_hello);
module_exit(cleanup_hello);
/*
* You can use strings, like this:
*/
/*
* Get rid of taint message by declaring code as GPL.
*/
MODULE_LICENSE("GPL");
/*
* Or with defines, like this
*/
MODULE_AUTHOR(DRIVER_AUTHOR); /* Who wrote this module? */
MODULE_DESCRIPTION(DRIVER_DESC); /* What does this module do */
'''
# This object represents the kernel sources, and needs to be turned into a
# DynamicKmod to be compiled and run.
src = KmodSrc({'hello.c': code})
# Create a DynamicKmod from the target and the module sources.
kmod = target.get_kmod(src=src)
# Collect the dmesg output while running the module
dmesg_coll = DmesgCollector(target, output_path='dmesg.log')
# kmod.run() will compile the module, install it and then uninstall it at the
# end of the "with" statement.
with dmesg_coll, kmod.run():
time.sleep(1)
for entry in dmesg_coll.entries:
print(entry)
- exception lisa._kmod.KmodVersionError[source]
Bases:
Exception
Raised when the kernel module is not found with the expected version.
- class lisa._kmod.OverlayResource[source]
Bases:
ABC
Resource to be applied as an overlay in an existing folder.
- class lisa._kmod._FileOverlayBase[source]
Bases:
OverlayResource
Base class for file overlays.
- class lisa._kmod.FileOverlay[source]
Bases:
_FileOverlayBase
Overlay representing a file content.
- class lisa._kmod._PathOverlayBase[source]
Bases:
_FileOverlayBase
Base class for path-based overlays.
- class lisa._kmod._PathFileOverlay(path)[source]
Bases:
_PathOverlayBase
- class lisa._kmod._CompressedPathFileOverlay(path)[source]
Bases:
_PathOverlayBase
- class lisa._kmod._ContentFileOverlay(content)[source]
Bases:
_FileOverlayBase
- class lisa._kmod.TarOverlay(path)[source]
Bases:
_PathOverlayBase
The
__init__
constructor is considered private. Use factory classmethod to create instances.
- class lisa._kmod.PatchOverlay(overlay)[source]
Bases:
OverlayResource
Patch to be applied on an existing file.
- Parameters:
overlay (_FileOverlayBase) – Overlay providing the content of the patch.
- class lisa._kmod.KmodSrc(src, name=None)[source]
Bases:
Loggable
Sources of a kernel module.
- Parameters:
- property code_files
- property c_files
- property checksum
Checksum of the module’s sources & Makefile.
- property mod_name
Name of the module.
- property makefile
- compile(kernel_build_env, make_vars=None)[source]
Compile the module and returns the
bytestring
content of the.ko
file.- Parameters:
kernel_build_env (_KernelBuildEnv) – kernel build env to build the module against.
make_vars (dict(str, object) or None) – Variables passed on
make
command line. This can be used for variables only impacting the module, otherwise it’s better to set them when creating thekernel_build_env
.
- exception lisa._kmod.CannotLoadModuleError[source]
Bases:
Exception
Raised when a kernel module cannot be loaded (or will not be loaded because of nasty side effects).
- class lisa._kmod.DynamicKmod(target, src, kernel_build_env=None)[source]
Bases:
Loggable
Dynamic kernel module that can be compiled on the go by LISA.
- Parameters:
target (lisa.target.Target) – Target that will be used to load the module.
src (lisa._kmod.KmodSrc) – Sources of the module.
kernel_build_env (lisa._kmod._KernelBuildEnv) – Kernel source tree to use to build the module against.
- property mod_name
- classmethod from_target(target, **kwargs)[source]
Build a module from the given target. Use this constructor on subclasses rather than making assumptions on the signature of the class.
- Variable keyword arguments:
Forwarded to
__init__
.
- property kernel_build_env
- class lisa._kmod.FtraceDynamicKmod(target, src, kernel_build_env=None)[source]
Bases:
DynamicKmod
Dynamic module providing some custom ftrace events.
- property defined_events
Ftrace events defined in that module.
- property possible_events
Ftrace events possibly defined in that module.
Note that this is based on crude source code analysis so it’s expected to be a superset of the actually defined events.
- class lisa._kmod.LISADynamicKmod(target, src, kernel_build_env=None)[source]
Bases:
FtraceDynamicKmod
Module providing ftrace events used in various places by
lisa
.The kernel must be compiled with the following options in order for the module to be created successfully:
CONFIG_DEBUG_INFO=y CONFIG_DEBUG_INFO_BTF=y CONFIG_DEBUG_INFO_REDUCED=n
- classmethod from_target(target, **kwargs)[source]
Build a module from the given target. Use this constructor on subclasses rather than making assumptions on the signature of the class.
- Variable keyword arguments:
Forwarded to
__init__
.
Generic types
Generic types inspired by the typing
module.
- lisa._generic.check_type(x, classinfo)[source]
Equivalent of
isinstance()
that will also work with typing hints.
- lisa._generic.is_instance(obj, classinfo)[source]
Same as builtin
isinstance()
but works with type hints.
- lisa._generic.is_hint(obj)[source]
Heuristic to check if a given
obj
is a typing hint or anything else. This function will returnFalse
for classes.Warning
Since there is currently no way to identify hints for sure, the check might return
False
even if it is a hint.
- lisa._generic.hint_to_class(hint)[source]
Convert a typing hint to a class that will do a runtime check against the hint when
isinstance()
is used.
- class lisa._generic.SortedSequence[source]
Bases:
Generic
[T
],_TypeguardCustom
Same as
typing.List
but enforces sorted values when runtime checked usingtypeguard
.- __orig_bases__ = (typing.Generic[~T], <class 'lisa._generic._TypeguardCustom'>)
- __parameters__ = (~T,)
Typeclasses
This module provides a trait system known as typeclasses in Haskell and Scala, and known as trait in Rust.
The fundamental idea is to decouple the followings:
definition of an interface as a set of methods to implement.
implementation of the aforementioned methods for a given class.
the class definitions themselves.
Decoupling 2. and 3. allows providing implementation of the interface on any type, including foreign types coming from other libraries, or even builtin types. This is the core benefit from typeclasses as opposed to regular classes in Object Oriented Programming. They allow extending existing types without having to modify their inheritance hierarchy.
Note
The names of the concepts are drawn from Haskell typeclasses:
typeclass: This is the description of an interface, as a set of mandatory methods to implement, and optionally helper functions with default implementations. It’s pretty close in concept to abstract base classes.
superclass: The mother typeclass of a given typeclass.
instance: This is the implementation of a given typeclass for a given (set of) type.
values: Values as opposed to types. Since instance is already used to refer to the implementation of a typeclass, we use the word value.
type: That is just a type, also known as class in Python.
Here is an example on how to work with typeclasses as provided by this module:
from lisa._typeclass import TypeClass
class FooBar(TypeClass):
"Foobar interface"
# TypeClass.required is an equivalent of abc.abstractmethod: It forces
# implementations of a given set of method
@TypeClass.required
def my_method(self):
pass
# This helper can be used in the implementation of the typeclass, and
# can be overriden by any instance.
def extra_helper(self):
return 42
class ARandomClass:
"Just a random class, it could be anything"
pass
# ``types`` can be either a tuple of types or a single type
class ARandomClassFooBarInstance(FooBar, types=(ARandomClass, int)):
"Implement the FooBar typeclass for both ARandomClass type and int at once."
def my_method(self):
return 'ARandomClass or int value'
value = ARandomClass()
# Both are equivalent
# The @ version is more useful when combining multiple typeclasses on the fly
value_as_foobar = FooBar(value)
value_as_foobar = value @ FooBar
# Inplace variant allows to "cast" the value directly.
# These are all equivalent:
# value @= FooBar
# value = value @ FooBar
# value = FooBar(value)
# The typeclass machinery will dispatch the call to my_method() to the
# right implementation
value_as_foobar.my_method()
# We also implemented FooBar for int type
FooBar(3).my_method()
# Raises a TypeError, since there is no instance for float
FooBar(3.0).my_method()
# Add an instance of FooBar for float type
class FloatFooBarInstance(FooBar, types=float):
def my_method(self):
return 'float'
# Now works once we added the instance
FooBar(3.0).my_method()
Classmethod also work, so typeclasses can be used to define factory interfaces:
from lisa._typeclass import TypeClass
class FromString(TypeClass):
"Build a value by parsing a string"
@TypeClass.required
@classmethod
def from_str(cls, string):
pass
class IntFromStringInstance(FromString, types=int):
@classmethod
def from_str(cls, string):
# Although cls is a value of type TypeProxy, it can be called just
# like a regular class
return cls(string)
# Types can be cast just like values, so we can use the classmethods and
# the staticmethods on them as well
assert 33 == FromString(int).from_str('33')
A more advanced usage can involve a hierarchy of typeclasses that gets combined together:
from lisa._typeclass import TypeClass
class MyTP1(TypeClass):
@TypeClass.required
def meth1(self):
pass
@TypeClass.required
def another_meth(self):
pass
class MyTP2(TypeClass):
@TypeClass.required
def meth2(self):
pass
class IntTP1Instance(MyTP1, types=int):
def meth1(self):
return 'int'
def another_meth(self):
return 42
class IntTP2Instance(MyTP2, types=int):
def meth2(self):
return 'int'
# Reuse an existing function implementation
another_meth = IntTP1Instance.another_meth
# Both are equivalent and allow creating a typeclass that provides
# interfaces of both MyTP1 and MyTP2. If some methods are required by both
# MyTP1 and MyTP2, the conflict is detected and a TypeError is raised:
MyTP1AndMyTP2 = MyTP1 & MyTP2
# This combined typeclass will automatically get the instances from its
# superclasses
class MyTP1AndMyTP2(MyTP1, MyTP2):
pass
# All are equivalent
value = 2 @ (MyTP1 & MyTP2)
value = 2 @ MyTP1AndMyTP2
value = MyTP1AndMyTP2(2)
value = (MyTP1 & MyTP2)(2)
# We can now use the API of both MyTP1 and MyTP2
value.meth1()
value.meth2()
Note that it’s possible to implement a typeclass for a type that has no values,
but for which isinstance(value, thetype)
will return true. This can be
achieved using __instancecheck__
or __subclasscheck__
and is used in
particular by the abstract base classes provided by collections.abc
.
lisa._generic.SortedSequence
is another example. Typing hints from the
typing
module can also be used. Casting values “registered” as instances
of these types is expensive though, as validity of the cast depends on the
value itself. That means it’s not possible to memoize the result of the cast
associated it with the type of the value.
One might wonder what casting a value to a typeclass gives. When possible, a
new value with a synthetic type is returned. That is implemented using a
shallow copy of the value, and then updating its __class__
attribute. This
will provide native attribute lookup speed, and casting will be efficient. If
that is not possible (non-heap types, types using __slots__
etc), an
instance of lisa._typeclass.ValueProxy
will be returned for values, and
a synthetic type will be created for types.
- class lisa._typeclass.TypeClassMeta(name, bases, dct, *args, types=None, **kwargs)[source]
Bases:
type
Metaclass of all typeclasses.
This implements most of the typeclass magic.
- Parameters:
name (str) – Name of the typeclass or instance being created.
bases (tuple(type)) – tuple of superclasses of the typeclass being defined. When an instance is created, bases must have exactly one element, which is the typeclass being implemented.
dct (dict(str, object)) – Dictionary of attributes defined in the body of the
class
statement.types (type or tuple(type) or None) – Type or tuple of types for which the typeclass instance is provided.
- static required(f)[source]
Decorator used in a typeclass to flag a method to be required to be implemented by all instances.
This is very similar to
abc.abstractmethod()
.
- __matmul__(obj)[source]
Use the matrix multiplication operator (
@
) as a “cast” operator, to cast a value or a type to a typeclass.
- __rmatmul__(obj)
Use the matrix multiplication operator (
@
) as a “cast” operator, to cast a value or a type to a typeclass.
- class lisa._typeclass.TypeClass(obj)[source]
Bases:
object
Base class to inherit from to define a new typeclass.
- class lisa._typeclass.ValueProxy(obj, dct)[source]
Bases:
object
Values of this class are returned when casting a value to a typeclass, if the value does not support shallow copy or
__class__
attribute assignment.It implements the modified attribute lookup, so we can use the typeclass methods. All other attribute lookups will go through untouched, except magic methods lookup (also known as dunder names).
- class lisa._typeclass.FromString(obj)[source]
Bases:
TypeClass
Build values by parsing a string.
- classmethod get_format_description(short)[source]
Returns the description of the format parsed by
from_str()
.- Parameters:
short (bool) – If
True
, a short description should be returned. Otherwise a more more lengthy description is acceptable
- DEFAULTS = {}
- INSTANCES = {<class 'bool'>: (<class 'lisa._typeclass._BoolFromStringInstance'>, {'__doc__': '\n Parse boolean from a string.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_BoolFromStringInstance', 'from_str': <classmethod(<function _BoolFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _BoolFromStringInstance.get_format_description>)>}), <class 'float'>: (<class 'lisa._typeclass._BuiltinFromStringInstance'>, {'__doc__': '\n Parse the following types from a string:\n * ``int``\n * ``float``\n ', '__module__': 'lisa._typeclass', '__qualname__': '_BuiltinFromStringInstance', 'from_str': <classmethod(<function _BuiltinFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _BuiltinFromStringInstance.get_format_description>)>}), <class 'int'>: (<class 'lisa._typeclass._BuiltinFromStringInstance'>, {'__doc__': '\n Parse the following types from a string:\n * ``int``\n * ``float``\n ', '__module__': 'lisa._typeclass', '__qualname__': '_BuiltinFromStringInstance', 'from_str': <classmethod(<function _BuiltinFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _BuiltinFromStringInstance.get_format_description>)>}), <class 'lisa._generic.List'>: (<class 'lisa._typeclass._IntSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`int` type.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_IntSeqFromStringInstance', 'from_str': <classmethod(<function _IntSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _IntSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.List'>: (<class 'lisa._typeclass._StrSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`str` type.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_StrSeqFromStringInstance', 'from_str': <classmethod(<function _StrSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _StrSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.List'>: (<class 'lisa.trace._CPUSeqFromStringInstance'>, {'__module__': 'lisa.trace', '__qualname__': '_CPUSeqFromStringInstance', 'from_str': <bound method _IntSeqFromStringInstance.from_str of <class 'lisa._typeclass.Sequence'>>, 'get_format_description': <classmethod(<function _CPUSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.List'>: (<class 'lisa.analysis.tasks._TaskIDSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for lists :class:`TaskID` type.\n ', '__module__': 'lisa.analysis.tasks', '__qualname__': '_TaskIDSeqFromStringInstance', 'from_str': <classmethod(<function _TaskIDSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _TaskIDSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.Sequence'>: (<class 'lisa._typeclass._IntSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`int` type.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_IntSeqFromStringInstance', 'from_str': <classmethod(<function _IntSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _IntSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.Sequence'>: (<class 'lisa._typeclass._StrSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`str` type.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_StrSeqFromStringInstance', 'from_str': <classmethod(<function _StrSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _StrSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.Sequence'>: (<class 'lisa.trace._CPUSeqFromStringInstance'>, {'__module__': 'lisa.trace', '__qualname__': '_CPUSeqFromStringInstance', 'from_str': <bound method _IntSeqFromStringInstance.from_str of <class 'lisa._typeclass.Sequence'>>, 'get_format_description': <classmethod(<function _CPUSeqFromStringInstance.get_format_description>)>}), <class 'lisa._generic.Sequence'>: (<class 'lisa.analysis.tasks._TaskIDSeqFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for lists :class:`TaskID` type.\n ', '__module__': 'lisa.analysis.tasks', '__qualname__': '_TaskIDSeqFromStringInstance', 'from_str': <classmethod(<function _TaskIDSeqFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _TaskIDSeqFromStringInstance.get_format_description>)>}), <class 'lisa.analysis.tasks.TaskID'>: (<class 'lisa.analysis.tasks._TaskIDFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`TaskID` type.\n ', '__module__': 'lisa.analysis.tasks', '__qualname__': '_TaskIDFromStringInstance', 'from_str': <classmethod(<function _TaskIDFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _TaskIDFromStringInstance.get_format_description>)>}), <class 'str'>: (<class 'lisa._typeclass._StrFromStringInstance'>, {'__doc__': '\n Instance of :class:`lisa._typeclass.FromString` for :class:`str` type.\n ', '__module__': 'lisa._typeclass', '__qualname__': '_StrFromStringInstance', 'from_str': <classmethod(<function _StrFromStringInstance.from_str>)>, 'get_format_description': <classmethod(<function _StrFromStringInstance.get_format_description>)>})}
- REQUIRED = {'from_str': <class 'lisa._typeclass.FromString'>, 'get_format_description': <class 'lisa._typeclass.FromString'>}
- SUPERCLASSES = []
Monads
Monads with syntactic sugar.
All monads share the following API:
# for a given monad Monad
# Turns a regular function into a function returning an instance of Monad,
# and able to await on monadic values. Similar to the "do notation" in
# Haskell.
@Monad.do
async def foo(x, y):
# Inside a decorated function, await can be used to "extract" the value
# contained in the monad, like ``<-`` in Haskell.
z = await Monad.something()
return x
# Equivalent to
@Monad.do
async def foo(x, y):
# note: await is used automatically if x is an instance of Monad
x_ = await x
y_ = await y
# Monad.pure() is called if x_ is not an instance of Monad
return Monad.pure(x_)
This allow composing decorated functions easily
Note
There currently is no overridable bind
operation, since nothing
in Python currently allows getting a proper continuation without explicit
manipulations of lambdas. The closest thing that is used is coroutine
functions, where await
somewhat provides a continuation using
coroutine.send()
. The limitation comes from that it can only be called
at most once (preventing anything like the list monad). Early-return
control flow such as the maybe monad are typically not necessary as Python
has exceptions already.
Note
async/await
is used as syntactic sugar instead of yield
since
the grammar works better for await
. yield
cannot be used in
comprehensions, which prevents some idiomatic functional patterns based on
generator expressions.
- class lisa.monad.Monad[source]
Bases:
_MonadBase
- classmethod __init_subclass__(*args, **kwargs)[source]
This method is called when a class is subclassed.
The default implementation does nothing. It may be overridden to extend subclasses.
- exception lisa.monad.AlreadyCalledError[source]
Bases:
Exception
Exception raised by
_CallOnce
when the wrapped function has already been called once.
- class lisa.monad.MonadTrans[source]
-
Base class for monad transformers.
Heavily inspired by transformers as defined by: https://hackage.haskell.org/package/transformers
And stack manipulation inspired by: https://hackage.haskell.org/package/mmorph
- classmethod __init_subclass__(*args, **kwargs)[source]
This method is called when a class is subclassed.
The default implementation does nothing. It may be overridden to extend subclasses.
- abstract classmethod lift(m)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T3(Identity).pure(42)
. we haveT2.lift(m) == T2(T3(Identity)).pure(42)
.See also
lift
as defined in https://hackage.haskell.org/package/transformers
- abstract classmethod hoist(self, nat)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T2(Identity).pure(42)
. we haveT2.hoist(m, T3.pure) == T2(T3(Identity)).pure(42)
.In other words, it allows adding a level “from below”, whereas
lift
adds a level “from above”. It’s similar tomap
, except that instead of traversing all the nested functor layers, it stops at the first one.- Parameters:
self (lisa.monad.Monad) – Monadic value to hoist.
nat (collections.abc.Callable) – Natural transform. i.e. a morphism from
Monad1[A]
toMonad2[A]
that obeys certain laws.
See also
hoist
as defined in https://hackage.haskell.org/package/mmorphNote
Note for implementers: A monad transformers
t m a
(t
is the transformer HKT,m
is the base monad anda
is the “contained type) usually ends up containing an “m (f a)” (f
being some kind of functor). For example,MaybeT
in Haskell (Option
here) is more or less defined asdata MaybeT m a = MaybeT (m (Maybe a))
. What thehoist
implementation must do is to “rebuild” a value with a call tonat()
around them (...)
part. ForMaybeT
, this giveshoist nat (MaybeT (m (Maybe a))) = MaybeT(nat(m (Maybe a)))
.
- lisa.monad.TransformerStack(*stack)[source]
Allows stacking together multiple
MonadTrans
, e.g.:class Stack(TransformerStack(T1, T2, T3)): pass @Stack.do async def foo(): # Any monadic value from the stack's direct components can be used. await T1.pure(42) await T2.pure(42) await T3.pure(42)
- class lisa.monad.Some(x)[source]
Bases:
_Optional
Wraps an arbitrary value to indicate its presence.
- __slots__ = ('x',)
- x
- class lisa.monad.Option(x)[source]
Bases:
MonadTrans
,_AddPureNothing
Monad transformer that manipulates
Some
andNothing
.Option.bind()
will short-circuit ifNothing
is passed, much like the Rust or Javascript?
operator, or theMaybeT
monad transformer in Haskell.- __slots__ = ('_x',)
- property x
Wrapped value, of type
Base[_Optional[A]]
withBase
the base monad of the transformer.
- bind(continuation)[source]
Takes a monadic value Monad[A], a function that takes an A and returns Monad[B], and returns a Monad[B].
Note
It is allowed to return a
_TailCall
instance.
- classmethod lift(m)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T3(Identity).pure(42)
. we haveT2.lift(m) == T2(T3(Identity)).pure(42)
.See also
lift
as defined in https://hackage.haskell.org/package/transformers
- hoist(self, nat)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T2(Identity).pure(42)
. we haveT2.hoist(m, T3.pure) == T2(T3(Identity)).pure(42)
.In other words, it allows adding a level “from below”, whereas
lift
adds a level “from above”. It’s similar tomap
, except that instead of traversing all the nested functor layers, it stops at the first one.- Parameters:
self (lisa.monad.Monad) – Monadic value to hoist.
nat (collections.abc.Callable) – Natural transform. i.e. a morphism from
Monad1[A]
toMonad2[A]
that obeys certain laws.
See also
hoist
as defined in https://hackage.haskell.org/package/mmorphNote
Note for implementers: A monad transformers
t m a
(t
is the transformer HKT,m
is the base monad anda
is the “contained type) usually ends up containing an “m (f a)” (f
being some kind of functor). For example,MaybeT
in Haskell (Option
here) is more or less defined asdata MaybeT m a = MaybeT (m (Maybe a))
. What thehoist
implementation must do is to “rebuild” a value with a call tonat()
around them (...)
part. ForMaybeT
, this giveshoist nat (MaybeT (m (Maybe a))) = MaybeT(nat(m (Maybe a)))
.
- class lisa.monad.State(f)[source]
Bases:
MonadTrans
Monad transformer analogous to Haskell’s
StateT
transformer.It manipulates state-transforming functions of type
state -> (value, new_state)
. This allows simulating a global state, without actually requiring one.- __slots__ = ('_f',)
- property f
State-transforming function of type
state -> (value, new_state)
- classmethod make_state(x)[source]
Create an initial state. All the parameters of
State.__call__()
are passed toState.make_state()
.
- __call__(*args, **kwargs)[source]
Allow calling monadic values to run the state-transforming function, with the initial state provided by
State.make_state()
.
- bind(continuation)[source]
Takes a monadic value Monad[A], a function that takes an A and returns Monad[B], and returns a Monad[B].
Note
It is allowed to return a
_TailCall
instance.
- classmethod lift(m)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T3(Identity).pure(42)
. we haveT2.lift(m) == T2(T3(Identity)).pure(42)
.See also
lift
as defined in https://hackage.haskell.org/package/transformers
- hoist(self, nat)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T2(Identity).pure(42)
. we haveT2.hoist(m, T3.pure) == T2(T3(Identity)).pure(42)
.In other words, it allows adding a level “from below”, whereas
lift
adds a level “from above”. It’s similar tomap
, except that instead of traversing all the nested functor layers, it stops at the first one.- Parameters:
self (lisa.monad.Monad) – Monadic value to hoist.
nat (collections.abc.Callable) – Natural transform. i.e. a morphism from
Monad1[A]
toMonad2[A]
that obeys certain laws.
See also
hoist
as defined in https://hackage.haskell.org/package/mmorphNote
Note for implementers: A monad transformers
t m a
(t
is the transformer HKT,m
is the base monad anda
is the “contained type) usually ends up containing an “m (f a)” (f
being some kind of functor). For example,MaybeT
in Haskell (Option
here) is more or less defined asdata MaybeT m a = MaybeT (m (Maybe a))
. What thehoist
implementation must do is to “rebuild” a value with a call tonat()
around them (...)
part. ForMaybeT
, this giveshoist nat (MaybeT (m (Maybe a))) = MaybeT(nat(m (Maybe a)))
.
- classmethod from_f(f)[source]
Build a monadic value out of a state modifying function of type
state -> (value, new_state)
.
- class lisa.monad.StateDiscard(f)[source]
Bases:
State
Same as
State
except that calling monadic values will return the computed value instead of a tuple(value, state)
.This is useful for APIs where the final state is of no interest to the user.
- __call__(*args, **kwargs)[source]
Allow calling monadic values to run the state-transforming function, with the initial state provided by
State.make_state()
.
- class lisa.monad.Async(coro)[source]
Bases:
MonadTrans
Monad transformer allowing the decorated coroutine function to await on non-monadic values. This is useful to mix any monad transformer defined in this module with other async APIs, such as
asyncio
.- __slots__ = ('_coro',)
- property coro
Coroutine that will only yield non-monadic values. All the monadic values will be processed by the monad transformer stack as expected and will stay hidden.
- property x
Run the coroutine to completion in its event loop.
- classmethod lift(x)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T3(Identity).pure(42)
. we haveT2.lift(m) == T2(T3(Identity)).pure(42)
.See also
lift
as defined in https://hackage.haskell.org/package/transformers
- hoist(self, nat)[source]
Lift a monadic value
m
by one level in the stack, i.e.: Given a stack for 3 transformersT1(T2(T3(Identity)))
, a valuem = T2(Identity).pure(42)
. we haveT2.hoist(m, T3.pure) == T2(T3(Identity)).pure(42)
.In other words, it allows adding a level “from below”, whereas
lift
adds a level “from above”. It’s similar tomap
, except that instead of traversing all the nested functor layers, it stops at the first one.- Parameters:
self (lisa.monad.Monad) – Monadic value to hoist.
nat (collections.abc.Callable) – Natural transform. i.e. a morphism from
Monad1[A]
toMonad2[A]
that obeys certain laws.
See also
hoist
as defined in https://hackage.haskell.org/package/mmorphNote
Note for implementers: A monad transformers
t m a
(t
is the transformer HKT,m
is the base monad anda
is the “contained type) usually ends up containing an “m (f a)” (f
being some kind of functor). For example,MaybeT
in Haskell (Option
here) is more or less defined asdata MaybeT m a = MaybeT (m (Maybe a))
. What thehoist
implementation must do is to “rebuild” a value with a call tonat()
around them (...)
part. ForMaybeT
, this giveshoist nat (MaybeT (m (Maybe a))) = MaybeT(nat(m (Maybe a)))
.
Fuzzing
Fuzzing API to build random constrained values.
Note
The following example shows a direct use of the Gen
monad,
but be aware that lisa.wlgen.rta
API allows mixing both Gen
and RTA DSL into the same coroutine function using
lisa.wlgen.rta.task_factory()
.
Example:
from lisa.platforms.platinfo import PlatformInfo
from lisa.fuzz import GenMonad, Choice, Int, Float, retry_until
# The function must be decorated with GenMonad.do() so that "await" gains
# its special meaning.
@GenMonad.do
async def make_data(duration=None):
# Draw a value from an iterable.
period = await Choice([16e-3, 8e-3])
nr = await Choice(range(1, 4))
duration = duration or (await Float(1, 2))
# Arbitrary properties can be enforced. If they are not satisfied, the
# function will run again until the condition is true.
await retry_until(0 < nr <= 2)
return (nr, duration, period)
# seed (or rng) can be fixed for reproducible results
data = make_data(duration=42)(seed=1)
print(data)
- exception lisa.fuzz.RetryException[source]
Bases:
Exception
Exception raised to signify to
lisa.fuzz.Gen
to retry the random draw.See also
- lisa.fuzz.retry_until(cond)[source]
Returns an awaitable that will signify to the
lisa.fuzz.Gen
monad to retry the computation untilcond
isTrue
. This is used to enforce arbitrary constraints on generated data.Note
If possible, it’s a better idea to generate the data in a way that satisfy the constraints, as retrying can happen an arbitrary number of time and thus become quite costly.
- class lisa.fuzz.GenMonad(f, name=None)[source]
Bases:
StateDiscard
,Loggable
Random generator monad inspired by Haskell’s QuickCheck.
- classmethod make_state(*, rng=None, seed=None)[source]
Initialize the RNG state with either an rng or a seed.
- Parameters:
seed (object) – Seed to initialize the
random.Random
instance.rng (random.Random) – Instance of RNG.
- class lisa.fuzz.Gen(*args, **kwargs)[source]
Bases:
object
- classmethod lift(f)[source]
Attention
Deprecated since version 2.0.
lift()
is deprecated and will be removed in version 4.0, uselisa.monad.MonadTrans.do()
instead: Note that GenMonad.do() will not automatically await on arguments if they are Gen instances, this must be done manually.
- class lisa.fuzz.Choices(n, xs, typ=None)[source]
Bases:
Gen
Randomly choose
n
values amongxs
.- Parameters:
n (int) – Number of values to yield every time.
xs (collections.abc.Iterable) – Finite iterable of values to choose from.
typ (type) – Callable used to build the output from an iterable.
- class lisa.fuzz.Set(n, xs, typ=None)[source]
Bases:
Choices
Same as
lisa.fuzz.Choices
but returns a set.Note
The values are drawn without replacement to ensure the set is of the correct size, assuming the input contained no duplicate.
- class lisa.fuzz.Tuple(n, xs, typ=None)[source]
Bases:
Choices
Same as
lisa.fuzz.Choices
but returns a tuple.
- class lisa.fuzz.SortedList(n, xs, typ=None)[source]
Bases:
Choices
Same as
lisa.fuzz.Choices
but returns a sorted list.
- class lisa.fuzz.Shuffle(xs)[source]
Bases:
Choices
Randomly shuffle the given sequence.
- Parameters:
xs (collections.abc.Sequence) – Finite sequence of values to shuffle.
- class lisa.fuzz.Int(min_=0, max_=0)[source]
Bases:
Gen
Draw a random int fitting within the
[min_, max_]
range.
- class lisa.fuzz.Float(min_=0, max_=0)[source]
Bases:
Gen
Draw a random float fitting within the
[min_, max_]
range.
- class lisa.fuzz.Dict(n, xs, typ=None)[source]
Bases:
Choices
Same as
lisa.fuzz.Choices
but returns a dictionary.Note
The input must be an iterable of
tuple(key, value)
.Note
The values are drawn without replacement to ensure the dict is of the correct size, assuming the input contained no duplicate.
- class lisa.fuzz.Choice(xs)[source]
Bases:
Gen
Randomly choose one values among
xs
.- Parameters:
xs (collections.abc.Iterable) – Finite iterable of values to choose from.
Dataframe and Series handling utilities
- class lisa.datautils.Timestamp(ts, unit='s', rounding='down')[source]
Bases:
float
Nanosecond-precision timestamp. It inherits from
float
and as such can be manipulating as a floating point number of seconds. Thenanoseconds
attribute allows getting the exact timestamp regardless of the magnitude of the float, allowing for more precise computation.- Parameters:
unit (str) – Unit of the
ts
value being passed. One of"s"
,"ms"
,"us"
and"ns"
.rounding (str) – How to round the value when converting to float. Timestamps of large magnitude will suffer from the loss of least significant digits in their float value which will not have nanosecond precision. The rounding determines if a value below or above the actual nanosecond-precision timestamp should be used. One of
"up"
or"down"
.
- __slots__ = ('as_nanoseconds', '_Timestamp__rounding')
- as_nanoseconds
- class lisa.datautils.DataAccessor(data)[source]
Bases:
object
Proxy class that allows extending the
pandas.DataFrame
API.Example:
# Define and register a dataframe accessor @DataFrameAccessor.register_accessor def df_foobar(df, baz): ... df = pandas.DataFrame() # Use the accessor with the "lisa" proxy df.lisa.foobar(baz=1)
- class lisa.datautils.DataFrameAccessor(data)[source]
Bases:
DataAccessor
- FUNCTIONS = {'add_delta': <function df_add_delta>, 'combine_duplicates': <function df_combine_duplicates>, 'convert_to_nullable': <function df_convert_to_nullable>, 'deduplicate': <function df_deduplicate>, 'delta': <function df_delta>, 'filter': <function df_filter>, 'filter_task_ids': <function df_filter_task_ids>, 'find_redundant_cols': <function df_find_redundant_cols>, 'make_empty_clone': <function df_make_empty_clone>, 'refit_index': <function df_refit_index>, 'split_signals': <function df_split_signals>, 'squash': <function df_squash>, 'update_duplicates': <function df_update_duplicates>, 'window': <function df_window>, 'window_signals': <function df_window_signals>}
- class lisa.datautils.SeriesAccessor(data)[source]
Bases:
DataAccessor
- FUNCTIONS = {'align_signal': <function series_align_signal>, 'convert': <function series_convert>, 'deduplicate': <function series_deduplicate>, 'derivate': <function series_derivate>, 'envelope_mean': <function series_envelope_mean>, 'integrate': <function series_integrate>, 'local_extremum': <function series_local_extremum>, 'mean': <function series_mean>, 'refit_index': <function series_refit_index>, 'rolling_apply': <function series_rolling_apply>, 'window': <function series_window>}
- lisa.datautils.series_refit_index(series, start=None, end=None, window=None, method='inclusive', clip_window=True)[source]
Slice a series using
series_window()
and ensure we have a value at exactly the specified boundaries, unless the signal started after the beginning of the required window.- Parameters:
df (pandas.Series) – Series to act on
start (object) – First index value to find in the returned series.
end (object) – Last index value to find in the returned series.
window (tuple(float or None, float or None) or None) –
window=(start, end)
is the same asstart=start, end=end
. These parameters styles are mutually exclusive.method (str) – Windowing method used to select the first and last values of the series using
series_window()
. Defaults toinclusive
, which is suitable for signals where all the value changes have a corresponding row without any fixed sample-rate constraints. If they have been downsampled,nearest
might be a better choice.).
Note
If
end
is past the end of the data, the last row will be duplicated so that we can have a start and end index at the right location, without moving the point at which the transition to the last value happened. This also allows plotting series with only one item using matplotlib, which would otherwise be impossible.- Parameters:
clip_window – Passed down to
series_refit_index()
.
- lisa.datautils.df_refit_index(df, start=None, end=None, window=None, method='inclusive')[source]
Same as
series_refit_index()
but acting onpandas.DataFrame
- lisa.datautils.df_split_signals(df, signal_cols, align_start=False, window=None)[source]
Yield subset of
df
that only contain one signal, along with the signal identification values.- Parameters:
df (pandas.DataFrame) – The dataframe to split.
signal_cols (list(str)) – Columns that uniquely identify a signal.
window (tuple(float or None, float or None) or None) – Apply
df_refit_index()
on the yielded dataframes with the given window.align_start (bool) – If
True
, same aswindow=(df.index[0], None)
. This makes sure all yielded signals start at the same index as the original dataframe.
- lisa.datautils.df_squash(df, start, end, column='delta')[source]
Slice a dataframe of deltas in [start:end] and ensure we have an event at exactly those boundaries.
The input dataframe is expected to have a “column” which reports the time delta between consecutive rows, as for example dataframes generated by
df_add_delta()
.The returned dataframe is granted to have an initial and final event at the specified “start” (“end”) index values, which values are the same of the last event before (first event after) the specified “start” (“end”) time.
Examples:
Slice a dataframe to [start:end], and work on the time data so that it makes sense within the interval.
Examples to make it clearer:
df is: Time len state 15 1 1 16 1 0 17 1 1 18 1 0 ------------- df_squash(df, 16.5, 17.5) => Time len state 16.5 .5 0 17 .5 1 df_squash(df, 16.2, 16.8) => Time len state 16.2 .6 0
- Returns:
a new df that fits the above description
- lisa.datautils.df_filter(df, filter_columns, exclude=False)[source]
Filter the content of a dataframe.
- Parameters:
df (pandas.DataFrame) – DataFrame to filter
filter_columns (dict(str, object)) – Dict of {“column”: value) that rows has to match to be selected.
exclude (bool) – If
True
, the matching rows will be excluded rather than selected.
- lisa.datautils.df_merge(df_list, drop_columns=None, drop_inplace=False, filter_columns=None)[source]
Merge a list of
pandas.DataFrame
, keeping the index sorted.- Parameters:
drop_columns (list(str)) – List of columns to drop prior to merging. This avoids ending up with extra renamed columns if some dataframes have column names in common.
drop_inplace (bool) – Drop columns in the original dataframes instead of creating copies.
filter_columns (dict(str, object)) – Dict of {“column”: value) used to filter each dataframe prior to dropping columns. The columns are then dropped as they have a constant value.
- lisa.datautils.df_delta(pre_df, post_df, group_on=None)[source]
pre_df and post_df containing paired/consecutive events indexed by time, df_delta() merges the two dataframes and adds a
delta
column containing the time spent between the two events. A typical usecase would be adding pre/post events at the entry/exit of a function.Rows from
pre_df
andpost_df
are grouped by thegroup_on
columns. E.g.:['pid', 'comm']
to group by task. Except columns listed ingroup_on
,pre_df
andpost_df
must have columns with different names.Events that cannot be paired are ignored.
- Parameters:
pre_df (pandas.DataFrame) – Dataframe containing the events that start a record.
post_df (pandas.DataFrame) – Dataframe containing the events that end a record.
group_on (list(str)) – Columns used to group
pre_df
andpost_df
. E.g.: This would be['pid', 'comm']
to group by task.
- Returns:
a
pandas.DataFrame
indexed by thepre_df
dataframe with:All the columns from the
pre_df
dataframe.All the columns from the
post_df
dataframe.- A
delta
column (duration between the emission of a ‘pre’ event and its consecutive ‘post’ event).
- A
- lisa.datautils.series_derivate(y, x=None, order=1)[source]
Compute a derivative of a
pandas.Series
with respect to another series.- Returns:
A series of dy/dx, where x is either the index of y or another series.
- Parameters:
y (pandas.DataFrame) – Series with the data to derivate.
x (pandas.DataFrame or None) – Series with the x data. If
None
, the index of y will be used. Note that y and x are expected to have the same index.order (int) – Order of the derivative (1 is speed, 2 is acceleration etc).
- lisa.datautils.series_integrate(y, x=None, sign=None, method='rect', rect_step='post')[source]
Compute the integral of y with respect to x.
- Returns:
A scalar \(\int_{x=A}^{x=B} y \, dx\), where x is either the index of y or another series.
- Parameters:
y (pandas.DataFrame) – Series with the data to integrate.
x (pandas.DataFrame or None) – Series with the x data. If
None
, the index of y will be used. Note that y and x are expected to have the same index.sign (str or None) –
Clip the data for the area in positive or negative regions. Can be any of:
+
: ignore negative data-
: ignore positive dataNone
: use all data
method – The method for area calculation. This can be any of the integration methods supported in
numpy
or rectrect_step (str) – The step behaviour for rect method
Rectangular Method
Step: Post
Consider the following time series data:
2 *----*----*----+ | | 1 | *----*----+ | 0 *----*----+ 0 1 2 3 4 5 6 7 import pandas as pd a = [0, 0, 2, 2, 2, 1, 1] s = pd.Series(a)
The area under the curve is:
\[\begin{split}\sum_{k=0}^{N-1} (x_{k+1} - {x_k}) \times f(x_k) \\ (2 \times 3) + (1 \times 2) = 8\end{split}\]Step: Pre
2 +----*----*----* | | 1 | +----*----*----+ | 0 *----* 0 1 2 3 4 5 6 7 import pandas as pd a = [0, 0, 2, 2, 2, 1, 1] s = pd.Series(a)
The area under the curve is:
\[\begin{split}\sum_{k=1}^{N} (x_k - x_{k-1}) \times f(x_k) \\ (2 \times 3) + (1 \times 3) = 9\end{split}\]
- lisa.datautils.series_mean(y, x=None, **kwargs)[source]
Compute the average of y by integrating with respect to x and dividing by the range of x.
- Returns:
A scalar \(\int_{x=A}^{x=B} \frac{y}{| B - A |} \, dx\), where x is either the index of y or another series.
- Parameters:
y (pandas.DataFrame) – Series with the data to integrate.
x (pandas.DataFrame or None) – Series with the x data. If
None
, the index of y will be used. Note that y and x are expected to have the same index.
- Variable keyword arguments:
Forwarded to
series_integrate()
.
- lisa.datautils.series_window(series, window, method='pre', clip_window=True)[source]
Select a portion of a
pandas.Series
- Parameters:
series (
pandas.Series
) – series to slicewindow (tuple(object)) – two-tuple of index values for the start and end of the region to select.
clip_window (bool) – Only
True
value is now allwed: clip the requested window to the bounds of the index, otherwise raise exceptions if the window is too large.method –
Choose how edges are handled:
- inclusive: When no exact match is found, include both the previous
and next values around the window.
- exclusive: When no exact match is found, only index values within
the range are selected. This is the default pandas float slicing behavior.
nearest: Not supported with
polars
objects: when no exact match is found, take the nearest index value.pre: When no exact match is found, take the previous index value.
post: When no exact match is found, take the next index value.
Note
The index of series must be monotonic and without duplicates.
- lisa.datautils.df_window(df, window, method='pre', clip_window=True)[source]
Same as
series_window()
but acting on apandas.DataFrame
- lisa.datautils.df_make_empty_clone(df)[source]
Make an empty clone of the given dataframe.
- Parameters:
df (pandas.DataFrame) – The template dataframe.
More specifically, the following aspects are cloned:
Column names
Column dtypes
- lisa.datautils.df_window_signals(df, window, signals, compress_init=False, clip_window=True)[source]
Similar to
df_window()
withmethod='pre'
but guarantees that each signal will have a values at the beginning of the window.- Parameters:
window (tuple(object)) – two-tuple of index values for the start and end of the region to select.
signals (list(SignalDesc)) – List of
SignalDesc
describing the signals to fixup.compress_init (bool) – When
False
, the timestamps of the init value of signals (right before the window) are preserved. IfTrue
, they are changed into values as close as possible to the beginning of the window.clip_window – See
df_window()
See also
- lisa.datautils.series_align_signal(ref, to_align, max_shift=None)[source]
Align a signal to an expected reference signal using their cross-correlation.
- Returns:
(ref, to_align) tuple, with to_align shifted by an amount computed to align as well as possible with ref. Both ref and to_align are resampled to have a fixed sample rate.
- Parameters:
ref (pandas.Series) – reference signal.
to_align (pandas.Series) – signal to align
max_shift (object or None) – Maximum shift allowed to align signals, in index units.
- lisa.datautils.df_filter_task_ids(df, task_ids, pid_col='pid', comm_col='comm', invert=False, comm_max_len=15)[source]
Filter a dataframe using a list of
lisa.analysis.tasks.TaskID
- Parameters:
task_ids (list(lisa.analysis.tasks.TaskID)) – List of task IDs to filter
df (pandas.DataFrame) – Dataframe to act on.
pid_col (str or None) – Column name in the dataframe with PIDs.
comm_col (str or None) – Column name in the dataframe with comm.
comm_max_len – Maximum expected length of the strings in
comm_col
. Thetask_ids
comm field will be truncated at that length before being matched.invert (bool) – Invert selection
- lisa.datautils.series_local_extremum(series, kind)[source]
Returns a series of local extremum.
- Parameters:
series (pandas.Series) – Series to look at.
kind (str) – Kind of extremum:
min
ormax
.
- lisa.datautils.series_envelope_mean(series)[source]
Compute the average between the mean of local maximums and local minimums of the series.
Assuming that the values are ranging inside a tunnel, this will give the average center of that tunnel.
- lisa.datautils.series_tunnel_mean(*args, **kwargs)[source]
Attention
Deprecated since version 2.0.
series_tunnel_mean()
is deprecated and will be removed in version 4.0, uselisa.datautils.series_envelope_mean()
instead
- lisa.datautils.series_rolling_apply(series, func, window, window_float_index=True, center=False)[source]
Apply a function on a rolling window of a series.
- Returns:
The series of results of the function.
- Parameters:
series (pandas.Series) – Series to act on.
func (collections.abc.Callable) – Function to apply on each window. It must take a
pandas.Series
as only parameter and return one value.window (float) – Rolling window width in seconds.
center (bool) – Label values generated by
func
with the center of the window, rather than the highest index in it.window_float_index (bool) – If
True
, the series passed tofunc
will be of typepandas.Index
(float64), in nanoseconds. Disabling is recommended if the index is not used byfunc
since it will remove the need for a conversion.
- lisa.datautils.series_deduplicate(series, keep, consecutives)[source]
Remove duplicate values in a
pandas.Series
.- Parameters:
keep (str) – Keep the first occurrences if
first
, or the last iflast
.consecutives (bool) –
If
True
, will only remove consecutive duplicates, for example:s = pd.Series([1,2,2,3,4,2], index=[1,2,20,30,40,50]) s2 = series_deduplicate(s, keep='first', consecutives=True) assert (s2 == [1,2,3,4,2]).all() s3 = series_deduplicate(s, keep='first', consecutives=False) assert (s3 == [1,2,3,4]).all()
- lisa.datautils.df_deduplicate(df, keep, consecutives, cols=None, all_col=True)[source]
Same as
series_deduplicate()
but forpandas.DataFrame
.
- lisa.datautils.series_update_duplicates(series, func=None)[source]
Update a given series to avoid duplicated values.
- Parameters:
series (pandas.Series) – Series to act on.
func (collections.abc.Callable or None) – The function used to update the column. It must take a
pandas.Series
of duplicated entries to update as parameters, and return a newpandas.Series
. The function will be called as long as there are remaining duplicates. IfNone
, the column is assumed to be floating point number of seconds and will be updated so that the no duplicated timestamps exist once translated to an integer number of nanoseconds.
- lisa.datautils.df_update_duplicates(df, col=None, func=None, inplace=False)[source]
Same as
series_update_duplicates()
but on apandas.DataFrame
.- Parameters:
df (pandas.DataFrame) – Dataframe to act on.
col (str or None) – Column to update. If
None
, the index is used.func (collections.abc.Callable or None) – See
series_update_duplicates()
.inplace (bool) – If
True
, the passed dataframe will be modified.
- lisa.datautils.df_combine_duplicates(df, func, output_col, cols=None, all_col=True, prune=True, inplace=False)[source]
Combine the duplicated rows using
func
and remove the duplicates.- Parameters:
df (pandas.DataFrame) – The dataframe to act on.
func (collections.abc.Callable) – Function to combine a group of duplicates. It will be passed a
pandas.DataFrame
corresponding to the group and must return either apandas.Series
with the same index as its input dataframe, or a scalar depending on the value ofprune
.prune (bool) –
If
True
,func
will be expected to return a single scalar that will be used instead of a whole duplicated group. Only the first row of the group is kept, the other ones are removed.If
False
,func
is expected to return apandas.Series
that will be used as replacement for the group. No rows will be removed.output_col (str) – Column in which the output of
func
should be stored.cols (list(str) or None) – Columns to use for duplicates detection
all_cols (bool) – If
True
, all columns will be used.inplace (bool) – If
True
, the passed dataframe is modified.
- lisa.datautils.df_add_delta(df, col='delta', src_col=None, window=None, inplace=False)[source]
Add a column containing the delta of the given other column.
- Parameters:
df (pandas.DataFrame) – The dataframe to act on.
col (str) – The name of the column to add.
src_col (str or None) – Name of the column to compute the delta of. If
None
, the index is used.window (tuple(float or None, float or None) or None) – Optionally, a window. It will be used to compute the correct delta of the last row. If
inplace=False
, the dataframe will be pre-filtered usingdf_refit_index()
. This implies that the last row will have a NaN delta, but will be suitable e.g. for plotting, and aggregation functions that ignore delta such aspandas.DataFrame.sum()
.inplace (bool) – If
True
,df
is modified inplace to add the column
- lisa.datautils.series_combine(series_list, func, fill_value=None)[source]
Same as
pandas.Series.combine()
on a list of series rather than just two.
- lisa.datautils.df_combine(series_list, func, fill_value=None)[source]
Same as
pandas.DataFrame.combine()
on a list of series rather than just two.
- lisa.datautils.series_dereference(series, sources, inplace=False, method='ffill')[source]
Replace each value in
series
by the value at the corresponding index by the source indicated byseries
’s value.- Parameters:
series (pandas.Series) – Series of “pointer” values.
sources (collections.abc.Mapping or pandas.DataFrame) –
Dictionary with keys corresponding to
series
values. For each value ofseries
, a source will be chosen and its value at the current index will be used. If apandas.DataFrame
is passed, the column names will be used as keys and the column series as values.Note
Unless
series
and thesources
share the same index, thesources
will be reindexed withffill
method.inplace (bool) – If
True
, modify the series inplace.method (str) –
sources
is reindexed so that it shares the same index asseries
.method
is forwarded topandas.Series.reindex()
.
- lisa.datautils.df_dereference(df, col, pointer_col=None, sources=None, inplace=False, **kwargs)[source]
Similar to
series_dereference()
.Example:
df = pd.DataFrame({ 'ptr': ['A', 'B'], 'A' : ['A1', 'A2'], 'B' : ['B1', 'B2'], }) df = df_dereference(df, 'dereferenced', pointer_col='ptr') # ptr A B dereferenced # 0 A A1 B1 A1 # 1 B A2 B2 B2
- Parameters:
df (pandas.DataFrame) – Dataframe to act on.
col (str) – Name of the column to create.
pointer_col (str or None) – Name of the column containing “pointer” values. Defaults to the same value as
col
.sources (collections.abc.Mapping or pandas.DataFrame) – Same meaning as in
series_dereference()
. If omitted,df
is used.inplace (bool) – If
True
, the dataframe is modified inplace.
- Variable keyword arguments:
Forwarded to
series_dereference()
.
- class lisa.datautils.SignalDesc(event, fields)[source]
Bases:
object
Define a signal to be used by various signal-oriented APIs.
- Parameters:
- classmethod from_event(*args, **kwargs)[source]
Attention
Deprecated since version 3.0.
from_event()
is deprecated and will be removed in version 4.0: No new signals will be added to this list, use explicit signal description where appropriate in the Trace API
- lisa.datautils.series_convert(series, dtype, nullable=None)[source]
Convert a
pandas.Series
with a best effort strategy.Nullable types may be used if necessary and possible, otherwise
object
dtype will be used.- Parameters:
series (pandas.Series) – Series of another type than the target one. Strings are allowed.
dtype (str or collections.abc.Callable) –
dtype to convert to. If it is a string (like
"uint8"
), the following strategy will be used:Convert to the given dtype
If it failed, try converting to an equivalent nullable dtype
If it failed, try to parse it with an equivalent Python object constructor, and then convert it to the dtype.
If an integer dtype was requested, parsing as hex string will be attempted too
If it is a callable, it will be applied on the series, converting all values considered as nan by
pandas.isna()
intoNone
values. The result will haveobject
dtype. The callable has a chance to handle the conversion from nan itself.Note
In some cases, asking for an unsigned dtype might let through negative values, as there is no way to reliably distinguish between conversion failures reasons.
nullable (bool or None) –
If:
True
, use the nullable dtype equivalent of the requested dtype.None
, use the equivalent nullable dtype if there is any missingdata, otherwise a non-nullable dtype will be used for lower memory consumption.
- lisa.datautils.df_convert_to_nullable(df)[source]
Convert the columns of the dataframe to their equivalent nullable dtype, when possible.
- Parameters:
df (pandas.DataFrame) – The dataframe to convert.
- Returns:
The dataframe with converted columns.
- lisa.datautils.df_find_redundant_cols(df, col, cols=None)[source]
Find the columns that are redundant to
col
, i.e. that can be computed asdf[x] = df[col].map(dict(...))
.- Parameters:
df (pandas.DataFrame) – Dataframe to analyse.
col (str) – Reference column
cols (str or None) – Columns to restrict the analysis to. If
None
, all columns are used.
Interactive notebooks utilities
Various utilities for interactive notebooks, plus some generic plot-related functions.
- lisa.notebook.COLOR_CYCLE = ['#377eb8', '#ff7f00', '#4daf4a', '#f781bf', '#a65628', '#984ea3', '#999999', '#e41a1c', '#dede00']
Colorblind-friendly cycle, see https://gist.github.com/thriveth/8560036
- class lisa.notebook.WrappingHBox(**kwargs: Any)[source]
Bases:
HBox
HBox that will overflow on multiple lines if the content is too large to fit on one line.
Public constructor
- lisa.notebook.axis_link_dataframes(axis, df_list, before=1, after=5, cursor_color='red', follow_cursor=False)[source]
Link some dataframes to an axis displayed in the interactive matplotlib widget.
- Parameters:
axis (matplotlib.axes.Axes) – Axis to link to.
df_list (list(pandas.DataFrame)) – List of pandas dataframe to link.
before (int) – Number of dataframe rows to display before the selected location.
after (int) – Number of dataframe rows to display after the selected location.
cursor_color (str) – Color of the vertical line added at the clicked location.
follow_cursor (bool) – If
True
, the cursor will be followed without the need to click.
When the user clicks on the graph, a vertical marker will appear and the dataframe slice will update to show the relevant row.
Note
This requires the matplotlib widget enabled using
%matplotlib widget
magic.
- lisa.notebook.axis_cursor_delta(axis, colors=('blue', 'green'), buttons=(MouseButton.LEFT, MouseButton.RIGHT))[source]
Display the time delta between two vertical lines drawn on clicks.
- Parameters:
axis (matplotlib.axes.Axes) – Axis to link to.
colors (list(str)) – List of colors to use for vertical lines.
buttons (list(matplotlib.backend_bases.MouseButton)) – Mouse buttons to use for each vertical line.
Note
This requires the matplotlib widget enabled using
%matplotlib widget
magic.
- lisa.notebook.interact_tasks(trace, tasks=None, kind=None)[source]
Decorator to make a block of code parametrized on a task that can be selected from a dropdown.
- Parameters:
trace (lisa.trace.Trace) – Trace object in use
tasks (list(int or str or lisa.analysis.tasks.TaskID) or None) – List of tasks that are available. See
kind
for alternative way of specifying tasks.kind (str or None) –
Alternatively to
tasks
, a kind can be provided and the tasks will be selected from the trace for you. It can be:rtapp
to select all rt-app tasksall
to select all tasks.
Example:
trace = Trace('trace.dat') # Allow selecting any rtapp task @interact_tasks(trace, kind='rtapp') def do_plot(task): trace.ana.load_tracking.plot_task_signals(task)
- lisa.notebook.make_figure(width, height, nrows, ncols, interactive=None, **kwargs)[source]
Make a
matplotlib.figure.Figure
and its axes.- Parameters:
- Variable keyword arguments:
Forwarded to
matplotlib.figure.Figure
- Returns:
A tuple of: *
matplotlib.figure.Figure
*matplotlib.axes.Axes
as a scalar, an iterable (1D) or iterable of iterable matrix (2D)
- lisa.notebook.plot_signal(series, name=None, interpolation=None, add_markers=True, vdim=None)[source]
Plot a signal using
holoviews
library.- Parameters:
series (pandas.Series) – Series of values to plot.
name (str or None) – Name of the signal. Defaults to the series name.
interpolation (str or None) – Interpolate type for the signal. Defaults to
steps-post
which is the correct value for signals encoded as a series of updates.add_markers (bool) – Add markers to the plot.
vdim (holoviews.core.dimension.Dimension) – Value axis dimension.
PELT signals simulations
- lisa.pelt.PELT_WINDOW = 0.001048576
PELT window in seconds.
- lisa.pelt.PELT_HALF_LIFE = 32
PELT half-life in number of windows.
- lisa.pelt.simulate_pelt(activations, init=0, index=None, clock=None, capacity=None, windowless=False, window=0.001048576, half_life=32, scale=1024)[source]
Simulate a PELT signal out of a series of activations.
- Parameters:
activations (pandas.Series) – Series of a task’s activations:
1 == running
and0 == sleeping
.init (float) – Initial value of the signal
index (pandas.Index) – Optional index at which the PELT values should be computed. If
None
, a value will be computed when the task starts sleeping and when it wakes up. Note that there is no emulation of scheduler tick updating the signal while it’s running.clock (pandas.Series) – Series of clock values to be used instead of the timestamp index.
capacity (pandas.Series or None) – Capacity of the CPU at all points. This is used to fixup the clock on enqueue and dequeue, since the clock is typically provided by a PELT event and not the enqueue or dequeue events. If no clock at all is passed, the CPU capacity will be used to create one from scratch based on the
activations
index values.window (float) – PELT window in seconds.
windowless (bool) – If
True
, a windowless simulator is used. This avoids the artifacts of the windowing in PELT.half_life (int) – PELT half-life in number of windows.
scale (float) – Scale of the signal, i.e. maximum value it can take.
Note
PELT windowing is not time-invariant, i.e. it depends on the absolute value of the timestamp. This means that the timestamp of the activations matters, and it is recommended to use the
clock
parameter to provide the actual clock used by PELT.Also note that the kernel uses integer arithmetic with a different way of computing the signal. This means that the simulation cannot perfectly match the kernel’s signal.
- lisa.pelt.pelt_swing(period, duty_cycle, window=0.001048576, half_life=32, scale=1024, kind='peak2peak')[source]
Compute an approximation of the PELT signal swing for a given periodic task.
- Parameters:
period (float) – Period of the task in seconds.
duty_cycle (float) – Duty cycle of the task.
window (float) – PELT window in seconds.
half_life (int) – PELT half life, in number of windows.
scale (float) – PELT scale.
kind (str) –
One of:
peak2peak
: the peak-to-peak swing of PELT.above
: the amplitude of the swing above the average value.below
: the amplitude of the swing below the average value.
Note
The PELT signal is approximated as a first order filter. This does not take into account the averaging inside a window, but the window is small enough in practice for that effect to be negligible.
- lisa.pelt.pelt_step_response(t, window=0.001048576, half_life=32, scale=1024)[source]
Compute an approximation of the PELT value at time
t
when subject to a step input (i.e running tasks, PELT starting at 0).
- lisa.pelt.pelt_settling_time(margin=1, init=0, final=1024, window=0.001048576, half_life=32, scale=1024)[source]
Compute an approximation of the PELT settling time.
- Parameters:
Note
The PELT signal is approximated as a first order filter. This does not take into account the averaging inside a window, but the window is small enough in practice for that effect to be negligible.
- lisa.pelt.kernel_util_mean(util, plat_info)[source]
Compute the mean of a utilization signal as output by the kernel.
- Parameters:
util (pandas.Series) – Series of utilization over time.
plat_info (lisa.platforms.platinfo.PlatformInfo) – Platform info of the kernel used to generate the utilization signal.
Warning
It is currently only fully accurate for a task with a 512 utilisation mean.
Sphinx documentation
- class lisa._doc.helpers.RecursiveDirective(name, arguments, options, content, lineno, content_offset, block_text, state, state_machine)[source]
Bases:
Directive
Base class helping nested parsing.
Options:
literal
: If set, a literal block will be used, otherwise the text will be interpreted as reStructuredText.
- option_spec = {'literal': <function flag>}
Mapping of option names to validator functions.
- class lisa._doc.helpers.ExecDirective(name, arguments, options, content, lineno, content_offset, block_text, state, state_machine)[source]
Bases:
RecursiveDirective
reStructuredText directive to execute the specified python code and insert the output into the document:
.. exec:: import sys print(sys.version)
Options:
literal
: If set, a literal block will be used, otherwise the text will be interpreted as reStructuredText.
- has_content = True
May the directive have content?
- class lisa._doc.helpers.RunCommandDirective(name, arguments, options, content, lineno, content_offset, block_text, state, state_machine)[source]
Bases:
RecursiveDirective
reStructuredText directive to execute the specified command and insert the output into the document:
.. run-command:: :capture-stderr: :ignore-error: :literal: exekall --help
Options:
literal
: If set, a literal block will be used, otherwise the text will be interpreted as reStructuredText.capture-stderr
: If set, stderr will be captured in addition to stdout.ignore-error
: The return status of the command will be ignored. Otherwise, it will raise an exception and building the documentation will fail.
- has_content = True
May the directive have content?
- option_spec = {'capture-stderr': <function flag>, 'ignore-error': <function flag>, 'literal': <function flag>}
Mapping of option names to validator functions.
- lisa._doc.helpers.autodoc_process_analysis_events(app, what, name, obj, options, lines)[source]
Append the list of required trace events
- lisa._doc.helpers.autodoc_skip_member_handler(app, what, name, obj, skip, options, default_exclude_members=None)[source]
Enforce the “exclude-members” option, even in cases where it seems to be ignored by Sphinx.
- class lisa._doc.helpers.DocPlotConf(conf=None, src='user', add_default_src=True)[source]
Bases:
SimpleMultiSrcConf
Analysis plot method arguments configuration for the documentation.
doc-plot-conf: Plot methods configuration
plots (
Mapping
): Mapping of function qualnames to their settings.
- Example YAML:
# Plot methods configuration doc-plot-conf: # Mapping of function qualnames to their settings # type: Mapping plots: _
Warning
Arbitrary code can be executed while loading an instance from a YAML or Pickle file. To include untrusted data in YAML, use the !untrusted tag along with a string
- STRUCTURE = <lisa.conf.TopLevelKeyDesc object>
- class Plots
Bases:
HideExekallID
Mapping of function qualnames to their settings
- lisa._doc.helpers.autodoc_process_analysis_plots(app, what, name, obj, options, lines, plots)[source]
- lisa._doc.helpers.autodoc_process_analysis_methods(app, what, name, obj, options, lines)[source]
Append the list of required trace events
- lisa._doc.helpers.find_dead_links(content)[source]
Look for HTTP URLs in
content
and return a dict of URL to errors when trying to open them.
- lisa._doc.helpers.check_dead_links(filename)[source]
Check
filename
for broken links, and raise an exception if there is any.
- lisa._doc.helpers.get_deprecated_map()[source]
Get the mapping of deprecated names with some metadata.
- lisa._doc.helpers.get_deprecated_table()[source]
Get a reStructuredText tables with titles for all the deprecated names in
lisa
.
- lisa._doc.helpers.get_xref_type(obj)[source]
Infer the Sphinx type a cross reference to
obj
should have.For example,
:py:class`FooBar
has the typepy:class
.
- lisa._doc.helpers.get_subclasses_bullets(cls, abbrev=True, style=None, only_leaves=False)[source]
Return a formatted bullet list of the subclasses of the given class, including a short description for each.
- lisa._doc.helpers.make_changelog(repo, since=None, head_release_name='Next release', fmt='rst')[source]
Generate a reStructuredText changelog to be included in the documentation.
Note
The git repository cannot be a shallow clone, as the changelog is extracted from the git history.
Note
The
refs/notes/changelog
notes is concatenated at the end of commit messages, and the resulting text is parsed. This allows fixing up changelog entries if markers were forgotten without rewriting the history.
Version
- lisa.version.VERSION_TOKEN = 'git-32168f3cc4c33a6512aef14975782cce0662bd6f'
Unique token related to code version.
When
LISA_DEVMODE
environment variable is set to 1, the git sha1 followed by the uncommitted patch’s sha1 will be used, so that the code of LISA can uniquely be identified even in development state.