PreprocManager¶
- class torch_ecg._preprocessors.PreprocManager(*pps: Optional[Tuple[torch_ecg._preprocessors.base.PreProcessor, ...]], random: bool = False)[source]¶
Bases:
torch_ecg.utils.misc.ReprMixin
Manager of preprocessors.
This class is used to manage a sequence of preprocessors. It can be used to add preprocessors to the manager, and apply the preprocessors to a signal.
- Parameters
pps (Tuple[PreProcessor], optional) – The sequence of preprocessors to be added to the manager.
random (bool, default False) – Whether to apply the augmenters in random order.
Examples
import torch from torch_ecg.cfg import CFG from torch_ecg._preprocessors import PreprocManager config = CFG( random=False, resample={"fs": 500}, bandpass={"filter_type": "fir"}, normalize={"method": "min-max"}, ) ppm = PreprocManager.from_config(config) sig = torch.randn(12, 80000).numpy() sig, fs = ppm(sig, 200)
- add_(pp: torch_ecg._preprocessors.base.PreProcessor, pos: int = - 1) None [source]¶
Add a (custom) preprocessor to the manager.
This method is preferred against directly manipulating the internal list of preprocessors via
PreprocManager.preprocessors.append(pp)
.- Parameters
pp (PreProcessor) – The
PreProcessor
to be added.pos (int, default -1) – The position to insert the preprocessor, should be >= -1, with -1 the indicator of the end.
- classmethod from_config(config: dict) torch_ecg._preprocessors.preproc_manager.PreprocManager [source]¶
Create a new instance of
PreprocManager
from a configuration.- Parameters
config (dict) – The configuration of the preprocessors, better to be an
OrderedDict
.- Returns
ppm – A new instance of
PreprocManager
.- Return type