torchscalers

April 10, 2026

torchscalers is a Python package providing a set of feature scalers for PyTorch pipelines. Unlike scikit-learn scalers, every scaler in torchscalers is a torch.nn.Module subclass. Fitted statistics are stored as module buffers, which means they travel with the model to any device, are saved and restored with every torch.save / torch.load checkpoint, and plug directly into nn.Sequential pipelines.

Available Scalers

Scaler Description
ZScoreScaler Standardise to zero mean and unit variance.
MinMaxScaler Scale to [0, 1] using per-feature min/max.
MaxAbsScaler Scale to [-1, 1] by dividing by the maximum absolute value.
RobustScaler Scale using median and IQR — robust to outliers.
ShiftScaleScaler Apply a user-specified (x + shift) * scale transformation.
LogScaler Apply a log transformation: log(x + eps).
PerDomainScaler Apply a separate scaler instance per string domain ID.
MixedDomainScaler Apply a different scaler type per string domain ID.

Links