nemos.basis.HistoryConv#

class nemos.basis.HistoryConv(window_size, label='HistoryConv', conv_kwargs=None)[source]#

Bases: ConvBasisMixin, HistoryBasis

Basis for history effects.

This function includes the history of the samples as predictor reshaped as a 2D array. It is intended to be used for including a raw history as predictor.

Parameters:
  • window_size (int) – History window as the number of samples.

  • label (Optional[str]) – The label of the basis, intended to be descriptive of the task variable being processed. For example: velocity, position, spike_counts.

  • conv_kwargs (Optional[dict]) – Additional keyword arguments passed to nemos.convolve.create_convolutional_predictor(); These arguments are used to change the default behavior of the convolution. For example, changing the predictor_causality, which by default is set to "causal". Note that one cannot change the default value for the axis parameter. Basis assumes that the convolution axis is axis=0.

Attributes

conv_kwargs

The convolutional kwargs.

input_shape

label

Label for the basis.

mode

Mode of operation, either "conv" or "eval".

n_basis_funcs

Read-only property for history basis.

n_output_features

Number of features returned by the basis.

window_size

Duration of the convolutional kernel in number of samples.

__init__(window_size, label='HistoryConv', conv_kwargs=None)[source]#
Parameters:
  • window_size (int)

  • label (str | None)

  • conv_kwargs (dict | None)

Methods

__init__(window_size[, label, conv_kwargs])

compute_features(xi)

Convolve basis functions with input time series.

evaluate_on_grid(n_samples)

Evaluate the basis set on a grid of equi-spaced sample points.

get_params([deep])

From scikit-learn, get parameters by inspecting init.

set_input_shape(xi)

Set the expected input shape for the basis object.

set_kernel()

Prepare or compute the convolutional kernel for the basis functions.

set_params(**params)

Set the parameters of this estimator.

setup_basis(*xi)

Set all basis states.

split_by_feature(x[, axis])

Decompose an array along a specified axis into sub-arrays based on the number of expected inputs.

to_transformer()

Turn the Basis into a TransformerBasis for use with scikit-learn.

__add__(other)#

Add two Basis objects together.

Parameters:

other (Basis) – The other Basis object to add.

Returns:

The resulting Basis object.

Return type:

AdditiveBasis

__iter__()#

Makes basis iterable. Re-implemented for additive.

__mul__(other)#

Multiply two Basis objects together.

Parameters:

other (Basis) – The other Basis object to multiply.

Return type:

MultiplicativeBasis

Returns:

The resulting Basis object.

__pow__(exponent)#

Exponentiation of a Basis object.

Define the power of a basis by repeatedly applying the method __multiply__. The exponent must be a positive integer.

Parameters:

exponent (int) – Positive integer exponent

Return type:

MultiplicativeBasis

Returns:

The product of the basis with itself “exponent” times. Equivalent to self * self * ... * self.

Raises:
  • TypeError – If the provided exponent is not an integer.

  • ValueError – If the integer is zero or negative.

__sklearn_clone__()#

Clone the basis while preserving attributes related to input shapes.

This method ensures that input shape attributes (e.g., _input_shape_product, _input_shape_) are preserved during cloning. Reinitializing the class as in the regular sklearn clone would drop these attributes, rendering cross-validation unusable.

Return type:

Basis

compute_features(xi)[source]#

Convolve basis functions with input time series.

A bank of basis filters is convolved with the input data. All the dimensions except for the sample-axis are flattened, so that the method always returns a matrix.

For example, if inputs are of shape (num_samples, 2, 3), the output will be (num_samples, num_basis_funcs * 2 * 3).

Parameters:

*xi (ArrayLike) – The input data over which to apply the basis transformation. The samples can be passed as multiple arguments, each representing a different dimension for multivariate inputs.

Return type:

TsdFrame | ndarray[Any, dtype[TypeVar(_ScalarType_co, bound= generic, covariant=True)]]

Notes

This method is intended to be 1-to-1 mappable to sklearn transform method of transformer. This means that for the method to be callable, all the state attributes have to be pre-computed in a method that is mappable to fit, which for us is _fit_basis. It is fundamental that both methods behaves like the corresponding transformer method, with the only difference being the input structure: a single (X, y) pair for the transformer, a number of time series for the Basis.

Examples

>>> import numpy as np
>>> from nemos.basis import HistoryConv
>>> # Generate data
>>> num_samples = 1000
>>> X = np.random.normal(size=(num_samples, ))  # raw time series
>>> basis = HistoryConv(10)
>>> features = basis.compute_features(X)  # basis transformed time series
>>> features.shape
(1000, 10)
property conv_kwargs#

The convolutional kwargs.

Keyword arguments passed to nemos.convolve.create_convolutional_predictor().

evaluate_on_grid(n_samples)[source]#

Evaluate the basis set on a grid of equi-spaced sample points.

Parameters:

n_samples (int) – The number of points used to construct the identity matrix.

Return type:

Tuple[NDArray, NDArray]

Returns:

  • X – Array of shape (n_samples,) containing a equi-spaced samples between 0 and 1.

  • basis_funcs – The identity matrix of shape, i.e. np.eye(window_size, n_samples).

Examples

>>> import matplotlib.pyplot as plt
>>> from nemos.basis import HistoryConv
>>> window_size=100
>>> basis = HistoryConv(window_size=window_size)
>>> sample_points, basis_values = basis.evaluate_on_grid(window_size)
get_params(deep=True)#

From scikit-learn, get parameters by inspecting init.

Parameters:

deep

Return type:

dict

Returns:

out:

A dictionary containing the parameters. Key is the parameter name, value is the parameter value.

property input_shape: NDArray#
property label: str#

Label for the basis.

property mode#

Mode of operation, either "conv" or "eval".

property n_basis_funcs: tuple | None#

Read-only property for history basis.

property n_output_features: int | None#

Number of features returned by the basis.

Notes

The number of output features can be determined only when the number of inputs provided to the basis is known. Therefore, before the first call to compute_features, this property will return None. After that call, or after setting the input shape with set_input_shape, n_output_features will be available.

set_input_shape(xi)[source]#

Set the expected input shape for the basis object.

This method configures the shape of the input data that the basis object expects. xi can be specified as an integer, a tuple of integers, or derived from an array. The method also calculates the total number of input features and output features based on the number of basis functions.

Parameters:

xi (int | tuple[int, …] | NDArray) –

The input shape specification. - An integer: Represents the dimensionality of the input. A value of 1 is treated as scalar input. - A tuple: Represents the exact input shape excluding the first axis (sample axis).

All elements must be integers.

  • An array: The shape is extracted, excluding the first axis (assumed to be the sample axis).

Raises:

ValueError – If a tuple is provided and it contains non-integer elements.

Returns:

Returns the instance itself to allow method chaining.

Return type:

self

Notes

All state attributes that depends on the input must be set in this method in order for the API of basis to work correctly. In particular, this method is called by setup_basis, which is equivalent to fit for a transformer. If any input dependent state is not set in this method, then compute_features (equivalent to fit_transform) will break.

Examples

>>> import nemos as nmo
>>> import numpy as np
>>> basis = nmo.basis.HistoryConv(5)
>>> # Configure with an integer input:
>>> _ = basis.set_input_shape(3)
>>> basis.n_output_features
15
>>> # Configure with a tuple:
>>> _ = basis.set_input_shape((4, 5))
>>> basis.n_output_features
100
>>> # Configure with an array:
>>> x = np.ones((10, 4, 5))
>>> _ = basis.set_input_shape(x)
>>> basis.n_output_features
100
set_kernel()#

Prepare or compute the convolutional kernel for the basis functions.

This method is called to prepare the basis functions for convolution operations in subclasses. It computes a kernel based on the basis functions that will be used for convolution with the input data. The specifics of kernel computation depend on the subclass implementation and the nature of the basis functions.

Returns:

The instance itself, modified to include the computed kernel. This allows for method chaining and integration into transformation pipelines.

Return type:

self

Notes

Subclasses implementing this method should detail the specifics of how the kernel is computed and how the input parameters are utilized.

set_params(**params)#

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as Pipeline). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Parameters:

**params (dict) – Estimator parameters.

Returns:

self – Estimator instance.

Return type:

estimator instance

setup_basis(*xi)#

Set all basis states.

This method corresponds sklearn transformer fit. As fit, it must receive the input and it must set all basis states, i.e. kernel_ and all the states relative to the input shape. The difference between this method and the transformer fit is in the expected input structure, where the transformer fit method requires the inputs to be concatenated in a 2D array, while here each input is provided as a separate time series for each basis element.

Parameters:

xi (NDArray) – Input arrays.

Return type:

Basis

Returns:

The basis with ready for evaluation.

split_by_feature(x, axis=1)[source]#

Decompose an array along a specified axis into sub-arrays based on the number of expected inputs.

This function takes an array (e.g., a design matrix or model coefficients) and splits it along a designated axis.

How it works:

  • If the basis expects an input shape (n_samples, n_inputs), then the feature axis length will be total_n_features = n_inputs * n_basis_funcs. This axis is reshaped into dimensions (n_inputs, n_basis_funcs).

  • If the basis expects an input of shape (n_samples,), then the feature axis length will be total_n_features = n_basis_funcs. This axis is reshaped into (1, n_basis_funcs).

For example, if the input array x has shape (1, 2, total_n_features, 4, 5), then after applying this method, it will be reshaped into (1, 2, n_inputs, n_basis_funcs, 4, 5).

The specified axis (axis) determines where the split occurs, and all other dimensions remain unchanged. See the example section below for the most common use cases.

Parameters:
  • x (NDArray) –

    The input array to be split, representing concatenated features, coefficients, or other data. The shape of x along the specified axis must match the total number of features generated by the basis, i.e., self.n_output_features.

    Examples:

    • For a design matrix: (n_samples, total_n_features)

    • For model coefficients: (total_n_features,) or (total_n_features, n_neurons).

  • axis (int, optional) – The axis along which to split the features. Defaults to 1. Use axis=1 for design matrices (features along columns) and axis=0 for coefficient arrays (features along rows). All other dimensions are preserved.

Raises:

ValueError – If the shape of x along the specified axis does not match self.n_output_features.

Returns:

A dictionary where:

  • Key: Label of the basis.

  • Value: the array reshaped to: (..., n_inputs, n_basis_funcs, ...)

Return type:

dict

Examples

>>> import numpy as np
>>> from nemos.basis import HistoryConv
>>> from nemos.glm import GLM
>>> # Define an additive basis
>>> basis = HistoryConv(5, label="feature")
>>> # Generate a sample input array and compute features
>>> inp = np.random.randn(20)
>>> X = basis.compute_features(inp)
>>> # Split the feature matrix along axis 1
>>> split_features = basis.split_by_feature(X, axis=1)
>>> for feature, arr in split_features.items():
...     print(f"{feature}: shape {arr.shape}")
feature: shape (20, 5)
to_transformer()#

Turn the Basis into a TransformerBasis for use with scikit-learn.

Return type:

TransformerBasis

Examples

Jointly cross-validating basis and GLM parameters with scikit-learn.

>>> import nemos as nmo
>>> from sklearn.pipeline import Pipeline
>>> from sklearn.model_selection import GridSearchCV
>>> # load some data
>>> X, y = np.random.normal(size=(30, 1)), np.random.poisson(size=30)
>>> basis = nmo.basis.RaisedCosineLinearEval(10).set_input_shape(1).to_transformer()
>>> glm = nmo.glm.GLM(regularizer="Ridge", regularizer_strength=1.)
>>> pipeline = Pipeline([("basis", basis), ("glm", glm)])
>>> param_grid = dict(
...     glm__regularizer_strength=(0.1, 0.01, 0.001, 1e-6),
...     basis__n_basis_funcs=(3, 5, 10, 20, 100),
... )
>>> gridsearch = GridSearchCV(
...     pipeline,
...     param_grid=param_grid,
...     cv=5,
... )
>>> gridsearch = gridsearch.fit(X, y)
property window_size#

Duration of the convolutional kernel in number of samples.