Algebraic signal processing (ASP) is an emerging area of theoretical signal processing (SP). In the algebraic theory of signal processing, a set of filters is treated as an (abstract) algebra, a set of signals is treated as a module or vector space, and convolution is treated as an algebra representation. The advantage of algebraic signal processing is its generality and portability.
In the original formulation of algebraic signal processing by Puschel and Moura, the signals are collected in an [math]\displaystyle{ \mathcal{A} }[/math]-module for some algebra [math]\displaystyle{ \mathcal{A} }[/math] of filters, and filtering is given by the action of [math]\displaystyle{ \mathcal{A} }[/math] on the [math]\displaystyle{ \mathcal{A} }[/math]-module.[1]
Let [math]\displaystyle{ K }[/math] be a field, for instance the complex numbers, and [math]\displaystyle{ \mathcal{A} }[/math] be a [math]\displaystyle{ K }[/math]-algebra (i.e. a vector space over [math]\displaystyle{ K }[/math] with a binary operation [math]\displaystyle{ \ast: \mathcal{A} \otimes \mathcal{A} \to \mathcal{A} }[/math] that is linear in both arguments) treated as a set of filters. Suppose [math]\displaystyle{ \mathcal{M} }[/math] is a vector space representing a set signals. A representation of [math]\displaystyle{ \mathcal{A} }[/math] consists of an algebra homomorphism [math]\displaystyle{ \rho: \mathcal{A} \to \mathrm{End}(\mathcal{M}) }[/math] where [math]\displaystyle{ \mathrm{End}(\mathcal{M}) }[/math] is the algebra of linear transformations [math]\displaystyle{ T: \mathcal{M} \to \mathcal{M} }[/math] with composition (equivalent, in the finite-dimensional case, to matrix multiplication). For convenience, we write [math]\displaystyle{ \rho_{a} }[/math] for the endomorphism [math]\displaystyle{ \rho(a) }[/math]. To be an algebra homomorphism, [math]\displaystyle{ \rho }[/math] must not only be a linear transformation, but also satisfy the property[math]\displaystyle{ \rho_{a \ast b} = \rho_{b}\circ\rho_a \quad \forall a, b \in \mathcal{A} }[/math]Given a signal [math]\displaystyle{ x \in \mathcal{M} }[/math], convolution of the signal by a filter [math]\displaystyle{ a \in \mathcal{A} }[/math] yields a new signal [math]\displaystyle{ \rho_a(x) }[/math]. Some additional terminology is needed from the representation theory of algebras. A subset [math]\displaystyle{ \mathcal{G} \subseteq \mathcal{A} }[/math] is said to generate the algebra if every element of [math]\displaystyle{ \mathcal{A} }[/math] can be represented as polynomials in the elements of [math]\displaystyle{ \mathcal{A} }[/math]. The image of a generator [math]\displaystyle{ g \in \mathcal{G} }[/math] is called a shift operator. In all practically all examples, convolutions are formed as polynomials in [math]\displaystyle{ \mathrm{End}(\mathcal{M}) }[/math] generated by shift operators. However, this is not necessarily the case for a representation of an arbitrary algebra.
In discrete signal processing (DSP), the signal space is the set of complex-valued functions [math]\displaystyle{ \mathcal{M} = \mathcal{L}^2(\mathbb{Z}) }[/math] with bounded energy (i.e. square-integrable functions). This means the infinite series [math]\displaystyle{ \sum_{n = -\infty}^{\infty} |(x)_n| \lt \infty }[/math] where [math]\displaystyle{ |\cdot| }[/math] is the modulus of a complex number. The shift operator is given by the linear endomorphism [math]\displaystyle{ (S x)_n = (x)_{n-1} }[/math]. The filter space is the algebra of polynomials with complex coefficients [math]\displaystyle{ \mathcal{A} = \mathbb{C}[z^{-1},z] }[/math] and convolution is given by [math]\displaystyle{ \rho_{h} = \sum_{k = -\infty}^{\infty} h_k S^k }[/math] where [math]\displaystyle{ h(t) = \sum_{k=-\infty}^{\infty} h_k z^k }[/math] is an element of the algebra. Filtering a signal by [math]\displaystyle{ h }[/math], then yields [math]\displaystyle{ (y)_n = \sum_{k=-\infty}^{\infty} h_k x_{n-k} }[/math] because [math]\displaystyle{ (S^k x)_n = (x)_{n-k} }[/math].
A weighted graph is an undirected graph [math]\displaystyle{ \mathcal{G} = (\mathcal{V}, \mathcal{E}) }[/math] with pseudometric on the node set [math]\displaystyle{ \mathcal{V} }[/math] written [math]\displaystyle{ a_{ij} }[/math]. A graph signal is simply a real-valued function on the set of nodes of the graph. In graph neural networks, graph signals are sometimes called features. The signal space is the set of all graph signals [math]\displaystyle{ \mathcal{M} = \R^\mathcal{V} }[/math] where [math]\displaystyle{ \mathcal{V} }[/math] is a set of [math]\displaystyle{ n = |\mathcal{V}| }[/math] nodes in [math]\displaystyle{ \mathcal{G} = (\mathcal{V}, \mathcal{E}) }[/math]. The filter algebra is the algebra of polynomials in one indeterminate [math]\displaystyle{ \mathcal{A} = \mathbb{R}[t] }[/math]. There a few possible choices for a graph shift operator (GSO). The (un)normalized weighted adjacency matrix of [math]\displaystyle{ [A]_{ij} = a_{ij} }[/math] is a popular choice, as well as the (un)normalized graph Laplacian [math]\displaystyle{ [L]_{ij} = \begin{cases} \sum_{j = 1}^{n}a_{ij} & i = j \\ -a_{ij} & i \neq j \end{cases} }[/math]. The choice is dependent on performance and design considerations. If [math]\displaystyle{ S }[/math] is the GSO, then a graph convolution is the linear transformation [math]\displaystyle{ \rho_h = \sum_{k=0}^{\infty} h_k S^k }[/math] for some [math]\displaystyle{ h(t) = \sum_{k=0}^{\infty} h_k z^k }[/math], and convolution of a graph signal [math]\displaystyle{ \mathbf{x}: \mathcal{V} \to \mathbb{R} }[/math] by a filter [math]\displaystyle{ h(t) }[/math] yields a new graph signal [math]\displaystyle{ \mathbf{y} = \left(\sum_{k=0}^{\infty} h_k S^k \right) \cdot \mathbf{x} }[/math].
Other mathematical objects with their own proposed signal-processing frameworks are algebraic signal models. These objects include including quivers,[2] graphons,[3] semilattices,[4] finite groups, and Lie groups,[5] and others.
In the framework of representation theory, relationships between two representations of the same algebra are described with intertwining maps which in the context of signal processing translates to transformations of signals that respect the algebra structure. Suppose [math]\displaystyle{ \rho: \mathcal{A} \to \mathrm{End}(\mathcal{M}) }[/math] and [math]\displaystyle{ \rho': \mathcal{A} \to \mathrm{End}(\mathcal{M}') }[/math] are two different representations of [math]\displaystyle{ \mathcal{A} }[/math]. An intertwining map is a linear transformation [math]\displaystyle{ \alpha: \mathcal{M} \to \mathcal{M}' }[/math] such that
[math]\displaystyle{ \alpha \circ \rho_a = \rho'_a \circ \alpha \quad \forall a \in \mathcal{A} }[/math]
Intuitively, this means that filtering a signal by [math]\displaystyle{ a }[/math] then transforming it with [math]\displaystyle{ \alpha }[/math] is equivalent to first transforming a signal with [math]\displaystyle{ \alpha }[/math], then filtering by [math]\displaystyle{ a }[/math]. The z transform[1] is a prototypical example of an intertwining map.
Inspired by a recent perspective that popular graph neural networks (GNNs) architectures are in fact convolutional neural networks (CNNs),[6] recent work has been focused on developing novel neural network architectures from the algebraic point-of-view.[7][8] An algebraic neural network is a composition of algebraic convolutions, possibly with multiple features and feature aggregations, and nonlinearities.
Original source: https://en.wikipedia.org/wiki/Algebraic signal processing.
Read more |