Introduction to Graph Signal Processing

Sybernix
7 min readDec 4, 2021
Source: https://epfl-lts2.github.io/gspbox-html/

Graph Signal Processing (GSP) is, as its name implies, signal processing applied on graphs. Classical signal processing is done on signals that are ordered along some axis. For example, if we take the alternating current (AC) waveform, it can be represented as follows.

AC Wave. Source: https://www.electrically4u.com/ac-waveform-types-and-advantages/

You can observe that there is some kind of ordering along the time axis. However, if we consider a graph signal, we will not have such ordering.

For example, let’s consider a state which has 8 cities. Suppose we need to represent the number of automobiles in each city. We can represent this information using a graph signal as follows.

Graph Signal. Source: https://link.springer.com/chapter/10.1007/978-3-030-03574-7_1

Each vertex of light green color represents a city and its location in the state. The edges connecting the cities can be taken to represent the road between the two connected cities. The red vertical line represents the number of automobiles which is a scalar value. So higher the red line, higher the number of automobiles in that city.

What we have seen above is a simple planer graph. However, we can represent complex models in higher dimensional graphs. For example, the human body can be represented as a 3D point cloud graph. Sensor networks and even images can be represented as graphs. This is shown in the image below.

Source: https://sipi.usc.edu/~ortega/Papers/GraphSignalProcessing-ICIP-2013-Ortega.pdf

In this blog, I assume that the reader has some knowledge of the following mathematical and signal processing concepts. You can google and learn about these topics if you need!

  • Vectors and Matrices
  • Eigenvectors and eigenvalues
  • Fourier Transform
  • Convolution

Formal Definition of Graph Signal

Now let’s dive into the mathematical representation of graphs and graph signals. We can represent a graph G as a set of vertices V, edges E, and a weight matrix W as follows. Edges can have different weights and weight matrix will store these values.

Let the cardinality of V to be N (|V| = N). This means that there are N vertices in our graph.

Suppose that each vertex i has a signal g(i). Given a vertex in the graph, this vertex signal will give a real number as output.

Since there are N vertices, there will be N different values for i in g(i). The graph signal can be defined as the ensemble of the N individual signals as follows.

Graph signal g

Graph Laplacian

The graph Laplacian is a crucial element for graph signal processing because the eigenvectors and eigenvalues of this matrix will lay the foundation for frequency domain analysis of graph signals. The graph Laplacian L can be defined as follows.

Where W is the weight matrix we saw in the graph definition and D is the degree matrix of the graph. Degree matrix is a diagonal matrix, i.e only the diagonal elements have non-zero values. Its i-th diagonal element di will give the sum of the weights of all edges that touch the i-th vertex.

This can be better understood using a ring graph as follows.

Ring graph

This ring graph has 8 vertices. So N = 8. It also has 8 edges. Let’s assume that each edge has a weight of 1. Then its weight matrix W will be as follows.

Weight matrix W of the ring graph

If we number each vertex from 1 to 8, the i-th vertex is connected to (i-1) and (i+1) vertices with weights 1 each. Also, the first and the last (eighth) vertices are also connected. This is represented in the weight matrix above.

The degree matrix D of the ring graph will be as follows.

Degree matrix D of the ring graph

Each vertex has two edges that touch it. Each edge has a weight of 1. So the sum of weights of all edges touching any vertex is 2. In other words, degree of all the vertices are 2 each. This is represented in the degree matrix above.

Using these two matrices, we can calculate the Laplacian according to the equation L = D - W as follows.

Graph Laplacian L

Eigenvectors and Eigenvalues of Graph Laplacian

For any matrix A, we can calculate the eigenvectors and eigenvalues by solving the following equation: (AI) v = 0

where I is the identity matrix and 0 is the zero vector. The solutions to λ are the eigenvalues and solutions to v are the eigenvectors. You can watch the video at [3] to understand the concept of eigenvectors and eigenvalues. I will leave the calculation of eigenvalues and eigenvectors as an exercise for interested readers.

Let’s suppose the eigenvalues of our graph Laplacian λ0, λ1, ….., λN-1 and the eigenvectors are u0, u1, …., uN-1. Since I cannot type subscript and superscript notation in Medium, please refer to the image below to see the actual notation we will use for the eigenvalues and eigenvectors.

Graph Fourier Transform

Now, using the eigenvalues and eigenvectors of graph Laplacian, and the graph signal g, we can define the graph Fourier transform as follows.

Graph Fourier transform

If we consider the l-th frequency λl, the amplitude of the λl is given by g-hat(λl) (The term on the left-hand side of the equation above).

The following expression is the product of the value of graph signal at i-th vertex and the value of the l-th eigenvector at the i-th vertex.

Doing the above product for all vertices from 1 to N and taking the sum will give the amplitude of the l-th frequency.

The inverse graph Fourier transform can be defined as follows.

Inverse graph Fourier transform

This inverse transform represents an expansion of the original graph signal g in terms of eigenvectors and eigenvalues.

Graph Spectral Domain

In classical Fourier transform, we intuitively understand that the spectral or frequency domain represents the amplitude of different sinusoidal waves with varying frequencies that can be combined to form the original waveform. We need a similar justification to convince ourselves that the graph Fourier transform we defined above actually contains some spectral domain representation.

In order to do so, we will look at the eigenvectors. We will use a random sensor network graph to facilitate our analysis. Let’s suppose that we have decomposed the original graph signal into u0, u1, …, u50. The first eigenvector u0 is always a constant. The value of this eigenvector is the same for every vertex in the graph. This can be taken to represent the DC component in classical signals.

u0

The blue vertical lines represent the value of the eigenvector at each vertex. Note that this value is constant for u0. Next, we will look at u1 eigenvector which is as follows.

u1

Note that the black lines represent negative values and the blue lines represent the positive values. We can think of frequency as the number of edges in the graph where the vertex on one end has a positive value and the vertex on the other end has a negative value. We can define this formally as follows.

Let’s call this zero-crossing. So the number of zero-crossing in an eigenvector can be taken to represent some sort of frequency of the eigenvector. Now, let’s look at u50 as follows.

u50

We can see that u50 has a lot more zero-crossings than u1.

If we plot the number of zero-crossings in eigenvectors u0, u1, …, uN-1 against the eigenvalues λ0, λ1, ….., λN-1, we will see a graph as follows.

This shows that as λ increases, the number of zero-crossings in the corresponding eigenvector increases. So we can think that frequency varies incrementally from low λ values to high λ values.

Now, we know that we can represent a graph signal in wither vertex domain and in the graph spectral domain. Let’s look at Minnesota road graph in the two domains.

Minnesota road graph in vertex domain (left) and in graph spectral domain (right)

In this blog, I have presented a brief introduction to graph signal processing. You can further learn about operations on graph signals such as filtering, convolution, translation, modulation, dilation, etc using the references linked below.

References

[1] https://www.youtube.com/watch?v=7DV1TU_jtBg
[2] Shuman et al. 2013 — The Emerging Field of Signal Processing on Graphs
[3] https://www.youtube.com/watch?v=PFDu9oVAE-g

--

--