Randomized Singular Value Decomposition
By Luis Carmona | 01/30/2024
By Luis Carmona | 01/30/2024
In the world of Numerical Linear Algebra, finding algorithms that can efficiently and effectively handle big datasets is crucial. This is where Randomized Singular Value Decomposition (RSVD) comes into play, a high-quality approximations to the conventional SVD, but at a fraction of the computational expense, especially for large datasets. SVD, you probably know, is a classic piece of Linear Algebra that gives a factorization for any matrix into a rotation, then a streach or compression and then another rotation.
You’ll be able to move through the explanation slide by slide—just hit the arrow keys to go forward or back.
Looking ahead, a next project involving the subject is to implement RSVD using GPUs. As we saw in the end of the slides it would be a fantastic way to use parallel processing to improve efficiency, so that we have results faster or even get results for larger data samples.
This project was part of a class inspired by J. Nathan Kutz and Steven L. Brunton book "Data-Driven Science and Engineering". It was an amazing way of learning applications, mostly related to Machine Learning, from a Linear Algebra approach.
For those eager to delve further into the concepts and applications discussed, I highly recommend visiting the YouTube channels of Steven L. Brunton at @Eigensteve and J. Nathan Kutz at @nathankutzUW. Both channels are incredibles sources of knowledge, offering brilliant insights into the world of data science and engineering through engaging and informative content.