# Fourier Transform

New Fourier Transform Promises Hundred Times Faster Data Speeds

Research team of the Massachusetts Institute of Technology has been able to produce yet another, even faster way to calculate Fourier Transform - a mathematical foundation of all data compression and processing algorithms. In January, Dina Katabi, Haitham Hassanieh, Piotr Indyk, and Eric Price have announced that they are in possession of new, better and faster formula to manage patterned data. New algorithm called Sparse **Fourier Transform** is able become a replacement for existing Fast Fourier Transform, the current underlining mathematical principle behind modern data processing and compression.

What is Fourier Transform?

The Fourier transform sees time as a function of frequency and calls it frequency spectrum; the theorem promises that it can always be done. For example, the musical chord comprised of individual notes will be expressed as a set of amplitudes and phases. The function of time and the frequency spectrum are called the time domain representation and the frequency domain representation. Each value of that function is a phase and a magnitude component number called Complex Amplitude. The *Fourier Transform* describes both: the actual transform calculation and the complex amplitude function. Fourier and Inverse Fourier Transforms are utilized in mathematics and engineering, differential equations, physics, chemistry, biology, medicine and many other industries and areas of life.

Fast Fourier Transform

A modern computer-adopted variant of Transform theory was invented in mid-1060s and was called Fast Fourier transform. MP3, JPEG and many other file compression algorithms owe its effectiveness to FFT principle. It is used in quantum mechanics, solid-state physics and wave mechanics. Aviation, automotive industry and train industry owe their smooth running vehicles to Fast Fourier Transform algorithm.

Sparse Fourier Transform

Sparse Fourier Transform, the new algorithm invented by the group of researchers from MIT, promises up to 100 times faster media processing and compression speeds using the same hardware by looking into the sparsity of data: average media signal typically carries only a fraction of value it's capable to contain. That means most of the data is sparse, scanty and lacks denseness, though it consumes lots of space. Team member Katabi believes that sparsity is everywhere: "it's in nature around us; it's in video signals; it's in audio signals''. Previously, it was thought that any new algorithm able to deal with sparse data would be far less capable than well tested Fast Fourier Transform method, but SFT proves different.

New algorithm looks at media data as a complex and well-structured organism. Though SFT is not designed to manage any forms of data, it can take some shortcuts not available to other algorithms, making processing more efficient. A faster transform means faster calculations, which in turn means easier load on processing hardware. New algorithm could literally trigger the revolution in mobile media device market. The same amount of computing power will be able to process and deliver faster, better quality media. Internet servers will be able to start managing individual data on a new level without upgrading their resources. Switches and routers will be able to pay closer attention to internet traffic.

References: MIT, Piotr Indyk, Dina Katabi, Eric Price, Haitham Hassanieh, Webb Chappell