ARLA/CLUSTER: MIT aumenta o processamento do Algoritmo FFT
João Gonçalves Costa
joao.a.costa ctt.pt
Sexta-Feira, 20 de Janeiro de 2012 - 12:49:15 WET
The faster-than-fast Fourier transform
MIT researchers have found a way to increase the speed of one of the most important algorithms in signal processing the fast Fourier transform (FFT).
It's a method for representing an irregular signal as a combination of pure frequencies. It's universal in signal processing, but it can also be used to compress image and audio files, solve differential equations and price stock options, among other things.
Ever since the FFT was proposed in the mid-1960's people have wondered whether an even faster algorithm could be found.
At the Association for Computing Machinery's Symposium on Discrete Algorithms (SODA) this week, a group of MIT researchers will present a new algorithm that, in a large range of practically important cases, improves on the fast Fourier transform. Under some circumstances, the improvement can be dramatic - a tenfold increase in speed. The new algorithm could be particularly useful for image compression, enabling, say, smartphones to wirelessly transmit large video files without draining their batteries or consuming their monthly bandwidth allotments.
The new algorithm - which associate professor Katabi and professor Piotr Indyk, both of MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), developed together with their students Eric Price and Haitham Hassanieh - relies on two key ideas. The first is to divide a signal into narrower slices of bandwidth, sized so that a slice will generally contain only one frequency with a heavy weight.
In signal processing, the basic tool for isolating particular frequencies is a filter. But filters tend to have blurry boundaries:
One range of frequencies will pass through the filter more or less intact; frequencies just outside that range will be somewhat attenuated; frequencies outside that range will be attenuated still more; and so on, until you reach the frequencies that are filtered out almost perfectly.
If it so happens that the one frequency with a heavy weight is at the edge of the filter, however, it could end up so attenuated that it can't be identified. So the researchers' first contribution was to find a computationally efficient way to combine filters so that they overlap, ensuring that no frequencies inside the target range will be unduly attenuated, but that the boundaries between slices of spectrum are still fairly sharp.
Once they've isolated a slice of spectrum, however, the researchers still have to identify the most heavily weighted frequency in that slice. In the SODA paper, they do this by repeatedly cutting the slice of spectrum into smaller pieces and keeping only those in which most of the signal power is concentrated. But in the paper 'Nearly Optimal Sparse Fourier Transform', they describe a much more efficient technique, which borrows a signal-processing strategy from 4G cellular networks. Frequencies are generally represented as up-and-down squiggles, but they can also be though of as oscillations; by sampling the same slice of bandwidth at different times, the researchers can determine where the dominant frequency is in its oscillatory cycle.
Read the full MIT press release at
http://web.mit.edu/newsoffice/2012/faster-fourier-transforms-0118.html
Read the paper Nearly Optimal Sparse Fourier Transform
http://arxiv.org/PS_cache/arxiv/pdf/1201/1201.2501v1.pdf
-------------- próxima parte ----------
Um anexo em HTML foi limpo...
URL: http://radio-amador.net/pipermail/cluster/attachments/20120120/6ef3d266/attachment.htm
Mais informações acerca da lista CLUSTER