Hello,
Consider the implementation of a “naive” Fourier transform in C++.
That is, a piece of code that directly implements the discrete Fourier
Transform (DFT) as written in the formula, using two nested for
loops.
Now consider the Fast Fourier Transform (FFT) algorithm. The basic
idea behind the FFT is that redundant calculations are avoided, and
results are reused across intermediate arrays by mapping them in
a butterfly pattern.
The question is: could a compiler effectively output an FFT algorithm
when given a naive DFT implementation?
Of course, I am not trying to argue that compilers can arbitrarily reduce
any algorithm to a more optimal one, but this particular case seems
plausible to me.
Compilers are good at re-ordering instructions in order to
avoid redundant calculation, especially when given a structured source,
and so I wonder how closely a compiled DFT would actually resemble
an FFT.
I’m also wondering what constraints would need to be added to a
compiler in order to come closer to an FFT, and if these constraints
might optimise a whole class of algorithms.
Best,
Natanijel Vasic
At least for C and C++, unless there are specific constraints — such as
printing intermediate results not computed by FFT, or more likely, strict
floating point error rounding (i.e. -fno-fast-math) — compilers are free
to do this per the abstract machine. Only the DFT outputs are observable,
not the inner workings.
IMHO, a compiler that transforms a generic DFT into FFT without having
been programmed for that specific pattern is entering AI territory. I've
noted in the past[1] that I've been a little freaked out at just how far
optimizer reasoning can go. Both GCC and Clang can optimize non-tail
recursive algorithms into iterative algorithms — without it being some
specific algorithm — and your DFT example is really just a sophisticated
version of this idea.
Caching/reusing large results (i.e. beyond simple register spills)
especially via some compiler-defined table or data structure, is quite
beyond the sorts of optimizations I'm used to seeing from compilers.
> I’m also wondering what constraints would need to be added to a compiler > in order to come closer to an FFT
The first thing that comes to mind is something like -ffast-math. For the
code itself, it might be necessary to reject invalid or extreme inputs,
specifically where a generic DFT computes a different result than FFT.
Otherwise the compiler must faithfully reproduce the invalid or unstable
results.
[1]: https://nullprogram.com/blog/2018/05/01/
Natanijel Vasic <natanijel.vasic@gmail.com> wrote:
> Consider the implementation of a “naive” Fourier transform in C++.> That is, a piece of code that directly implements the discrete Fourier> Transform (DFT) as written in the formula, using two nested for> loops.> > Now consider the Fast Fourier Transform (FFT) algorithm. The basic> idea behind the FFT is that redundant calculations are avoided, and> results are reused across intermediate arrays by mapping them in> a butterfly pattern.> > The question is: could a compiler effectively output an FFT algorithm> when given a naive DFT implementation?
I'm not particularly knowledgeable about our field's history, but I believe this
was a fairly serious research topic in the 1960s, under the heading of "symbolic
transformation". One place I recall seeing this discussed was in Mavin Minsky's
1969 Turing Award lecture; he uses an example of compiling a naive Fibonacci sequence,
into a more efficient algorithm -- quite similar in spirit to your Fourier transform
example. https://amturing.acm.org/award_winners/minsky_7440781.cfm
Christopher Wellons wrote:
> IMHO, a compiler that transforms a generic DFT into FFT without having been programmed for> that specific pattern is entering AI territory.
Indeed it is -- it's not at all a coincidence that the Turing Award lecture I linked above
was given by Minskey, who received his award "For his central role in creating, shaping,
promoting, and advancing the field of Artificial Intelligence". I believe that this area of
research suffered significantly during the first AI winter in the 1970e, and I'm not familiar
with it being picked back up with the same enthusiasm later (though, as I said, I'm far from
an expert)
Best,
Daniel