The Centar FFT architecture allows all additions to be performed at full precision so that the round-off errors occur primarily in the twiddle multiplication multiplication steps. Consequently, the SQNR is much higher than found in other FFT archtectures for a given input bit length. The SQNR is defined by the expression
SQNR = 10 log10 (∑n z(n)2 / ∑n (z(n)-zref(n))2)),
where z(n) are the coefficient output magnitues and zref(n) are exact precision reference magnitudes. Based on this expression statistics are tabulated in Table 1 for 256-point and 1024-point FFTs with 16-bit fixed-point input word lengths and a 16-bit output mantissa plus a 5-bit exponent. For comparison the same values are calculated using the Altera streaming FFT circuits (v13.1) with 16-bit and 20-bit input/output word lengths using random real and inmaginary data based on Altera’s supplied Matlab model. The benefit of the Centar scaling approach is equivalent to approximately 4-bits, an advantage which is particularly important for smaller word lengths.
|Centar 16-bit word||Altera 16-bit word||Altera 20-bit word|
Table 1. SQNR statistics for fixed-point streaming FFTs (1000 FFT blocks of random data).