Submitted by Gary HOVEY
G.-J. Hovey (1), Z. Heng (1), G. Herriot (1), Z. Ljusic (1), M. Smith, J.-P. Veran (1), F. Gamache (2), D. Quinn (2), R. Conan(3)
(1)Herzberg Institute of Astrophysics, National Research Council, British Columbia, Canada; (2)Lyrtech Signal Processing, Quebec City, Quebec, Canada; (3)University of Victoria, Victoria, British Columbia, Canada
Turbulent atmosphere distorts the wavefront and limits the resolution of large ground-based optical telescopes, such as the proposed Thirty Metre Telescope (TMT). Such distortions can largely be compensated by adaptive optics controllers that re-construct the distortions along the light path and adjust deformable mirrors to improve the resolution of the telescope. Re-construction is an inverse problem that by nature is compute intensive. The problem is exacerbated by a large aperture and the time variability of the atmosphere. In the case of the TMT, megabytes of data need to be processed within a few hundred microseconds with low jitter and latency. Such a computing load is of order 10^11 operations per second and cannot be handled practically using general purpose computing approaches. Such loads, however, are commonly handled by FPGA (Field Programmable Gate Array) based digital signal processing systems. What is atypical is using them to solve inverse type problems. In this paper we present our work in investigating the feasibility of such an approach and developing a practical design. We start by reviewing the requirements of such a system and the trade-off of various alternative approaches. We then outline our concept design and the results from our benchmark measurements.