Chair: Maureen P. Quirk, David J. Thomson, Center for Communication Research, AT&T Bell Laboratories (USA)
Shrinivas R. Kulkarni, California Institute of Technology (USA)
Signal processing is extensively used in astronomy for spectroscopy and synthetic imaging. Signals from celestial sources are stochastic in nature and weak compared to local sources of noise. Thus the emphasis is on improving the signal-to-noise-ratio. This necessitates the use of large bandwidths and real time processing of the signal. Considerable practical gain is obtained by using 1- or 2-bit sampling. Traditionally spectrometers are based on correlators. However recently FFT engines have become popular. Pulsar signals are chirped by propagation through the dispersive interstellar medium. A variety of innovative processing ideas including time domain filtering have been used in this field. Two major trends may change the landscape altogether. (1) High bandwidth tape recording of the signal, followed by the use of supercomputers to analyze the data. (2) Fully digital interferometers with the ability to form multiple beams as well as adaptive beams to deal with man made interference.
Rudy Schild, Harvard Smithsonian Center for Astrophysics
David J. Thomson, AT&T Bell Laboratories (USA)
Astronomers study quasars, the enigmatic luminous black hole cores of distant galaxies, because of their huge distances; they sample the universe when it was only 1/10 of the present age. In about a dozen known cases, a random galaxy along the line of sight causes the quasar's image to be double or multiple, providing cosmologists with a tool for study of young galaxies and the structure of space and time itself. In these gravitational lens alignments, the images do not arrive at the same time, and measurement of the quasar's irregular brightness fluctuations allows determination of the light travel time differences. For the first discovered gravitational lens, Q0956+561 A,B a 15-year data record of brightness fluctuations (Schild &Thomson, 1995) has been analyzed for time delay. Bad weather, telescope availability, and seasonal effects cause sub-optimal data sampling, but all time scales from a day to 15 years are reasonably sampled. We find the cosmologically interesting time delay to be 405 +/- 10 days, implying a universe about as old as the oldest known stars. Complications to the data analysis come from our discoveries that the quasar has internal structure in the form of reflecting regions, and planetary mass objects in the lens galaxy introduce additional brightness fluctuations. The quasar also shows periodic variability at multiple frequencies, probably due to oscillations of the luminous disc surrounding the black hole.
Louis J. Lanzerotti, AT&T Bell Laboratories (USA)
David J. Thomson, AT&T Bell Laboratories (USA)
Much geophysical and space physics data have time as one of the key variables in the data acquisition and subsequent analysis. Generally, in order to achieve physical understanding of these data, it is necessary to carry out comprehensive time series analyses. This paper outlines briefly the deriving sources for producing time variabilities in many space and geophysical systems. It then discusses time series analyses of several selected data sets that describe the evolution of physical systems ranging from the sun through the interplanetary medium to the surface of Earth.
George J. M. Aitken, Queens University (CANADA)
Advances in image-reconstruction techniques and computation capability have made it practical to use post-detection processing to overcome the resolution limits imposed on conventional astronomical observations by turbulence in the atmosphere. Techniques based on second and third- order spectra have proved to be very successful. The astronomical application requires special attention to the calibration of the measured atmospheric transfer function, the removal of photon-noise biases and the effects of finite detector size. A new area of application is the enhancement of images partially compensated by adaptive optics.