INFORMATION OPTICS: 5th International Workshop on Information Optics (WIO'06)
860(2006); http://dx.doi.org/10.1063/1.2361201View Description Hide Description
Image deconvolution is a powerful tool for improving the quality of images corrupted by blurring and noise. However, in some cases, the imaging system is affected by anisotropic resolution, i. e. the resolution depends on the direction in the imaging plane or volume. Such a distortion cannot be corrected by image deconvolution. One example, from Astronomy, is the Large Binocular Telescope (LBT) under construction on the top of Mount Graham (Arizona). A second example, from Microscopy, is the confocal microscope. In both cases, the situation can be improved if different images of the same target can be detected by rotating the instrument or by rotating the target. Then the problem arises of obtaining a unique high‐resolution image from different images taken at different orientation angles. Such a problem is called multiple‐image deconvolution.
In this paper, after a brief illustration of the two examples mentioned above, the problem of multiple‐image deconvolution is formulated and preliminarily investigated in a continuous setting (all directions are available), showing that, while resolution is anisotropic in the multiple images, it becomes isotropic in the reconstructed image. Next, methods and algorithms for the solution of the problem are presented and their accuracy illustrated by means of the results of a few numerical experiments. Finally, the possibility of a further improvement of resolution by means of super‐resolving methods is briefly discussed and demonstrated.
860(2006); http://dx.doi.org/10.1063/1.2361202View Description Hide Description
This paper presents a unifying approach to the blind deconvolution and superresolution problem of multiple degraded low‐resolution frames of the original scene. We do not assume any prior information about the shape of degradation blurs. The proposed approach consists of building a regularized energy function and minimizing it with respect to the original image and blurs, where regularization is carried out in both the image and blur domains. The image regularization based on variational principles maintains stable performance under severe noise corruption. The blur regularization guarantees consistency of the solution by exploiting differences among the acquired low‐resolution images. Experiments on real data illustrate the robustness and utilization of the proposed technique.
860(2006); http://dx.doi.org/10.1063/1.2361203View Description Hide Description
We analyze coherence properties of partially polarized light. For that purpose, we discuss different theories that have been recently introduced. We focuss our attention on invariance properties that are fundamental in physics in order to define relevant parameters.
860(2006); http://dx.doi.org/10.1063/1.2361204View Description Hide Description
Holographic three‐dimensional(3D) display system with data processing techniques is presented. By use of phase shifting digital holography, the complex amplitude of 3D object is obtained. Data processing techniques in complex amplitude are introduced to improve the quality of reconstructed 3D object or manipulate 3D objects. Elimination and addition of objects are presented.
860(2006); http://dx.doi.org/10.1063/1.2361205View Description Hide Description
We present approaches for real time automated three‐dimensional (3D) visualization and recognition of biological microorganisms. A single exposure on line digital holographic microscopy records Fresnel digital hologram of the biological microorganisms. Then original complex images of biological microorganisms are computationally reconstructed at different depths along longitudinal direction by inverse Fresnel transformation of the digital hologram. For recognition and classification, two approaches are overviewed. One is 3D morphology‐based recognition and the other is shape‐independent recognition based on statistical estimation and inference algorithms.
860(2006); http://dx.doi.org/10.1063/1.2361206View Description Hide Description
In microscopy, high magnifications are achievable for investigating micro‐objects but the paradigm is that higher is the required magnification, lower is the depth of focus. For an object having a three‐dimensional (3D) complex shape only a portion of it appears in good focus to the observer who is essentially looking at a single image plane. Actually, two approaches exist to obtain an extended focused image (EFI), both having severe limitations since the first requires mechanical scanning while the other one requires specially designed optics. We demonstrate that an EFI of an object can be obtained through digital holography (DH) without any mechanical scanning or special optical components. The conceptual novelty of the proposed approach lies in the fact that it is possible to completely exploit the unique feature of DH in extracting all the information content stored in hologram, amplitude and phase, to extend the depth of focus.
860(2006); http://dx.doi.org/10.1063/1.2361207View Description Hide Description
We propose a new optical method to obtain multifactor image encoding and authentication. The encoded complex‐amplitude image function fulfils the general requirements of invisible content, extreme difficulty in copying or counterfeiting and real‐time automatic verification. This paper describes both the encoded information contained in the ID tag and the optoelectronic processor that validates an identity on a basis of multiple signal recognition. This optical technique is attractive for high‐security purposes that require multifactor reliable authentication. A demonstration using a combination of biometric images, alphanumeric signatures and key codes is provided. Retina images, which are very effective for authentication, are used as biometric signals.
860(2006); http://dx.doi.org/10.1063/1.2361208View Description Hide Description
We contrast standard stochastic process methods with those of quantum mechanics. While the end results of both methods are common quantities, such as expectation values and probabilities, the methods of calculation are dramatically different. We present a simple example of a nonstationary stochastic process that can be exactly solved in both standard and quantum mechanics. The example illustrates both approaches and crystallizes the similarities and differences between the methods.
860(2006); http://dx.doi.org/10.1063/1.2361209View Description Hide Description
The ambiguity function and the Wigner distribution function have both been applied in the optical area for many years. Later, the fractional Fourier transform has also been used. The connection between the ambiguity function and the defocused optical transfer function has also been described. Here we consider the connections with the generalized optical transfer function, first proposed in 1965, which is a two (three) dimensional optical transfer function for the two (three) dimensional case. The two dimensional form can be used as the basis for phase retrieval algorithms, but is also valid in the non‐paraxial domain.
860(2006); http://dx.doi.org/10.1063/1.2361210View Description Hide Description
The concept of Integrated Sensing and Processing (ISP) suggests that a sensor should collect data in a manner that is consistent with the end objective. Thus ISP seeks to minimize the collection of redundant data, reduce processing time and improve overall performance. A traditional approach designs the “best” sensor in terms of SNR, resolution, data rates, integration time and so forth, while traditional algorithms seek to optimize metrics such as probability of detection, false alarm rate, and class separability. The goal of ISP is to change this disjoint “sensing then processing” approach by allowing the algorithms to control the sensing parameters to collect the “best” information in order that the algorithm performs optimally.
At Lockheed Martin, we are experimenting with an ISP system which utilizes a near Infrared (NIR) Hadamard multiplexing imaging sensor, built by PlainSight Systems. This prototype sensor incorporates a digital mirror array (DMA) device in order to realize a Hadamard multiplexed imaging system. Specific Hadamard codes can be encoded on the sensor aperture to measure their inner products with the underlying scene rather than the scene itself. The developed ISP algorithm uses an automatic target recognition (ATR) metric to send particular codes to the sensor in order to collect only the information relevant to the ATR problem. The result is a variable resolution hyperspectral cube with full resolution where targets are present and less than full resolution where there are no targets. This approach greatly improves the sensing process by reducing the overall volume of data and the time required to collect it. Several examples are also presented for illustrative purposes.
860(2006); http://dx.doi.org/10.1063/1.2361211View Description Hide Description
We present a novel microscopy technique based on the fourwave mixing (FWM) process that is enhanced by two‐photon electronic resonance induced by a pump pulse along with stimulated emission induced by a dump pulse. A Ti:sapphire laser and an optical parametric oscillator are used as light sources for the pump and dump pulses, respectively. We demonstrate that our FWM technique can be used to obtain two‐dimensional microscopic images of an unstained leaf of Camellia sinensis and an unlabeled tobacco BY2 Cell.
860(2006); http://dx.doi.org/10.1063/1.2361212View Description Hide Description
The Compact Optoelectronic Integrated Neural (COIN) coprocessor is a rugged, pixelated, parallel optoelectronic system that is designed to run neural network‐type algorithms in native hardware. The goal of the COIN project is to explore the potential capabilities and limitations of such systems. The optoelectronics, holographic interconnections, and VLSI circuits of the first prototype machine have been fabricated and characterized individually. Recent work on the project was focused on designing the computational components and developing hierarchal system models that can provide accurate, timely, and efficient performance estimates of the COIN coprocessor while it is still in the pre‐integration stage. In this paper, we present an overview of the project and some of the simulated results that demonstrate the potential training and learning characteristics of the system.
860(2006); http://dx.doi.org/10.1063/1.2361213View Description Hide Description
We propose novel methods for optimizing the joint design of the optics, detector, and digital image processing for an imaging system such as a document scanner or digital camera. Our design methodology predicts the end‐to‐end imaging performance on an average pixel sum‐squared error performance criterion using models of the components; it then relies on Wiener theory to adjust optical and image processing parameters to minimize this performance criterion. Our method relaxes the traditional strict (and hence often expensive) implicit goal that the intermediate optical image be of high quality. Inexpensive image processing compensates for the resulting optical performance degradations. Certain optical aberrations are easier to correct through image processing than others and our method automatically adjusts both optical and image processing parameters to find the best joint design.
860(2006); http://dx.doi.org/10.1063/1.2361214View Description Hide Description
We propose and discuss three methods for increasing the resolution of an aperture limited optical system by illuminating the input with structured light. The kind of illumination can be varied, ranging from tilted plane waves, coherence shaped light or speckle patterns. In any case the high resolution of the projected pattern demodulates the high frequencies of the sample and permits its passage through the system aperture. A decoding process provides the superresolved image after some digital post processing stage.
High‐speed Signal Evaluation in Optical Coherence Tomography Based on Sub‐Nyquist Sampling and Kalman Filtering Method860(2006); http://dx.doi.org/10.1063/1.2361215View Description Hide Description
The method of sub‐Nyquist sampling is considered that is used to decrease sampling speed in a few times for narrow‐band OCT signals. To provide high noise‐immunity and stability when processing OCT signals with randomly variable parameters in real time, the Kalman filtering method has been applied for evaluating sub‐Nyquist sampled OCT signals.
860(2006); http://dx.doi.org/10.1063/1.2361216View Description Hide Description
In this paper we review the concept of “Laser Firefly Clusters” for atmospheric probing and describe the development of the first and second generation versions. The laser firefly cluster is a mobile and versatile distributed sensing system with the purpose of profiling the chemical and particulate composition of the atmosphere on the basis of its optical properties. The primary applications are pollution monitoring, and detection of contamination as well as scientific research, meteorology and others. In this paper we extend the concept from the atmospheric to the oceanic environment and introduce “Optical Plankton”. The special features of the ocean as a channel for optical wireless communication are characterized and the distinctive requirements of monitoring the oceanic particulate composition are addressed.
Simple Jones Method for describing Modulation Properties of Reflective Liquid Crystal Spatial Light Modulators860(2006); http://dx.doi.org/10.1063/1.2361217View Description Hide Description
We present a review of the techniques for the calibration and optimization of twisted nematic liquid crystal displays (TNLCDs,) with special focus on the application to describe multiple reflections effects. The inclusion of these effects permits us to obtain an improvement in the accuracy in the prediction of the modulation properties of transmissive and reflective displays. In newer reflective liquid crystal on silicon (LCoS) displays we noticed a significant degree of depolarization. Polarimetric measurements have been performed showing depolarizations as large as 10% at certain gray levels and for certain incident polarizations. Fluctuations in the instantaneous state of polarization of the reflected beam have been demonstrated as the main origin for this depolarization. In the next future we plan to extend this type of polarimetric study to the whole range of gray levels, in order to look for the optimal modulation response of the LCoS display.
860(2006); http://dx.doi.org/10.1063/1.2361218View Description Hide Description
The Fourier plane encryption algorithm is subjected to a heuristic known‐plaintext attack. The simulated annealing algorithm is used to estimate the key using a known plaintext‐ciphertext pair which decrypts the ciphertext with arbitrarily low error. The strength of the algorithm is tested by using the key to decrypt a different ciphertext encrypted using the same original key. The Fourier plane encryption algorithm is found to be susceptible to a known‐plaintext heuristic attack. It is found that phase only encryption, a variation of Fourier plane encoding algorithm, successfully defends against this attack.
Remote Sensing Image Fusion with a Multiresolution Directional‐Oriented Image Transform Based on Gaussian Derivatives860(2006); http://dx.doi.org/10.1063/1.2361219View Description Hide Description
A methodology for image fusion based on the Hermite transform is presented. First, we demonstrate fusion with multispectral images from the same satellite (Landsat 7 ETM+) with different spatial resolutions. In this case we show how the proposed method can help improve spatial resolution. In the second case we show fusion with different sensor images, namely SAR and multispectral Landsat 5 TM. The fusion algorithm is based on the directional oriented Hermite transform which is an image representation model based on Gaussian derivatives that mimics some of the more important properties of human vision. The local analysis properties of the Hermite transform help fusion and noise reduction adapt to the local orientation and content of the image.