![]() ![]() ![]()
EI Compendex Source List(2022年1月)
EI Compendex Source List(2020年1月)
EI Compendex Source List(2019年5月)
EI Compendex Source List(2018年9月)
EI Compendex Source List(2018年5月)
EI Compendex Source List(2018年1月)
中国科学引文数据库来源期刊列
CSSCI(2017-2018)及扩展期刊目录
2017年4月7日EI检索目录(最新)
2017年3月EI检索目录
最新公布北大中文核心期刊目录
SCI期刊(含影响因子)
![]() ![]() ![]()
论文范文
I.INTRODUCTION Traditional imaging is based on lenses that map the scene plane to the sensor plane. In this physics-based approach the imaging quality depends on parameters such as lens quality, numerical aperture, density of the sensor array and pixel size. Recently it has been challenged by modern signal processing techniques. Fundamentally the goal is to transfer most of the burden of imaging from high quality hardware to computation. This is known as computational imaging, in which the measurement encodes the target features; these are later computationally decoded to produce the desired image.Furthermore, the end goal is to completely eliminate the need for high quality lenses, which are heavy, bulky, and expensive. One of the key workhorses in omputational imaging is compressive sensing (CS) [1], [2]. For example, CS enabled the single pixel camera [3], which demonstrated imaging with a single pixel that captured scene information encoded with a spatial light modulator (SLM). The pixel measurement is a set of consecutive readings, with different SLM patterns. The scene is then recovered using compressive deconvolution. Broadly, the traditional imaging and single pixel camera demonstrate two extremes: traditional cameras use a pure hardware approach whereas single pixel cameras minimize the requirement for high quality hardware using modern signal processing. There are many trade-offs between the two approaches. One notable difference is the overall acquisition time: the physics-based approach is done in one shot (i.e. all the sensing is done in parallel). The single pixel camera and its variants require hundreds of consecutive acquisitions, which translates into a substantially longer overall acquisition time. Recently, time-resolved sensors enabled new imaging capabilities. Here we consider a time-resolved system with pulsed active illumination combined with a sensor with a time resolution on the order of picoseconds. Picosecond time resolution allows distinguishing between photons that arrive from different parts of the target with mm resolution. The sensor provides more information per acquisition (compared to regular pixel), and so fewer masks are needed. Moreover, the time-resolved sensor is characterized by a measurement matrix that enables us to optimize the active illumination patterns and reduce the required number of masks even further. Currently available time-resolved sensors allow a wide range of potential implementations. For example, Streak cameras provide picosecond or even sub-picosecond time resolution [4], however they suffer from poor sensitivity. Alternatively, Single Photon Avalanche Photodiode (SPAD) are compatible with standard CMOS technology [5] and allow time tagging with resolutions on the order of tens of picoseconds. These devices are available as a single pixel or in pixel arrays. In this paper we present a method that leverages both timeresolved sensing and compressive sensing. The method enables lensless imaging for reflectance recovery with fewer illumination patterns compared to traditional single pixel cameras. This relaxed requirement translates to a shorter overall acquisition time. The presented framework provides guidelines and decision tools for designing time-resolved lensless imaging systems. In this framework the traditional single pixel camera is one extreme design point which minimizes the cost with simple hardware, but requires many illumination patterns (long acquisition time). Better hardware reduces the acquisition time with fewer illumination patterns at the cost of complexity. We provide sensitivity analysis of reconstruction quality to changes in various system parameters. Simulations with system parameters chosen based on currently available hardware indicate potential savings of up to 50 fewer illumination patterns compared to traditional single pixel cameras. ![]() |
|