1.Department of Applied Physics, The Hong Kong Polytechnic University, 999077 Hong Kong, China
2.Photonics Research Institute (PRI), The Hong Kong Polytechnic University, 999077 Hong Kong, China
3.Key Laboratory of Spectral Imaging Technology, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, 710119 Xi'an, China
4.Anhui Province Key Laboratory of Measuring Theory and Precision Instrument, School of Instrument Science and Opto-Electronics Engineering, Hefei University of Technology, 230009 Hefei, China
5.Department of Mechanical Engineering, The Hong Kong Polytechnic University, 999077 Hong Kong, China
6.Research Institute for Advanced Manufacturing (RIAM), The Hong Kong Polytechnic University, 999077 Hong Kong, China
Xuming Zhang (xuming.zhang@polyu.edu.hk)
Published:30 November 2024,
Published Online:18 September 2024,
Received:27 February 2024,
Revised:28 July 2024,
Accepted:10 August 2024
Scan QR Code
Jiang, H. et al. Optical fibre based artificial compound eyes for direct static imaging and ultrafast motion detection. Light: Science & Applications, 13, 2649-2667 (2024).
Jiang, H. et al. Optical fibre based artificial compound eyes for direct static imaging and ultrafast motion detection. Light: Science & Applications, 13, 2649-2667 (2024). DOI: 10.1038/s41377-024-01580-5.
Natural selection has driven arthropods to evolve fantastic natural compound eyes (NCEs) with a unique anatomical structure
providing a promising blueprint for artificial compound eyes (ACEs) to achieve static and dynamic perceptions in complex environments. Specifically
each NCE utilises an array of ommatidia
the imaging units
distributed on a curved surface to enable abundant merits. This has inspired the development of many ACEs using various microlens arrays
but the reported ACEs have limited performances in static imaging and motion detection. Particularly
it is challenging to mimic the apposition modality to effectively transmit light rays collected by many microlenses on a curved surface to a flat imaging sensor chip while preserving their spatial relationships without interference. In this study
we integrate 271 lensed polymer optical fibres into a dome-like structure to faithfully mimic the structure of NCE. Our ACE has several parameters comparable to the NCEs: 271 ommatidia versus 272 for bark beetles
and 180° field of view (FOV) versus 150–180° FOV for most arthropods. In addition
our ACE outperforms the
typical NCEs by ~100 times in dynamic response: 31.3 kHz versus 205 Hz for
Glossina morsitans
. Compared with other reported ACEs
our ACE enables real-time
180° panoramic direct imaging and depth estimation within its nearly infinite depth of field. Moreover
our ACE can respond to an angular motion up to 5.6×10
6
deg/s with the ability to identify translation and rotation
making it suitable for applications to capture high-speed objects
such as surveillance
unmanned aerial/ground vehicles
and virtual reality.
Hooke, R.Micrographia, or, Some Physiological Descriptions of Minute Bodies Made by Magnifying Glasses: with Observations and Inquiries Thereupon(Jo. Martyn and Ja. Allestry, 1665).
Darwin, C. R.On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life.(John Murray, London, 1859).
Exner, S.Die Physiologie der Facettirten Augen von Krebsen und Insecten: Eine Studie.(Franz Deuticke, Leipzig, 1891).
Agi, E. et al. The evolution and development of neural superposition.J. Neurogenet.28, 216–232 (2014)..
Shinomiya, K. et al. The organization of the second optic chiasm of theDrosophilaoptic lobe.Front. Neural Circuits13, 65 (2019)..
Duparré, J. et al. Thin compound-eye camera.Appl. Opt.44, 2949–2956 (2005)..
Dudley, R.The Biomechanics of Insect Flight: form, Function, Evolution(Princeton University Press, 2000).
Wei, K., Zeng, H. S.&Zhao, Y. Insect–Human Hybrid Eye (IHHE): an adaptive optofluidic lens combining the structural characteristics of insect and human eyes.Lab Chip14, 3594–3602 (2014)..
Brückner, A. et al. Thin wafer-level camera lenses inspired by insect compound eyes.Opt. Express18, 24379–24394 (2010)..
Jeong, K. H., Kim, J.&Lee, L. P. Biologically inspired artificial compound eyes.Science312, 557–561 (2006)..
Floreano, D. et al. Miniature curved artificial compound eyes.Proc. Natl Acad. Sci. USA110, 9267–9272 (2013)..
Song, Y. M. et al. Digital cameras with designs inspired by the arthropod eye.Nature497, 95–99 (2013)..
Lee, M. et al. An amphibious artificial vision system with a panoramic visual field.Nat. Electron.5, 452–459 (2022)..
Wu, D. et al. Bioinspired fabrication of high-quality 3D artificial compound eyes by voxel-modulation femtosecond laser writing for distortion-free wide-field-of-view imaging.Adv. Opt. Mater.2, 751–758 (2014)..
Kogos, L. C. et al. Plasmonic ommatidia for lensless compound-eye vision.Nat. Commun.11, 1637 (2020)..
Dai, B. et al. Biomimetic apposition compound eye fabricated using microfluidic-assisted 3D printing.Nat. Commun.12, 6458 (2021)..
Phan, H. L. et al. Artificial compound eye systems and their application: a review.Micromachines12, 847 (2021)..
Deng, Z. F. et al. Dragonfly-eye-inspired artificial compound eyes with sophisticated imaging.Adv. Funct. Mater.26, 1995–2001 (2016)..
Ma, M. C. et al. Super-resolution and super-robust single-pixel superposition compound eye.Opt. Lasers Eng.146, 106699 (2021)..
Ma, M. C. et al. Target orientation detection based on a neural network with a bionic bee-like compound eye.Opt. Express28, 10794–10805 (2020)..
Koike, Y.&Asai, M. The future of plastic optical fiber.NPG Asia Mater.1, 22–28 (2009)..
Säckinger, E.Broadband Circuits for Optical Fiber Communication(John Wiley&Sons, Inc, Hoboken, 2005).
Lee, B. Review of the present status of optical fiber sensors.Opt. Fiber Technol.9, 57–79 (2003)..
Liu, F. et al. Artificial compound eye-tipped optical fiber for wide field illumination.Opt. Lett.44, 5961–5964 (2019)..
Flusberg, B. A. et al. Fiber-optic fluorescence imaging.Nat. Methods2, 941–950 (2005)..
Chapman, J. A. Ommatidia numbers and eyes in scolytid beetles.Annal. Entomol. Soc. Am.65, 550–553 (1972)..
Land, M. F. inFacets of Vision(eds Stavenga, D. G.&Hardie, R. C.) 90–111 (Springer, Berlin, Heidelberg, 1989).
Subbarao, M.&Gurumoorthy, N. Depth recovery from blurred edges. In:Proc. CVPR'88: The Computer Society Conference on Computer Vision and Pattern Recognition498–503 (IEEE, 1988).
Subbarao, M.&Surya, G. Depth from defocus: a spatial domain approach.Int. J. Comput. Vision13, 271–294 (1994)..
Lucas, B. D.&Kanade, T. An iterative image registration technique with an application to stereo vision. In:Proc. 7th International Joint Conference on Artificial intelligence(ed. Hayes, P. J.) 674–679 (Morgan Kaufmann Publishers Inc., 1981)https://researchr.org/publication/ijcai%3A1981https://researchr.org/publication/ijcai%3A1981..
Fleet, D. J.&Langley, K. Recursive filters for optical flow.IEEE Trans. Pattern Anal. Mach. Intel.17, 61–67 (1995)..
Chen, J. W. et al. Optoelectronic graded neurons for bioinspired in-sensor motion perception.Nat. Nanotechnol.18, 882–888 (2023)..
Kelly, D. H.&Wilson, H. R. Human flicker sensitivity: two stages of retinal diffusion.Science202, 896–899 (1978)..
Miall, R. C. The flicker fusion frequencies of six laboratory insects, and the response of the compound eye to mains fluorescent 'ripple.Physiological Entomol.3, 99–106 (1978)..
Juusola, M. et al. Information processing by graded-potential transmission through tonically active synapses.Trends Neurosci.19, 292–297 (1996)..
de Ruytervan Steveninck, R. R.&Laughlin, S. B. The rate of information transfer at graded-potential synapses.Nature379, 642–645 (1996)..
Lei, L. et al. Optofluidic planar reactors for photocatalytic water treatment using solar energy.Biomicrofluidics4, 043004 (2010)..
Bartolo, D. et al. Microfluidic stickers.Lab Chip8, 274–279 (2008)..
0
Views
0
Downloads
0
CSCD
Publicity Resources
Related Articles
Related Author
Related Institution