Jeremy Dylan Batterson – Lifeboat News: The Blog https://lifeboat.com/blog Safeguarding Humanity Fri, 08 Sep 2023 19:22:32 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.2 WiMi Developed Metasurface Eyepiece for Augmented Reality with Ultra-wide FOV https://russian.lifeboat.com/blog/2023/09/wimi-developed-metasurface-eyepiece-for-augmented-reality-with-ultra-wide-fov https://russian.lifeboat.com/blog/2023/09/wimi-developed-metasurface-eyepiece-for-augmented-reality-with-ultra-wide-fov#respond Fri, 08 Sep 2023 19:22:32 +0000 https://lifeboat.com/blog/2023/09/wimi-developed-metasurface-eyepiece-for-augmented-reality-with-ultra-wide-fov Metalens for AR and VR.


BEIJING, Sept. 8, 2023 /PRNewswire/ — WiMi Hologram Cloud Inc. (NASDAQ: WIMI) (“WiMi” or the “Company”), a leading global Hologram Augmented Reality (“AR”) Technology provider, today announced that a metasurface eyepiece for augmented reality has been developed, which is based on metasurfaces composed of artificially fabricated subwavelength structures. The metasurface eyepiece employs a special optical design and engineered anisotropic optical response to achieve an ultra-wide field of view(FOV), full-color imaging, and high-resolution near-eye display.

At the heart of the WiMi’s metalens are see-through metalens with a high numerical aperture(NA), a large area and broadband characteristics. Its anisotropic optical response allows it to perform two different optical functions simultaneously. First, it can image virtual information, acting as an imaging lens for virtual information. Second, it can transmit light, serving as a transparent glass for viewing a real-world scene. This design allows the transparent metalens to be placed directly in front of the eye without the need for additional optics, resulting in a wider FOV.

Fabrication of metalens is done using nanoimprinting technology, which is capable of fabricating large-area metalens with sub-wavelength structures. First, a mould or template with the desired structure is prepared. Then, the mould or template is contacted with a transparent substrate and the nanoscale structure is transferred by applying pressure and temperature. Through this nanoimprinting process, the subwavelength structure of the metalens is successfully replicated onto the transparent substrate, resulting in the formation of the metalens.

]]>
https://russian.lifeboat.com/blog/2023/09/wimi-developed-metasurface-eyepiece-for-augmented-reality-with-ultra-wide-fov/feed 0
Sending Mixed Signals https://russian.lifeboat.com/blog/2023/07/sending-mixed-signals https://russian.lifeboat.com/blog/2023/07/sending-mixed-signals#respond Sat, 01 Jul 2023 14:23:04 +0000 https://lifeboat.com/blog/2023/07/sending-mixed-signals

Doing away with limiting computer data to 1’s and 0’s will increase speeds by orders of magnitude per volume of chip function.


This tiny photonic chip can multiplex optical data transmissions to support the next generation of massively scalable AI applications.

]]>
https://russian.lifeboat.com/blog/2023/07/sending-mixed-signals/feed 0
Potential of E-beam lithography for micro- and nano-optics fabrication on large areas https://russian.lifeboat.com/blog/2023/06/potential-of-e-beam-lithography-for-micro-and-nano-optics-fabrication-on-large-areas https://russian.lifeboat.com/blog/2023/06/potential-of-e-beam-lithography-for-micro-and-nano-optics-fabrication-on-large-areas#respond Mon, 26 Jun 2023 20:47:59 +0000 https://lifeboat.com/blog/2023/06/potential-of-e-beam-lithography-for-micro-and-nano-optics-fabrication-on-large-areas

Large area metalenses are on the horizon.


In this section, we briefly discuss the presented technique of VSB/CP e-beam writing in comparison with two other contemporary high-resolution lithographic patterning approaches: multibeam e-beam lithography, also known as complementary electron beam lithography (CEBL),16 and optical lithography.

Today’s optical lithography tools are basically well able to address the feature-sizes of the elements presented in the previous section. The exposure in those tools is based on a demagnified imaging of a pattern containing photomask. With this parallel approach, optical lithography is always much faster than any direct-write technique. The central question for making a choice between optical lithography or VSB/CP-based e-beam lithography is, therefore, the effort needed to achieve the required optical performance. For the optical lithography, this is related to the quality of the mask. A high-resolution optical pattern may require a very fine approximation of the mask pattern, leading to large writing times in a mask shop and thus to considerable costs. Consequently, the choice between the different techniques must include considerations on the number of elements required and the price for which the final consumer product can be sold. This cannot be generalized.

]]>
https://russian.lifeboat.com/blog/2023/06/potential-of-e-beam-lithography-for-micro-and-nano-optics-fabrication-on-large-areas/feed 0
Canon developing world-first ultra-high-sensitivity ILC equipped with SPAD sensor, supporting precise monitoring through clear color image capture of subjects several km away, even in darkness https://russian.lifeboat.com/blog/2023/04/canon-developing-world-first-ultra-high-sensitivity-ilc-equipped-with-spad-sensor-supporting-precise-monitoring-through-clear-color-image-capture-of-subjects-several-km-away-even-in-darkness Mon, 03 Apr 2023 23:22:37 +0000 https://lifeboat.com/blog/2023/04/canon-developing-world-first-ultra-high-sensitivity-ilc-equipped-with-spad-sensor-supporting-precise-monitoring-through-clear-color-image-capture-of-subjects-several-km-away-even-in-darkness

The first SPAD camera.


TOKYO, April 3, 2023—Canon Inc. announced today that the company is developing the MS-500, the world’s first1 ultra-high-sensitivity interchangeable-lens camera (ILC) equipped with a 1.0 inch Single Photon Avalanche Diode (SPAD) sensor2 featuring the world’s highest pixel count of 3.2 megapixels3. The camera leverages the special characteristics of SPAD sensors to achieve superb low-light performance while also utilizing broadcast lenses that feature high performance at telephoto-range focal lengths. Thanks to such advantages, the MS-500 is expected to be ideal for such applications as high-precision monitoring.

The MS-500

]]>
NeRF in the Dark: High Dynamic Range View Synthesis from Noisy Raw Images https://russian.lifeboat.com/blog/2023/03/nerf-in-the-dark-high-dynamic-range-view-synthesis-from-noisy-raw-images Sat, 04 Mar 2023 23:27:02 +0000 https://lifeboat.com/blog/2023/03/nerf-in-the-dark-high-dynamic-range-view-synthesis-from-noisy-raw-images

ALGORITHMS TURN PHOTO SHAPSHOTS INTO 3D VIDEO AND OR IMMERSIVE SPACE. This has been termed “Neural Radiance Fields.” Now Google Maps wants to turn Google Maps into a gigantic 3D space. Three videos below demonstrate the method. 1) A simple demonstration, 2) Google’s immersive maps, and 3) Using this principle to make dark, grainy photographs clear and immersive.

This technique is different from “time of flight” cameras which make a 3D snapshot based on the time light takes to travel to and from objects, but combined with this technology, and with a constellation of microsatellites as large as cell phones, a new version of “Google Earth” with live, continual imaging of the whole planet could eventually be envisioned.

2) https://www.youtube.com/watch?v=EUP5Fry24ao.

3)


We present RawNeRF, a method for optimizing neural radiance fields directly on linear raw image data. More details at https://bmild.github.io/rawnerf.

CVPR 2022

]]>
This Next-Generation Display Technology Is Going to Change the World https://russian.lifeboat.com/blog/2023/01/this-next-generation-display-technology-is-going-to-change-the-world Mon, 16 Jan 2023 20:22:44 +0000 https://lifeboat.com/blog/2023/01/this-next-generation-display-technology-is-going-to-change-the-world

MicroLED’ or Electroluminescent quantum dot screens and sensors are coming to your neighborhood soon. The linked article states: What does this mean? Just about any flat or curved surface could be a screen. This has long been the promise of a variety of technologies, not to mention countless sci-fi shows and movies, but electroluminescent QD has the potential to actually make it happen.


We’ve seen a new, top-secret prototype display technology that will soon be in TVs, phones and more.

]]>
Google’s New AI Learned To See In The Dark! 🤖 https://russian.lifeboat.com/blog/2023/01/googles-new-ai-learned-to-see-in-the-dark-%f0%9f%a4%96 Sat, 07 Jan 2023 18:22:47 +0000 https://lifeboat.com/blog/2023/01/googles-new-ai-learned-to-see-in-the-dark-%f0%9f%a4%96

GOOGLE’S NEW SENSOR DENOISNG ALGORITHM brings yet another game changer for LOW LIGHT PHOTOGRAPHY. Within a handful of years, this will be added to other factors coming down the pipe, giving further impetus to a revolution in night vision. The video below speaks for itself. In effect, the system takes a series of images from different angles, exposures, and so on, then accurately reconstructs what is missing:


❤ Check out Weights & Biases and sign up for a free demo here: https://wandb.com/papers.

📝 The paper “NeRF in the Dark: High Dynamic Range View Synthesis from Noisy Raw Images” is available here:
https://bmild.github.io/rawnerf/index.html.

❤ Watch these videos in early access on our Patreon page or join us here on YouTube:
- https://www.patreon.com/TwoMinutePapers.
- https://www.youtube.com/channel/UCbfYPyITQ-7l4upoX8nvctg/join.

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Ivo Galic, Jace O’Brien, Jack Lukic, John Le, Jonas, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Michael Albrecht, Michael Tedder, Nevin Spoljaric, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers.

Thumbnail background design: Felícia Zsolnai-Fehér — http://felicia.hu.

Károly Zsolnai-Fehér’s links:
Instagram: https://www.instagram.com/twominutepapers/
Twitter: https://twitter.com/twominutepapers.
Web: https://cg.tuwien.ac.at/~zsolnai/

]]>
Quantum Dots Enable Spacecraft-as-Sensor Concept https://russian.lifeboat.com/blog/2022/08/quantum-dots-enable-spacecraft-as-sensor-concept Wed, 24 Aug 2022 22:23:17 +0000 https://lifeboat.com/blog/2022/08/quantum-dots-enable-spacecraft-as-sensor-concept

A nano-scale sensor technology records precise signatures of light striking a surface.

]]>
Metasurface Optics for Space Applications https://russian.lifeboat.com/blog/2022/07/metasurface-optics-for-space-applications Wed, 20 Jul 2022 11:22:43 +0000 https://lifeboat.com/blog/2022/07/metasurface-optics-for-space-applications Metamaterial Space Applications:


In this presentation I will talk about nanophotonics, more specifically metasurfaces – subwavelength patterned surfaces – and explain how this can be used for space applications. As recently displayed by the stunning images from the James Webb space telescope, we often rely on recording the intensity of light (e.g. with a camera) to study the universe. However, light fundamentally has several additional degrees of freedom which can carry information, e.g. polarization, phase, and spectral content. While it is true that many conventional optical components can address these degrees of freedom individually (e.g., polarizers, phase retarders, and filters), metasurfaces enable general manipulations of phase, amplitude, and polarization on the nanoscale, thereby providing ample opportunity to create new versions of existing components and even enable functionality not possible using conventional technologies. In the presentation I will cover several examples of metasurfaces I have been working on and explain their relevance for space applications. I will attempt to explain the working principles, why metasurfaces can be useful, as well as how we fabricate metasurfaces in a cleanroom.

About the speaker: Dr. Tobias Wenger is a postdoc at JPL’s microdevices laboratory (MDL) where his main efforts relate to nanophotonics — light at the nanoscale – and how we can engineer structures and components in order to control light in new ways. Tobias received his PhD from Chalmers University of Technology, Sweden, where he worked on understanding the physical properties of plasmons in graphene.

At JPL, Tobias is applying his knowledge of subwavelength electromagnetics to design metasurface-based optical components, mainly for infrared wavelengths. Metasurfaces are a novel approach to optics which uses subwavelength elements for controlling the phase, amplitude and polarization of transmitted and/or reflected electromagnetic radiation. Tobias research interests intersect optics, computational electromagnetics, and microfabrication and he enjoys both the practical and theoretical aspects of this work. During his postdoc time at MDL, he has worked on metasurface-based optical concentrators, IR detectors, plasmonic filters, wavefront sensing, and grating replication.

]]>
Gigajot Announces the World’s Highest Resolution Photon Counting Sensor https://russian.lifeboat.com/blog/2022/04/gigajot-announces-the-worlds-highest-resolution-photon-counting-sensor Tue, 05 Apr 2022 20:02:18 +0000 https://lifeboat.com/blog/2022/04/gigajot-announces-the-worlds-highest-resolution-photon-counting-sensor

41 Megapixel Quanta Image Sensor’s Low Light and HDR Imaging Capabilities with Small Pixels are Unrivaled in the Market.

PASADENA, Calif. 0, April 4, 2022 /PRNewswire/ — Gigajot Technology, inventors and developers of Quanta Image Sensors (QIS), today announced the expansion of its groundbreaking QIS product portfolio with the GJ04122 sensor and associated QIS41 camera. With market leading low read noise, the GJ04122 sensor is capable of photon counting and photon number resolving at room temperature. The QIS41 camera, built around the GJ04122 sensor, pairs well with standard 4/3-inch microscopy optics, bringing unparalleled resolution and low light performance to scientific and industrial imaging applications.

]]>