Toggle light / dark theme

Metamaterials: Shaping The Future Of Optics And Electromagnetism

Metamaterials are artificial materials engineered to exhibit unique properties not found in naturally occurring materials, including negative refractive index, perfect absorption of electromagnetic radiation, and tunable optical properties. Researchers have been exploring the use of metamaterials in various applications, including optics, electromagnetism, and acoustics. One area where metamaterials are being explored is in sensing and imaging applications, such as creating ultra-compact optical devices like beam splitters and lenses.


The first practical demonstration of a metamaterial was achieved in 2000 by David Smith and his team at the University of California, San Diego. They created a composite material consisting of copper strips and dielectric materials, which exhibited a negative refractive index at microwave frequencies. This breakthrough sparked widespread interest in the field, and soon researchers began exploring various applications of metamaterials.

One of the key areas of research has been in the development of optical metamaterials. In 2005, a team led by Xiang Zhang at the University of California, Berkeley demonstrated the creation of an optical metamaterial with negative refractive index. They achieved this by using a fishnet-like structure composed of silver and dielectric materials. This work paved the way for further research into optical metamaterials and their potential applications in fields such as optics and photonics.

Metamaterials have also been explored for their potential use in electromagnetic cloaking devices. In 2006, a team led by David Smith demonstrated the creation of a metamaterial cloak that could bend light around an object, effectively making it invisible. This work was based on earlier theoretical proposals by John Pendry and his colleagues.

Russia’s plasma engine spacecraft could reach Mars in just 30 days

Rosatom scientists have announced the development of a plasma electric rocket engine that they claim could send spacecraft to Mars in just one to two months.

As reported by Russia’s Izvestia newspaper, unlike traditional rocket engines that rely on fuel combustion, this innovative propulsion system utilizes a magnetic plasma accelerator and promises to reduce interplanetary travel time significantly.


Russian scientists claim to have developed a plasma electric rocket engine that could enable travel to Mars in just one to two months.

From lead to gold in a flash at the Large Hadron Collider

At the Large Hadron Collider, scientists from the University of Kansas achieved a fleeting form of modern-day alchemy — turning lead into gold for just a fraction of a second. Using ultra-peripheral collisions, where ions nearly miss but interact through powerful photon exchanges, they managed to knock protons out of nuclei, creating new, short-lived elements. This breakthrough not only grabbed global attention but could help design safer, more advanced particle accelerators of the future.

Delivering 1.5 M TPS Inference on NVIDIA GB200 NVL72, NVIDIA Accelerates OpenAI gpt-oss Models from Cloud to Edge

NVIDIA and OpenAI began pushing the boundaries of AI with the launch of NVIDIA DGX back in 2016. The collaborative AI innovation continues with the OpenAI gpt-oss-20b and gpt-oss-120b launch. NVIDIA has optimized both new open-weight models for accelerated inference performance on NVIDIA Blackwell architecture, delivering up to 1.5 million tokens per second (TPS) on an NVIDIA GB200 NVL72 system.

The gpt-oss models are text-reasoning LLMs with chain-of-thought and tool-calling capabilities using the popular mixture of experts (MoE) architecture with SwigGLU activations. The attention layers use RoPE with 128k context, alternating between full context and a sliding 128-token window. The models are released in FP4 precision, which fits on a single 80 GB data center GPU and is natively supported by Blackwell.

The models were trained on NVIDIA H100 Tensor Core GPUs, with gpt-oss-120b requiring over 2.1 million hours and gpt-oss-20b about 10x less. NVIDIA worked with several top open-source frameworks such as Hugging Face Transformers, Ollama, and vLLM, in addition to NVIDIA TensorRT-LLM for optimized kernels and model enhancements. This blog post showcases how NVIDIA has integrated gpt-oss across the software platform to meet developers’ needs.

/* */