Category Archives: Physics

What is a superconductor?


Materials can be divided into two categories based on their ability to conduct electricity. Metals, such as copper and silver, allow electrons to move freely and carry with them electrical charge. Insulators, such as rubber or wood, hold on to their electrons tightly and will not allow an electrical current to flow.

In the early 20th century physicists developed new laboratory techniques to cool materials to temperatures near absolute zero (-273 °C), and began investigating how the ability to conduct electricity changes in such extreme conditions. In some simple elements such as mercury and lead they noticed something remarkable – below a certain temperature these materials could conduct electricity with no resistance. In the decades since this discovery scientists have found identical behaviour in thousands of compounds, from ceramics to carbon nanotubes.

We now think of this state of matter as neither a metal nor an insulator, but an exotic third category, called a superconductor. A superconductor conducts electricity perfectly, meaning an electrical current in a superconducting wire would continue to flow round in circles for billions of years, never degrading or dissipating.

Electrons in the fast lane

On a microscopic level the electrons in a superconductor behave very differently from those in a normal metal. Superconducting electrons pair together, allowing them to travel with ease from one end of a material to another. The effect is a bit like a priority commuter lane on a busy motorway. Solo electrons get stuck in traffic, bumping into other electrons and obstacles as they make their journey. Paired electrons on the other hand are given a priority pass to travel in the fast lane through a material, able to avoid congestion.

Superconductors have already found applications outside the laboratory in technologies such as Magnetic Resonance Imaging (MRI). MRI machines use superconductors to generate a large magnetic field that gives doctors a non-invasive way to image the inside of a patient’s body. Superconducting magnets also made possible the recent detection of the Higgs Boson at CERN, by bending and focusing beams of colliding particles.

Superconductors are used in medical Magnetic Resonance Imaging.
Jan Ainali, CC BY

One interesting and potentially useful property of superconductors arises when they are placed near a strong magnet. The magnetic field causes electrical currents to spontaneously flow on the surface of a superconductor, which then give rise to their own, counteracting, magnetic field. The effect is that the superconductor dramatically levitates above the magnet, suspended in the air by an invisible magnetic force.

What prevents more widespread use of these materials is the fact that the superconductors we know about operate only at very low temperatures. In the simple elements for instance superconductivity dies out at just 10 Kelvin, or -263 °C. In more complicated compounds, such as yttrium barium copper oxide (YBa2Cu3O7), superconductivity may persist to higher temperatures, up to 100 Kelvin (-173 °C). While this is an improvement on the simple elements, it is still much colder than the coldest winter night in Antarctica.

Scientists dream of finding a material where superconductive properties can be used at room temperature, but it’s a challenging task. Turning up the temperature tends to destroy the glue that binds the electrons into superconducting pairs, which then returns a material back to its boring metallic state. One of the great challenges in the field arises from the fact that we don’t yet understand very much about this glue, except in a few limited cases.

From superatom to superconductor

New research from the University of Southern California has taken a novel step towards improving our understanding of how superconductivity arises. Rather than study superconductivity in bulk materials like wires, Vitaly Kresin and his coworkers have managed to isolate and examine small clumps of a few dozen aluminium atoms at a time. These tiny clusters of atoms can act like a “superatom”, sharing electrons in a way that mimics a single, giant atom.

What is surprising is that measurements of these clusters reveal what may be the signature of electron pairing persisting all the way up to 100 kelvin (-173 °C). This is still a frosty temperature of course, but it is 100 times higher than the superconducting temperature of a piece of aluminium wire. Why does a small handful of atoms superconduct at a much higher temperature than the millions of atoms that form a wire? Physicists have some ideas but the effect is largely unexplored, and it might prove an interesting way forward in the quest for superconductivity at higher temperatures.

The need for speed: MagLev trains in Japan use superconductivity to achieve ultra-high speeds.
Shutterstock

Hoverboards anyone?

If physicists were able to achieve the goal of room temperature superconductivity in a material that was easy to fashion into wires, important new technologies would soon follow. For starters, devices which use electricity would become considerably more efficient and consume less power.

Transporting electricity over long distances would also become much easier, which is particularly useful for renewable energy applications – and some have proposed giant superconducting cables linking Europe with solar energy farms in North Africa.

The fact that superconductors will levitate above a strong magnet also creates possibilities for efficient, ultra-high speed trains that float above a magnetic track, much like Marty McFly’s hoverboard in Back to the Future. Japanese engineers have experimented with replacing the wheels of a train with large superconductors that hold the carriages a few centimetres above the track. The idea works in principle, but suffers from the fact that the trains need to carry expensive tanks of liquid helium with them in order to keep the superconductors cold.

Many superconducting technologies will probably remain on the drawing board, or too expensive to implement, unless a room temperature superconductor is discovered. It’s just possible however that the advances made by Kresin’s group might mark a milestone on this journey.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Simultaneous Observation Might Change Our Understanding of Quantum Mechanics


New data could shed light on a decades old gap in understanding quantum mechanics – but how?

There is no shame in struggling to conceptualize quantum mechanics, considering some of the best minds on the planet struggle as well. In fact, the field of study has been confusing for even the most forward-thinking, capable scientists. The new piece of data can be gleaned from a complicated but relatively easy to grasp experiment, published March second, 2015, entitled Simultaneous observation of the quantization and the interference pattern of a plasmonic near-field.

This new event was possible through a collaboration of the Laboratory for Ultrafast Microscopy and Electron Scattering of EPFL, the Department of Physics of Trinity College (US) and the Physical and Life Sciences Directorate of the Lawrence Livermore National Laboratory. The image was rendered by EPFL’s ultrafast energy-filtered transmission electron microscope. There are currently only 2 such microscopes in the world.

Before now, traditional understanding of quantum mechanics has not been able to explain why some subatomic features can behave as a simultaneous particle or a wave. The often-referenced experiments demonstrating the effect of observation always left me asking why you can’t try both at the same time. No experiment was able to capture both states of light simultaneously.  Science has only been able to record evidence of a light as waves or particles; this new photograph captures an image of both from the exact same moment in time. Finally, an novel experiment was devised, developed and executed, simultaneous observation.

 

Traditional particle/wave observation works like this: ultraviolet light hits a metal surface causing the metal to emit electrons in a predictable, observable time-frame. Until Albert Einstein wrote about what he called the photoelectric effect, light was thought to be a  wave. Once the logic is understood this photoelectric effect is hard proof of light behaving as a particle, able to knock into other particles.

Researcher Fabrizio Carbone lead his team at EPFL as they performed a modified version of : using electrons to image light. The researchers have captured, for the first time ever, a single snapshot of light behaving simultaneously as both a wave and a stream of particles particle.

Carbone’s team was able to use nanotechnology to exploit the wave aspect of light to create a standing wave. They used a laser to direct a short pulse of light at a nano-thin metal wire. The light travels along the wire’s surface to create a standing wave on the other side. By running electricity through the wire and measuring the speed of that electron flow they were able to create an image of the wave aspect of the light during the pulse. The same electron-flow that was used to create the image of the wave traveled so close to the standing light-wave it actually had a measurable exchange of energy, as only a particle can do.

Fabrizio Carbone explains the significance: “This experiment demonstrates that, for the first time ever, we can film quantum mechanics – and its paradoxical nature – directly.”

I wonder what this new way of observing the same exact quantity of light in both states will mean in the developing applications which involve quantum theory. Carbone gives a great example, “Being able to image and control quantum phenomena at the nanometer scale like this opens up a new route towards quantum computing.”

Jonathan Howard
Jonathan is a freelance writer living in Brooklyn, NY

Tiny capsules can have big impact on carbon capture


By Roger Aines, Lawrence Livermore National Laboratory

Using the same baking soda found in most grocery stores, we and our colleagues from Harvard University and the University of Illinois at Urbana-Champaign, have created a significant advance in carbon dioxide capture.

We developed a new type of carbon capture media composed of microcapsules, tiny capsules designed to separate carbon dioxide in flue gases from power plants and other sources of emissions. Our approach offers a number of advantages over current methods.

The capsules are made of a highly permeable polymer shell, similar to what you find on a red kitchen spatula. Inside them is a fluid with sodium carbonate, a common chemical with many industrial uses. The sodium carbonate reacts with carbon dioxide (CO2) and transforms into harmless, stable sodium bicarbonate – the main ingredient in baking soda, the baking ingredient found in most kitchens.

The capsules keep the liquid contained inside the capsule core and allow the CO2 gas to pass back and forth through its shell.

During absorption, the CO2 diffuses through the thin capsule shells. Then, the gas dissolves and reacts in the liquid core to form the desired baking soda.

To extract the gas from the tiny capsules, the microcapsules are heated, which releases high-purity CO2, which can subsequently be compressed for storage or utilization. Once the CO2 is removed, the capsules can be reused.

To date, microcapsules have been used for controlled delivery and release of pharmaceuticals, food flavoring and cosmetics. But this is the first demonstration of this approach for controlled CO2 capture and release.

Material innovation

The aim of carbon capture is to prevent the release of large quantities of CO2 — a greenhouse gas that traps heat and makes the planet warmer — into the atmosphere from burning fossil fuel for power generation and other industries.

However, currently used methods, while successful, can be harmful to the environment. One well established carbon capture method is to use chemicals called amines to chemically react with CO2 and thus separate it from effluent gas. Our method is more environmentally benign because it uses carbonates rather than caustic fluids, such as monoethanol amine, to capture CO2 – a key attribute of our research.

The capsules on a quarter for scale.
Lawrence Livermore National Laboratory

Also, the microcapsules only react with the gas of interest (in this case CO2).

The encapsulation process was developed as one of the Department of Energy’s inaugural Advanced Research Projects Agency-Energy (ARPA-E) innovative carbon capture projects. The new process can be used in a wide range of situations. It can be designed to work with coal or natural gas-fired power plants, as well as in industrial processes like steel and cement production.

The technique is not a single, short-term solution to carbon capture, but a broad, sustainable approach. The sodium carbonate used in the process is mined domestically, rather than being made in a complex chemical process like the current technology of amines. In addition, baking soda has no recycling or degradation issues. It can be reused forever because the capsules are mechanically robust, while amines break down in a period of months to years.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Documents Link Leading Climate Denier to Corporate Funds


2015 succeeds what scientists had roundly labeled the hottest year on record – already we’ve made second warmest January on record in the new year. Although climate deniers continue to tout the data as skewed, misinterpreted, or merely controversial, the numbers are clear, while the refusal to act is looking more and more like a scam as each year passes.

Deniers take advantage of making climate change appear to be a matter of controversy. Don’t mention that 97 percent of the world’s scientists have made a consensus over the evidence – say instead that there are cycles, natural periods of warming and cooling, that at one point we thought we were in danger of an ice age (read: one year and misinterpreted data), there is no consistent warming and hasn’t been for decades, or – yes, the climate is changing, but we don’t know how fast or even if people are responsible for the pattern. Most of all, there’s money to be made in green energy – to say nothing of the oil industry. None of them really hold much water if carefully looked at, and even seem to conflict with one another. Yet, we fall right in line because in science nothing is absolute. There’s always room for debate. Winters are still happening, so maybe the polar ice caps are safe after all. Often, any person who’s made up their mind on global warming will have a set of statistics they are ready to jump to – many of which were made possible because of one man named Wei-Hock Soon.

Soon, who takes the “it’s not our fault” approach to the climate controversy, has received over 1.2 million from several energy companies over the last decade, according to The New York Times. In particular, these include oil and gas corporations, which tend to pay a bit more than solar energy – a tip for any scientists thinking they can get wealthy by investing in renewable energy. This is hardly the first time that Soon has been overwhelmingly compensated for his work. Back in 2011, he received $131,000 from ExxonMobil, funds allocated to study what role the sun had to play in climate change. Despite having minimal credentials in climate science, Soon argues that an increase in sunspots is the direct cause of climate change.

This is little more than a long recycled argument by deniers – sunspots emerging over the last 100 years have caused the Earth’s surface temperatures to increase. In case you were wondering, the opposite is true, with the past 35 years showing the sun on a cooling trend. The only way to make a correlation between the two would be to deliberately manipulate the data, only revealing a few years when slight increases in the sun’s energy coincided with high temperatures. The overwhelming consensus, in fact, is that changes in the sun can only account for ten percent of climate change at best. Soon’s research, however, is indicative of something else.

“What it shows is the continuation of a long-term campaign by specific fossil-fuel companies and interests to undermine the scientific consensus on climate change,” said Kert Davies, executive director at the Climate Investigations Center, in a statement to the Times. The journals in which Soon had published his work are currently investigating the matter deeper, as Soon had evidently failed to report conflicts of interest when his papers were in the stages of peer review, at least eleven times since 2008 – a breach of publication standards and ethical guidelines.

It’s become something of an irritating post-holiday tradition over the last few winters, to have a chorus of deniers accompany each gust of cold wind, on how the miserable weather proves them right all along. The delight over these blasts of Arctic wind has generally not been seen by deniers (not climate skeptics) since the so-called Climategate scandal of 2009, when right wing bloggers quote mined the hacked e-mails of several climatologists to imply that they cooked the data used in their graphs.

Among the companies that funneled money to Soon were API, Exxon Mobil, Southern Company and Koch Industries, many of which were sent in the form of anonymous donations through the organization DonorsTrust.

“I think that’s inappropriate behavior,” said Charles R. Alcock, director of the Harvard-Smithsonian Center. “This frankly becomes a personnel matter, which we have to handle with Dr. Soon internally.” Soon is a part-time employee of the Smithsonian Institution and has sometimes been falsely represented as an astrophysicist. Greenpeace was able to request the release of the documents through the Freedom of Information Act because the Smithsonian is a government agency.

In the past, corporations have shelled out sizable sums of money, covering up the harmfulness of products like lead-based paint or tobacco. Climate change, however, has shown itself to affect just about every aspect of life as we know it, with the potential to cause an unprecedented amount of harm throughout the world, and we are already feeling the effects. It is time to approach cases of corporate financed climate denial with that in mind.

James Sullivan
James Sullivan is the assistant editor of Brain World Magazine and a contributor to Truth Is Cool and OMNI Reboot. He can usually be found on TVTropes or RationalWiki when not exploiting life and science stories for another blog article.

Could Dark Matter Be Behind Earth’s Extinction Events?


Earth’s orbit along the Galactic Disc, is a long yet predictable journey that lasts for eons, but not without consequence, as Michael Rampino, a professor of biology at New York University recently observed. Rampino’s newest research, published in the Monthly Notices of the Royal Astronomical Society, believes that these infrequent rotations have coincided with the state of life on Earth.

While we often think of the comet that brought a rather dramatic end to the dinosaurs 65 million years ago when we hear of extinction, it was hardly the first time that many species died out together. Nor was it even remotely the worst. That distinction belongs to the Permian-Triassic extinction event, which occurred 252 million years ago, coinciding with the Galactic Disc rotation, in which 83 percent of all life became extinct – owing to not only volcanic events but ocean acidification and the impact of several meteors. It took approximately 10 million years for much of the life left on Earth to replenish itself. While it took more than one event to make things hostile for life on Earth, Rampino has attributed the increased number of meteoric impacts to a buildup of dark matter, which may upset the orbits of comets and also increase heating at the Earth’s core – igniting volcanic activity, a trend currently being seen in Iceland.

Even the era that paleontologists purport to be the golden age of Dinosaurs – the Jurassic period – in which some of the largest species of sauropods thrived, was preceded by another violent extinction event – the Triassic-Jurassic extinction – taking place 201.3 million years ago. It took about 10,000 years, partially because of increased activity at a massive underwater volcano known as the Central Atlantic Magmatic Province, as well as several meteoric events which took place in Europe. Today, these periods are known by the rock layers they left behind, yet it is clear that each are caused by the same violent reactions in nature and the resulting change in climate.

The Galactic Disc is a region of the Milky Way Galaxy that defines its shape and contains our solar system, amidst a heavy clutter of stars and clouds of cosmic dust and reactive gasses. Yet, surrounding the cluster, is the elusive dark matter, particles which are primarily known because of the remarkable gravity they release, impervious to light and other forms of electromagnetic radiation.

According to prior studies, the Earth makes a rotation around the Galactic Disc once every 250 million years. However, the path is not always circular but wavy, as the Sun and other planets weave their way in and out of the crowded disc at intervals of approximately 30 million years or so. The Cretaceous-Paleocene event also coincides with these patterns.

So why does dark matter in particular seem like the culprit in these occurrences? When comets move through the disc, concentrations of dark matter can sometimes intensify to the point that they begin to throw comets off course, sometimes this instability causes them to collide with the planet, acts that have defined the shape of Earth throughout its history, and also perhaps supplying it with the very amino acids necessary to sustain life. But dark matter has another somewhat more pernicious impact on our planet in a different way, too.

As the Earth is exposed to dark matter on its rotations, Rampino learned that dark matter could essentially build up within the planetary core, producing an intense heat as its particles collide with each other inside. Eventually the heat builds up considerable pressure, leading to mountain building, volcanic eruptions, and even reversals in the planet’s magnetic field. The history of rises and falls in sea levels also shows a peak happening every 30 million years.

The new model of dark matter and its interactions with planets as they move across the Galaxy could significantly impact how we perceive geological and biological development. Already, our current understanding of the Earth’s natural history is one of violent and destructive events. Dark matter could be a critical cause behind it all. Already, in what geologists have petitioned to refer to as the Anthropocene Era (the Age of Humans – due to our species’ shaping of the planet for better or for worse), many other researchers believe we are in the midst of a sixth extinction event – with climbing levels of CO2 adding to the acidification of the ocean each year. Like dark matter, humans have the power to impact the universe too.

To put this all in perspective, Rampino said in his paper: “We are fortunate enough to live on a planet that is ideal for the development of complex life. But the history of Earth is punctuated by large scale extinction events, some of which we struggle to explain. It may be that dark matter — the nature of which is still unclear but which makes up around a quarter of the universe — holds the answer. As well as being important on the largest scales, dark matter may have a direct influence on life on Earth.”

James Sullivan
James Sullivan is the assistant editor of Brain World Magazine and a contributor to Truth Is Cool and OMNI Reboot. He can usually be found on TVTropes or RationalWiki when not exploiting life and science stories for another blog article.

For a scientist, there really is no such thing as a stupid question


By Jillian Scudder, University of Sussex

I am an astrophysicist. I run into a lot of people who are extremely curious about space, how it works, or what it is that I study in particular. Many of these questions begin with an apology: “Not to sound like an idiot, but …” or “Sorry if this is a stupid question”. It doesn’t have to be this way.

Quite often I hear: “Oh! I have so many questions! I’ve always wondered …” I get a lot of extremely interesting questions this way. Some of the questions are easy to answer. Some of them require a much more nuanced response, with a careful understanding of how technical I can make my reply. Some questions are about things I’ve never thought of before, and can only offer basic insight for an answer.

Ask without fear

One of the best things about coming into your own as a scientist is that you learn to let go of the shame that society often attaches to not knowing something. As a scientist, it is not the “not knowing” that bothers me. It is not wanting to fix that lack of knowledge that bothers me. I don’t know the answers to many things – but if I meet someone who might know the answer, I will certainly ask.

You don’t have to think of a “good question”, just ask the questions that are on your mind. Good scientists won’t judge your ignorance. I expect most people to know much less about outer space than I do – most people haven’t spent nearly as much time studying it as I have.

It is not a waste of my time to help you understand the universe. If you sincerely have a question, and have gone to the trouble of articulating it, most scientists will respect the question and the curiosity that has prompted to ask it. They will generally answer it as well as they can.

On a cautionary note – not every scientist is always going to be in the mood to answer questions about their work. If we are hunkered over a computer, typing furiously, with headphones in, it is probably not a good time. And not all scientists will have the skills required to break their subject down to the right level for any given query. But almost universally scientists like talking about the things they have spent so much time learning about.

Astrocurious Anonymous

I’ve got so many questions that I have launched a blog: Astroquizzical, where I write up the answers to the questions people had in longer form in the hope that it will be useful reading for other people. The blog is my contribution, an attempt to create a friendly and non-intimidating space for the public to ask questions. To that end, it even accepts anonymous submissions, so there is no way that I can figure out who submitted the question.

Society would be in a better place if we were less fearful of looking ignorant in the face of knowledge. Allow yourself to be curious and ask your questions. If someone makes fun of your question – ask someone else. If you get a good answer, great. You will have learnt something you didn’t know – say thank you to your temporary teacher.

Your questions are valid. You don’t need to be ashamed of not knowing things, especially things that people spend their lives learning about. So if you find a friendly person or venue where you can ask your questions and get satisfactory answers, make the most of it. You never know what you might learn.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Astronomers find oldest known star


International astronomers say they have found the oldest known star encircled by five Earth-sized planets, signaling that planets formed throughout the history of the universe. International astronomers say they have found the oldest known star encircled by five Earth-sized planets, signaling that planets formed throughout the history of the universe. The system is 11.2 billion years… Continue reading

Light technologies illuminate global challenges


During these dark winter months, spare a thought for artificial lights. From strings of lights adding holiday cheer to artificial sunlamps alleviating seasonal affective disorder, they brighten our days. And light’s applications can go much further than that. The United Nations designated 2015 as the International Year of Light and Light-Based Technologies to raise awareness of how photonic technologies offer solutions to international challenges. Light technology is now an active area of research in energy, health and agriculture.

First lighting the way

Thomas Edison with some of his incandescent bulbs.

In the late 1800s, Thomas Edison created a practical light bulb, an electrically-powered, long-lasting light source that significantly changed our work, play and sleep habits. The ability to control light in new ways transformed how we experience and see the world. Light-based technologies such as optical fiber networks allow us to connect rapidly with people worldwide over the internet. Light emitting diodes (LEDs) are now everywhere from consumer electronics like smart phones to light bulbs for home lighting.

CoeLux’s artificial skylight harnesses technology to mimic our most vital light source: the sun.
James Holloway, CC BY-NC

One recent example is the artificial skylight invented by researchers who spent over ten years refining the CoeLux system. This invention, which received Lux Awards 2014 Light Source Innovation of the Year, can fill a room’s ceiling mimicking sunlight from different latitudes, from the equator to northern Europe. The key to its success in replicating a sunny sky uses nanostructured materials to scatter light from LEDs in the same way tiny particles scatter sunlight in the atmosphere – so-called Rayleigh scattering. Funding for this project from the European Commission enabled scientific advances in light management and nanotechnology as well as the completion of a device that may improve quality of life in indoor settings, from hospitals to underground parking garages.

Blue LEDs were the missing link.
Pete Brown, CC BY

Illuminating research

Only recently has the full utility of LEDs been realized for general lighting. While red and green LEDs had been in commercial use for more than a decade, the missing color for producing white light was blue. Isamu Akasaki, Hiroshi Amano, and Shuji Nakamura cracked the blue conundrum in the early 1990s. Now, thanks to their work, white light LEDs are ubiquitous. In recognition of this energy-saving invention, they received the Nobel Prize in Physics last year.

Light was also recognized in the Nobel Prize category of Chemistry last year for light-based microscopy tools that use a few tricks to sense the presence of a single molecule. Microscopy had been limited by diffraction, where two adjacent objects can only be resolved if they are separated by more than half the wavelength of light used for imaging. But Nobel laureates Eric Betzig, Stefan Hell and W.E. Moerner all took different approaches using similar principles to get beyond the diffraction barrier in order to control the fluorescence of individual molecules to view them in high detail. By turning the light emitted from the molecules on or off, the scientists could reconstruct the location of the molecules at the nanometer scale.

Microscope images of human protein vimentin. Note the higher resolution on the right.
Fabian Göttfert, Christian Wurm, CC BY-SA

Here’s how it works: a fraction of fluorescent molecules or proteins is first excited by a weak light pulse. Then after their emission fades, another subgroup of fluorescent molecules are excited. This cycle of on and off continues, and then the images are processed and superimposed to form a high-resolution map of individual proteins. The ability to peer into the nano-world of living cells to observe, for example, how proteins aggregate in the earliest stages of diseases like Alzheimer’s and Huntington’s, has just begun. Understanding disease progression at the single-molecule level could help identify when early intervention might be advantageous.

Let there be light in the darkness.
martinak15, CC BY

Investors must see the light

Light is a unifying science across fields like chemistry and physics, improving our lives and the world. But learning how to manipulate light is costly and takes time. Technologies are largely built on investments in basic science research as well as, of course, serendipity and circumstantial opportunities. Take LEDs for example. Research in blue LEDs started more than 40 years ago at Radio Corporation of America, but changes in the company’s funding structure stymied their development for two decades — until last year’s Nobel Prize winners solved the materials problem and the scale-up process.

Continued and sustained support of fundamental research is critical for future technologies not yet imagined or seen but that could have a transformative impact on our daily lives. For example, in agriculture, more effective harvesting of solar energy and its conversion into heat via greenhouses could enable year-round production as well as access to crops not currently available in certain climates.

(Left) Cartoon of nanoparticle lasers. (Right) Electron microscopy image of an array of bow-tie nanolasers.
Teri Odom, CC BY-ND

In my own work as a chemistry researcher, my group invented a laser the size of a virus particle, which should not be possible based on traditional ways to control light but is, thanks to metal nanoparticles that can squeeze light into small volumes. These tiny lasers are promising light sources that can be used to send and receive data with high bandwidths as well as to detect trace molecules or bio-agents.

Construction of our nano-laser required precise control over the shape and location of the adjacent gold nanoparticles. That such nanostructures could even be made is because of the decades-long investment by the electronics industry in developing nanofabrication tools to make the tiny components in computers. Investments in both fundamentals and applications are critical, as has been highlighted by last year’s Nobel Prizes in Chemistry and Physics.

The UN’s designation of this International Year of Light will spotlight the potentials of these kinds of innovations and the need to continue investing in future technologies. From new ways to shake off those winter blues to manipulating light in small spaces, the trajectory for artificial light is bright indeed.

The Conversation

This article was originally published on The Conversation.
Read the original article.