Top Menu

Tackling the World’s Energy Challenges: 7 Recent Innovations from the American National Laboratories

Needless to say that basic and applied research plays a crucial role in addressing world’s energy and environmental issues. Scientific research is a foundation for new innovations that could fundamentally change our energy system in its entirety. Hundreds of promising cleantech companies, including Aquion Energy, Bloom Energy, Ambri and Alphabet Energy were based on the research results coming from university labs or other scientific institutions.

The development of new energy technologies generally takes a long time and requires a lot of capital. The scientific and technological challenges in energy sector are also increasingly complex and require multidisciplinary solutions that can be often provided only by large research institutions. At the same time private organizations usually avoid long-term, high-risk, costly and science-heavy R&D projects. That’s why publicly funded research is particularly important in the development of new sustainable technologies. In order to tackle energy challenges, many countries increase the budgets of public labs and research centers that can create affordable clean-energy technologies and slow the rate of climate change. With government-backed support, these research institutions do not have to focus on generating quick, profitable results, and thus they can undertake projects that would otherwise never be addressed.

In the United States the Department of Energy National Laboratory system have been playing a central role in developing breakthrough science and technologies during the past 70 years. With origins in the Manhattan Project during World War II, the Energy Department’s 17 National Labs currently conduct more than $12.5 billion in publicly funded R&D annually on a wide range of issues, including high performance computing, national security and energy innovation. Each year, hundreds of scientists at National labs are conducting new R&D projects in physics, chemistry, material science and engineering in order to create new innovative solutions to the world’s toughest energy challenges. In this article I would like to present seven cutting-edge scientific advances related to clean energy, which have been developed at different National labs during the last few years. In the near future each of these innovations has a high potential to move from the lab into the marketplace, becoming transformative technological solutions to energy and environmental challenges facing the world in the 21st century.

Tracking building energy efficiency with Rapid Building Energy Modeler (RAPMOD)

The Environmental Energy Technologies Division of Lawrence Berkeley National Laboratory and partners are developing a portable system of sensing and computer hardware to rapidly generate indoor thermal and physical building maps. Rapid Building Energy Modeler (RAPMOD) is fitted with several different sensors, including laser scanners (LiDAR), a visible light camera, and an infrared sensor. The camera and LiDAR generate a photorealistic three-dimensional model of the building’s interior as the user walks through hallways, into rooms, and up and down staircases.

The infrared sensor measures the thermal properties of windows and detects thermal defects e.g. in wall insulation or moisture leaks. It also measures the heat coming from lighting systems, other equipment, and building occupants, providing the model with information required to calculate the energy needed to heat and cool the buildings.

The cameras and scanners are mounted on a backpack, allowing a person to walk through and record the interior of an entire building. RAPMOD generates a visual map of the building that can be input into energy simulation models and used to develop an understanding of the building’s energy performance, leading to a list of recommendations for improving its efficiency.

RAPMOD-2

Rapid Building Energy Modeler (RAPMOD) / Photo: Lawrence Berkeley National Laboratory

Why it matters:

A multi-billion dollar market exists for reducing the energy use of existing buildings, if we can only figure out a way to substantially reduce the cost and time required to assess building energy performance, recommend energy performance measures, and identify problems in building operations. Today, creating building energy models is expensive and time consuming and requires a lot of skill. Many existing buildings have incomplete, outdated, or no design documentation, requiring specialists to go into the building and laboriously make measurements that they can import into the software required to create the model.

RAPMOD is designed to tackle this problem head on. This system enables energy service companies, architects, engineering firms, and utilities rapidly assess problem areas, identify energy-efficiency opportunities, and pursue a course of action. RAPMOD doesn’t need to be operated by high-cost-energy experts. Technicians will be able do the building walkthrough and measured data will be uploaded automatically for processing and importing into the energy modeling software. All this drives down the cost and time of producing, enabling much more existing buildings to be analyzed for energy efficiency opportunities.

A nano-sized, environmentally friendly hydrogen generator

Researchers at Argonne National Laboratory have created a small scale “hydrogen generator” that uses light and a two-dimensional graphene platform to boost production of hydrogen. The concept is inspired by the function of an ancient protein known to turn light into energy. Researchers have long known that some single-celled organisms use a protein called bacteriorhodopsin (bR) to absorb sunlight and pump protons through a membrane, creating a form of chemical energy. They also know that water can be split into oxygen and hydrogen by combining these proteins with titanium dioxide and platinum and then exposing them to light.

There is just one downside in this process: titanium dioxide only reacts in the presence of ultraviolet light, which makes up a mere four percent of the total solar spectrum. In order to produce greater amounts of hydrogen using visible light, the researchers looked for a new material. The new material would need enough surface area to move electrons across quickly and evenly and boost the overall electron transfer efficiency, like graphene. Graphene is a super strong, super light, near totally transparent sheet of carbon atoms and one of the best conductors of electricity ever discovered.

Mini-hydrogen generator works like this: both the bR protein and the graphene platform absorb visible light. Electrons from this reaction are transmitted to the titanium dioxide on which these two materials are anchored, making the titanium dioxide sensitive to visible light. Simultaneously, light from the green end of the solar spectrum triggers the bR protein to begin pumping protons along its membrane. These protons make their way to the platinum nanoparticles which sit on top of the titanium dioxide. Hydrogen is produced by the interaction of the protons and electrons as they converge on the platinum.

Why it matters:

Hydrogen is virtually everywhere on the planet, but the element is typically bonded with other elements and must be separated from oxygen in H2O to produce free hydrogen. The commercial separation process uses natural gas to react with superheated steam to strip away hydrogen atoms producing hydrogen fuel, but also carbon dioxide —a greenhouse gas byproduct which escapes into the atmosphere. Argonne’s early-stage generator, composed of many tiny assemblies, is proof that hydrogen can be produced without burning fossil fuels. Scaling this research up in the future may mean that we could replace the gas in our cars and generators with hydrogen—a greener option, because burning hydrogen fuel emits only water vapor.

Algae to crude oil: million-year natural process takes minutes in the lab

Researchers at Pacific Northwest National Laboratory (PNNL) have created a continuous chemical process that produces useful crude oil minutes after they pour in harvested algae — a verdant green paste with the consistency of pea soup.

In the PNNL process, a slurry of wet algae is pumped into the front end of a chemical reactor. Once the system is up and running, out comes crude oil in less than an hour, along with water and a byproduct stream of material containing phosphorus that can be recycled to grow more algae. With additional conventional refining, the crude algae oil is converted into aviation fuel, gasoline or diesel fuel.

PNNL scientists and engineers simplified the production of crude oil from algae by combining several chemical steps into one continuous process. The most important cost-saving step is that the process works with wet algae. Most current processes require the algae to be dried that takes a lot of energy and is expensive. The new process works with an algae slurry that contains as much as 80 to 90 percent water. Researchers have been able to extract usable gas from the water and then recycle the remaining water and nutrients to help grow more algae, which further reduces costs.

The PNNL system also eliminates another step required in today’s most common algae-processing method: the need for complex processing with solvents like hexane to extract the energy-rich oils from the rest of the algae. Instead, the PNNL team works with the whole algae, subjecting it to very hot water under high pressure to tear apart the substance, converting most of the biomass into liquid and gas fuels.

The PNNL system runs continuously, processing about 1.5 liters of algae slurry in the research reactor per hour, an amount that’s much closer to the type of continuous system required for large-scale commercial production than systems that run a batch at a time.

Algae Biofuel

Steps in the process for making fuel from algae – the algae slurry, crude oil, and refined diesel fuel / Photo: Pacific Northwest National Laboratory

Why it matters:

The crude oil produced from algae at Pacific Northwest National Laboratory can be refined into gasoline, heating oil, diesel fractions, and with a bit more expense and effort, aviation grade kerosene. While algae have long been considered a potential source of biofuel, and several companies have produced algae-based fuels on a research scale, the fuel is projected to be expensive. The PNNL technology harnesses algae’s energy potential efficiently and incorporates a number of methods to reduce the cost of producing algae fuel. “Cost is the big roadblock for algae-based fuel,” said Douglas Elliott, the laboratory fellow who led the PNNL team’s research. “We believe that the process we’ve created will help make algae biofuels much more economical.”

Quantum Dots Promise to Significantly Boost Solar Cell Efficiencies

The scientists from National Renewable Energy Laboratory (NREL) has shown that quantum-dot solar cells operating under concentrated sunlight can have maximum theoretical conversion efficiencies twice that achievable by conventional solar cells—up to 66%, compared to 31% for present-day first- and second-generation solar cells. “Quantum dots” are tiny spheres of semiconductor material measuring only about 2–10 nanometers in diameter, which can generate more than one bound electron-hole pair, or exciton, per incoming photon. In 2001, NREL predicted that quantum dots would be capable of generating more than one electron-hole pair, from a single photon of light.

The external quantum efficiency for photocurrent, usually expressed as a percentage, is the number of electrons flowing per second in the external circuit of a solar cell divided by the number of photons per second of a specific energy (or wavelength) that enter the solar cell. Today’s solar cells produce only one exciton per incoming photon. The “multiple exciton generation” (MEG) effect of quantum dots promises to wring more energy out of each photon. In addition, varying the size of quantum dots effectively “tunes” them to respond to different wavelengths of light. As quantum dots get smaller, the light spectra that they absorb will shift to the blue, which represents greater energy or shorter wavelength.

Why it matters:

The solar power industry has a problem: researchers have pushed the efficiencies of the current generation of solar cell technologies close to their practical limits. So, achieving significant new gains in solar photovoltaic efficiencies will depend on the development of new technologies. In the search for a third generation of solar-cell technologies (as a follow-up to silicon and thin-film solar cells), a leading candidate is the use of quantum dots – tiny spheres of semiconductor material.

The theoretical and experimental work in nanocrystals being done by NREL scientists opens the door to the potential application of quantum dots to greatly enhance the conversion efficiency of solar cells based on silicon and other semiconductor materials. The result is that more of the sun’s energy may be available to generate solar electricity and produce solar fuels. This is a key step toward making solar electricity and fuels more efficient and cost competitive with conventional power sources.

Designing future cities: merging urban planning with scientific analysis

Researchers at Argonne National Laboratory have developed a platform called LakeSim that merges urban design with scientific analysis to aid in the planning of 21st century cities. To address the uncertainty of large-scale planning with so many complex variables, LakeSim creators have prototyped a platform that seeks to help developers plan at massive scales, while anticipating the ability to build in future scenarios such as climate change, improved efficiency in buildings and transportation systems, and increased renewable energy and/or micro-grid applications.

LakeSim connects existing urban design tools with scientific computer models to create detailed simulations relevant to large-scale development. Instead of planning separately for different pieces of the infrastructure, the framework allows developers to simulate how various designs influence the environment, transportation, and business spheres under a range of possible scenarios, over hundreds of acres and decades of time. LakeSim will also allow planners and developers to explore a wider range of designs more quickly, without the lengthy periods of re-analysis currently required for each aspect of the development when a new strategy is proposed. Urban planners would make changes to their plans – for example, changing a residential block to commercial – triggering the execution of computational models that rapidly predict the effects of those adjustments across multiple complex systems, such as energy supply or stormwater management.

LakeSim-2

LakeSim computational modeling for the massive 600-acre, urban development planned for the South Side of Chicago, along Lake Michigan

Why it matters:

A hundred years ago, one out of every five people lived in urban areas. By 2050, that number will grow to over four out of five. This rapid urbanization presents significant problems to the world. Even a modest annual population growth of three or five percent can mean thousands of new inhabitants, and each new resident will require energy, transportation, potable water, food and other infrastructure services that strain finite resources.

With LakeSim, urban planners could better anticipate extreme weather or power outage events and build safeguards into the structures and energy systems they design. By running a multitude of scenarios, and determining how each structure and block responds to certain events, planners could optimize the balance of energy with more solar or wind power, or increase energy storage capacity to a particular section of a city development, minimizing the effects of major weather or energy events. This would give energy producers and builders a more scientifically rigorous range of potential energy demands in the near and far future in minutes, allowing them to better anticipate the need for renewable and traditional forms of energy.

New material doubles carbon dioxide capture potential

A new material that is highly selective in capturing carbon dioxide over other flue gases and that has a surface area the size of a football field in just one teaspoonful has been designed by researchers at Pacific Northwest National Laboratory. This material, called Molexol, captures carbon dioxide efficiently, selectively, and economically.

The researchers created Molexol by heating a zinc nitrate with an organic building block material that serves as the foundation for the new material, and a solvent, to 100°C to form a crystalline structure filled with solvent molecules. (Organic in this context means the molecules are made up of carbon, hydrogen and oxygen.) The solvent molecules, which are trapped in the empty spaces of the crystalline material, are removed by heating, leaving behind an abundance of channels that hold enormous amounts of carbon dioxide. The scientists estimate that a teaspoon of Molexol can hold 3 to 4 times more carbon dioxide gas than conventional solvents. In addition to Molexol’s capacity to capture large amounts of carbon dioxide and its high selectivity for carbon dioxide, it also can be incorporated into other engineered structures, such as thin films, membranes, and microporous materials, making it relatively simple to apply the technology.

Why it matters:

Before carbon dioxide emissions from power plants can be permanently stored, they must be separated from other flue gases and then captured as a relatively pure gas. Existing technology for capturing carbon is neither efficient nor cost effective with studies indicating that carbon capture using current technologies will add more than 30 percent to the cost of electricity. In addition, the net electricity produced from existing plants would be significantly reduced since 20 to 30 percent of the power generated by the plant would be used to capture and compress carbon dioxide.

The PNNL’s Molexol material has been molecularly engineered to selectively capture carbon dioxide from flue gases. With Molexol’s high selectivity, no additional steps are needed to separate the carbon dioxide from the flue gases, making it very economical. In addition, the material does not require the same huge amounts of energy to release the trapped carbon dioxide molecules as conventional capture materials because the bonds between the Molexol and the carbon dioxide molecules are weak.

New enzyme speeds up to 14 times biomass-to-sugar conversion

Scientists at the National Renewable Energy Laboratory (NREL) have developed an enzyme that could change the economics of biofuel production by converting biomass to sugars up to 14 times faster and much cheaper than competing catalysts.

Production of renewable fuels, especially bio-ethanol holds remarkable potential to meet the current energy demand as well as mitigate greenhouse gas emissions for a sustainable environment. At the same time, first generation biofuels made from sugarcane or starch based grains and tubers (mainly corn, potatoes) have caused significant stress on food prices and food security.

Lignocellulosic raw materials offer an attractive alternative for being abundant and not competing with food and feed production. Lignocellulosic ethanol is made by freeing the sugar molecules from cellulose using enzymes, steam heating, or other pre-treatments. These sugars can then be fermented to produce ethanol in the same way as first generation bioethanol production. The major problem of second generation lignocellulosic biofuel is that existing enzymes needed to release sugars from cellulose are inefficient and expensive.

The enzyme developed at NREL was isolated from the bacterium Caldicellulosiruptor bescii and was called CelA. Unlike most catalysts, CelA can digest not one, but two major components in biomass: both cellulose and xylan. CelA works in two mechanical realms, not just one. It is an ablater, scraping the valuable material off the cell walls of the plants. But it is also a borer, digging deep into the wall to grab more of the digestible biomass. It is the only enzyme known to dig pits into biomass; others only ablate. It can operate at much higher temperatures than other enzymes. That’s important because high temperatures mean faster action. Also, because CelA can operate above the boiling point of alcohol, the alcohol is separated naturally, saving a costly step in the conversion process—and the high temperatures kill many of the microorganisms that would otherwise interfere with the process.

If the best commercially used enzyme converted sugars at a 30% extent in seven days, CelA doubles that extent. And while it took the alternative enzyme seven days to achieve that conversion, CelA, with a small boost from an extra beta glucosidase, achieved double in just about two days. “If you can achieve in one day what typically takes seven, you are saving the better part of a week of processing. And that can have a huge economic impact”, says NREL Senior Scientist Roman Brunecky.

CelA

NREL Principal Investigator Dan Schell holds a flask of “beer” — a lignocellulosic ethanol broth — that represents a middle stage in the process of making high-grade ethanol fuel from non-edible biomass / Photo: Pat Corkery

Why it matters:

Lignocellulosic biomass is the most plentiful and sustainable resource on Earth, largely made up of plant residuals that would otherwise go unused and left to decay. Using this biomass as a source of alternative fuels can help offset the world’s dependence on fossil fuels and reduce greenhouse gas emissions. For many years, cellulosic-ethanol researchers have been looking for ways to more efficiently break down lignocellulose materials in biomass into usable sugars. The development of the enzyme that can produce sugars more efficiently will considerably lower a cost of converting biomass into fuel and, with it, a cost of all from jet fuel to ethanol, butanol, drop-in fuels, and countless chemicals.

Innovation through basic and applied research is a key means for addressing global energy and environmental issues. However, promising solutions discovered in laboratories and research centers can’t effectively address energy challengers unless and until they are successfully transferred to the marketplace as commercial products and services. In order to maximize technological and commercial impact of their innovations, research organizations can pursue a variety of pathways from startups creation to industry agreements, technology licensing and other partnerships with the private sector. In fact, the link between science and industry is critical to turn a test-tube of algae biofuel into a running system of oil production or transform a solar cell prototype into a workable module that can be mass produced. The success of publicly-funded energy research programs is directly tied to the strong relationships between research labs, academia, industry and government. These partners, and the shared resources they can bring to bear on scientific and technological problems, are key to identifying and solving the major global challenges in energy and climate change.

,

No comments yet.

Leave a reply