Doe Science news source
The DOE Science News Source is a Newswise initiative to promote research news from the Office of Science of the DOE to the public and news media.
  • 2017-07-05 12:00:41
  • Article ID: 677436

Will Brain-Inspired Chips Make a Dent in Science's Big Data Problems?

Two Berkeley Lab teams are running experiments on IBM's TrueNorth to find out

  • Credit: Rebecca Carney, Berkeley Lab

    IBM TrueNorth setup Berkeley Lab.

The average human adult brain weighs about three pounds and is comprised mostly of fat and water, but it is extremely efficient at processing information. To simulate just one second of biological brain activity several years ago, researchers used 82,994 processors, one petabyte of system memory and 40 minutes on the Riken Research Institute’s K supercomputer. At the time, this system consumed enough electricity to power about 10,000 homes. In contrast, the brain uses the equivalent of about 20 watts of electricity—barely enough to power a dim light bulb.

Our brains are also much better than computers at tasks like recognizing images and navigating unfamiliar spaces. Although the precise mechanism by which our brain performs these tasks is still unknown, we do know that visual information is processed in a massively parallel and concerted fashion by millions of neurons connected by synapses. Each neuron responds to visual stimuli in a simple, on-demand fashion, but their collective responses can yield cognitive outcome that currently cannot by easily described by a simple mathematical model. These models are essentially the foundation of current image processing software executed on traditional computing systems. All computing systems since the 1940s—from smartphones to supercomputers—have been built from the same blueprint, called the von Neumann architecture, which relies on mathematical models to execute linear sequences of instructions.

The von Neumann design has also led computing to its current limits in efficiency and cooling. As engineers built increasingly complex chips to carry out sequential operations faster and faster, the speedier chips have also been producing more waste heat. Recognizing that modern computing cannot continue on this trajectory, a number of companies are looking to the brain for inspiration and developing “neuromorphic” chips that process data the way our minds do. One such technology is IBM's TrueNorth Neurosynaptic System.

Although neuromorphic computing is still in its infancy, researchers in the Computational Research Division (CRD) at the U.S. Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) hope that these tiny, low-power, brain-inspired computing systems could one day help alleviate some of science’s big data challenges. With funding from the Laboratory Directed Research and Development (LDRD) program, two groups of researchers are exploring how science might benefit from this new technology.

One group of CRD researchers is looking at how neuromorphic chips might be able to provide low-power, real-time data processing for charged particle tracking in high energy physics experiments and prediction of movement from neural signals for brain machine interfaces. So they are working to implement Kalman filters on TrueNorth chips, effectively expanding the utilization of this neuromorphic technology to any computing problem benefiting from real-time, continuous tracking or control.

Meanwhile, another collaboration of researchers from CRD and the Molecular Biophysics and Integrated Bioimaging (MBIB) division looked at the viability of applying convolutional neural networks (CNNs) on IBM’s TrueNorth to classify images and extract features from experimental observations generated at DOE facilities. Based on their initial results, the team is currently working to identify problems in the areas of structural biology, materials science and cosmology that may benefit from this setup.

“The field of neuromorphic computing is very new, so it is hard to say conclusively whether science will benefit from it. But from a particle physics perspective, the idea of a tiny processing unit that is self-contained and infinitely replicable is very exciting,” says Paolo Calafiura, software & computing manager for the Large Hadron Collider’s ATLAS experiment and a CRD scientist.

He adds: “For one reason or another—be it I/O (input/output), CPU (computer processing unit) or memory—every computing platform that we’ve come across so far hasn’t been able to scale to meet our data processing needs. But if you can replicate the same tiny unit of processing 10 million times or more, as neuromorphic computing aims to do, and find the right balance between power consumption and processing speed, this sounds like it will meet our needs.”

Why Neuromorphic Computing?

In the traditional von Neumann design, computers are comprised primarily of two components: a CPU that handles data, and random access memory (RAM) that stores data and the instructions for what to do with it. The CPU fetches its first instruction from memory, and then data needed to execute it. Once the instruction is performed, the result is sent back to memory and the cycle repeats.

Rather than go back and forth between CPU and memory, the TrueNorth chip is a self-contained computing system in which processing units and memory are colocated. Each chip contains 4,096 neurosynaptic cores that contain 1 million programmable neurons and 256 million configurable synapses interconnected via an on-chip network. The neurons transmit, receive and accumulate signals known as spikes. A neuron produces a spike whenever accumulated inputs reach a programmed activation threshold. They are weighted and redirected by synapses that connect different layers of neurons to map input to output.

TrueNorth chips natively tile in two dimensions using the on-chip network, essentially allowing the system to seamlessly scale to any size. Because synapses serve a dual function of memory and CPU, neuromorphic chips pack a lot of computing power into a tiny footprint and use significantly less power. For instance, TrueNorth uses about 70 milliwatts of electricity while running and has a power density of 20 milliwatts per square centimeter—almost 1/10,000th the power of most modern microprocessors.   

“Low-energy consumption and compact size are some of the reasons we’re interested in neuromorphic computing,” says Chao Yang, an applied mathematician in Berkeley Lab’s CRD. “With these miniature computing systems, we expect that soon we will enable scientific instruments to be more intelligent by doing real-time analysis as detectors collect information.” 

According to CRD scientist Daniela Ushizima, incorporating these neuromorphic chips into detectors could mean huge computational savings for imaging facilities. Rather than send raw data directly to a storage facility and then figure out post-acquisition whether the information collected is relevant, good quality or includes the object of interest, researchers could just do this exploration in situ as the data is being collected.

The size of the chips also presents new possibilities for wearables and prosthetics. “In our time-series work, we’re exploring the potential of this technology for people who have prosthetics implanted in their brains to restore movement,” says Kristofer Bouchard, a Berkeley Lab computational neuroscientist. “While today’s supercomputers are powerful, it is not really feasible for someone to tote that around in everyday life. But if you have that same computing capability packed into something the size of a postage stamp, that opens a whole new range of opportunities.”

Translating Science Methods: From von Neumann to Neuromorphic

Because neuromorphic chips are vastly different than today’s microprocessors, the first step for both projects is to translate the scientific methods developed for modern computers into a framework for the TrueNorth architecture. Here is a more detailed look at these two projects.


Particle Physics and Brain Machine Interfaces

Co-leads: Kristofer Bouchard and Paolo Calafiura

In particle physics experiments, researchers smash beams of protons at the center of detectors and measure the energy and momentum of escaping particles. By tracking the trajectory of escaping material with algorithms called Kalman filters, physicists can infer the existence of massive particles that were created, or decayed, right after the collision.

Kalman filters are essentially optimal estimators. They can infer structures of interest, relatively accurately, from a series of measurements taken over time in difficult environments that produce data with statistical noise and other inaccuracies. Because these algorithms are recursive, new measurements can be processed in real time, making them convenient for online processing. In addition to particle physics, Kalman filters are also widely used for navigation, signal processing and even modeling the central nervous system’s control of movement.

Currently, Bouchard and Calafiura are working to set up their scientific framework on the TrueNorth architecture. They implemented Kalman filters using IBM TrueNorth Corelet Programming Language and they explored strengths and weaknesses of the various TrueNorth's transcoding schemes that convert incoming data into spikes. Once fully tested, this TrueNorth Kalman filter will be broadly applicable to any research group interested in sequential data processing with the TrueNorth architecture. 

“As these transcoding schemes have different strengths and weakness, it will be important to explore how the transcoding scheme affects performance in different domain areas. The ability to translate any input stream into spikes will be broadly applicable to any research group interested in experimenting with the TrueNorth architecture,” says Calafiura.

“Brain-machine interfaces (BMIs) for restoring lost behavioral functions entail recording brain signals and transforming them for a particular task. The computations required for a BMI need to occur in real time, as delays can cause instabilities in the system,” says Bouchard. “Today, the majority of state-of-the-art BMIs utilizes some variation of the Kalman filter for transforming observed brain signals into a prediction of intended behavior.” 

Once the team has successfully set up their workflow on TrueNorth, they will train their spiking neural network Kalman filters on real neural recordings taken directly from the cortical surface of neurosurgical patients collected by Dr. Edward Chang at the University of California, San Francisco. This consists of neural recordings from 100-256 electrodes with signal rates of ~400 Hz, well within the constraints of a single TrueNorth system. The team will also train their implementations with high energy physics data collected at the Large Hadron Collider in Geneva, Switzerland and Liquid Argon Time Processing Chambers at FermiLab.

Image Analysis and Pattern Recognition
Co-leads: Chao Yang, Nick Sauter and Dani Ushizima

Convolutional neural networks are extremely useful for image recognition and classification. In fact, companies like Google and Facebook are using CNNs to identify and categorize faces, locations, animals, etc., using billions of images uploaded to the Internet every day. Users essentially help “train” these CNNs every time they tag a location or friend in a picture. CNNs learn from these tags, so the next time someone tries to tag a face in an uploaded image the system can offer suggestions based on what it’s learned.

Because CNN designs evolved from early research of the brain’s visual cortex and how neurons propagate information through complex cell organizations, Yang and his colleagues thought that this algorithm might be a good fit for neuromorphic computing. So they explored a number of CNN architectures, targeting image-based data that requires time-consuming feature extraction and classification. Given the broad interest of Berkeley Lab in the areas of structural biology, materials science and cosmology, different scientists came together to select adequate problems that can be efficiently processed on the TrueNorth architecture.

X-ray Crystallography

In biology and materials science, X-ray crystallography is a popular technique for determining the three-dimensional atomic structure of salts, minerals, organic compounds, and proteins. When researchers tap the crystalline atoms or molecules with an X-ray beam, light is scattered in many directions. By measuring the angles and intensities of these diffracted beams, scientists can create a 3D picture of the density of electrons inside the crystals.

One of the key steps in X-ray crystallography is to identify images with clear Bragg peaks, which are essentially bright spots created when light waves constructively interfere. Scientists typically keep images with Bragg peaks for further processing and discard those that don’t have these features. Although an experienced scientist can easily spot these features, current software requires a lot of manual tuning to identify these features. Yang’s team proposed to use a set of previously collected and labeled diffraction images to train a CNN to become a machine classifier. In addition to separating good images from bad ones, CNNs can also be used to segment the Bragg spots for subsequent analysis and indexing. 

“Our detectors produce images at about 133 frames per second, but currently our software takes two seconds of CPU time to compute the answer. So one of our challenges is analyzing our data quickly,” says Nicholas Sauter, a structural biologist in Berkeley Lab’s Molecular Biophysics and Integrated Bioimaging Division. “We can buy expensive parallel computing systems to keep up with the processing demands, but our hope is that IBM TrueNorth may potentially provide us a way to save money and electrical power by putting a special chip on the back of the detector, which will have a CNN that can quickly do the job that those eight expensive computers sitting in a rack would otherwise do.”

Cryo-Electron Microscopy (CryoEM)

To determine the 3D structures of molecules without crystalizing them first, researchers use a method called cryo-electron microscopy (cryoEM), which involves freezing a large number of randomly oriented and purified samples and photographing them with electrons instead of light. The 2D projected views of randomly oriented but identical particles are then assembled to generate a near-atomic resolution 3D structure of the molecule.

Because cryoEM images tend to have very low signal-to-noise ratio—meaning it is relatively hard to spot the desired feature from the background—one of the key steps in the analysis process is to group images with the similar views into the same class. Averaging images within the same class boosts the signal-to-noise ratio.

Yang and his teammates used simulated projection images to train a CNN to classify images into different orientation classes. For noise-free images, their CNN classifier successfully grouped images into as many as 84 distinct classes with over 90 percent success rate. The team also investigated the possibility of lowering the precision of the CNN by constraining both the input and CNN weights and found that reliable prediction can be made when the input and weights are constrained down to 3 or 4 bits. They are currently examining the reliability of this approach to noisy images. 

Grazing Incidence Small Angle X-ray Scattering (GISAXS)

Grazing incidence small angle X-ray scattering (GISAXS) is an imaging technique used for studying thin films that play a vital role as building blocks for the next generation of renewable energy technology. One of the challenges in GISAXS imaging is to accurately infer the crystal structure of a sample from its two-dimensional diffraction pattern.

In collaboration with Advanced Light Source (ALS) Scientist Alex Hexemer, Ushizima used categorization algorithms to label large collections of computer simulated-images, each containing a variety of crystal structures. They used this dataset to train a deep CNN to classify these images by their structures. When they tested the performance of their classifier on multiple datasets, they achieved about 83 to 92 percent accuracy depending on the number of crystal lattices of each test case. Preliminary classification results using real images point out that models trained on massive simulations, including realistic background noise levels, have the potential to enable categorization of experimentally obtained data.

“We believe that these initial results are really encouraging, and an indication that we should continue to study the use of CNNs for GISAXS and other synchrotron based scientific experiments,” says Ushizima. 


To find Type Ia supernovas and other transient events in the night sky, astronomers rely on sky surveys that image the same patches of sky every night for months and years. Astronomers warp and average these some of images together to create a template of a particular patch of sky. When a new observation comes in, they will compare it to the template and subtract the known objects to uncover new events like supernova. Because images of the night sky have to be warped to correct for optical effects or artifacts—caused by defect sensors, cosmic ray hits and foreground objects—the subtractions are not always perfect. In fact, 93 percent of potential candidates identified by the subtraction pipeline are artifacts.    

To sift out the false from real candidates post-subtraction, Throsten Kurth, an HPC Consultant at the National Energy Research Scientific Computing Center (NERSC) created a two-layer CNN and applied a method that involved 80 percent training, 10 percent validation and 10 percent testing to evaluate the performance of their algorithm on TrueNorth. To test the robustness of his algorithm, he also included images of the night sky in varying orientations in their training dataset. Ultimately, they achieved about 95 percent classification accuracy.

“Increasing the network with more layers does not mean to improve performance,” says Ushizima. “The next step involves trying our approach on a different dataset, which contains images with low signal-to-noise ratio, images with defects, as well as noise and defect pixel maps. With this dataset, the neural network can learn correlations between all those characteristics and thus hopefully deliver a better performance.”

Micro tomography (MicroCT)

Micro tomography (MicroCT) is an imaging method that is very similar to what hospitals use when they do CT or CAT scans on a patient, but it images on a much smaller scale. It actually allows researchers to image the internal structure of objects at very fine scales and in a non-destructive way. This means that no sample preparation needs to occur—no staining, no thin slicing—and a single scan can capture the sample’s complete internal structure in 3D and at high resolution.

Using microCT, scientists can test the robustness of materials that may one day be used in batteries, automobiles, airplanes, etc. by searching for microscopic deformations in its internal structure. But sometimes finding these fissures can be a lot like searching for a needle in a haystack. So Ushizima and Yang teamed up with the ALS’s Dula Parkinson to develop algorithms to extract these features from raw microCT images.  

“Computer vision algorithms have allowed us to construct labeled data banks to support supervised learning algorithms, like CNNs. One particular tool that we created allows the researcher to segment and label image samples with high accuracy by providing an intuitive user interface and mechanisms to curate data,” says Ushizima.

Although these tools were developed specifically to extract features from microCT images, she notes that it is applicable to other science areas as well.

“As the volume and complexity of science data increases, it will become important to optimize CNNs and explore cutting-edge architectures like TrueNorth,” says Yang. “Currently, we are determining the CNN parameters— number of layers, size of the filters and down sampling rate—with ad hoc estimates. In our future work, we would like to examine systematic approaches to optimizing these parameters.”  

For these LDRD projects, the researchers primarily used IBM’s TrueNorth because it was the first neuromorphic chip they had access to. In the future they hope to explore the viability of other neuromorphic computing architectures.

In addition to Yang, Ushizima, Sauter, Hexemer and Kurst, the other Berkeley Lab collaborators on the Image Analysis and Pattern Recognition LDRD include Karen Davies (MBIB), Xiaoye Li (CRD), Peter Nugent (CRD, NERSC), Dilworth Parkinson (ALS), Nicholas Sauter (MBIB) and Singanallur Venkatakrishnan (ORNL). In addition to Bouchard and Calafiura, the other collaborators on the Particle Physics and Brain Machine Interfaces LDRD include David Donofrio (CRD), Maurice Garcia-Sciveres and Rebecca Carney (Physics), David Clarke and Jesse Livezey (UCB,CRD).

The work was funded through Berkeley Lab’s Laboratory Directed Research and Development (LDRD) program designed to seed innovative science and new research directions. ALS and NERSC are DOE Office of Science User Facilities.

The Office of Science of the U.S. Department of Energy supports Berkeley Lab. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit

  • Filters

  • × Clear Filters

A New Optimization Model Could Bring Higher Solar-Power Integration

With numerous installations of solar power systems for residential homes at or near the distribution site, there is a challenge to balance supply and demand to make these intermittent energy sources reliable. Demand response is one promising way to increase operational flexibility and energy efficiency, and researchers in Malaysia have incorporated DR scenarios in case studies based on 100 urban low-voltage network samples to learn more. They report their findings in this week's Journal of Renewable and Sustainable Energy.

Making Polymer Chemistry 'Click'

A team including Berkeley Lab scientists has developed a faster and easier way to make a class of sulfur-containing plastics that will lower the cost of large-scale production.

Imaging Technology Reveals Copper Is Key to Meeting Future Food and Energy Needs

For the first time, Cornell University researchers are using imaging capabilities at the Cornell High Energy Synchrotron Source (CHESS) to explore how copper affects plant fertility. The work could provide key insights into how plants can be bred for better performance in marginal soils.

PPPL Researchers Perform First Basic Physics Simulation of the Impact of Recycled Atoms on Plasma Turbulence

Article describes simulation of impact of recycled atoms on plasma turbulence.

"Hindcasting" Study Investigates the Extreme 2013 Colorado Flood

Using a publicly available climate model, Berkeley Lab researchers "hindcast" the conditions that led to the Sept. 9-16, 2013 flooding around Boulder, Colo. and found that climate change attributed to human activity made the storm much more severe than would otherwise have occurred.

Ultrathin Device Harvests Electricity From Human Motion

Imagine slipping into a jacket, shirt or skirt that powers your cell phone, fitness tracker and other personal electronic devices as you walk, wave and even when you are sitting down. A new, ultrathin energy harvesting system developed at Vanderbilt University's Nanomaterials and Energy Devices Laboratory has the potential to do just that.

Energy-Efficient Accelerator Was 50 Years in the Making

With the introduction of CBETA, the Cornell-Brookhaven ERL Test Accelerator, Cornell University and Brookhaven National Laboratory scientists are following up on the concept of energy-recovering particle accelerators first introduced by physicist Maury Tigner at Cornell more than 50 years ago.

Scientists Program Yeast to Turn Plant Sugars into Biodiesel

Redox metabolism was engineered in Yarrowia lipolytica to increase the availability of reducing molecules needed for lipid production.

Soils Could Release Much More Carbon than Expected as Climate Warms

Deeper soil layers are more sensitive to warming than previously thought.

3-D Models Help Scientists Gauge Flood Impact

Using one of the world's most powerful supercomputers--Titan, the 27-petaflop Cray XK7 at the Oak Ridge Leadership Computing Facility (OLCF)--a University of Iowa team performed one of the first highly resolved, 3-D, volume-of-fluid Reynolds-averaged Navier-Stokes (RANS) simulations of a dam break in a natural environment. The simulation allowed the team to map precise water levels for actual flood events over time.

  • Filters

  • × Clear Filters

Qubitekk Licenses ORNL Single-Photon Source Approach for Quantum Encryption

Qubitekk has non-exclusively licensed an Oak Ridge National Laboratory-developed method to produce quantum light particles, known as photons, in a controlled, deterministic manner that promises improved speed and security when sharing encrypted data.

Construction of Massive Neutrino Experiment Kicks Off a Mile Underground

A new era in international particle physics research officially began July 21 with a unique groundbreaking held a mile underground at the Sanford Underground Research Facility in South Dakota. Dignitaries, scientists and engineers from around the world marked the start of construction of a massive international experiment that could change our understanding of the universe. The Long-Baseline Neutrino Facility (LBNF) will house the international Deep Underground Neutrino Experiment (DUNE), which will be built and operated by roughly 1,000 scientists and engineers from 30 countries.

Construction Begins on International Mega-Science Experiment to Understand Neutrinos

In a unique groundbreaking ceremony held this afternoon at the Sanford Underground Research Facility in Lead, South Dakota, a group of dignitaries, scientists and engineers from around the world marked the start of construction of a massive international experiment that could change our understanding of the universe. The Long-Baseline Neutrino Facility (LBNF) will house the international Deep Underground Neutrino Experiment (DUNE), which will be built and operated by a group of roughly 1,000 scientists and engineers from 30 countries.

Buchanan Named Deputy for Science and Technology at Oak Ridge National Laboratory

Michelle Buchanan, an accomplished scientific leader and researcher, has been appointed Deputy for Science and Technology at the Department of Energy's Oak Ridge National Laboratory by new Lab Director Thomas Zacharia.

Neutrino Project to Fuel Particle Physics Research

Over the next decade, 800,000 tons of rock will be excavated from the former Homestake Mine in Lead, South Dakota, to accommodate a particle detector filled with 70,000 tons of liquid argon cooled to -300 degrees Fahrenheit to study neutrinos beamed from Fermilab in Illinois. It's called the Deep Underground Neutrino Experiment.

Berkeley Lab to Lead Multimillion-Dollar Geothermal Energy Project

The Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) will lead a new $9 million project aimed at removing technical barriers to commercialization of enhanced geothermal systems (EGS), a clean energy technology with the potential to power 100 million American homes.

PNNL Scientist Ruby Leung Appointed a Battelle Fellow

Ruby Leung of the Department of Energy's Pacific Northwest National Laboratory has been named a Battelle Fellow -- the highest recognition from Battelle for leadership and accomplishment in science. She is one of eight Battelle fellows at PNNL.

Gu and Paranthaman Named ORNL Corporate Fellows

Researchers Baohua Gu and Parans Paranthaman have been named Corporate Fellows of the Department of Energy's Oak Ridge National Laboratory.

DOE Funds Center for Bioenergy Innovation at ORNL to Accelerate Biofuels, Bioproducts Research

The DOE has announced funding for new research centers to accelerate the development of specialty plants and processes for a new generation of biofuels and bioproducts.

Grant Focuses on 'Hydrogen Sponge' for Use in Fuel-Cell Vehicles

Finding practical hydrogen storage technologies for vehicles powered by fuel cells is the focus of a $682,000 grant from the U.S. Department of Energy, awarded to Mike Chung, professor of materials science and engineering, Penn State.

  • Filters

  • × Clear Filters

Quantum Computing Building Blocks

Scientists invented an approach to creating ordered patterns of nitrogen-vacancy centers in diamonds, a promising approach to storing and computing quantum data.

Scientists Program Yeast to Turn Plant Sugars into Biodiesel

Redox metabolism was engineered in Yarrowia lipolytica to increase the availability of reducing molecules needed for lipid production.

Soils Could Release Much More Carbon than Expected as Climate Warms

Deeper soil layers are more sensitive to warming than previously thought.

Weaving a Fermented Path to Nylons

Microbial enzymes create precursors of nylon while avoiding harsh chemicals and energy-demanding heat.

Loosening of Lignocellulose: Switchgrass and Success in Sugar Release

Using a genetically modified line of switchgrass, scientists reduced plant cell wall recalcitrance while increasing sugar release over three generations.

Extending the Life of Lithium-Ion Batteries

Scientists offer new insights into how the source of electrons in batteries fails.

Unraveling the Molecular Complexity of Cellular Machines and Environmental Processes

State-of-the-art mass spectrometer delivers unprecedented capability to scientists.

Speeding Up Catalysts for Energy Storage

Researchers develop the fastest synthetic catalyst for producing hydrogen gas, potentially leading to a new environmentally friendly, affordable fuel.

Watching Neutrons Flow

Like water, neutrons seek their own level, and watching how they flow may teach us about how the chemical elements were made.

FIONA to Take on the Periodic Table's Heavyweights

FIONA (For the Identification Of Nuclide A) is a newly installed device designed to measure the mass numbers of individual atoms of heavy and superheavy elements. FIONA will let researchers learn about the shape and structure of heavy nuclei, guide the search for new elements, and offer better measurements for nuclear fission and related processes.


Saturday May 20, 2017, 12:05 PM

Rensselaer Polytechnic Institute Graduates Urged to Embrace Change at 211th Commencement

Rensselaer Polytechnic Institute (RPI)

Monday May 15, 2017, 01:05 PM

ORNL, University of Tennessee Launch New Doctoral Program in Data Science

Oak Ridge National Laboratory

Friday April 07, 2017, 11:05 AM

Champions in Science: Profile of Jonathan Kirzner

Department of Energy, Office of Science

Wednesday April 05, 2017, 12:05 PM

High-Schooler Solves College-Level Security Puzzle From Argonne, Sparks Interest in Career

Argonne National Laboratory

Tuesday March 28, 2017, 12:05 PM

Champions in Science: Profile of Jenica Jacobi

Department of Energy, Office of Science

Friday March 24, 2017, 10:40 AM

Great Neck South High School Wins Regional Science Bowl at Brookhaven Lab

Brookhaven National Laboratory

Wednesday February 15, 2017, 04:05 PM

Middle Schoolers Test Their Knowledge at Science Bowl Competition

Argonne National Laboratory

Friday January 27, 2017, 04:00 PM

Haslam Visits ORNL to Highlight State's Role in Discovering Tennessine

Oak Ridge National Laboratory

Tuesday November 08, 2016, 12:05 PM

Internship Program Helps Foster Development of Future Nuclear Scientists

Oak Ridge National Laboratory

Friday May 13, 2016, 04:05 PM

More Than 12,000 Explore Jefferson Lab During April 30 Open House

Thomas Jefferson National Accelerator Facility

Monday April 25, 2016, 05:05 PM

Giving Back to National Science Bowl

Ames Laboratory

Friday March 25, 2016, 12:05 PM

NMSU Undergrad Tackles 3D Particle Scattering Animations After Receiving JSA Research Assistantship

Thomas Jefferson National Accelerator Facility

Tuesday February 02, 2016, 10:05 AM

Shannon Greco: A Self-Described "STEM Education Zealot"

Princeton Plasma Physics Laboratory

Monday November 16, 2015, 04:05 PM

Rare Earths for Life: An 85th Birthday Visit with Mr. Rare Earth

Ames Laboratory

Tuesday October 20, 2015, 01:05 PM

Meet Robert Palomino: 'Give Everything a Shot!'

Brookhaven National Laboratory

Tuesday April 22, 2014, 11:30 AM

University of Utah Makes Solar Accessible

University of Utah

Wednesday March 06, 2013, 03:40 PM

Student Innovator at Rensselaer Polytechnic Institute Seeks Brighter, Smarter, and More Efficient LEDs

Rensselaer Polytechnic Institute (RPI)

Friday November 16, 2012, 10:00 AM

Texas Tech Energy Commerce Students, Community Light up Tent City

Texas Tech University

Wednesday November 23, 2011, 10:45 AM

Don't Get 'Frosted' Over Heating Your Home This Winter

Temple University

Wednesday July 06, 2011, 06:00 PM

New Research Center To Tackle Critical Challenges Related to Aircraft Design, Wind Energy, Smart Buildings

Rensselaer Polytechnic Institute (RPI)

Friday April 22, 2011, 09:00 AM

First Polymer Solar-Thermal Device Heats Home, Saves Money

Wake Forest University

Friday April 15, 2011, 12:25 PM

Like Superman, American University Will Get Its Energy from the Sun

American University

Thursday February 10, 2011, 05:00 PM

ARRA Grant to Help Fund Seminary Building Green Roof

University of Chicago

Tuesday December 07, 2010, 05:00 PM

UC San Diego Installing 2.8 Megawatt Fuel Cell to Anchor Energy Innovation Park

University of California San Diego

Monday November 01, 2010, 12:50 PM

Rensselaer Smart Lighting Engineering Research Center Announces First Deployment of New Technology on Campus

Rensselaer Polytechnic Institute (RPI)

Friday September 10, 2010, 12:40 PM

Ithaca College Will Host Regional Clean Energy Summit

Ithaca College

Tuesday July 27, 2010, 10:30 AM

Texas Governor Announces $8.4 Million Award to Create Renewable Energy Institute

Texas Tech University

Friday May 07, 2010, 04:20 PM

Creighton University to Offer New Alternative Energy Program

Creighton University

Wednesday May 05, 2010, 09:30 AM

National Engineering Program Seeks Subject Matter Experts in Energy

JETS Junior Engineering Technical Society

Wednesday April 21, 2010, 12:30 PM

Students Using Solar Power To Create Sustainable Solutions for Haiti, Peru

Rensselaer Polytechnic Institute (RPI)

Wednesday March 03, 2010, 07:00 PM

Helping Hydrogen: Student Inventor Tackles Challenge of Hydrogen Storage

Rensselaer Polytechnic Institute (RPI)

Thursday February 04, 2010, 02:00 PM

Turning Exercise into Electricity

Furman University

Thursday November 12, 2009, 12:45 PM

Campus Leaders Showing the Way to a Sustainable, Clean Energy Future

National Wildlife Federation (NWF)

Tuesday November 03, 2009, 04:20 PM

Furman University Receives $2.5 Million DOE Grant for Geothermal Project

Furman University

Thursday September 17, 2009, 02:45 PM

Could Sorghum Become a Significant Alternative Fuel Source?

Salisbury University

Wednesday September 16, 2009, 11:15 AM

Students Navigating the Hudson River With Hydrogen Fuel Cells

Rensselaer Polytechnic Institute (RPI)

Wednesday September 16, 2009, 10:00 AM

College Presidents Flock to D.C., Urge Senate to Pass Clean Energy Bill

National Wildlife Federation (NWF)

Wednesday July 01, 2009, 04:15 PM

Northeastern Announces New Professional Master's in Energy Systems

Northeastern University

Friday October 12, 2007, 09:35 AM

Kansas Rural Schools To Receive Wind Turbines

Kansas State University

Thursday August 17, 2006, 05:30 PM

High Gas Prices Here to Stay, Says Engineering Professor

Rowan University

Wednesday May 17, 2006, 06:45 PM

Time Use Expert's 7-Year Fight for Better Gas Mileage

University of Maryland, College Park

Showing results

0-4 Of 2215