ADVERTISEMENTS
Learn about diversity at ASEE
ASEE would like to acknowledge the generous support of our premier corporate partners.

COVER STORY
Dance With the Dragon - Research partnerships grow alongside U.S.-Chinese competition. + By Mark Matthews

A SHOUT OUT at the State of the Union address tells a federal agency its work finds favor at the highest levels. So when President Obama lauded brain research as an example of how the government should “invest in the best ideas,” National Institutes of Health program directors were thrilled, thinking he meant their current efforts to map the static brain for answers to Alzheimer’s and other ills. In fact, Obama had a bigger challenge in mind for NIH: a $3 billion, 10-year initiative to plot the complex activity of some 100 billion cells that make up the typical human brain — an undertaking that could yield the same broad technological breakthroughs and economic returns as the Human Genome Project or America’s war on cancer.

“The brain is the last great frontier,” says Story Landis, head of the NIH National Institute of Neurological Disorders and Stroke (NINDS), whose agency may collaborate on the Brain Activity Map (BAM) with the National Science Foundation, Defense Advanced Research Projects Agency, and other government research arms. “It’s what makes us human, how we think, how we write poetry. And the burden of disease that affects the brain is pretty extraordinary,” she told USA Today.

Equally extraordinary is the potential reward from mapping this frontier, a reason the National Academy of Engineering made reverse engineering the human brain one of the profession’s 21st-century Grand Challenges. By figuring out the brain’s inner workings, engineers can simulate its activities and develop new drugs for mental disorders and smarter prosthetics for amputees. Studying neural circuitry may one day spawn implants that work around damaged tissue to halt dementia’s memory loss or let blind people see. BAM could yield advances as well in artificial intelligence, robotics, and manufacturing. Learning how the brain learns also has implications for the design of supercomputers that can process multiple streams of information rather than today’s one-step processes.


“The Chinese have done it right: They invested in their people and infrastructure.” – Jeffrey Simmen - Director of the Applied Physics Laboratory at the University of Washington.


While critics have questioned devoting $300 million a year to this one initiative, the quest to develop a Brain Activity Map has generated some of the excitement and urgency of the space race. The federal money, if approved by Congress, would gain added leverage from privately funded and overseas research efforts. Earlier this year, the European Union announced a $1.5 billion, decade-long initiative led by Swiss researchers to build a supercomputer simulation of the human brain based on the inner workings of its molecules, neurons, and circuits. Meanwhile, several private ventures — including a nonprofit research institute founded by Microsoft cofounder Paul Allen, who lost a mother to Alzheimer’s disease — have invested heavily in figuring out how the brain is wired.

The engineering and scientific challenges of mapping the brain are enormous, calling to mind the words of James Watson, codiscoverer of DNA: “The brain boggles the mind.” Reverse-engineering means dismantling something to find out how it works and then building copies. That may work for a machine, but the human brain is no bucket of bolts. It has circuitry and a complex wiring pattern honed by evolution. At the most basic level — neurons — a nerve impulse travels down the cell’s long, tail-like axons and triggers the release of chemicals called neurotransmitters into a synapse, the space between neurons. Those chemicals then spur another neuron to fire, and thus the signal passes from one cell to the next.

Neuroscientists traditionally have used electrodes to detect the activity of one or several neurons within a particular region. Since brain circuits involve millions of cells, however, these methods are pointless, noted the six researchers who first proposed the BAM project in a June 2012 Neuron magazine article.

It has taken a decade for NIH researchers to develop a “circuit diagram” of the roundworm’s nervous system, which has 300 neurons that make about 7,000 connections. BAM requires a quantum leap beyond that into territory that until recently was the preserve of science fiction writers. As NIH Director Francis Collins noted in PBS News Hour interview, each of the human brain’s 100 billion neurons has 10,000 connections. Developing tools and techniques to track where the mind’s roughly 1,000 trillion connections occur – and recording every firing of every neuron in every circuit – is like untangling an incredibly complex bowl of spaghetti. And mapping is not simply a matter of seeing which neuron connects where. How does gray matter code and store information? Understanding how the brain cements or retrieves information, or goes haywire in people with mental illness, autism, or depression “is not going to be an overnight experience,” says Collins.

Fundamental Questions

No one disputes the strides made already as a result of years of interdisciplinary research. With tools developed by engineers, like the functional magnetic resonance imaging machine (fMRI) and micro-endoscopes, science can now address such fundamental questions as “Why do we sleep?” or “What is memory?”
Neuroscientists can peer into the active brain as someone learns, remembers, sees, or snoozes. With fMRI scans that show blood flow, they can glimpse the different areas of the brain that light up, say, for strong readers versus those with dyslexia. Diffusion tensor imaging (DTI), a variation of conventional scans that tracks the movement of water molecules in the brain and can detect subtle wiring problems in axons, has been used to study addiction, schizophrenia, and traumatic brain injury.

Advances in computer science have paid off in algorithms used in speech recognition technologies and automated “seeing” machines used in factories. Indeed, engineering and neuroscience discoveries “go hand in hand,” yielding hundreds of technologies, says bioengineer Kip Ludwig, who oversees brain-research technologies as a NINDS program director. He cites devices to record extracellular and intracellular voltage levels, neurotransmitter detectors, and specialized microscopes and voltage-sensitive dye that can display the activity of hundreds of neurons. In the collaboration, “engineers design new tools for neuroscientists to use to study the brain and neuroscientists say, ‘This is good, but what we really need now is a tool that will do ABC,’ and ask engineers for tools to do that.”

eagle

Monitoring Neurons

Mark Schnitzer, an associate professor of biology and applied physics at Stanford University and a Howard Hughes Medical Institute (HHMI) investigator, worked with his team to develop a system that “reads” the minds of mice as they run around an enclosure. The rodents have genetically engineered neurons that express a fluorescent protein when the brain cells fire, releasing calcium ions. By implanting a micro-endoscope connected to a camera chip just above the hippocampus, an area of the brain sensitive to the environment, researchers can monitor the firing of hundreds of neurons in near real time in the living, behaving mouse. By looking at these lights — they resemble random bursts of little green fireworks on the computer display — “we can literally figure out where the mouse is,” says Schnitzer, noting that different neurons would fire at specific spots. In essence, the mouse’s brain made a representational map of its space. Schnitzer’s team has linked that activity to long-term information storage — offering a potential tool for studying new therapies for Alzheimer’s and other brain diseases.

Terry Sejnowski is another brain-research pioneer. For years, the HHMI investigator and neurobiology professor at the University of California, San Diego (UCSD), has tried to bridge the gap between molecules and systems. Starting his career as a physicist, he found it harder and harder to collect data without big instruments. By contrast, neuroscience offered many interesting problems to tackle. So he signed up for a neuroscience course at the Marine Biological Laboratory in Woods Hole, Mass. “That was a turning point,” says Sejnowski, who went on to get a postdoctoral position, enjoying the lab work so much he switched fields.

Now at the Salk Institute for Biological Studies in La Jolla, Calif., Sejnowski directs a lab focused on understanding the way brain mechanisms – such as how neurons communicate at the synapses or what regulates the flow of information to the cortex — link to behavior. Unlike many labs, however, his laboratory looks at the brain at multiple levels. Some teams examine the synapse, while others look at the ion channels or a whole system. They also collaborate with researchers elsewhere to figure out, for example, why we sleep. Behind this big-picture approach is Sejnowski’s fascination with how different gray matter is from computers. “When you look at the brain, you realize this is not the way an engineer would design the system,” he says in his HHMI profile. “The brain is redundant, massively parallel, and regenerative,” with neurons dedicated to one activity able to be recruited for new uses.

The approach has produced significant breakthroughs, including a state-of-the-art cellular simulation program called MCell. A collaborative effort between the Pittsburgh Supercomputing Center and the Salk Institute, with support from the NIH, HHMI, and NSF, MCell took more than 15 years to develop and allows researchers to account for every molecule and protein inside and outside a cell, and to document their activities by the microsecond. Sejnowski’s lab also has forged fruitful collaborations to probe why humans sleep. Matching EEG measurements with fMRI imaging and in vitro studies, for example, has revealed patterns of nerve impulses that correlate to sleep patterns and has yielded new algorithms to automatically classify sleep states from single EEG readings. Analysis of spike patterns overturned the long-held assumption that the most important information transmitted in the visual cortex, the section of the brain responsible for vision, was the total number of times the neurons fired. In fact, the timing of those neural spikes is important in coding and transmitting information. Ironically, notes Sejnowski, researchers know more about how the brain learns than how it pays attention.

Deciphering the brain’s biology can lead to new technologies, and vice versa. “Once you’ve figured out a principle,” explains Sejnowski, “it becomes possible to build on that principle and develop practical devices.” For example, Tobi Delbrück of the Institute of Neuroinformatics at the University of Zurich and colleagues have built a camera based on the timing of neural spikes, which is how the eye’s retina processes images for the brain. The process of engineering a device, in turn, helps confirm that basic understanding. “If all you have is a bunch of simulations, then you have a shadow of what’s there,” says Sejnowski. “If you can actually build a device that uses the same principles, and the device works in the same way that the complex system does, then you’ve achieved a better understanding of that process.”

China
University of Georgia researchers developed
diffusiontensor images showing fibrous connections in the brain.




Genetic Software Decoded

At the Allen Institute for Brain Science, launched in 2003 with $500 million from Paul Allen, researchers have developed cutting-edge technologies to decode the mind’s genetic software, not just its wiring. The Seattle-based nonprofit has a mission to accelerate the progress of neuroscience research by generating publicly available databases. “Generally, the best discoveries in biology are made by connecting many data sets,” notes Allen Institute researcher Michael Hawrylycz, director of the modeling and analysis groups, “so you try to link them up.” In 2006, Allen Institute researchers completed a genetic atlas of the mouse brain. Last November, they published an atlas of the human brain. Current efforts include using a laser to peer into a live mouse’s mind through a surgically implanted glass window in its skull, capturing images that could illuminate how nerve impulses from the eyes become behavior as it runs.

The institute’s broad goal is to generate data in a very comprehensive and systematic way, so that the entire neuroscience community “can have the benefit of access… without having to do the experiments themselves,” says Allen Institute neuroscientist Ed Lein. “Generating any single piece of this data could occupy months to years of a particular researcher’s time to do. And then most of that data actually doesn’t become public. So the concept here is based on the model of the genome projects: Do it once, very, very well—and make it completely open access.”

To map the mouse brain’s genes, Allen Institute scientists took tissue sections and then looked for these expressed genes using a technique called in situ hybridization. This involves binding labeled probes to the tissue. The probes attach themselves to bits of RNA, which indicate the genes being expressed in those cells. In situ hybridization allows researchers to capture images of gene expression and then plot each expression’s physical location in the brain.
Since the human brain is 1,000 times as large as that of a mouse, the mapping process was even more complicated. Allen Institute researchers acquired two “control” brains for analysis — ones that were as normal as possible. That in itself was a tall task: If a person suffered from disease or took drugs, it could alter the brain’s circuitry or chemistry. Taking tissue samples, the researchers used microarrays containing probes for many possible RNA sequences. (Each brain had 1,000 different structures from which 20,000 genes were assayed.) The results then were mapped back onto coordinates of the brain obtained earlier by magnetic resonance imaging.

“What’s really unusual about the data set,” says Hawrylycz, “is that it’s so broad anatomically that for the first time we can look at brain wide expression patterns.” For example, researchers found genetic signatures for various regions of the brain, including patterns in the cortex, the wrinkly outer layer that does much of our higher-order information processing. They now can pinpoint which part of the cortex a tissue sample came from just by looking at the pattern of genes expressed. If researchers find that particular genes active in some regions are also active in others, “that has implications for how diseases may be treated,” Hawrylycz explains. “So we try to understand the global relationships in the brain.” Since completion of the brain atlas, the institute has acquired four more brains, generating enormous amounts of data. Researchers hope to determine the common signatures among the six, and perhaps expand into examining diseased brains.

“Neuromorphic engineering” is the name given to the emerging field of using what neuroscientists learn about the brain’s architecture to improve computer simulations and thinking devices. IBM, creator of Deep Blue, the supercomputer that defeated chess grandmaster Gary Kasparov in 1997, and of Watson, which bested two Jeopardy! champs, is building on its 2011 development of a cognitive computing microchip. Created in collaboration with DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE, project, the TrueNorth chip uses advanced algorithms and silicon circuitry to “simulate the phenomena between spiking neurons and synapses in the brain,” according to the company’s announcement. Last November, IBM Research announced it had used the chip to simulate 530 billion neurons and 100 trillion synapses by mimicking the connectivity of a monkey’s brain. IBM Research and four university labs now are working “to bring together neuroscience, supercomputing, and nanotechnology to create a radically different computer architecture that mimics the function, low power, small size, and real time of the human brain,” electrical engineer Dharmendra Modha, manager of cognitive computing at IBM’s Almaden Research Center in San Jose, Calif., explains in an IBM video.

Despite such achievements, the era of brain-in-the-box learning systems remains a long way from becoming reality. “We have not built a biologically realistic simulation of the complete human brain,” IBM’s TrueNorth team cautioned in a paper. While experiments involving neuromorphic networks have included learning to play the game of Pong, the knowledge was gained offline — not acquired on the fly, as humans learn.

The point, Modha says, is to draw inspiration from the brain, not construct an artificial one. In the world of cognitive computing, that means designing networks capable of parallel, short, complex thinking. For Ralph Greenspan, associate director of UCSD’s Kavli Institute and one of the six researchers who first proposed a large-scale, public Brain Activity Map project, the key to “functional connectomics” lies in designing novel imaging techniques and nanoprobe-sensing tools. The project, which emerged from a 2011 conference hosted by his institute, could help answer questions such as how humans move their fingers or understand economics. Meanwhile, Greenspan told local public radio station KPBS, the “fancy, science fiction-y new kinds of detectors” needed to record and analyze all that neuroactivity could produce technological spinoffs that have nothing to do with the brain, So far, engineers have “done incredible proof-of-concept demonstrations,” says NIH’s Ludwig, but commercially viable products – like a durable, inexpensive thought-controlled prosthesis as opposed to a multi-million-dollar, fragile prototype – are in some cases years away. Still, the potential job creation clearly fires the president’s imagination. In his State of the Union speech, he noted that every dollar spent on the Human Genome Project returned $140 to the U.S. economy.

You don’t need a map to find those connections.


Mary Lord is deputy editor of Prism. Corinna Wu is a print and radio journalist specializing in science.


Photo by Lance Long; courtesy Electronic Visualization Laboratory, University of Illinois at Chicago

 

`
ASEE
© Copyright 2013
American Society for Engineering Education
1818 N Street, N.W., Suite 600
Washington, DC 20036-2479
Web: www.asee.org
Telephone: (202) 331-3500