I am a mariner of Odysseus with heart of fire but with mind ruthless and clear

Archive for the ‘computer science’ Category

Cell-inspired electronics

In computer science on February 26, 2010 at 3:11 pm

Cell-inspired electronics

A single cell in the human body is approximately 10,000 times more energy-efficient than any nanoscale digital transistor, the fundamental building block of electronic chips. In one second, a cell performs about 10 million energy-consuming chemical reactions, which altogether require about one picowatt (one millionth millionth of a watt) of power.

MIT’s Rahul Sarpeshkar is now applying architectural principles from these ultra-energy-efficient cells to the design of low-power, highly parallel, hybrid analog-digital electronic circuits. Such circuits could one day be used to create ultra-fast supercomputers that predict complex cell responses to drugs. They may also help researchers to design synthetic genetic circuits in cells.

In his new book, Ultra Low Power Bioelectronics (Cambridge University Press, 2010), Sarpeshkar outlines the deep underlying similarities between chemical reactions that occur in a cell and the flow of current through an analog . He discusses how biological cells perform reliable computation with unreliable components and noise (which refers to random variations in signals — whether electronic or genetic). Circuits built with similar design principles in the future can be made robust to electronic noise and unreliable electronic components while remaining highly energy efficient. Promising applications include image processors in cell phones or brain implants for the blind.

“Circuits are a language for representing and trying to understand almost anything, whether it be networks in biology or cars,” says Sarpeshkar, an associate professor of electrical engineering and computer science. “There’s a unified way of looking at the biological world through circuits that is very powerful.”

Circuit designers already know hundreds of strategies to run analog circuits at low power, amplify signals, and reduce noise, which have helped them design low-power electronics such as mobile phones, mp3 players and laptop computers.

“Here’s a field that has devoted 50 years to studying the design of complex systems,” says Sarpeshkar, referring to electrical engineering. “We can now start to think of biology in the same way.” He hopes that physicists, engineers, biologists and biological engineers will work together to pioneer this new field, which he has dubbed “cytomorphic” (cell-inspired or cell-transforming) electronics.

Finding connections

Sarpeshkar, an electrical engineer with many years of experience in designing low-power and biomedical circuits, has frequently turned his attention to finding and exploiting links between electronics and biology. In 2009, he designed a low-power radio chip that mimics the structure of the human cochlea to separate and process cell phone, Internet, radio and television signals more rapidly and with more energy efficiency than had been believed possible.

That chip, known as the RF (radio frequency) cochlea, is an example of “neuromorphic electronics,” a 20-year-old field founded by Carver Mead, Sarpeshkar’s thesis advisor at Caltech. Neuromorphic circuits mimic biological structures found in the nervous system, such as the cochlea, retina and brain cells.

Sarpeshkar’s expansion from neuromorphic to cytomorphic electronics is based on his analysis of the equations that govern the dynamics of chemical reactions and the flow of electrons through analog circuits. He has found that those equations, which predict the reaction’s (or circuit’s) behavior, are astonishingly similar, even in their noise properties.

Chemical reactions (for example, the formation of water from hydrogen and oxygen) only occur at a reasonable rate if enough energy is available to lower the barriers that prevent such reactions from occurring. A catalyst such as an enzyme can lower such barriers. Similarly, electrons flowing through a circuit in a transistor exploit input voltage energy to allow them to reduce the barrier for electrons to flow from the transistor’s source to the transistor’s drain. Changes in the input voltage lower the barrier and increase current flow in transistors, just as adding an enzyme to a chemical reaction speeds it up.

Essentially, cells may be viewed as circuits that use molecules, ions, proteins and DNA instead of electrons and transistors. That analogy suggests that it should be possible to build electronic chips — what Sarpeshkar calls “cellular chemical computers” — that mimic chemical reactions very efficiently and on a very fast timescale.

One potentially powerful application of such circuits is in modeling genetic network — the interplay of genes and proteins that controls a cell’s function and fate. In a paper presented at the 2009 IEEE Symposium on Biological Circuits and Systems, Sarpeshkar designed a circuit that allows any genetic network reaction to be simulated on a chip. For example, circuits can simulate the interactions between genes involved in lactose metabolism and the transcription factors that regulate their expression in bacterial cells.

In the long term, Sarpeshkar plans to develop circuits that mimic interactions within entire cellular genomes, which are important in enabling scientists to understand and treat complex diseases such as cancer and diabetes. Eventually, researchers may be able to use such chips to simulate the entire human body, he believes. Such chips would be much faster than computer simulations now, which are highly inefficient at modeling the effects of noise in the large-scale nonlinear circuits within cells.

He is also investigating how circuit design principles can help genetically engineer cells to perform useful functions, for example, the robust and sensitive detection of toxins in the environment.

Sarpeshkar’s focus on modeling cells as analog rather than digital circuits offers a new approach that will expand the frontiers of synthetic biology, says James Collins, professor of biomedical engineering at Boston University. “Rahul has nicely laid a foundation that many of us in synthetic biology will be able to build on,” he says.

Provided by Massachusetts Institute of Technology

Basic quantum computing circuit built

In computer science, physics, science on February 26, 2010 at 3:01 pm

Exerting delicate control over a pair of atoms within a mere seven-millionths-of-a-second window of opportunity, physicists at the University of Wisconsin-Madison created an atomic circuit that may help quantum computing become a reality.

Quantum computing represents a new paradigm in information processing that may complement classical computers. Much of the dizzying rate of increase in traditional computing power has come as transistors shrink and pack more tightly onto chips — a trend that cannot continue indefinitely.

“At some point in time you get to the limit where a single transistor that makes up an  is one atom, and then you can no longer predict how the transistor will work with classical methods,” explains UW-Madison physics professor Mark Saffman. “You have to use the physics that describes atoms — .”

At that point, he says, “you open up completely new possibilities for processing information. There are certain calculational problems… that can be solved exponentially faster on a quantum computer than on any foreseeable .”

With fellow physics professor Thad Walker, Saffman successfully used neutral atoms to create what is known as a controlled-NOT (CNOT) gate, a basic type of circuit that will be an essential element of any quantum computer. As described in the Jan. 8 issue of the journal , the work is the first demonstration of a  between two uncharged atoms.

The use of neutral atoms rather than charged ions or other materials distinguishes the achievement from previous work. “The current gold standard in experimental  has been set by trapped ions… People can run small programs now with up to eight ions in traps,” says Saffman.

However, to be useful for computing applications, systems must contain enough , or qubits, to be capable of running long programs and handling more complex calculations. An ion-based system presents challenges for scaling up because ions are highly interactive with each other and their environment, making them difficult to control.

“Neutral atoms have the advantage that in their ground state they don’t talk to each other, so you can put more of them in a small region without having them interact with each other and cause problems,” Saffman says. “This is a step forward toward creating larger systems.”

The team used a combination of lasers, extreme cold (a fraction of a degree above absolute zero), and a powerful vacuum to immobilize two rubidium atoms within “optical traps.” They used another laser to excite the atoms to a high-energy state to create the CNOT quantum gate between the two atoms, also achieving a property called entanglement in which the states of the two atoms are linked such that measuring one provides information about the other.

Writing in the same journal issue, another team also entangled neutral atoms but without the CNOT gate. Creating the gate is advantageous because it allows more control over the states of the atoms, Saffman says, as well as demonstrating a fundamental aspect of an eventual quantum computer.

The Wisconsin group is now working toward arrays of up to 50 atoms to test the feasibility of scaling up their methods. They are also looking for ways to link qubits stored in atoms with qubits stored in light with an eye toward future communication applications, such as “quantum internets.”

Πηγή: http://www.physorg.com/print186333950.html