Feeds:
Posts
Comments

Archive for the ‘computational neuroscience’ Category

Ion channels are important drug targets. A young team of researchers led by pharmacologist Anna Stary-Weinzinger from the Department of Pharmacology and Toxicology, University of Vienna investigated the opening and closing mechanisms of these channels: for the first time the full energy landscape of such a large protein (> 400 amino acids) could be calculated in atomic detail. The scientists identified a phenylalanine, which plays a key role for the transition between open and closed state. The time consuming calculations were performed using the high performance computer cluster (VSC), which is currently the fastest computer in Austria.

Recently, the results were published inPLOS Computational Biology.

Every cell of our body is separated from its environment by a lipid bilayer. In order to maintain their biological function and to transduce signals, special proteins, so called ion channels, are embedded in the membrane. Anna Stary-Weinzinger and Tobias Linder from the University of Vienna and Bert de Groot from the Max Planck Institute of Biophysical Chemistry in Göttingen identified a key amino acid (phenylalanine 114), which plays an essential role for opening and closing of these ion channels. A conformational change of phenylalanine triggers opening of the channels.

“These proteins are highly selective, they can distinguish between different ions such as sodium, potassium or chloride and allow ion flux rates of up to 100 million ions per seconds,” explains Stary-Weinzinger, leader of the research project and postdoc at the Department of Pharmacology and Toxicology of the University of Vienna. “These molecular switches regulate numerous essential body functions such as transduction of nerve signals, regulations of the heart rhythm or release of neurotransmitters. Slight changes in function, caused by replacement of single amino acids, can lead to severe diseases, such as arrhythmias, migraine, diabetes or cancer.”

Knowledge of ion channel function provides the basis for better drugs

Ion channels are important drug targets. 10 percent of current pharmaceuticals target ion channels. A detailed understanding of these proteins is therefore essential to develop drugs with improved risk-benefit profiles. An important basis for drug development is a detailed knowledge of the functional mechanisms of these channels. However, there are still many open questions; especially the energy profile and pathway of opening and closure are far from being understood.

Computer simulations visualize ion channel movements

To watch these fascinating proteins at work, molecular dynamics simulations are necessary. Computational extensive calculations were performed with the help of the Vienna Scientific Cluster (VSC), the fastest high performance computer in Austria, a computer cluster operated by the University of Vienna, the Vienna University of Technology and the University of Natural Resources and Applied Life Sciences Vienna. With the help of VSC, the free energy landscape of ion channel gating could be investigated for the first time. The young researchers discovered that the open and closed channel states are separated by two energy barriers of different height.

Phenylalanine triggers conformational changes

Surprisingly, the dynamics of a specific amino acid, phenylalanine 114, are coupled to a first smaller energy barrier. “This side chain acts as molecular switch to release the channel from the closed state,” explains Tobias Linder, PhD student from the University of Vienna. After these local changes, the channel undergoes large global rearrangements, leading to a fully open state. This second transition from an intermediate to a fully open pore is accompanied by a large second energy barrier.

This research project is financed by the FWF-doctoral program “Molecular Drug Targets” (MolTag), which is led by Steffen Hering, Head of the Department of Pharmacology and Toxicology of the Faculty of Life Sciences, University of Vienna.

Story Source:

The above story is based on materials provided by University of Vienna.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

 

Journal Reference:

  1. Tobias Linder, Bert L. de Groot, Anna Stary-Weinzinger.Probing the Energy Landscape of Activation Gating of the Bacterial Potassium Channel KcsAPLoS Computational Biology, 2013; 9 (5): e1003058 DOI:10.1371/journal.pcbi.1003058
Advertisements

Read Full Post »

Image

At the recent meeting of the American Association for the Advancement for Science in Boston, neuroscientists outlined several lines of promising Brain–computer interface research. Advances in microprocessors, computing, and materials science, for example, have facilitated the development of “epidermal electronics,” which combine wireless communications, neural sensors, and other medical sensors into patches small and flexible enough to serve as temporary tattoos.

Read more: http://goo.gl/97eki

Read Full Post »

Image
Image credit: Inserm / P. Latron

This is possible through the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. More info: http://bit.ly/15soo00

Journal article: Real-Time Parallel Processing of Grammatical Structure in the Fronto-Striatal System: A Recurrent Network Simulation Study Using Reservoir Computing. PLoS ONE, 2013 http://bit.ly/Xj3Zqz

Read Full Post »

Artificial neural networks (ANNs) are used today to learn solutions to parallel processing problems that have proved impossible to solve using conventional algorithms. From cloud-basedvoice-driven apps like Apple’s Siri to realtimeknowledge mining apps like IBM’s Watson to gaming apps like Electronic Arts’ SimCity, ANNs are powering voice-recognitionpattern-classification and function-optimization algorithms perfect for acceleration with Intel hyper-threading technology.

“Artificial neural networks and hyper-threading technologies are ideally suited for each other,” says Chuck Desylva, a support engineer for Intel performance primitives. “By functionally decomposing ANN workloads–dividing them among logical processors and employing additional optimization methods–you can achieve significant performance gains.”

Desylva recently tested several widely available open-source ANN algorithms on a Pentium-4 extreme edition to demonstrate how using its dual threads can achieve significant speed-ups. For the forthcoming massively parallelXeon Phi, Desylva predicts even more significant acceleration of ANN algorithms, since Xeon Phi supports four threads for each of its 50+ cores.

“I think that Xeon Phi will be a perfect fit for ANNs,” Desylva believes.

Biological neurons (upper left) are emulated by artificial neural network (ANN) mapping concepts that sum inputs (upper right) then supply an output (bottom) filtered by an activation function. Source: Intel

Biological neurons (upper left) are emulated by artificial neural network (ANN) mapping concepts that sum inputs (upper right) then supply an output (bottom) filtered by an activation function. Source: Intel

Artificial neural networks (ANNs)  emulate how the brain’s billion of neurons and trillions of synaptic connections divide and conquer tough combinatorial problems involving  detection of features,  perception of objects and  cognitive functions of association, generalization and attention. By implementing multiple layers of virtual parallel processors–each simulating a layer of interconnected neurons like those found in the cerebral cortex–ANNs are capable oflearning the solution to programming problems  impossible to execute in realtime using conventional algorithms.

For instance, ANNs enable voice-recognition systems to instantaneously match your voice against millions of stored samples, in contrast with standard algorithms that would have to serially compare your voice to each sample then calculate the best match,  a task too computationally intensive for realtime execution.

To evaluate how to accelerate ANNs, Desylva adapted for hyper-threading several popular algorithms, such as the back-propagation-of-error (BPE) learning algorithm that sends corrective feedback to previous layers in a multi-layer neural network until the desired real time response is achieved.

 Testing of these neural-learning algorithms was applied to a virtual network 10 million neurons. Performance boosts of over 10 percent were achieved immediately by using the Streaming SIMD Extensions 2 (SSE2) and thread-safeversions of the Microsoft Standard template Library (STL). OpenMP pragmas were then used to direct the compiler to use threading, resulting in a 20 percent overall performance increase compared to the original source. VTunes was then run to show a 3-to-4 times speedup in the commands OpenMP uses to synchronize threads.

Next the same OpenMP-based optimization technique was applied to the update function, which calculates the output of each neural-network layer before passing it to the next, resulting in double the average performance of several different ANN applets.

Finally, dissecting the BPE learning algorithm itself, resulted in as much as a 3.6-times speedup over the original unmodified source.

Posted on October 16, 2012 by R. Colin Johnson, Geeknet Contributing Editor

Read Full Post »