Feeds:
Posts
Comments

Archive for the ‘research’ Category

Image

A three-dimensional, reconstructed magnetic resonance image (upper) shows a cavity caused by a spinal injury nearly filled with grafted neural stem cells, colored green. The lower image depicts neuronal outgrowth from transplanted human neurons (green) and development of putative contacts (yellow dots) with host neurons (blue).

A new study has found that a single injection of human neural stem cells produced neuronal regeneration and improvement of function and mobility in rats impaired by an acute spinal cord injury (SCI).
The human stem cells appeared to vigorously take root at the injury site and produced an array of therapeutic benefits.

A 3-D, reconstructed MRI (upper image) shows a cavity caused by a spinal injury nearly filled with grafted neural stem cells (green). The lower image depicts neuronal outgrowth from transplanted human neurons (green) and development of putative contacts (yellow dots) with host neurons (blue).

Read more: http://bit.ly/16mQaOH
Image credit: UC San Diego School of Medicine
Journal article: Amelioration of motor/sensory dysfunction and spasticity in a rat model of acute lumbar spinal cord injury by human neural stem cell transplantation. Stem Cell Research & Therapy, 2013 DOI: 10.1186/scrt209

Advertisements

Read Full Post »

891823_152477334919296_1454135138_o

“There is this enormous mystery waiting to be unlocked, and The BRAIN Initiative will change that by giving scientists the tools they need to get a dynamic picture of the brain in action and better understand how we think and how we learn and how we remember.”

— Barack Obama

 

Read Full Post »

Image
Image credit: Inserm / P. Latron

This is possible through the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. More info: http://bit.ly/15soo00

Journal article: Real-Time Parallel Processing of Grammatical Structure in the Fronto-Striatal System: A Recurrent Network Simulation Study Using Reservoir Computing. PLoS ONE, 2013 http://bit.ly/Xj3Zqz

Read Full Post »

Image

A new finding by Harvard stem cell biologists turns one of the basics of neurobiology on its head — demonstrating that it is possible to turn one type of already differentiated neuron into another within the brain.

More info: http://bit.ly/Wd6hHh

Read Full Post »

Artificial neural networks (ANNs) are used today to learn solutions to parallel processing problems that have proved impossible to solve using conventional algorithms. From cloud-basedvoice-driven apps like Apple’s Siri to realtimeknowledge mining apps like IBM’s Watson to gaming apps like Electronic Arts’ SimCity, ANNs are powering voice-recognitionpattern-classification and function-optimization algorithms perfect for acceleration with Intel hyper-threading technology.

“Artificial neural networks and hyper-threading technologies are ideally suited for each other,” says Chuck Desylva, a support engineer for Intel performance primitives. “By functionally decomposing ANN workloads–dividing them among logical processors and employing additional optimization methods–you can achieve significant performance gains.”

Desylva recently tested several widely available open-source ANN algorithms on a Pentium-4 extreme edition to demonstrate how using its dual threads can achieve significant speed-ups. For the forthcoming massively parallelXeon Phi, Desylva predicts even more significant acceleration of ANN algorithms, since Xeon Phi supports four threads for each of its 50+ cores.

“I think that Xeon Phi will be a perfect fit for ANNs,” Desylva believes.

Biological neurons (upper left) are emulated by artificial neural network (ANN) mapping concepts that sum inputs (upper right) then supply an output (bottom) filtered by an activation function. Source: Intel

Biological neurons (upper left) are emulated by artificial neural network (ANN) mapping concepts that sum inputs (upper right) then supply an output (bottom) filtered by an activation function. Source: Intel

Artificial neural networks (ANNs)  emulate how the brain’s billion of neurons and trillions of synaptic connections divide and conquer tough combinatorial problems involving  detection of features,  perception of objects and  cognitive functions of association, generalization and attention. By implementing multiple layers of virtual parallel processors–each simulating a layer of interconnected neurons like those found in the cerebral cortex–ANNs are capable oflearning the solution to programming problems  impossible to execute in realtime using conventional algorithms.

For instance, ANNs enable voice-recognition systems to instantaneously match your voice against millions of stored samples, in contrast with standard algorithms that would have to serially compare your voice to each sample then calculate the best match,  a task too computationally intensive for realtime execution.

To evaluate how to accelerate ANNs, Desylva adapted for hyper-threading several popular algorithms, such as the back-propagation-of-error (BPE) learning algorithm that sends corrective feedback to previous layers in a multi-layer neural network until the desired real time response is achieved.

 Testing of these neural-learning algorithms was applied to a virtual network 10 million neurons. Performance boosts of over 10 percent were achieved immediately by using the Streaming SIMD Extensions 2 (SSE2) and thread-safeversions of the Microsoft Standard template Library (STL). OpenMP pragmas were then used to direct the compiler to use threading, resulting in a 20 percent overall performance increase compared to the original source. VTunes was then run to show a 3-to-4 times speedup in the commands OpenMP uses to synchronize threads.

Next the same OpenMP-based optimization technique was applied to the update function, which calculates the output of each neural-network layer before passing it to the next, resulting in double the average performance of several different ANN applets.

Finally, dissecting the BPE learning algorithm itself, resulted in as much as a 3.6-times speedup over the original unmodified source.

Posted on October 16, 2012 by R. Colin Johnson, Geeknet Contributing Editor

Read Full Post »

A postdoctoral student has developed a technique for implanting thought-controlled robotic arms and their electrodes directly to the bones and nerves of amputees, a move which he is calling “the future of artificial limbs”. The first volunteers will receive their new limbs early in 2013.

More info: http://bit.ly/SqVkQW

Read Full Post »

Introducing a light-sensitive protein in transgenic nerve cells … transplanting nerve cells into the brains of laboratory animals … inserting an optic fibre in the brain and using it to light up the nerve cells and stimulate them into releasing more dopamine to combat Parkinson’s disease. These things may sound like science fiction, but they are soon to become a reality in a research laboratory at Lund University in Sweden.

For more information: http://bit.ly/SthSQk

Read Full Post »

Older Posts »