Project:
animatlab
Code Location:

http://animatlab.googlecode.com/svn/trunk

Open Hub Project Analysis
Basic Information
Code Locations: 1
SCM Types: Subversion   
Files: 1,068
Lines Of Code: 201,355
Published On: May 30, 2013 (11:43 AM)
Why do we need AnimatLab? A central goal of neuroscience is to understand how the nervous system is organized to control behavior. Behavior is controlled by neural circuits that link sensory inputs to decision networks, and link decision elements to motor networks and muscles. The dynamics of this interaction are central to the functional control of behavior. Each movement generates its own sensory input and changes the animal’s position and perspective in the world. To govern behavior correctly, the nervous system must both predict and respond to the consequences of the animal’s own movements and behavior, and do so on a millisecond to second time scale. Despite the importance of this dynamic relationship between nervous function and behavior, it is poorly understood because of technical limitations in our ability to record neural activity in freely behaving animals. The kinematics and dynamics of many behaviors are well understood, and the neural circuitry for behavior patterns in a variety of animals have been mapped and described in anesthetized, restrained animals, or in preparations where the nervous system has been isolated from the periphery. Investigators have then been left to imagine how the operation of neural circuits might produce the behavior patterns observed in the intact animal, but without any way to test those imaginings. How does AnimatLab Help? AnimatLab was written to address this problem. It provides a software environment in which models of the body and nervous system interact dynamically in a virtual physical world where all the relevant neural and physical parameters can be observed and manipulated. The program contains a ‘body editor’ that is used to assemble a model of the body of an animal (or part thereof) in LegoTM-like fashion, by attaching different sorts of parts to each other through a variety of joint mechanisms. Muscle attachments, muscles, stretch receptors, touch sensors, and chemical sensors can then be added to provide sensory and motor capabilities. A ‘neural editor’ is used to assemble virtual neural circuits using a variety of model neuron and synapse types. Model sensory neurons can then be linked to the body sensors, and motor neurons can be linked to the Hill model muscles to complete the loop. The body is situated in a virtual physical world governed by VortexTM, a physics simulator licensed from CM-Labs, Inc. Simulations can then be run in which the animat’s movements in the virtual environment are under neural control as it responds to simulated physical and experimental stimuli. The autonomous behavior of the animat is displayed graphically in 3-D alongside the time-series responses of any designated set of neural or physical parameters. This allows you to complete the sensory-motor feedback loop to test various hypothesis about the neural control of behaviors. What can you do with AnimatLab? AnimatLab currently has two different neural models that can be used. One is an abstract firing rate model, and the other is a more realistic conductance based integrate-and-fire model. It is also possible for you to add new neural and mechanical models as plug-in modules. There are several different joint types, and a host of different body types that can be used. And, if none of the body parts are exactly what you want, then create that part as a mesh and use it directly in your simulations. This gives you complete control over specifying the body and neural control system for your organism. Below is a list of a few of the organisms that you can create using AnimatLab. Several of these examples also have online video tutorials that show you how to build those systems for yourself in a simple, step-by-step process. Follow along as you watch us build those systems.
File Name LOCs Language
VS7
--- ---