Mechanisms and Perception of Vocalization in Canids
The work will start with a thorough exploration of canid vocal production (both source and filter components, cf; the second step will be the construction of a dog vocal tract model and synthesizer. This tool will then be used in playback experiments to generate synthetic stimuli, to gain an understanding of the various acoustics cues present in dog growls, barks, whines and howls, and how these are perceived and categorized by other listening dogs.
Acoustics of Individual Displays in Ravens
We will conduct detailed analysis of the highly idiosyncratic and multimodal displays of adult ravens from both structural/acoustic and gestural perspectives. We will use principles of vocal production to analyze the use of source and filter components in these displays and use repeated measures analysis within birds to extract the reliability of these different cues. We will then use touch-screen experiments to explore ravens' classification of calls whose acoustic and visual properties are experimentally varied. Finally, we will perform a combined environment/genetic analysis to analyze how these different aspects of displays are driven by upbringing and by genetics, and thus how the uncovered acoustic cues may provide reliable cues to genetics, provenance, and individual identity.
Fitch Additional projects
The two projects below will be funded by the ERC Advanced Grant SOMACCA, to Tecumseh Fitch. This grant is focused on the cognitive processes underlying pattern perception and rule learning in vision, speech and music, in both humans and other animals. We use the theoretical framework of formal language theory (e.g. Chomsky hierarchy, context-free languages), studied experimentally using the paradigm of "artificial grammar learning".
The overarching goal of this grant is to understand the evolution of syntax, and the degree to which linguistic syntax is based on mechanisms shared with other cognitive domains (music, abstract visual art) and other species (focusing on primates and birds). Students with an interest in the evolution of language, music and art and desiring to work with both humans and nonhuman animals are sought; strong quantitative (e.g. computer science/computational linguistics) skills are a definite plus.
Comparative Pattern Perception in Humans, Birds and Monkeys
We will directly compare pattern perception in many species, using auditory and visual patterns that vary in complexity. The species studied include humans, chimpanzees and squirrel monkeys, and pigeons, ravens and keas (a large-brained New Zealand parrot species). Stimuli include visual abstract patterns, strings of spoken syllables, and the animals' own vocalizations. Patterns are generated, and their complexity is evaluated, using formal language theory. All experiments are conducted using custom touch-screen-based systems, so that all animals can make responses in a species-typical manner (fingers for primates, beak pecks for birds). Students with a background in animal experimentation and/or software design and theory will be especially suitable for this project (we use the programming language Python).
Web-based Perceptual Experiments in Pattern Learning
We are interested in cognition in humans as well as other animals. For broad insights into all humans, it would be inadvisable to limit our subject pool to undergraduate students from Europe. In order to assure that our responses are not culturally-contingent we will develop a web-based experimental site, open to humans across the planet, to participate in rating stimuli (they will be attracted by a facility for generating musical and visual stimuli). This will provide a very broad human control group when we turn to questions of animals' preferences (or lack thereof). Internet-based experimentation promises a vast number of subjects and considerable cultural diversity, and will support far more robust generalizations about human capabilities and preferences. PhD students with a background in computer programming and web design are sought for this project. All software developed will be open-source, and made freely available to other laboratories.