Project Proposals for William Flynn Scholarship:
IIndex
page
Project Number: 1
Project Title: Bio-Inspired Systems to Aid The Human Senses (Sense-Maker)
Project Supervisor: Dr. Liam Maguire, Professor Martin Mc Ginnity
1. Objectives
The aim of this project will be to conceive and implement electronic
architectures that are able to merge sensory information sampled
through different modalities into a unified perceptual representation
of our environment. The architectural design will be based on the
biological principles of sensory receptor and nervous system function.
At a higher cognitive level (e.g. categorical classification) this
representation of the proximal environmental space would be largely
independent of the selected sensory substrates. A major objective
of the project would be to enable a system to reconfigure itself,
forming supplementary cross-connections between the sensory receptor
level of a given type and the higher stages of processing specific
to another sensory modality. The ultimate ambition is to create
new senses to supplement the existing senses eg. allow one to see
the thunder or hear the lightning.
2. Rationale and Methodology
Natural environments do not contain explicit labelling signals but
the brain is able to extract correlated information from sensory
representations elaborated simultaneously using different sensory
modalities. The "sense maker" system would choose a minimal
set of sensory modalities from a predefined library, the combination
of which would lead to reliable object/environment discrimination
and identification. The natural senses to be emulated are vision,
audition or echolocation, electrolocation, haptic and proprioceptive
senses, and internal representation of motor command. The composite
perceptual system would adapt its computational architecture as
a function of unpredicted changes in the environment, or when faced
with the partial impairment of some of the specialised sensors.
The system could switch its spectrum of sensory competence, by selecting
the sensory channels where the signal/noise ratio is optimal at
any given point in time. It could also benefit from crossmodal expertise
to supervise the learning and the calibration of artificial sensors
that might be plugged-in at a later stage to complement the already
built-in bio-inspired modalities.
3. Approach
The sense-maker system would be composed of several stages, modelled
on biological architectural principles:
The lower level will correspond to different sensory input layers,
each associated with a single modality (visual, auditory, electroreception,
etc.), working in parallel.
As in the brain, an intermediate stage will regulate input-output
transfer within each modality channel, depending on the rate of
change in the input message and motor/probing activity and the immediate
need for resource allocation for internal processing.
Biology-inspired plasticity algorithms will be used to discriminate
novel information during motor/probing strategies planned by the
higher levels of operations of the artificial analyser. For instance,
fast anti-Hebbian plasticity, based on negative correlation between
actual input and motor-command-derived internal prediction of the
expected sensory input, has been demonstrated to operate in the
electrosensory lobe of the electric fish. Similar hypotheses have
also been proposed in the visuo-oculomotor system of higher vertebrates.
The motor action exerted by the perceptive system here would be
an intermittent, modulable frequency probing action, designed to
change the spatial relationship between the artificial observer
and the environment in a predictable way. The resulting sensory
effect will be compared with that predicted on the basis of the
internal copy of the motor command and the knowledge of the recently
sampled environment. The expected result would be the selection
of whatever part of the sensory representation needs to be updated,
reducing the number of computational operations to be realised.
The cerebral cortex competently classifies unimodal stimuli while
keeping the different modalities largely separate. An upper layer
organised as a sensory neocortex would be used for each modality
to achieve gain contrast adaptation and extract key features to
be used to form some sort of primal sketch.
Feedback connections from higher- to lower-order sensory areas
carry predictions that can be used to optimise the feedforward processing.
Multimodality can also be used for the development of focused attention
and cross-modality interactions to provide the internal teacher
in self-supervised learning paradigms, for example where auditory
information (the " moo " of the cow) helps to recognise
the visual entity (shape of the cow). Accordingly, our artificial
system would include an additional layer in order to bind and store,
in a reversible manner, the most frequent cross-sensory associations
and the most predictable context gathered from recent statistics
taken from the environment. Associative plasticity rules would be
inspired from biological spike-timing dependent plasticity algorithms
studied in mammalian neocortex, and used to store long-term memories.
The topmost layer will constitute the intelligent processor of
the analyser, and decides (in a top-down way) how to distribute
the focus of attention (and computational power allocation) in the
processing. The system could thus have the property of deciding
the number and locations of hot spots (or foveas) necessary to extract
the more relevant information. Its adaptive capacity will be used
to reconfigure the focus of processing on the receptor sheet in
an equivalent manner to the formation of a preferred substitute
pseudo-fovea in macular disease patients. A third major task for
the controller would be to define cognitive rules, which determines
the selection and the change in balance between modalities in the
sensory fusion process. Both processes (change in spatial focus
of attention, and change in sensory modality spectrum/balance) would
of course deal with environmental disturbances and receptor sheet
alterations, and would be updated at a frequency dictated by the
probing action.
The Intelligent Systems Engineering Laboratory at the University
of Ulster will concentrate on an investigation of autonomously adaptable,
evolvable computing architectures to implement such a system. The
project will require input from biological neuroscientists currently
co-operating with the ISEL group.
4. Expected Benefits
The expected end-result is to define and evaluate prototypical crosslinked
architectures at the perceptual level, resulting in a stable integrated
environmental representation largely independent of the modalities
initially used to process the various input sources. The potential
applications are numerous, including: neuromorphic artificial intelligence;
sensory substitution wearable technology; hybrid technology to restore
sensory loss in humans; cross-modal learning; a better understanding
of information processing and function in the adult brain. A final
major benefit would be a step to a higher level of communication
and interaction between the engineering, physics, psychology, and
biological scientific communities.
If you are interested in being considered for a studentship please
contact
the Group Director, Professor T.M. McGinnity by email:
tm.mcginnity@ulst.ac.uk
or telephone: +44-(0)28-71375417.
See the current research section of this website
for details on research projects pursued by existing PhD students
|