M. M. Iskarous, Z. Chaudhry, F. Li, S. Bello, S. Sankar, A. Slepyan, N. Chugh, C. L. Hunt, R. Greene, and N. V. Thakor
Advanced Intelligent Systems
Humans possess an exquisite sense of touch, which robotic and prosthetic systems aim to replicate. Algorithms are developed to create neuron-like (neuromorphic) spiking representations of texture that are invariant to the scanning speed and contact force applied in the sensing process. These spiking representations mimic the activity of mechanoreceptors in human skin and subsequent processing up to the brain. The algorithms are tested on a tactile texture dataset collected under 15 speed–force conditions. An offline texture classification system based on the invariant representations demonstrates higher classification accuracy, improved computational efficiency, and enhanced capability to identify textures explored in novel speed–force conditions. The speed-invariance algorithm is adapted for a real-time human-operated texture classification system. In this system, invariant representations again improve classification accuracy, computational efficiency, and the ability to identify textures encountered under novel conditions. The invariant representation is particularly critical in this context, as human imprecision can be perceived as a novel condition by the classification system. These results show that invariant neuromorphic representations enable superior performance in neurorobotic sensing systems. Additionally, because the neuromorphic representations are grounded in biological processing, this work can serve as the basis for naturalistic sensory feedback for upper limb amputees.
S. Sankar, W. Cheng, J. Zhang, A. Slepyan, M. M. Iskarous, R. Greene, R. DeBrabander, J. Chen, A. Gupta, and N. V. Thakor
Science Advances
The human hand’s hybrid structure combines soft and rigid anatomy to provide strength and compliance for versatile object grasping. Tactile sensing by skin mechanoreceptors enables precise and dynamic manipulation. Attempts to replicate the human hand have fallen short of a true biomimetic hybrid robotic hand with tactile sensing. We introduce a natural prosthetic hand composed of soft robotic joints and a rigid endoskeleton with three independent neuromorphic tactile sensing layers inspired by human physiology. Our innovative design capitalizes on the strengths of both soft and rigid robots, enabling the hybrid robotic hand to compliantly grasp numerous everyday objects of varying surface textures, weight, and compliance while differentiating them with 99.69% average classification accuracy. The hybrid robotic hand with multilayered tactile sensing achieved 98.38% average classification accuracy in a texture discrimination task, surpassing soft robotic and rigid prosthetic fingers. Controlled via electromyography, our transformative prosthetic hand allows individuals with upper-limb loss to grasp compliant objects with precise surface texture detection.
E. Rahiminejad, A. Parvizi-Fard, M. M. Iskarous, N. V. Thakor, and M. Amiri
Transactions on Neural Systems and Rehabilitation Engineering
One major challenge in upper limb prostheses is providing sensory feedback to amputees. Reproducing the spiking patterns of human primary tactile afferents can be considered as the first step for this challenging problem. In this study, a novel biomimetic circuit for SA-I and RA-I afferents is proposed to functionally replicate the spiking response of the biological tactile afferents to indentation stimuli. The circuit has been designed, laid out, and simulated in TSMC 180nm CMOS technology with a 1.8V supply voltage. A pair of SA-I and RA-I afferent circuits consume 3.5μW of power. The occupied silicon area is 180μm×220μm for 32 afferents. To provide the inputs for circuit testing, a patch of skin with a grid of mechanoreceptors is simulated and tested by an edge stimulus presented at different orientations. Experimental data are collected using indentation of 3D-printed edges at different orientations on a tactile sensor mounted on a robotic arm. Inspired by innervation patterns observed in biology, the artificial afferents are connected to several neighboring mechanoreceptors with different weights to form complex receptive fields which cover the entire mechanoreceptor grid. Machine learning algorithms are applied offline to classify the edge orientations based on the pattern of neural responses. Our results show that the complex receptive fields arising from the innervation pattern led to smaller circuit area and lower power consumption, while facilitating data encoding from high-resolution sensors. The proposed biomimetic circuit and tactile encoding example demonstrate potential applications in modern tactile sensing modules for developing novel bio-robotic and prosthetic technologies.
A. Parvizi-Fard, M. Amiri, D. Kumar, M. M. Iskarous, and N. V. Thakor.
Scientific Reports
To obtain deeper insights into the tactile processing pathway from a population-level point of view, we have modeled three stages of the tactile pathway from the periphery to the cortex in response to indentation and scanned edge stimuli at different orientations. Three stages in the tactile pathway are, (1) the first-order neurons which innervate the cutaneous mechanoreceptors, (2) the cuneate nucleus in the midbrain and (3) the cortical neurons of the somatosensory area. In the proposed network, the first layer mimics the spiking patterns generated by the primary afferents. These afferents have complex skin receptive fields. In the second layer, the role of lateral inhibition on projection neurons in the cuneate nucleus is investigated. The third layer acts as a biomimetic decoder consisting of pyramidal and cortical interneurons that correspond to heterogeneous receptive fields with excitatory and inhibitory sub-regions on the skin. In this way, the activity of pyramidal neurons is tuned to the specific edge orientations. By modifying afferent receptive field size, it is observed that the larger receptive fields convey more information about edge orientation in the first spikes of cortical neurons when edge orientation stimuli move across the patch of skin. In addition, the proposed spiking neural model can detect edge orientation at any location on the simulated mechanoreceptor grid with high accuracy. The results of this research advance our knowledge about tactile information processing and can be employed in prosthetic and bio-robotic applications.
L. E. Osborn, M. A. Hays, R. Bose, A. Dragomir, M. M. Iskarous, Z. Tayeb, G. M. Levay, C. L. Hunt, M. S. Fifer, G. Cheng, A. Bezerianos, and N. V. Thakor
Journal of Neural Engineering
Objective. A major challenge for controlling a prosthetic arm is communication between the device and the user's phantom limb. We show the ability to enhance phantom limb perception and improve movement decoding through targeted transcutaneous electrical nerve stimulation in individuals with an arm amputation. Approach. Transcutaneous nerve stimulation experiments were performed with four participants with arm amputation to map phantom limb perception. We measured myoelectric signals during phantom hand movements before and after participants received sensory stimulation. Using electroencephalogram (EEG) monitoring, we measured the neural activity in sensorimotor regions during phantom movements and stimulation. In one participant, we also tracked sensory mapping over 2 years and movement decoding performance over 1 year. Main results. Results show improvements in the participants' ability to perceive and move the phantom hand as a result of sensory stimulation, which leads to improved movement decoding. In the extended study with one participant, we found that sensory mapping remains stable over 2 years. Sensory stimulation improves within-day movement decoding while performance remains stable over 1 year. From the EEG, we observed cortical correlates of sensorimotor integration and increased motor-related neural activity as a result of enhanced phantom limb perception. Significance. This work demonstrates that phantom limb perception influences prosthesis control and can benefit from targeted nerve stimulation. These findings have implications for improving prosthesis usability and function due to a heightened sense of the phantom hand.
S. Bello, M. M. Iskarous, S. Sankar, and N. V. Thakor
L. Osborn, M. Iskarous, and N. Thakor
Wearable Robotics
Recent research has led to advances in sophisticated prosthetic devices and associated scientific and technological challenges have also opened up new avenues of exploration. Prosthetic hand research is largely driven by clinical needs to achieve higher user functionality and outcome measures. The persistent questions at the forefront of both clinical and research applications are how to improve prosthesis control and provide sensory feedback. Restoring a sense of touch can help upper limb amputees improve the use of their prosthetic hands. This chapter provides a brief overview of sensing and control technology for hand prostheses, specifically as they relate to clinical and research applications. We discuss signal processing techniques for decoding hand movement as well as efforts to provide natural sensory feedback. We address some of the major challenges faced by upper limb amputees and discuss how user needs have driven the technology forward.
M. M. Iskarous and N. V. Thakor
Proceedings of the IEEE
Prosthetic hands, today, have anthropomorphic, multifinger design. A common control method uses pattern recognition of electromyogram signals. However, these prostheses do not capture the human hand's sensory perception, which is critical for prosthesis embodiment and dexterous object manipulation. This problem can be solved by sensorized electronic skin (e-skin) composed of various sensors that transduce sensory percepts such as touch, pressure, temperature, and pain, just as human skin does. This review will present the physiology of the receptors that encode tactile, thermal, nociceptive, and proprioceptive information. The e-skin is designed to mimic these receptors and their responses. We review each sensor subtype, and its design and performance when embedded in the e-skin. Next, we review the spiking response of the receptors, which are then relayed to sensory nerves and encoded by the brain as sensory percepts. The e-skin system is designed to produce neuromorphic or receptorlike spiking activity. Computational models to mimic these sensory nerve signals are presented and then various methods to interface with the nervous system are explored and compared. We conclude the review with the state of the art in e-skin design and deployment in closed-loop applications that demonstrate the benefits of sensory feedback for amputees.
Z. Chaudhry, F. Li, M. Iskarous, and N. V. Thakor
11th International IEEE/EMBS Conference on Neural Engineering (NER)
Tactile sensing is an active area of research in robotics and neural engineering, particularly in relation to sensory feedback for neural prostheses. Sensory feedback relies on neuromorphic models for touch, which must be characterized and validated through tactile sensing experiments. Currently, no standardized, automated method exists for performing these experiments. Thus, there exists a need for new methods/workflows for providing tactile stimulation in neuromorphic tactile sensing. In this work, we describe a rotary-drum tactile stimulator that provides complete user control over force and velocity setpoints and applied textures using PID tuning and an interchangeable, snap-in 3D-printed texture plate system. We achieve high accuracy and precision closed-loop force control (3.4% average deviation in force between first and last ten trials with 4.2% standard deviation) and open-loop velocity control (4.6% average deviation from velocity setpoint with 2.6% standard deviation). Additionally, the apparatus features an automated data pipeline, which records analog tactile sensor readings at each experimental condition, automatically segments them into individual palpation trials, and transforms them into neuromorphic spiking activity. Though designed to develop neuromorphic models of touch for prostheses, the apparatus is generalizable to a wide array of neural engineering experiments, including characterizing tactile sensors and generating tactile sensing databases.
Y. Tian, A. Slepyan, M. M. Iskarous, S. Sankar, C. L. Hunt and N. V. Thakor
IEEE Biomedical Circuits and Systems Conference (BioCAS)
Despite advances in motor control of current upper limb prosthetic devices, many prosthetic users still experience poor prosthesis embodiment due to a lack of sensory feedback. Previous studies have focused on novel tactile sensor development, neuron-like (neuromorphic) encoding, and sensory stimulation to develop sensorized prosthetics. However, the real-time dynamics of the tactile information and sensory feedback have not yet been addressed. In this work, we developed a neuromorphic closed-loop system that integrates tactile sensing and transcutaneous electrical nerve stimulation in real time. We demonstrate the dynamic characteristics of the sensory stimulation pulses that change in response to real-time neuromorphic tactile information, which is the foundation of enhanced sensory perception and dexterous motor control in upper limb amputees.
S. Sankar, A. Slepyan, M. M. Iskarous, W. Cheng, R. DeBrabander, J. Zhang, A. Gupta, and N. V. Thakor
IEEE Sensors Conference
Tactile sensing is an integral aspect of interacting with our environment, where textural information can be essential to properly manipulate objects. Soft robotic fingers are compliant biomimetic solutions for upper-limb prosthesis. Therefore, adding tactile sensing capabilities such as texture discrimination on a soft robotic finger can create a compliant and precise prosthetic hand for an amputee. In this study, we emulate a human fingertip by incorporating a flexible, biomimetic, and multilayered tactile sensor into a soft robotic fingertip for a texture discrimination task where thirteen textures were passively palpated. The sensor was able to achieve a texture classification accuracy of 97.12% when the sensor output was neuromorphically encoded to mimic the receptors in human skin. This work is a step in the direction of a more human-like prosthetic hand to help enhance amputees' perception and interaction with their environment.
M. M. Iskarous, S. Sankar, Q. Li, C. L. Hunt and N. V. Thakor
10th International IEEE/EMBS Conference on Neural Engineering (NER)
In this work, we develop an algorithm to optimally select a set of output spiking channels from a set of input sensing channels in order to deliver naturalistic sensory feedback to an amputee. The growing discrepancy between the number of sensing channels (~103) and the number of stimulation channels (~102) in neural interface devices leads to a combinatorially explosive search problem that quickly becomes intractable through brute force methods. This algorithm is tested using a dataset of 13 textures that was collected using a tactile sensor array with 9 tactile pixels (taxels). The sensor readings are transformed into neuron-like (neuromorphic) spiking activity that resemble the output of slowly adapting (SA) and rapidly adapting (RA) mechanoreceptors in the skin which encode different features of tactile information. The channel selection algorithm uses a spike train distance metric to create an optimal set of output channels that are highly differentiated and complementary. The effectiveness of the channel selection algorithm is tested by comparing texture classification performance between the chosen set and a set of random channels. The selected channels have slightly better classification accuracy but a much smaller standard deviation. As the input channels get noisier and grow in number, the channel selection algorithm will provide even better accuracy in comparison to random selection and much lower computational cost in comparison to brute force search. This work paves the way to provide sensory feedback systems that will deliver more informative stimulation patterns to amputees.
S. Sankar, A. Brown, D. Balamurugan, H. Nguyen, M. Iskarous, T. Simcox, D. Kumar, A. Nakagawa, and N. Thakor
IEEE Sensors Conference
Soft robotic fingers provide enhanced flexibility and dexterity when interacting with the environment. The capability of soft fingers can be further improved by integrating them with tactile sensors to discriminate various textured surfaces. In this work, a flexible 3×3 fabric-based tactile sensor array was integrated with a soft, biomimetic finger for a texture discrimination task. The finger palpated seven different textured plates and the corresponding tactile response was converted into neuromorphic spiking patterns, mimicking the firing pattern of mechanoreceptors in the skin. Spike-based feature metrics were used to classify different textures using the support vector machine (SVM) classifier. The sensor was able to achieve an accuracy of 99.21% when two features, mean spike rate and average inter-spike interval, from each taxel were used as inputs into the classifier. The experiment showed that an inexpensive, soft, biomimetic finger combined with the flexible tactile sensor array can potentially help users perceive their environment better.
M. Hays, L. Osborn, R. Ghosh, M. Iskarous, C. Hunt, and N. V. Thakor
9th International IEEE/EMBS Conference on Neural Engineering (NER)
A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.
M. M. Iskarous, H. H. Nguyen, L. E. Osborn, J. L. Betthauser, and N. V. Thakor
IEEE Biomedical Circuits and Systems Conference (BioCAS)
In this work, we investigated the classification of texture by neuromorphic tactile encoding and an unsupervised learning method. Additionally, we developed an adaptive classification algorithm to detect and characterize the presence of new texture data. The neuromorphic tactile encoding of textures from a multilayer tactile sensor was based on the physical structure and afferent spike signaling of human glabrous skin mechanoreceptors. We explored different neuromorphic spike pattern metrics and dimensionality reduction techniques in order to maximize classification accuracy while improving computational efficiency. Using a dataset composed of 3 textures, we showed that unsupervised learning of the neuromorphic tactile encoding data had high classification accuracy (mean = 86.46%, sd = 5. 44%). Moreover, the adaptive classification algorithm was successful at determining that there were 3 underlying textures in the training dataset. In this work, tactile information is transformed into neuromorphic spiking activity that can be used as a stimulation pattern to elicit texture sensation for prosthesis users. Furthermore, we provide the basis for identifying new textures adaptively which can be used to actively modify stimulation patterns to improve texture discrimination for the user.
H. Nguyen, L. Osborn, M. Iskarous, C. Shallal, C. Hunt, J. Betthauser, and N. Thakor
IEEE Biomedical Circuits and Systems Conference (BioCAS)
Prosthetic limbs would benefit from tactile feedback to provide sensory information when interacting with the environment, such as adjusting grasps using force feedback or palpating texture. In this work, we demonstrate how a multilayer tactile sensor can be used for palpation, and enhance the ability to discriminate between touch interfaces. Inspired by mechanoreceptors in skin, the multilayer sensor consists of multiple textile force sensing elements. The novelty of this work lies in the application of a multilayer sensor, one that produces touch receptor like (neuromorphic) output, to texture classification by using a classifier based on sparse recovery. This approach is shown to be capable of palpation, achieving classification accuracies as high as 97% on a distinct texture set. Using compressed sensing and sparse recovery, the multilayer sensor can decode texture under dynamic conditions, potentially providing amputees the ability to perceive rich haptic information while using their prosthesis.
M. M. Iskarous, L. J. Endsley, L. E. Miller, J. L. Collinger, N. G. Hatsopolous, and J. E. Downey
11th International BCI Meeting
K. Ding, M. M. Iskarous, L. E. Osborn, B. P. Christie, D. D’Almeida, K. Yu, M. Fifer, P. Celnik, F. Tenore, B. Caffo, and N. V. Thakor
11th International BCI Meeting
K. Ding, M. M. Iskarous, L. E. Osborn, B. P. Christie, M. S. Fifer, P. A. Celnik, F. Tenore, and N. V. Thakor
Society for Neuroscience
K. Ding, M. M. Iskarous, L. E. Osborn, M. S. Fifer, B. P. Christie, P. A. Celnik, F. V. Tenore, and N. V. Thakor
Society for Neuroscience
K. Ding, M. M. Iskarous, L. E. Osborn, and N. V. Thakor
Society for Neuroscience
K. Quinn, M. M. Iskarous, C. Glass, A. L. Lowe, S. Tuffaha, and N. V. Thakor
Society for Neuroscience
C. L. Hunt, A. Sharma, M. M. Iskarous, and N. V. Thakor
IEEE Biomedical Circuits and Systems Conference (BioCAS)
The goal of this work is to improve the efficacy of upper-limb prosthesis training protocols using visual feedback based on tactile and proprioceptive sensorization in an augmented reality environment.
© 2025 Mark Iskarous
Built by Bailey Kane.