By Jillian Rose Lim, Nancy McCann, Barbara Drosey, and Sharlene George
Editor’s Note: Where Discovery Leads is a multimedia storytelling project that delves into key research themes at CHOP Research Institute. This is part one of a three-part series that focuses on novel diagnostic tools and approaches being developed under the leadership of the Center for Autism Research at CHOP. See part 2 and part 3 of the series.
The child’s facial movements are subtle but revealing. From barely a glance at his conversation partner to the hint of a furrowed brow as he pauses to answer how his morning is going, novel technology captures and analyzes the nuances of their interplay. A simple three-minute chat becomes a powerful window of insight to support more accurate assessments of the complexities of autism spectrum disorder (ASD).
Social synchrony is diminished in ASD, making it a key measure to track and analyze in individuals participating in ASD research. At the Technology and Innovation Lab within the Center for Autism Research (CAR) at Children’s Hospital of Philadelphia, Keith Bartley, a research scientist, is part of an “Ocean’s Eleven-style” team of computer scientists, clinician-researchers, and postdoctoral fellows who have developed a fully automated method to capture, digitize, and analyze social communication between two individuals.
“The social aspect [of ASD] is one of the biggest detriment to a child’s life,” said Bartley, the inventor of the device behind the team’s innovative digital phenotyping approach. “But being able to look at autism specifically through the lens of social behavior is something that hasn’t been done very well, and not done at all with this level of computer vision application.”
A biometric sensor camera nicknamed the “SensorTree” directly measures with exquisite digital precision and objectivity the outward manifestations of social interaction problems in children with ASD, such as atypical eye contact and reduced nonverbal communication, observable by clinicians. Then, machine learning algorithms turn those granular behaviors into time-synced data, giving autism researchers a highly novel and accurate way to quantify and characterize different dimensions of social impairment.
The National Institute of Mental Health awarded CAR’s Technology and Innovation Lab a grant to further test and enhance the digital phenotyping method, with the goal of achieving a more quantifiable way to characterize the constellation of behaviors that contribute to a diagnosis of autism. Every child with ASD presents differently, and until now, the diagnostic process has relied mostly on human judgments, which can be imprecise. The lack of scientifically reliable behavioral data also creates challenges when it comes to pinning down the essence or fundamental characteristics that define autism as a whole.
“I’ve been studying autism for almost 30 years, and during that time, the field has focused on understanding autism in one specific way within the framework of its definition in the [Diagnostic and Statistical Manual of Mental Disorders],” said Robert Schultz, PhD, scientific director of CAR and principal investigator of the new project. “We have come to understand autism. But we’ve never had a ‘dust bowl empiricism’ approach. We’ve never had a chance to collect large, unbiased samples with raw primary data of human behavior or analytic models to ask: What are ASD’s underlying dimensions? With emerging technologies, that opportunity is finally becoming available to us, and with it we have the potential to vastly improve the way we understand and treat ASD.”
Screening for Autism: Improving Current Methods and Developing New Ones
Considering the Centers for Disease Control and Prevention’s estimate of autism prevalence is one in 59, autism experts are constantly striving for earlier identification, treatment, and better outcomes for children and adults on the autism spectrum. While screening tools exist, and the average age of diagnosis is declining, identifying autism very early in life still proves to be very challenging.
It’s only been a little over 10 years (2007) since the American Academy of Pediatrics (AAP) recommended universal screening for autism at all 18- and 24-month well-child visits. (See Timeline at end of story.) Susan Levy, MD, MPH, founder and director of the Regional Autism Center at CHOP, was co-author of the study that prompted the AAP’s recommendation. At CHOP, 91 percent of toddlers are screened at their 18- and/or 24-month well-child visit, and the results are entered immediately into the electronic health record.
“It’s because of these records that we know universal screening is incredibly feasible when you build it directly into the health record,” said Judith Miller, PhD, senior scientist and training director at CAR. Successful screening opens the door to both improving current practice and informing future breakthroughs. “Even though screening methods right now are not perfect, we know that the children identified through screening received an earlier diagnosis than others.”
CAR researchers are studying what happens after screening and what ultimately leads to the best outcomes. For example, Kate Wallis, MD, MPH, a developmental and behavioral pediatrician at CHOP, has been researching whether the children who screen positive get all the referrals recommended by the AAP. She is finding that referral rates are far lower than expected, suggesting that making the referral process easier and more automated would likely improve overall care.
“We’re also seeing the limitations of our current screening tools,” Dr. Miller said. “We’re able to look back, now that those kids have grown up, and see what their screening results were as toddlers. We found that a lot of them ended up with a diagnosis of autism later on, even though they screened negative. Today, even though screening seems to be helping some kids, we’re actually missing more kids early on (60 percent) than we’re catching.”
Dr. Miller is senior author on an in-press Pediatrics study led by Whitney Guthrie, PhD, a scientist and co-director of the Data and Statistical Core at CAR, which elaborates on these findings.
“A large limitation of our current screening tools is the yes-no nature of these instruments; they’re not granular enough,” Dr. Guthrie said. “The crux of autism is this: Kids with autism tend to show social behaviors less frequently, less appropriately, and in a less nuanced way than we would expect for their age. That’s pretty complex. To narrow that down to a yes or no answer is just too hard. But we think we can leverage technology to improve how we measure autism symptoms.”
One possible technology to improve screening is a simple phone app, created by Dr. Miller with help from the Department of Biomedical and Health Informatics at CHOP and Drexel University co-op students. (See Part 2 of this series.)
But another way to improve screening may be to incorporate more of the information pediatricians are already collecting during the well-child visits, and combine it with screening results to improve accuracy. For example, children with known risk factors for autism, like having an older sibling with autism, may need different screening cutoff scores.
“We know the best thing we can do is to start intervention under age 3 to capitalize on early brain development,” Dr. Guthrie said. “That’s the best strategy we have to help kids to reach their highest potential. The average age of diagnosis in the United States today is 4. We need to screen earlier, and we need to screen better.”
Moving Beyond the Science of Opinion
Using CAR’s digital phenotyping approach, Dr. Schultz hopes to build a new empirically derived taxonomy for ASD that can complement advances in genetics and brain imaging of ASD, and contribute to early detection, precision medicine, and assessing how autism evolves in an individual. Building that taxonomy, he said, begins with unbiased behavioral data collected by the SensorTree — and lots of it.
The most recent prototype of the SensorTree is small, portable, and light enough to be transported and placed anywhere, such as atop a psychologist’s or doctor’s desk. The device itself is smooth in texture and simple in aesthetic, designed to be as unintrusive as possible when measuring the bidirectional timing of behaviors between partners in the most common and natural setting: a short, unscripted casual conversation.
As the two individuals converse, the camera employs marker-less motion technology to capture the ultra-fine micro movements of the pair’s faces across 180 various “bases,” or predefined spaces where motor activity occurs. These movements might include the direction of their eyes, the uptick of an eyebrow, how far their mouth moves in response to the other person’s movements.
Specifically, the SensorTree captures interpersonal motor synchrony, or the tendency for social partners to instinctively and dynamically coordinate their movements. When two people converse, their facial movements, body language, and tone of voice tend to naturally respond and react in rhythmic coordination, much like birds flocking or fish schooling.
Autism is fundamentally a disorder of social interaction; therefore, the most salient features of ASD occur within the interpersonal dynamics of interactions,” said Casey Zampella, PhD, a postdoctoral fellow at CAR who is using the SensorTree to study conversational patterns. “However, most research to date has focused exclusively on the behaviors of individuals with ASD, rather than on bidirectional social processes. The novelty of our approach is its direct measurement of behaviors that naturally unfold between social partners as they interact, allowing us to capture much more information about the core nature of ASD.”
Learning From Language
Julia Parish-Morris, PhD, director of CAR’s Quantitative Linguistics Lab, is in the process of analyzing speech samples collected during the same conversations that provided the computer vision scientists with video for analyses. Using high dimensional computational approaches to quantify and analyze linguistic markers of neurodevelopmental variation is particularly important for understudied or underserved subgroups such as girls with autism.
“I want to use this amazing biosensor and use computational linguistics as a method to detect subtle differences that might not be obvious to the naked ear,” Dr. Parish-Morris said. “Ultimately, I’d like to develop targeted social communication interventions that are personalized to the unique challenges that girls with autism face.” Learn more in Breaking Through podcast.
As a language researcher, Dr. Parish-Morris studies everything from the acoustic properties of individuals’ voices, to the spacing between their words, and their word choices. Recently, she joined collaborators from the Linguistic Data Consortium at the University of Pennsylvania, the Department of Speech-Language Pathology at CHOP, and CAR to take a big look at the little word “um.”
Prior research shows that autistic children use a pragmatic marker — “um” — less than typically developing children during conversation, and that reduced “um” use correlates with greater social impairment. However, these studies rarely included girls. When girls were included in the research conducted at CAR, a sex difference emerged: Autistic girls produced “um” ratios similar to typically developing boys and girls. This suggests previous research that has shown an association between lower rates of “um” and poorer social skills may only hold true for autistic boys.
In another study, Dr. Parish-Morris used the SensorTree technology to analyze whether or not autistic girls and autistic boys used personal pronouns in similar ways during a five-minute natural conversation. The results showed that overall, children with autism used fewer personal pronouns than typical kids, especially “we.” But again, this difference was driven by males, not females.
“What I’m finding is that most of what historically has been known and accepted about language behaviors in autism is only true for autistic boys,” Dr. Parish-Morris said. “Verbal communication in autistic girls is not well understood.”
Tapping into the SensorTree as a quantitative assessment tool and compiling a larger pool of data about autistic girls than has been previously available could help to uncover how social communication works in autistic girls. It’s possible that verbally fluent girls who are as severely affected by autism as boys may appear less impaired “to the naked ear” in a variety of linguistic domains, which could impact girls’ likelihood of being referred, diagnosed, and treated for ASD.
In addition to focusing on sex differences in communication, Dr. Parish-Morris is capitalizing on the machine learning techniques being developed for use with the biosensor to analyze thousands of audio samples from hundreds of individuals ages 0 to 60 years. The overarching goal of this effort is to extract linguistic characteristics that constitute a person’s vocal signature.
For example, previous research has shown infants who go on to be diagnosed with ASD cry and babble differently than typically developing infants. New funding from the National Institutes of Health will allow CAR researchers to build on this foundational work and identify early “vocal biomarkers” of ASD.
Granular analysis of this vocal data will lead to powerful tools that can be used to evaluate, monitor, and ultimately inform real-world interventions to improve everyday functioning in autism and other neuropsychiatric conditions across the lifespan.
Creating Dynamic Digital ASD Profiles
The SensorTree collects and integrates the audiovisual data with information on the child’s heart rate, skin conductance, movements, and more. Once recorded, the data is offloaded into a machine learning pipeline to be analyzed using advanced algorithms. CAR Technology and Innovation lab members, Evangelos Sariyanidi, PhD; Bartley, and Birkan Tunç, PhD, created these algorithms out of reams of raw behavioral data collected at CAR, and they are the core of what makes the method unique.
“Our massive amounts of data are truly our most valuable intellectual property,” Bartley said. “That is what the NIH grant is really about. We are building a massive database with mountains of data from children with autism spectrum disorder in conversation, who have been diagnosed by experts with the highest degree of clinical training and research rigor.”
Dr. Sariyanidi and his colleagues found that the computer’s assessment of facial motor synchrony predicted autism diagnosis with 90 percent accuracy, which is a remarkably higher percentage than highly trained autism clinical experts assessing the same video recordings. Predictive motor synchrony features gleaned from analyses of adults also accurately predicted diagnosis in a child sample, as well as differences in severity for each person within the ASD group. These results suggest the SensorTree can successfully aid in diagnostic classification and evaluating differences.
They are now working hard on making individual level predictions, both at one given time, and across time as individuals undergo new treatments. One of the real promises of digital phenotyping is for use as a clinical outcome measure in new treatment trials, and for ongoing care management by CHOP clinicians and beyond.
The team has no plans to stop at quantifying facial movements to assess autistic children’s social abilities. As the researchers collect more data on more individuals, the approach and its framework will become more robust. With support from the NIH grant, they are currently evaluating predictive facial features in non-ASD psychiatric disorders and measuring the interpersonal coordination of other body movements along with nonverbal facial expressions.
“We know that most information in a conversation is obviously coming from the face, but there’s also body movements and gestures to consider,” Bartley said. “And when you add in the dynamic of their body language interacting with another person, and fold in their language, you’ve created this unbelievably complex system of interplay.”
Changing the Landscape of ASD Research and Treatment
The concept of a digital phenotype has the potential to impact multiple areas of autism research and treatment. Both Bartley and Dr. Schultz emphasized that this new approach is in no way meant to replace a clinician’s expertise, only to offer diagnostic and therapeutic support. As the team continues to compile more behavioral data, refine their algorithms, and achieve sharper characterizations, digital phenotyping will no doubt usher in a new era of how we look at and define autism — quickly, rigorously, and precisely.
For example, digital phenotyping methods could improve early detection. In the first year of life, typical babies are already very synchronous with adults in their behavior; however, evidence suggests that early synchrony may be reduced in autism.
“We need to do a better job [detecting behavioral differences] in that first year of life,” Dr. Schultz said. “Using this novel [method], can we measure the lack of synchrony as a dimension and establish a threshold for synchrony that indicates a high probability that the baby will go onto fully develop autism?”
Furthermore, as a spectrum disorder, ASD can vary widely in the way it presents from one person to the next. Some individuals have co-occurring intellectual disability, some have learning challenges, and some have savant abilities. Others have seizures, attention-deficit/hyperactivity disorder, or anxiety. A digital phenotype gives clinicians and researchers greater power to find and map out associations along the spectrum of autism, as well as its intersections with other neuropsychiatric diagnoses.
We believe that the deeper level of understanding we can gain from these new tools will help us better recognize individual differences across people,” Dr. Zampella said. As for the future of the team’s work, she added, “
After the diagnosis, personalized treatment planning and monitoring is equally important and challenging. When a clinician places a patient on a new medication or therapy, she might not see that patient again for at least six months to a year. With the portable SensorTree, a nurse could potentially measure the efficacy of an intervention over more frequent intervals; or this can be done via telemedicine and screen-based communication devices already in the home. Both approaches could be used as part of remote screening and assessment to identify new intervention needs as they emerge, and track progress over time. (See Part 3 of this series.)
Dr. Schultz hopes digital phenotype methods will move researchers closer to a precision medicine approach for ASD by marrying more accurate behavioral phenotypes with advances in genetics and imaging. Using the digital phenotyping in very large groups of individuals diagnosed with autism and other mental health conditions will clarify the boundaries between disorders, as well as areas of overlap, allowing for a clear portrait of what autism is and what it shares with other conditions, he added
“We’re building the foundation for something that will yield returns over time,” Dr. Schultz said. “And it’s a worthwhile foundation because it’s beginning to make sense that autism is so heterogeneous that it’s difficult to find specific treatments, and it’s also cluttered with dimensions that we don’t understand. But that’s what we’re trying to do here at CAR, and that’s why I’m so excited about all this.”
(Save the Date: Join Honorary Team Captain Madeline Bell and almost 150 CHOP colleagues participating in the Eagles Autism Challenge May 18! Use discount code 50CHOP for 50 percent off registration when you sign up to run, ride, or participate virtually as part of Team CHOP, and help reach the goal of raising more $2.5M for autism research and care. This event will feature a one-day bike ride and family-friendly 5K run/walk as well as a Friday Night Kick-Off Party at Lincoln Financial Field.)
Sources for Timeline
Donvan J, Zucker C. Autism’s First Child. The Atlantic. 2010 October.
Fischbach GD. Leo Kanner’s 1943 paper on autism, opinion. Spectrum. 2007 December.
Sinclair E, Wilbraham D. History of Autism Treatment. The Applied Behavior Analysis Program Guide.
Wallis C. Autism linked to genes that govern how the brain is wired. Time. 2009 April.
Zeldovich L. The evolution of ‘autism’ as a diagnosis, explained. Spectrum. 2018 May.
Artwork courtesy of Center for Autism Research Art Gallery.