Laura is currently pursuing a Ph.D. in Computing Science under the supervision of Dr. Patrick Pilarski and Dr. Matthew E. Taylor. She received a B.Sc. with Honors in Computing Science from the University of Alberta in 2019 and an M.Sc. in Computing Science from the University of Alberta in 2022. Her research interests include reinforcement learning, human-robot interaction, biomechatronics, and assistive robotics. Drawing inspiration from her anatomical studies with Dr. Pierre Lemelin, Laura’s research aims to develop control methods for robotic manipulation with the goal of increased functionality, usability, reliability, and safety in the real-world.
What initially attracted you to computer science?
So, when I came back to school, I wanted to actually be a doctor to begin with. And work-life balance is really important to me, so I could spend a lot of time with my family. And I've always wanted to just help people, and I ended up taking a computer science class in my first year of university, and it just felt like such an amazing tool that you could use to solve problems. And I just fell in love with problem solving and decided that being able to work in the space of assistive technology where I could put that problem solving to use and actually help people in the process was where I wanted to exist.
How did you figure out that assistive technology was your passion to get into?
I took a robotics class in my undergrad, and it was where we used Lego Mindstorm kits in order to learn the basics of mechatronics and robotics. And it just all clicked together that it was so much fun to work with, and you could write programs and then immediately see the results of it. So, it just made sense to merge my love of working with these robotics systems with my desire to help people. Assistive technology just fits exactly into that space.
Could you define mechatronics for our audience?
It would be just like anything inside the robotics sphere where you have these hardware systems, and you can control them in order to enact change in the environment.
What are some of the different use cases you've worked on for that technology?
Right now, I'm working in the BLINC Lab (Bionic Limbs for Improved Natural Control) with Patrick (Pilarski), and the main use case there is I'm working on upper limb prosthetics. So, we have these smart robotic devices that you can control through myoelectric signals, and that would be the main use case is how do we control these prosthetic limbs that are now attached to the human body in order to do what the user wants.
How long have you been working on that specifically?
I just started my PhD in January. I did my master's in the robotics and computer vision group at the University of Alberta, where I worked on robotic manipulation with robotic arms.
I'm just now venturing into the world of prosthetics.
What are some safety concerns that you would say with the technology at the moment?
With these smart prosthetic devices, we always keep safety first and forefront. That's always what we think about first, because these devices are attached to a human. So, at the end of the day, the device doesn't have the final say. The human is always in complete control. We can make suggestions to the human, say, “I think that you want to do this,” but they always have the final control over what happens. So, we're always thinking about what is the safety of the individual that's going to be using these devices.
In a previous interview, Patrick was talking about how usually the brain has to learn to adapt to the device, but in this case the device uses machine learning to adapt to the brain. Could you discuss your views on this?
Yes. So, we want to build continual learning systems for running on these devices. So, it's all about mapping the signals from the user, which in our case would be EMG signals. So, surface EMGs, you put electrodes on the individual's residual muscles that they have, and then what do you do with those input signals? So, we want to map it to robotic motion, right? Do you want hand open? Do you want hand closed? First we have to decide how we're doing that mapping problem, and that's where machine learning comes into play.
You have pattern recognition systems, so we can predict what is going on with the specific muscle activation. We want it to continually learn and adapt over time. You can think that, say, your muscles right now. If you were to go to the gym, your muscle shape changes. So, do we have to now completely train a new machine learning model? No. We want the device and the machine learning components to adapt to the person over time as they go through changes or as their intent changes.
What kind of timeline do you think it'll be until we see these out in the real world?
I hope within my PhD. That is my goal is to solve this control problem for upper limb prosthesis.
That would be amazing. And what's your vision for the future of assisted robotics, let's say in a 10-year timeline or 20-year timeline?
I envision a world where individuals that would like to have a prosthetic limb for use in their everyday life would be able to use it in a way that we use our arms. So, to make it reliable, make it intuitive, make it easy to use, that's what I would like for.
And do you see a personalized future where a smaller person would have a smaller prosthetic? Or do you think they'd be all the same?
No, absolutely. I think for something as personal as this, you want these devices to feel as an extension of your own body. Right? You want it to be a part of you. So, that brings us down to personalized and individualized healthcare. These prosthetic limbs would need to be shaped and adapted to the individual that's going to be using them, at least in my opinion. We already see this. Right now, when you go to, say, the Glenrose Rehabilitation Hospital here in Edmonton, and you're getting fit for a new prostheses, they take 3D modeling of your residual limb, and they personalize the custom fit of your prosthetic limb to you. So, we're already seeing this happen, and for these smart devices, it'll just be even more so.
So, it'll be custom 3D printed, probably, for the user.
Yeah, custom 3D printed, custom fit, and then custom personalized control systems, also.
And how long does it take for a user to learn how to use one of these systems?
Our goal is that we would be able to train a more generalized machine learning model, and then be able to individualize it to the person within a five, 10 minute training session. That would be a goal of ours.
Once you've achieved this, what would you like to work on next?
A big part of my research is also working with the anatomy department at the University of Alberta, And I am completely fascinated with how we, as humans, manipulate the world around us. So, everything to do with the human upper limb. So, our arms and our hands are completely fascinating. So, I would like to focus all my efforts on upper limb prosthesis and really making these devices usable for people in the real world.
In what ways is it more challenging than the lower limbs?
The lower limb, the motion is very repetitive. If you think of our walking gate, it's an easier control problem to solve. Whereas, for the upper limb, you need to be able to manipulate objects in 3D space, you have so many more degrees of freedom. If you think about it, if we close our eyes and we reach around, we can still see the world around us through our hands. So, we have these amazing sensory organs that we can use to explore the environment. So, to be able to give that back to the medical community through prosthetic limbs, I think, would be a amazing,
Can you share how you draw inspiration from anatomical studies
I've been working with Dr. Pierre Lemelin in the anatomy department at the University of Alberta for the past five years now. I'm hoping that through the study of human anatomy and understanding exactly how we manipulate objects in the environment, what are our nerve pathways, what muscles are activated, we can use that knowledge in order to not just improve, first of all, the structure of prosthetic limbs.
Are there small places in the actual design of the prosthetic limb that we can change that would be a small little tweak in mechanical design, but see a huge increase in function? But also, the underlying control systems, how we use it. If we can understand exactly which nerves are firing, when we're thinking, “Oh, open my hand,” then can we use that in order to predict what the user wants to do with their prosthetic limb, and then enact that motion in the robotic device.
Thank you for the amazing interview, readers who wish to learn more should visit the following resources:
- Dr. Patrick M. Pilarski – Unite.AI Interview
- BLINC Lab (Bionic Limbs for Improved Natural Control))
- Upper Bound AI Conference.
- Amii (Alberta Machine Intelligence Institute)
The post Laura Petrich, PhD Student in Robotics & Machine Learning – Interview Series appeared first on Unite.AI.