Robots in schools, care homes next? This UK biz hopes to make that happen
Bot manufacturers should think outside the box of humanoid form – unless those makers are Engineered Arts
Interview The robotics business is booming, thanks to the hype surrounding artificial intelligence and the demonstrated capabilities of robotaxis like Waymo.
Amid the race to create robots capable of navigating public spaces and handling domestic chores as deftly as they sort packages and assemble cars in industrial settings, England's Engineered Arts remains focused on more easily attainable goals that fit into existing business models.
One of The Register's vultures in San Francisco spoke with Leo Chen, director of US operations, about robots intended for entertainment, education, and research.
Chen: Engineered Arts has been around for a little over 20 years now. We were founded in 2004 in Falmouth in the UK. I want you to imagine Falmouth as like the Santa Cruz of the UK. It's a small little fisherman's town, surfer town, college town, with a little bit of a hippie vibe, but not the first place you'd imagine a really cool robotics company.
Will Jackson, our founder and CEO, started this company 20 years ago in his garden shed in Cornwall. So a really cool rags-to-riches story in a way.
We've been around for 20 years, so we're not a startup, but we've always built humanoid robots. And we've been very particular about where we place those humanoid robots.

Boss Will Jackson ... Source for all images: Engineered Arts
Right now, until our technology is better and until the general technology of the robot ecosystem gets better in terms of hardware, we're not focused on things like factories or Amazon warehouses or even putting a robot in your home to help you wash your dishes or do your laundry or whatever it might be.
What we're focused on is what we like to call human-centric use cases. We're a human-centric robotics company that's focused on making sure that our robots can form a sense of human connection, create a sense of empathy with other human beings.
The current three verticals that we're in are research, education, and entertainment.
Now, we'll start with the most obvious one – especially because you're here in California – entertainment.
If you were to go down to the Computer History Museum where you saw me speak at the Humanoids Summit, they actually have an Ameca desktop [robot] in the back where you can talk to it, interact with it, and it's exhibited. So that's one of the things that we consider to be entertainment, so to speak.
We are also in the Las Vegas Sphere. If you were to go to the Las Vegas Sphere, you would see five robots that look very similar to a full-body Ameca inside talking to people, engaging with people before a show. That's also considered entertainment.
Finally, if you were to go to the Rosicrucian Egyptian Museum and walk into their exhibit around alchemy, you'd actually see a large six-foot-tall Thoth figure. Thoth is the Egyptian god of wisdom.
And we've created a robot that looks like a human, like an animatronic almost, and that can interact with guests. It can speak to guests about the contents of the museum. They can ask it general questions. It'll always stay in character as the deity Thoth. So that's entertainment.
We're going to be deploying robots to a school system down in San Diego County soon
Education is something that I'm very passionate about. And we're going to be deploying robots to a school system down in San Diego County soon, where the robots will be acting as supplemental education.
So what do I mean by that?
Right now, if I had a child who wasn't doing so well in class, in a public school, odds are they'd be given extra homework, sat in the corner, sort of expected to pull themselves up by their own bootstraps.
However, what we're hoping is that instead of just dealing with the issue alone, a child will be able to engage with a robot and basically have the robot act as a tutor or a companion who will help this child learn, you know, the fundamentals of the themes of Romeo and Juliet or why calculus isn't actually that hard and derivatives are actually quite easy, so on and so forth.
So that's one sort of educational use case that we're seeing.
In a couple of weeks, we're actually deploying out in Richard Bland College (RBC) as well. That's a college system out in Virginia, where not only will they be using our robots in their STEM classes, their engineering classes, you know, learning how to code Python and other stuff and being able to deploy it on our robots, but a robot can act as sort of a career coach, career counselor.
Now RBC is a community college system that's in a more rural area of Virginia. We're extremely excited to be a part of this because we're hoping that we can help expose the students to new technology that they normally wouldn't have been exposed to in a typical community college sort of setting. Not only can we expose them to it, we can help them become a little more comfortable with it.
So we've covered entertainment and education. The final one is research. We're in research labs all over the world.
The human face is one of the highest bandwidth forms of communication available to us
We're in University of California, Merced; we're in Auburn. HRI, Human and Robotic Interface Research, is something that's our bread and butter, especially with a robot like Ameca that has such an expressive face.
The human face is one of the highest bandwidth forms of communication available to us. Just by looking at me, you can tell what I'm feeling most likely. And that's something that we're very passionate about. That's why we focus on areas of deployment where communication is key.
The Register: What roles do your robots play in research labs?
Chen: They're the research platform. An example of research that we participated in is seeing how elderly patients react to having a robot in the room.
So you need a platform to engage with the patients, and our robot is a perfect example of that.
Another example would be there's a school system that's looking at how different personalities will affect how humans engage in robots, how [the robot can present itself in] different ways to make itself more appealing, things like that.
The Register: Has the AI boom of the past two or three years has changed the direction of your company? Has it opened up new modes of interaction or expanded the kinds of applications you're thinking about?
Chen: A hundred percent. So we were extremely fortunate as a company that we honestly, before OpenAI basically mic-dropped on everybody [in November 2022], releasing ChatGPT to the world and making large language models super-accessible to the common person, we had decided very much to focus on being a hardware platforms company.
We didn't design our own AI, we weren't building our own models, we weren't trying to do a neural network, because frankly, even what we saw as the top-of-the-line examples of those weren't good enough [in terms of the user experience]. We focus on the human experience. What we focus on as a company, our core values around bringing joy to people. And I don't mean to pooh-pooh any research done before OpenAI had released GPT-3.5 to the world, but it frankly just wasn't there. The conversation wasn't there. It felt stilted. It was very mechanical. It felt like you were talking to a chatbot or a Mechanical Turk or something like that.
Now, when OpenAI released what they did, we were able to integrate it into our Tritium operating system fairly quickly, make the correct API calls, and suddenly, instead of just having what we call Tin Man operated – a telepresence-operated robot – we had a robot that could converse with somebody in a manner that was very compelling and drew people in, and created these moments of joy that we focus on bringing to the world. So it was a game changer.
Now, that being said, we haven't limited ourselves to just OpenAI products. We've implemented Google's Gemini, we've implemented Vicuna, open source Llama examples. We're a very flexible system.
Now, the reason why we generally do deploy with OpenAI products right now in public is what we care about is latency. And right now they offer best-in-class latency.
However, we are still open to exploring and constantly experimenting with other models.
The Register: Looking at the robot industry in general, there's a lot of interest in acquiring more data so that robots can move around autonomously and navigate in the same way that autonomous vehicles do. Do you share those concerns? Is there a lot to be gained in terms of your applications from gathering massive amounts of data? Or is your focus on entertainment leading you in a different direction?
Chen: While we're firmly interested in entertainment, education, and research right now, that's not where we want to be all the time.
We have a keen eye on things like medical applications. Now, you're never going to see Ameca, well, anytime soon, act as a surgeon or anything like that. We're definitely not trying to go that direction.
However, there's some really, really sad statistics here in the United States. Ninety-four percent of our elderly care facilities are actually understaffed as of, I think, three years ago. And that's horrifying.
We believe that our robots can help with that [elder care] staffing shortage ... the elderly have reacted very positively to having robots act as companions
And that's general elder care facilities. That's not even specialty care. For things like Alzheimer's, dementia care, long-term memory loss care, that's not included, or that's not the focus of that number. That number actually gets worse because it's harder and harder to find the specialty staff required to staff those institutions.
We believe that our robots can help with that problem, help with that staffing shortage.
One of the things that the elderly need is companionship. And we participate in studies where the elderly have actually reacted very positively to having robots act as companions.
So we have a keen eye on that. The other thing that we're trying to work towards is long-term memory loss care and the ability to augment the staff at a long-term memory loss care facility.
Now, one of the things that has to happen, unfortunately, when you have memory loss issues is a process called reorientation.
So what do I mean by reorientation? Oftentimes people forget who they are, where they are, even when they are, and they constantly have to be anchored, in a way.
- Humanoid robots coming soon, initially under remote control
- Robots crush career opportunities for low-skilled workers
- Red Rabbit Robotics takes human form to sell work as a service
- Honda upgrades robot brain into OS for future electric cars
Now, for humans like you and I, unfortunately, if this process has to happen multiple times within an hour, you have to constantly remind someone who they are or hear the same story 50 times, we get annoyed. A robot does not. So we feel like robots, especially humanoid robots that can form a sense of human connection, create a sense of empathy, have a place in those facilities.
So that's sort of future facing. The other [potential medical business opportunities include] receptionist roles, patient intake roles, things like that, where in large hospital systems like Mayo Clinic have staffing shortages and have internal sort of programs right now trying to shore up those staffing shortages. And we believe that automation can help augment the power of human beings.
That being said, foundational models like GROOT are mainly targeted around human robots that are going to be navigating the world. Ameca is awesome but right now Ameca's two legs, unlike the Shakira song, Ameca's hips do lie – we don't have ideal locomotion. Now we're working towards that and we're working towards a better dexterity and more general purpose usefulness. But right here, right now, Ameca is a communication robot that is focused on creating a sense of hidden connection.
The Register: Do you find that people have differing attitudes about robots? Are there some people who are receptive to them and willing to work with them and others who are not? And if so, how do you navigate having part of the population that just doesn't really want to engage with robots on the level that you'd prefer?
Chen: Absolutely. I think the same problem exists with self-driving cars as well.
Statistically speaking, a Waymo is safer than me driving or any other Uber driver driving. Yet there's this hesitation around [the technology] because of this sort of standoffishness that some humans do feel towards technology and it's something that we have to keep an eye on. Nothing is universal. There's no absolute in the world other than death and taxes, right? But in this case especially when you're counting on humans that react a certain way with your technology, we do have to be aware and culturally sensitive.
Even just taking a step back and not looking at people who maybe outright hate technology or outright hate robots or whatever it might be, there are different sorts of approaches across cultural divides.
My parents are from Taiwan and interacting with more of an Eastern culture background [in terms of robotics], you find that some people are actually a lot more open over there.
If you ask a room full of people to name a robot, Terminator is brought up
It's because on the Western side, if you ask a room full of people to name a robot … Terminator is brought up or more recently iRobot or Ex Machina. And these are cultural examples where robots are threatening or the antagonist. However, if you look at Eastern culture, I grew up with Xiao Ding Dang, this little robotic cat that was around to help people, you know, or Astro Boy or Gundams.
And these are all a little more, there's a little more of a positive twist to them. Whereas you look at Western culture, robots out there try to kill you, chase you down, or take over the world.
I hope that as robots become a little more commonplace in the world, people will become a little more accepting. I was actually just at Purdue University last week on campus and was surprised to see a little delivery robot roll by. And the students didn't take pictures. No one was surprised. It was just a part of life for them. That's exposure.
So one of the things that we're working on is bringing Ameca to areas where people can be exposed to a robot like Ameca and sort of soothe the nervousness, the negative emotions surrounding robots.
If you notice, anything that I've said around robots and helping people, it's about augmenting the workforce. We're not here to replace the workforce. We're here to augment the ability of someone to do their job. And I think that's very important to underscore.
The Register: Looking ahead over the next year or two, what are your hopes or expectations for the robotics industry?
Chen: Right now there is a lot of hype around humanoid robotics.
There is an extreme amount of VC money being put into companies like Figure. There are some old standbys that are doing amazing work like Boston Dynamics with their new Atlas. Agility Robotics is doing world-class work with their bipedal locomotion. So there's a lot of innovation and there's a lot of funding being infused into the scene.
And of course, Nvidia is doing amazing work with GROOT, the foundational model that Jim Fan spoke about while on stage with me at the Humanoids Summit. There's definitely a ton of work being put into this stuff.
But there's something that we say a lot in Engineered Arts as sort of part of our company culture: "Why be human when you can be superhuman?" I think what we're going to realize is that human form is actually quite limiting. Why are we limiting ourselves to this form factor? Is there actually any real benefit to the humanoid form in terms of deployed robots across the world?
At Engineered Arts, we believe the humanoid form is extremely valuable for one thing in particular and that's communication. As I said before, the human face is one of the highest bandwidth forms of communication available to us. And beyond just the face, the body – in terms of body language and stuff like that – is extremely important.
Why does the robot have to be shaped like a human?
However now let's say that like you want a robot that washes your dishes. Why is it important for it to be shaped like a human? Or even taking a step further, let's say that you have an Amazon warehouse and you need robots to stack the boxes, to sort the goods. Why does the robot have to be shaped like a human?
And the answer that we often get at Engineered Arts from other companies is, "Well, we want these robots to be able to work side by side with humans and be able to, like, slide in seamlessly as a co-worker."
But the reality is, right here, right now in early 2025, the technology isn't quite there yet. What do I mean by that? You and I can operate, I don't know, 10 or 12 hours a day on 1500 calories, give or take, right? You'd be hungry but you could do it. There's no robot on the market that can do that right now.
But then let's say, okay, let's take out the operating time [as a consideration]. Maybe [the robots are] tethered or maybe we have a battery swap or something like that so they can keep up with us in terms of our endurance. Okay, let's eliminate endurance as a thing. Let's talk about safety.
In order to add a robot to be "useful," a motor has to be geared in such a way that it has to be powerful enough to do all these lifts that humans can do. A robot like that is not safe for a human being to be next to. So when [a robot] is too heavy or too powerful, there's going to be a lot of different layers of safety that need to be wrapped around it, so much that the human really shouldn't be beside it. Look at Boston Dynamics' videos with Atlas – you have to kick it from a distance.
So now suddenly you can't work beside this robot that you've created in the image of a human being. Then what's the point of having human form if you're not going to have that co-working scenario with a humanoid robot yet?
Now, there are [variations on this theme].
For example, Unitree [has shrunk its G1 robot down to], I think, three and a half or four feet tall.
Maybe you'll have smaller [humanoid] robots, but even then, the G1 right now isn't that useful. You're never going to deploy that directly in a factory.
So as far as where I see the industry going in two to three years, what I hope is that we will begin seeing people and companies realize that we shouldn't limit ourselves to the human form for a lot of applications. I hope that we will begin to see a lot of creativity and people sort of breaking the mold a little bit and not limiting themselves to this very narrow restricting box that's the human form. ®