Blossom: A Handmade Approach to Social Robotics from Cornell and Google

IEEE Spectrum

Handmade out of organic materials, Blossom is a creative new take on social home robots

Photo: Michael Suguitan

As excited as we are about the forthcoming generation of social home robots (including Jibo, Kuri, and many others), it’s hard to ignore the fact that most of them look somewhat similar. They tend to feature lots of shiny white and black plasticky roundness. That’s for admittedly very good reasons, but it comes at the cost of both uniqueness and visual and tactile personality.

Guy Hoffman, who is well known for the fascinating creativity of his robot designs, has been working on a completely new kind of social robot in a collaboration between his lab at Cornell and Google ZOO’s creative technology team in APAC. The robot is called Blossom, and we’d describe it for you, except that it’s designed to be handmade out of warm natural materials like wool and wood so that every single one is a little bit different.

Blossom is not the first soft robot designed to interact with people, and also not the first to use materials that emphasize touch. Robots like Keepon, Tofu and Mochi, and Romibo all encourage tactile interaction through things like squishiness and fluffiness, deliberately avoiding hard plastics wherever possible. Blossom, however, is perhaps the first robot to be soft both inside and outside, using a compliant internal structure to enable movements that give the robot a somewhat imperfect (and therefore much more organic) personality.

The outside of Blossom can be equally organic and imperfect, especially if you’re not very good at crocheting or woodworking, since Blossom’s exterior is very much do-it-yourself. Most DIY-type robots rely on 3D printing, which is usually reasonable for the sorts of people who decide that they want a DIY-type robot, but Blossom is designed to be accessible and engaging for people who might be more comfortable with traditional crafts that don’t necessarily rely on the latest technology. As Guy Hoffman explained to us, “we were asking ourselves: ‘How can we involve the whole family in building technology for the home?’ And the idea of crafts like knitting, sewing, and traditional woodworking came out of that question.”

Blossom’s overall aesthetic is, in some ways, a response to the way that the design of home robots (and personal technology) has been trending recently. We’re surrounding ourselves with sterility embodied in metal and plastic, perhaps because of a perception that tech should be flawless. And I suppose when it comes to my phone or my computer, sterile flawlessness is good. But for personal home robots, it makes personality so much harder to achieve. As notoriously flawed humans, we have an easier time bonding with things that aren’t perfect, yet while we occasionally see this leveraged in the programming of a social robot, very rarely is it an integral part of the physical design. It’s this inherent imperfection that’s part of what we like so much about Blossom. We asked Guy Hoffman where he got the inspiration for it:


IEEE Spectrum: How did you conceptualize the design for Blossom?


Guy Hoffman:
Looking at the design of the huge number of social robots revealed in recent years, there are a lot of repetitive features: white shiny plastic with metal or black accents,  glass screens and smooth, rounded lines and edges. The overall shape and metaphor of these robots always reminds me of miniature or child-sized astronauts. With Blossom, I wanted to reject almost all of this common wisdom of domestic robot design. 

Interestingly, in the design world outside of robotics, as we buy more and more shiny plastic and glass devices, there is an opposite trends towards handcrafted objects and experiences. From craft beer to craft light bulbs, it seems that the more accelerated and digital our culture becomes, we gain a new appreciation for the slow, inefficient, and one-of-a-kind process of traditional crafts. I wanted to bring some of that sentiment to social robot design.

Can you explain what is so unique about Blossom’s aesthetic? 


Guy Hoffman: 
Blossom is made out of soft, handcrafted materials, so its external shape is neither sleek nor smooth. The robot’s shape is not even well-defined, and instead folds, creases, and shifts as the robot moves. The materials are warm and natural, including wool, cotton, and wood. When you look at Blossom and touch it, you are met with organic textures and even the scents of natural materials.

At one point, when I was crocheting one of the shells for the robot, a coworker of mine noticed me and said that she loves crocheting. She literally pulled the hook and yarn from my hands, and ended up finishing the robot for me, much faster and with a much nicer knot pattern than I could have ever done myself. And that’s another point of a handcrafted robot: people who would never consider building a robot can participate in the design of their own family robot. 

This also makes this personal robot more deeply personal. You can imagine someone making a robot for a loved one, just like people used to make ragdolls and pass them on between generations. In that sense, Blossom attempts something that’s often promised with social robots: “bringing people together.” But Blossom does that in an indirect way by having one person craft the robot for another. 

Is it intentional that your design for Blossom doesn’t have a face?


Guy Hoffman:
Personally, I am not a fan of robot faces, and in particular robot eyes. Eyes are a strong indicator of a sophisticated sensory organ and an even more sophisticated brain behind that organ. People who see eyes need to accept a proto-social illusion in which the robot can really see them, and understand them. There is something deceptive about robot eyes and faces, and that makes me uncomfortable. 

However, Blossom having no eyes or face is one of the most common critiques I have heard about the design so far, and I am willing to accept that it might be a minority choice and a pet peeve of mine. The good news is that Blossom is customizable! Adding ‘eyes’ is as simple as stitching on two buttons or doll-eyes (it would freak me out if someone did that, though). That’s exactly the power of a handcrafted robot: you can really make it your own.

Blossom moves very organically. Can you describe what’s going on inside the robot to make that possible?


Guy Hoffman:
In the first few prototypes, the interior of Blossom was designed using standard practices of rigid links attached to servo motors. However, the soft exterior demanded an equally soft interior. My lab is next to Rob Shepherd’s Organic Robotics Lab, and I am continuously inspired by the advances in soft robotics.

The breakthrough came from my students Michael Suguitan and Greg Holman, who found the right balance between soft actuators and handmade/customizable mechanisms. The soft components give the robot a physical compliance which make Blossom move in an imperfect, lifelike way, and would be impossible to recreate with rigid components. Having worked on expressive robots for many years, one of the biggest challenges of expressive social robots is to make a rigid, hard, and digitally controlled device move in a way that seems lifelike to the viewer. Blossom achieves this goal in part through its physical and mechanical structure, with a lot of softness built into the materials used to drive the robot.


The Blossom project is a collaboration between Hoffman’s lab at Cornell and the team at Google ZOO’s creative technology team in APAC. Miguel de Andrés-Clavera is the Head of Creative Technology at Google Asia Pacific, and he shared some details with us about what the near-term goal is for Blossom:

IEEE Spectrum: Why is Google interested in partnering with Cornell to build a new kind of social robot?



Miguel de Andrés-Clavera: 

The idea of Blossom is to provide developers with a platform they can use to create smart social companions. It’s still very early stages, but we’re excited about exploring meaningful and creative applications of machine learning together with Cornell. It has been great to work with Cornell and Guy’s research lab. He is at the leading edge of HCI [human computer interaction] and has done incredible work in robotics. His mission of engineering empathy by bringing more meaningful interactions between us and machines during our everyday interaction with them is really exciting.

How will Blossom help you leverage machine learning to do something uniquely useful?


Miguel de Andrés-Clavera: 
Machine learning promises to improve people’s lives in many different ways– we are already using it in most of our products and are making AI accessible to developers, researchers, and companies through our Cloud Machine Learning APIs and TensorFlow, our open-source machine learning framework. Social robotics is an area that we believe can have a huge positive impact on fields like education or even therapy.

One project we’re working on is using Blossom to create a social companion for kids in the autism spectrum. Our research specifically explores how smart companions can help with social learning through showing empathetic responses while watching videos together. We’re excited about the results that we’ve seen with Blossom so far, and are now looking to develop it further with partners that wish to make this social learning platform for children in the spectrum more widely available to schools and families.


Essentially, Blossom’s first job in research is as a ‘media companion.’ The robot will watch YouTube videos with you, physically reacting to their content, “adding another layer or dimension to the experience, pulling that experience out of the screen and into the real world,” says Hoffman. Think MST3K, except without the snarky commentary, but still offering an independent perspective of sorts that’s on the side of the viewer rather than something internal to the video.

This may not seem like it would accomplish much, but there’s been a substantial amount of research on the effects that co-watching can have on viewers: for example, people experience racially or gender charged videos much differently depending on who they’re sitting next to. A robot viewing companion will elicit different reactions to different things, of course, but Hoffman’s research has shown that sharing an experience (like watching a video or listening to a song) with a robot can, in fact, shape your own experience: If the robot seems to like what it’s seeing or hearing, you’re more likely to enjoy it as well, even if the robot isn’t interacting with you directly. As it turns out, that shared experience also results in a more positive opinion of the robot, too.

The way that Blossom interacts with videos at the moment relies on a special type of caption file that must be hand-coded, but the broader concept is that eventually, TensorFlow will enable Blossom to automatically identify features like emotions that it sees or hears in a video and autonomously react to them in real time. This could be enormously helpful to children with autism, who may be able to use Blossom’s reactions to help them understand the social and emotional aspects of what they’re watching. To be clear, the researchers don’t know whether this will actually work or not, but Miguel de Andrés-Clavera tells us that Google is excited to develop Blossom further with partners that want to make it more widely available to children on the Autistic spectrum, their schools, and their families.

More generally, Blossom could use these video interpretation skills it’s developing to “provide commentary, emotional reactions, or even be an additional character outside of the screen,” Hoffman says. “Imagine how you would experience a football game with the robot rooting for the other team, or whether you might find the Emmy awards more satisfying with the robot providing a snobby commentary track to whatever is happening on the screen.”

No matter what functionality Blossom ends up with in the future, Hoffman hopes that its design will have a tangible influence on the way that roboticists (and consumers) think about what a robot can, and should, look like: “if robots are truly going to enter our day-to-day lives, we want a broader and more inclusive definition of their aesthetics.” It’s fortunate that many of those aesthetics are based on end user crafting, which should make Blossom more accessible. The complicated and expensive bit is the core, but the researchers are working on redesigning it to make it as affordable as possible. If Cornell and Google can get Blossoms out there in the wild, that’s when we’ll begin to understand its true potential, Hoffman tells us: “I am really curious to see what people imagine blossom to be like, look like, and move like, once it gets in the hands of designers of all ages and walks of life.”

Blossom is a collaboration between Cornell and Google ZOO’s creative technology team in APAC, with Guy Hoffman, Michael Suguitan, Greg Holman, James Redd, and Emma Cohn from Cornell; Miguel de Andrés Clavera, Rosa Uchima, Gene Brutty, Alex Chia, and Mandy Vu from Google.

Source: IEEE Spectrum

Please follow and like us:

Leave a Reply