Skip to content

Safety 101: Human-Centered Robot Design

Thu Jun 02 2022 // Mary Ann Matias

An integral part of the daily grind for our engineers is not simply building robots, but developing the values and procedures needed to safely bring their ideas into the real world.

It’s just as much a matter of conceptualizing the world around a device as it is constructing the device itself. That’s why we’ve taken a collaborative approach to both human-machine interactions and our standards of operation, one that isn’t limited to a single user base or business model, and instead focuses on empathy, careful observation, and ultimately prioritizing the welfare of local neighborhoods.

So, in the first entry of a series that will provide an overview of Tiny Mile’s safety protocols and the measures we have put in place to introduce delivery robots to the city space, we’ll begin with a simple question: How do you bring a robot out of the laboratory and into the real world?

Take a moment to visualize any robot that you may have encountered, either real or fictional, and you’ll notice that a simple, yet vital, pattern emerges: their designs are inherently shaped by the environments in which they operate.

Autonomous agricultural vehicles, such as tractors, require considerable heft to effectively navigate vast farming fields; droids in Star Wars are robust and industrial, sometimes barely humanoid, in order to cope with the perils of space travel; even Ally Bank’s recently deployed security robot, leased from Knightscope, sports a dome-like appearance for sleek and durable performance on the streets of Charlotte, North Carolina.

In the case of shifting small parcel delivery to sidewalks, where the idea was to essentially create a new type of pedestrian entirely, our blueprint demanded similar considerations. This particular device needed to fit into the flow of daily living, not intrude upon it, and become a natural inclusion to pre-existing traffic.

Colloquially known amongst Tiny Mile staff as “Geoffrey,” our robots have undergone various iterations, from Tupperware-bot to the modern-day courier in pink, where hardware and software specifications were altered in direct response to the urban space it was made to traverse. As a result, each of our robots incorporates several interconnected systems that allow for it to function collaboratively with its environment and, in this way, interact safely with those around it.

Tiny Mile robots are designed and custom-built for public sidewalks, measuring and weighing considerably less than devices we can expect to encounter daily, such as shopping carts or baby carriages. A smaller, lightweight device reduces the risk of potential hazards or accessibility barriers by providing both pedestrians and our robot with the space needed for increased maneuverability. By accommodating others, we can more effectively introduce robots into bustling city centers without creating new obstacles.

To maximize visibility, however, and ensure that passersby are always aware of a robot’s presence, additional features have been included to achieve that careful balance between distinct and unobtrusive.

The bright pink color is an immediately evident example, but seemingly innocuous additions, such as the placement of color-coded light fixtures, go one step further towards facilitating natural, intuitive communication between our robots and passersby.

An illuminated camera tower and omnidirectional lights, which are visible from a distance of 500ft (150m), notify pedestrians of the robot’s approach and help indicate its relative position much in the same way that one would rely on similar cues coming from cars when traveling on the road. Sound plays an important role in this communication scheme as well; a speaker emits warning sounds that alert others of either incoming or idle robots, and prevents possible collisions.

In this way, by integrating our own systems with that of traditional sidewalk traffic, we’ve built a robot that operates in close proximity to humans in an urban environment and actively cooperates with them without forcing them to adjust their own behavior. Instead, this robot is capable of initiating nuanced methods of communication with others, allowing for both parties’ actions to be interpretable and easier to anticipate. It’s a strategy that reinforces safety, compliance, and places the wellbeing of our neighborhoods above that of the robot.

Human-Machine Interactions

In a laboratory, environments are easily controlled, calculated, and tested. It’s an effective way to assess a device’s performance against variables we can expect to encounter over the course of its use.

But compare that to the real world, your own city or neighborhood for example, where the unexpected is in abundance: cyclists shifting from roads to sidewalks, the tenuous flow of traffic at major intersections, pedestrians moving in and out of storefronts, and those too brief moments when, prior to a narrowly avoided collision, attention is elsewhere. There is only so much preparation that can be done through calculation and coding alone.

To counter act this, rather than relying solely on artificial intelligence or autonomous controls, we’ve reworked A.I. and computer systems by relegating their use to forms of driving assistance. Instead, our robots are remotely piloted by human drivers to safely facilitate interaction, cooperate with human pedestrians, and provide effective assistance in real-time. By leveraging software and hardware assets, and combining them with the dynamic and robust skill of human drivers, we’ve developed a layered security system that’s as variable as the city streets.

This means that the quality of our pilots is as crucial as the hardware they’re operating; a comprehensive onboarding procedure requires that each candidate must be vetted and undergo extensive training from experienced staff members.

Training itself is a multi-faceted process, comprised of adherence to regional laws and regulations, methods of avoiding risk or injury to others, and practicing respect for potentially vulnerable individuals, such as those with disabilities, children, and elders. Combined with continued monitoring from our Dispatch and Support Center, these protocols ensure that only the most qualified drivers are permitted to interact with businesses, passersby, and customers alike, and that they are equipped to make the right decisions.

A supportive network of additional hardware and software therefore functions to complement the increased complexity and range of our pilots, re-emphasize security, and accurately render the environment in ways that human senses may be limited. Here are two basic, yet core, units of this system:

Optics; our robots exchange traditional body-mounted visual systems for a unique periscopic camera tower, utilizing a curvilinear fisheye lens to create a 360-degree panoramic view of the environment. As both the robot and the area in which it operates are rendered, the user interface can increase situational awareness and provide active feedback.

Sensors; a specially calibrated array of devices, such as LiDAR, actively detect, measure, and respond to objects as they appear in the robot’s path. This provides a continuous layer of collision avoidance by detecting potential obstacles in real-time and adjusting speed accordingly.

Each provides the valuable data required in order for pilots to accurately perceive their surroundings — situational awareness, immersion, and active feedback — creating multiple points of perception and control. Importantly, this form of driving assistance functions as a safety measure. Although our robots are not fully autonomous, in cases where pilots make an error in judgement or fail to respond in time to hazards, these features will enact automatic failsafe measures in order to prevent or mitigate harm to others.

Simply put, we train our pilots to avoid making mistakes. Our robot is designed to act in case they do.

Building the Right Robot For the Job

This constant rapport between user, robot, and environment keeps human-machine interactions safe, flexible, and balanced. It’s a key component in a robot design that works intuitively with the user pilot and, in turn, encourages respect and cooperation with humans in the real world by effectively understanding, interpreting, and then responding to these more nuanced behaviors.

It’s an ongoing process, and one that us at Tiny Mile will never quite be satisfied with; we will continue to push boundaries in our security systems, rigorous testing protocols, and standards of operation to ensure the wellbeing of those around us. Then, we will assume it’s not enough and push further.

Through working with regulators, manufacturing and safety experts, and seeking input from local and provincial officia