PostCity Highlight: The Mercedes-Benz F 015 Luxury in Motion
The F 015 Luxury in Motion
The F 015 not only represents the technical realization of autonomous driving. It also shows how self-driving cars are going to change our society as the automobile moves beyond its role as a means of transportation to become a mobile living space. And this is not only exemplified by the F 015’s sleek proportions; a core concept of this research vehicle is constant exchange of information among the car, the people inside it, and the outside world. The passengers can interact with the intelligent technology they’re riding in via gestures, eye-tracking or high-definition touch screens. Plus, the F 015 Luxury in Motion uses laser projections and LED displays to considerately share the streets with others going on their way.
Ars Electronica Futurelab and Mercedes-Benz:
Transdisciplinary Research on Tomorrow’s Spaces for Mobility
No scenario of humankind’s urban future is complete without autonomous vehicles. Self-driving cars might predominate on our streets as early as 2030. That would not only make the steering wheel, gas pedal and gear shift lever things of the past; it would also put an end to fender-benders, traffic jams and searching for a parking spot. Until then, there are still a few hurdles to surmount—technical, infrastructural, legal. Perhaps the biggest challenge is for people to trust the machines. And here’s where the collaboration between Mercedes-Benz and Ars Electronica Futurelab comes in.
Trust presupposes smoothly functioning communication, so we’ve created proving grounds designed to configure such communication and interaction between human beings and mobile robots in a shared space. In contrast to virtual R&D environments, these so-called robotic experience spaces are particularly distinguished by their high degree of hands-on quality, and by the kinetic energy and tangible physical impact of the robots moving about them.
At Mercedes-Benz’s 2014 Future Talk entitled “Robotics,” some of the Ars Electronica Futurelab’s LED-equipped quadcopters were deployed in such a proving ground. In an 8×8-meter, camera- & sensor-studded interaction space, the Spaxels depicted various traffic scenarios and communicated with the people in their immediate surroundings via light signals or predefined maneuvers. The human interactors, in turn, could talk to the quadcopters by means of a haptic interface called a “magic car key” or by physical gestures.
Read more on the Ars Electronica Blog.
In a second test environment, engineers used so-called Shared Space Bots, six specially developed terrestrial robots that make it possible to simulate situations that occur at intersections. The approaching robots not only indicate whether and where they’ve identified a human pedestrian; if need be, they also project a crosswalk in the form of zebra stripes right onto the street surface.