Author: Anastasia Zagorni

It’s been a journey. Case Studies joined DesignFarm in 2017 with an aim to accomplish a transition from fashion into interior. They have tailored their techniques, found the right production partners, designed new patterns and came stronger with a wonderful collection of blankets and cushions that are available for purchase now – October 2019.

As colorists, Case Studies, bring into their garments various shades and nuances making us feel warmer, lighter and happier.

Studio “LS301” at the Hannover Fair 2019

The “LS301” studio, a young product and interaction design studio that develops interaction concepts for dealing with autonomous objects, will be a guest at the Hannover Messe 2019 from 1 to 5 April 2019 at the invitation of Berlin Partner. Under the motto “#Berlinproduziert digital inspiriert” an outstanding platform for innovative products and services – developed in Berlin – will be presented. The two designers Valentin Lindau and Jonas Schneider attracted Berlin Partner’s attention with the “Sweep” project, a semi-autonomous street cleaning robot they developed in cooperation with Berliner Stadtreinigung.

During a one-year sponsorship by DesignFarmBerlin, the two art college graduates developed the body language for autonomous technology. With this unique approach to interaction design, it is possible to give robots a lively character and a bio-mimetic habitus. In addition to the companies Ape Unit, citkar, BigRep, botspot, i-mmersive, INURU, Moeco, MotionLab, Robot4Work, pi4, Sonic Robots, WISTA, Würth Elektronik eiSos and Yptokey, they present their vision of the dialogue between man and machine to potential customers.

“The Hanover Fair offers us the opportunity to present our innovative approach to a broad audience and to network with the robotics scene,” says Jonas Schneider, one of the two founders. “We are focusing on a humane and empathic human-machine collaboration. Our service offers covers beside the search and analysis of existing products, the development of user-oriented interaction concepts, which we make experiencable with physical prototypes, explains Valentin Lindau.

The robotics industry is developing more rapidly than almost any other, not least thanks to artificial intelligence and increasingly powerful sensor technology. At least as interesting as technological progress in this field are design-oriented solutions for autonomous technology. The question is no longer whether things will move independently, but how they will do it.

Translated with www.DeepL.com/Translator

 »#Berlinproduziert digital inspiriert«

Hannover Messe, Halle 6 E39, www.hannovermesse.de/

More Studio information »LS301«: https://www.ls301.de

The interaction design studio LS301 focusses on the occurring questions within the implementation of autonomous technology and on the potential of interacting with collaborative machines. More and more smart products in our everyday life are able to preceive their surroundings, to take their own decisions and start moving autonomously in our environment. Currently the close interaction between man and machine can be experienced in the field of collaborative robots, so-called cobots.  

The Hannover Messe is one of the most important industrial trade fairs worldwide. With this year’s slogan – connect & collaborate – it was the ideal place to get a detailed overview on current standards and future trends. However after eight hours and 40 booths it was shown, that despite the proclamation of a close teamwork with machines, there was hardly any innovation regarding the interaction. In the case of service robots for example it rather appears, that the developers got lost in old-fashioned stereotypes. Besides that, there were many examples of robotic production in combination with augmented and virtual reality. But especially assembly robots played a huge role at the trade fair.

 

Cobots are still stuck in a future marketing bubble and rarely restrain old-fashioned stereotypes.

 

In this context it’s all about collaboration. But not in terms of really working hand in hand. Teaching replaces hard coding. That means by moving the different axes of the robot, it memorizes the certain positions. From there on the machine repeats its routine autonomously. At this point every interaction is limited to external interfaces. This is no surprise. Because despite their ability to perceive their surroundings, these smart machines are remarkably limited in their communication. You can’t really tell, what their intentions are and that leaves questions begging to be asked. For example how do we know that the robot is aware of our presence? How do we know in what direction the robot is about to move? Or will it stop, if I cross its path? As long as those kind of questions remain unanswered, you will definitely experience a feeling of insecurity. That makes it really difficult to work close with the robot.

 

How can we physically interact without the need of a tablet interface?

 

Some attempts to answer these questions were presented in the research and technology hall. One example was KIT’s physical interface, which was directly attached to the robot. The combination of visual feedback, indicating motions, and the control function with capacitive sensors led to an intuitive interaction. So in every situation, the user is aware of the machines intentions and gets immediate feedback once he approaches his cobot. 

At the end of the day, the claim „connect & collaborate“ was often limited to a robot handshake printed on a promotion pamphlet. But nevertheless the fair and discussions with exhibitors proved, that there is a great potential for defining how the collaboration between man and machine should be like in the future. There is a need for innovative solutions and for an authentic human-machine collaboration. A comprehensible interaction with distinct indications of the machines intention builds trust. And that is the key to a reliable collaboration.