Computers are good at abstract thinking; we are all too keen to delegate complex calculations to them in order to free ourselves from that chore. There is something threatening about the intelligence of machines too. Robots and synthetic or artificial intelligence (AI) force us to question our place in the world. What does it mean to be human? Where does the boundary lie between man and machine? What is man? – enlightenment philosopher Immanuel Kant pondered. Our moral views on in vitro fertilization (IVF) have evolved considerably over the past decades. Even to the extent that many people would find it unacceptable to refuse a couple who is eligible (within certain rules, such as age) to go through with an IVF procedure in the Netherlands or Belgium. Reference is then made in this context to techno-moral change: modifying moral beliefs as a consequence of technology.
Die as a cyborg
Machine and body will become more and more intertwined. Philosopher James Moor asserts that we are born today as human beings, but that many of us will die as cyborgs. Cyborg stands for ‘cybernetic organism’. As in, partly human, partly computer. Moor’s claims are justified, even though cyborg may sound like science fiction. A good example is the pacemaker which is in fact a miniature computer. Moreover, there are pacemakers that are connected to the internet. There are bionic limbs too, such as a bionic arm for disabled veterans or people with congenital disabilities. As well as exoskeletons for patients with full paraplegia.
For example, knee or hip prostheses are implants in the body, which we have been familiar with for some time already. These are not computerized technologies. Still, our human dignity and integrity have not been altered by them. We have over time accepted these implants without any problems. Even further developments, as yet unknown to us, may amount to a broader sense of human dignity. Consequently, we should not be ‘automatically’ opposed to them.
Thanks to science and technology, human beings have been improving for centuries. And the results are clearly apparent, because we are living longer and healthier lives. The debate must now focus on ethical boundaries and problems – what is desirable? And also – what kind of cyborgs do we want to be? For example, AI implants should not only be accessible to the happy few who can afford them, which invariably means that only they can enjoy the benefits. The principle of justice is important for ensuring fair, democratic access to technology. Damage or risk of harm to the patient and third parties obviously needs to be curtailed.
Are we expendable?
How unique is humankind? Are we replaceable by robots and AI systems? AI researcher Rodney Brooks thinks we should rid ourselves of the idea that we are special. We, people, are ‘just’ machines with emotions. Not only are we able to build computers that recognize emotions, but eventually we could also build emotions into them. According to him, it will at some point even be possible to design a computer with real emotions and a state of consciousness. But he also remains rather cautious and avoids making statements about when that is going to happen. That is a wise decision, because the brain is extraordinarily complex. There is still not enough known about its specific workings or the very long evolution that preceded it. Least of all about being able to replicate it just like that.
During the ‘Digital Design’ DDW Talk, Sony presented its vision on the future relationship between humans and robots. “When robots have evolved so far that we feel like they’re alive, then humans will begin to feel an affinity toward them,” says Rikke Gertsen Constein. “We really need to learn to coexist.”
Affinity in Autonomy
Constein is Global Art Director at the Sony Creative Center, where she has developed the Affinity in Autonomy design project. In this project, Sony doesn’t look at robotics from a functional perspective, as is customary at present, but from a human viewpoint instead. “We want to experiment on an emotional level,” Constein explains. “The project offers a more abstract vision for AI and robotics. We will be living alongside each other in the near future, but we really need to learn to coexist.”
What led to this project was the anxiety that artificial intelligence elicits in some people, the designer states. “So we got to work on familiarizing ourselves with the unknown.” The result is an interactive exhibition made up of five installations. Each installation depicts a step closer to an affinity with robots. It starts with the awakening of the intelligence. Subsequently it learns to respond to people and their environment. This ultimately leads to an emotional connection, or in the words of Constein, a ‘symbiosis between humans and robots’. The exhibition was on display at the Milan Design Week last April.
The other side
The designer underlines the urgency of this issue. “Robots will be given an essential role in our society. I sincerely believe that robotics will help people with the most important things in our lives. We’ve tried to flesh that out.” As a way of imagining a genuine relationship between humans and robots, Constein and her team have also thought about the robot’s side of things. “In order to find any real affinity, we also need to think about how things look from the other side.”
Robots as superheroes
In conclusion, Constein notes that we in Europe are far more cautious with regard to these matters than in Asia. “We are often highly critical in Europe. On the one hand, this is a good thing, because we do need regulations for the application of this new technology. But in Japan, for example, it has already been warmly welcomed. People are delighted, as if someone is helping them to make their lives more efficient. Robots are seen as a kind of superhero. Our project has also looked for affinity and empathy, instead of just looking at the more terrifying aspects.”
”Your sneak preview of the future” is the slogan of Innovation Origins, and that’s just what we will highlight with our Start-up of the Week column. Over the past few days, five start-ups of the day have been featured and on Saturday we will choose the week’s winner.
We shall consider various issues such as sustainability, developmental phase, practical application, simplicity, originality and to what extent they are in line with the Sustainable Development Goals of UNESCO. They will all pass by here and at the end of the week, the Start-Up of the Week will be announced.
SatAgro – A satellite’s eye for precision farming
As new technologies emerge, it is becoming easier for agricultural businesses to keep an eye on their land. In the past, everything had to be checked by hand and consequently a lot was overlooked. A combination of GPS equipment and sensors under the ground, on the land surface and in the air allows us to keep a very close eye on the growing crops without the need for human eyes.
SatAgro is actually exactly what you would expect from it – a satellite that looks after crops. The farmer saves a lot of time by outsourcing monitoring and always knows exactly how much fertilizer, pesticides or phytohormones are needed at any given time. This system might also be a solution for people without green fingers and a lousy garden …
Glowingplaces – Let forlorn city spots shine once more
Eindhoven is generally not really known as a picturesque place with beautiful historic buildings. Most will associate the city of Brabant with desolate, redeveloped buildings and Philips. In the 1970s, no fewer than 40,000 people worked for the electronics company. However, this was almost half a century ago, and Philips gradually began to disappear from Eindhoven over time. What was left were abandoned factory sites with little left to do besides demolishing them.
Sandra Poelman has shown that things could be done differently. She was one of the architects behind the renovated Strijp-S, which over the past decade and a half has developed from a sad abandoned mess into one of the most innovative hotspots in the Netherlands. And now her expertise and experience is available to everyone through her start-up Glowingplaces. Which helps transformations from hopeless to hotspot. Poelman’s experience says more than enough. Let’s just say that a certain, sensational, unspecified site that publishes about start-ups is located at Strijp-S … we are not naming names here.
Zwolle, Oss and Bergen op Zoom have already started working with her and at the moment she has had so many requests that there’s a queue.
Tofmotion – Robot security
Following transport, logistics and administration, security should become the next sector where machines will take over from the people. Technological tools in the security sector are nothing new, but if it is up to Tofmotion’s camera equipment, video surveillance ought to be carried out without the intervention of a single person in the future.
This LIDAR technology is not new, but Tofmotion has made LIDAR more accurate. This technology used to work a bit like a knight’s helmet, so it could only make environmental scans using ‘stripes’, which were never very reliable. This is not the case with the cameras from this Austrian company. They use so-called Time-of-Flight (flashLIDAR) technology, which emits a kind of electromagnetic cloud that is immediately analyzed in order to determine whether or not there are any deviations from a normal situation.
Tofmotion sees itself as a pioneer in this field, their camera has already received an official safety certificate and they are eager to continue discovering the unexplored world of robotics and security. Will the ubiquitous V insignia on security staff uniforms disappear from the streets soon? Then this trio just might have something to do with that ….
Tangany – Extra security for blockchain
Blockchain was quite the buzzword in 2017. No one really knew what it was, and every self-styled innovation guru thought that this was the future and that everyone should go for it. So far, even government agencies are convinced that blockchain is the future for them as well. Although they still don’t know exactly how and in what areas, it does sound good. Blockchain … a wonderful word that appeals to the imagination, obviously. ‘Cyber security expert’ Rian van Rijbroek has even created a whole revenue model around it and was able to amplify her mind-bending message on the Dutch national news television program Nieuwsuur.
However, blockchain is a technology that must be taken seriously and it offers plenty of advantages. The decentralized storage of data on numerous interconnected servers certainly has merit. And even though nobody really understands all the possibilities of this technology, Tangany promises to offer concrete solutions. It also offers concrete products for companies that want to work with blockchain, yet who don’t know exactly how to do that. The Germans are still looking for funding, but believe that, as pioneers, they will make the potential of blockchain technology more accessible as well as discover new innovative applications.
E-Bot7 – Automatized customer service
The team behind E-Bot7 wants to help telephone customer services enter into the future by using artificial intelligence to ensure that customers are served faster and more effectively. As a result, queues of up to 45 minutes and frustrating repeated calls (due to unsolved problems) may be a thing of the past.
A self-learning system that is designed to handle complaints or queries which can be resolved on the basis of standard procedures. On average, around 90 percent of incoming calls to a telecom provider are handled by a computer. What is this percentage based on? The personal experience of this author, who in their grey past was once a ‘customer expert’ at a really friendly call center. In cases where more customization and expertise is needed, it will still be possible to transfer the call to a skilled human customer service representative. The E-Bot7 is very much in its infancy at the moment and can only speak English and German. However, the German company has big plans and wants to expand the software step by step with new technologies, specialties and languages.
Now, as a reader, you are probably thinking: _an insensitive robot on the line which is nothing more than a talking procedure … That’s not much different from the current situation, isn’t it?” This is partially true, especially for certain companies involved in parcel delivery and unnamed government agencies burdened with issues such as tax, benefits or driving ability.
However, it definitely makes a difference: artificial intelligence is not familiar with the phenomenon of ‘office hours’, which means that you are able to get your affairs in order even in the middle of the night. Ideal! And that’s why we want to crown E-Bot7 with the honor of calling itself Start-up of the Week! Despite the fact that personal attention is being lost, E-Bot7 – or technology that resembles it – certainly seems to be the future. What’s more, companies en masse are already working on it, but not on the same general and universal scale as this start-up who can roll it out across more sectors.
The need for customer service is greater than ever, yet this technology makes it cheaper and more efficient than ever before. And that’s how you save on both personnel and office costs. It’s a pity though that this technology means that thousands of call center employees will have to look for new employment in the coming decade.
Generally speaking, most people find the idea of workers being replaced by robots or software worse than if the jobs are taken over by other workers. But when their own jobs are at stake, people would rather prefer to be replaced by robots than by another employee. That is the conclusion of a study by the Technical University of Munich (TUM) and Erasmus University in Rotterdam.
Over the coming decades, millions of jobs will be threatened by robotics and artificial intelligence. Despite intensive academic debate on these developments, there has been little study on how workers react to being replaced through technology.
To find out, business researchers at TUM and Erasmus University Rotterdam conducted 11 scenarios studies and surveys with over 2,000 persons from several countries in Europe and North America. Their findings have now been published in the renowned journal Nature Human Behaviour.
The threat to the feeling of self-worth
The study shows: In principle, most people view it more favourably when workers are replaced by other people than by robots or intelligent software. This preference reverses, however, when it refers to people’s own jobs. When that is the case, the majority of workers find it less upsetting to see their own jobs go to robots than to other employees. In the long term, however, the same people see machines as more threatening to their future role in the workforce. These effects can also be observed among people who have recently become unemployed.
The researchers were able to identify the causes behind these seemingly paradoxical results, too: People tend to compare themselves less with machines than with other people. Consequently, being replaced by a robot or a piece of software poses less of a threat to their feeling of self-worth. This reduced self-threat could even be observed when participants assumed that they were being replaced by other employees who relied on technological abilities such as artificial intelligence in their work.
“Even when unemployment results from the introduction of new technologies, people still judge it in a social context,” says Christoph Fuchs, a professor of the TUM School of Management, one of the authors of the study. “It is important to understand these psychological effects when trying to manage the massive changes in the working world to minimize disruptions in society.”
For example, the insights could help to design better programs for the unemployed. “For people who have lost their job to a robot, boosting their self-esteem will be less of a priority,” says Fuchs. “In that case, it is more important to teach them new skills that will reduce their concerns about losing out to robots in the long term.” The study could also serve as a starting point for further research on other economic topics, says Fuchs.
Self-propelled robots are increasingly more common in industry. RUVU makes software that allows a robot to work autonomously. That’s how they help various companies in various industries with the development of a robot. CEO Rokus Ottervanger: “I think that robotics have been used for barely ten percent of their potential at the moment.” That is something he wants to change with his start-up.
What do you do precisely?
“We supply software that enables robots to operate autonomously. For example, think about self-propelled carts for transporting tomatoes in greenhouses or other products in a distribution center. These robots traditionally ride over a floor-mounted induction line or other fixed infrastructure. These are very rigid and therefore expensive to adjust later on. More and more flexibility is required in these kinds of processes. Particularly in terms of the number of robots, routes and conditions that they have to be able to deal with. We supply software which allows a robot to work autonomously to companies that build these types of robots. The software is always wired into one of our customer’s products. That company takes care of the hardware and the sales of the machines themselves.”
“We always build our software out of standardized components. For example, a robot must always be able to determine its own position, be able to think how to get from A to B and be able to detect obstacles along the way. This is the same for a robot that has to do inspections outside, as it is for a robot which operates inside a factory or distribution center. But what works for one robot does not necessarily work for another. That is why we have several software modules for each of these tasks. We choose from these the one that is most suitable for the conditions which the customer’s robot has to operate under. As well as that, we often need some kind of customization in order to meet the customer’s needs. Such as an interface for users, for instance. This way, we deliver customized solutions with standard components.”
“We work in three stages in collaboration with our customers. We start with a proof of concept in which we examine, together with the customer, what is technically required for their product. We then present a prototype. During the second stage, the results can be tested by the initial users. We subsequently use this feedback in our software. As a result, we are able to produce a fully-fledged product in just a few months. In the final phase, we provide software licenses and support. The customer can then install and maintain the product themselves.”
Why did you start this business?
“I think robots are just super cool in the first place. Besides that, I see a lot of possibilities that are not being exploited at the moment. There is still relatively little automation in most companies. While there are comparatively simple and inexpensive solutions for automating parts of their process. They can save a lot of costs like that. I think it’s great that the technology we’ve really mastered can be of added value for the user.”
What is the biggest obstacle that you have had to overcome?
“We had it pretty easy in the beginning. We had a free workplace and soon found our first customers. One of them was a company that wanted to build an inspection robot. Among other things, it had to detect gas leaks in oil and gas factories. The other was a robot that could move racks in a distribution center for plants.”
“We were very busy with those first two projects. We assumed that this would result in more work, but that proved not to be the case. After those projects, we were out of work. Although I was aware of this eventuality, I hadn’t paid enough attention back then to attracting new customers. A time came when we didn’t have much work to do. That was difficult.”
How did you eventually remedy that?
“By providing a clearer picture of what we are doing. That way, customers are better able to find us and immediately know what to expect from us. In addition, we have started to build partnerships with companies that supply components for mobile robots, such as sensors, for instance. Their customers are our customers, although from a commercial point of view we don’t get in each other’s way. And that’s a great way to work together. For example, we work with a company that supplies GPS systems. If a customer of this company wants to build a robot, they knock on our door. Both the customer and the GPS supplier benefit from the fact that the systems work well together. This is what we deliver with our software. We complement each other very well in this respect.”
What has been the best moment you have had so far?
“I am happy when the customer is happy. The most wonderful moment is when a robot does what the customer had in mind. That feels like appreciation of our expertise. There’s a big technical aspect to that, but I’ve noticed that I am able to relish the entrepreneurial side of myself more and more. I’m a genuinely technical person by nature. I did mechanical engineering at Eindhoven University of Technology and graduated in control systems technology. When I work with a client on the business case for their new product, and on how we fit in with it, I think that’s very nice too. Then, of course, it’s fantastic when they finally sign the deal.”
What can we expect from you in the coming year?
“I want to double the number of team members in the coming year. There are three of us now. That’s great fun, but the six of us can build up more diversity and a broader expertise. After that we can grow a bit more, although the team shouldn’t get too big at this point in time. If we are all working on the same core of software, we have to make sure that communication is flawless. If the team gets too big, you often see that islands start emerging. You run the risk of inefficiency that way. Together with the team, I want to work on at least three separate robots in three different market sectors. So those are three different collaborations with other companies.”
“In the long term, I would find it really amazing if there were thousands of robots riding around the world with our software. They make people’s lives easier, more fun and safer.”
Danqing Liu, assistant professor at the Department of Chemical Engineering and Chemistry of TU Eindhoven, is receiving more than 400,000 euros in order to develop smart surfaces that can secrete fluids and absorb them n response to light or to electric fields from their environment. These surfaces will be used to study friction during motion, for self-cleaning systems, and for robotic and health care applications.
The Dutch Research Council (NWO) has awarded three million euros to seven early-stage researchers in physics and chemistry through the START-UP program. Liu is one of them.
Secretion is a common phenomenon in nature. Human skin secretes oil to defend our bodies against bacteria and sweat to regulate our body temperature. Fish secrete mucus from their skin so as to protect against parasites and reduce friction from water in order to swim faster. Inspired by the skins of living creatures, Danqing Liu develops smart surfaces that can repeatedly release and reabsorb substances under environmental stimuli such as light and electricity.
Controlled release of liquid from surface areas is important for self-cleaning systems, where the released lubricant modifies surface moisture and repels attachment of various contaminants. Also, it can be used for biomedical purposes, such as skin patches, in controlling humidity and slowly release antibiotics to cure wounds. And, in a not too distant future, smart surfaces could even be used as ‘artificial skins’ for robots.
The body produces heat
Walking, exercising, lifting objects or simply standing still. Every time we use our muscles, they produce heat as a by-product. The more we use them, the more they have to be actively cooled down. This is why we sweat. By sweating, water is pumped out of our bodies, and as that water evaporates, it cools us down.
In robots, especially in humanoid robots which place high torque demands on their motors, generated heat presents a major constraint on their performance. Currently, engineers solve this problem by using fans or bulky radiators, which take up space and add mass. In the future, the smart surfaces developed by Liu might be used to develop artificial skins which could ‘make robots sweats, cool down and perform better’.
With a broad background in various disciplines ranging from electrical to mechanical and chemical engineering, Liu attempts to fill the gap between molecular sciences – such as synthetic organic chemistry – and material science. “I develop new materials like silicons, hydrogels and liquid crystal polymers, on sub-micrometric scales,” she explains. These materials are ‘responsive’, meaning that they can sense external stimuli and adapt to those via built-in sensory systems. The latter are either intrinsically present in the material itself or they can be integrated as optical, electrical or chemical sensors.
World Economic Forum (WEF) asked a group of international technology experts to identify this year’s Top 10 Emerging Technologies. After soliciting nominations from additional experts around the globe, the group evaluated dozens of proposals according to a number of criteria. Do the suggested technologies have the potential to provide major benefits to societies and economies? Could they alter established ways of doing things? Are they likely to make significant inroads in the next several years? “Technologies that are emerging today will soon be shaping the world tomorrow and well into the future – with impacts to economies and to society at large”, said Mariette DiChristina, Editor-in-Chief of Scientific American, and chair of the Emerging Technologies Steering Committee. In our constant lookout for the origins of innovation, IO will present WEF’s top-10 emerging technologies day-by-day. Today: social robots.
Like most robots, social robots use artificial intelligence (AI) to decide how to act on information received through cameras and other sensors. Advances in AI have enabled designers to translate psychological and neuroscientific insights into algorithms that allow robots to recognize voices, faces and emotions, interpret speech and gestures, respond appropriately, make eye contact, speak conversationally, and adapt to people’s needs by learning from feedback, rewards and criticisms.
In consequence, social robots are filling an ever-expanding variety of roles, WEF concludes. The examples are abundant. A more than a meter tall humanoid called Pepper (from SoftBank Robotics), for instance, recognizes faces and basic human emotions and engages in conversations via a touch screen in its “chest”. About 15,000 Peppers worldwide perform services such as hotel check-ins, airport customer service, shopping assistance and fast-food checkout. Temi (from Temi USA) and Loomo (Segway Robotics) are the next generations of personal assistants, providing a new level of functionality. Loomo, for instance, is not only a companion but also can transform on command into a scooter for transport.
TU Eindhoven’s social robot HERO won the 2019 world title by performing ‘challenges’ like “Find Josja in the living room” and “Take out the garbage”. Such tasks may seem simple, but there are still many challenges for robots. Not only does it need to make a digital map of the space, but the robot also needs to understand the task well, be able to recognize objects such as benches and cans, and finally, he needs to devise optimal strategies for different tasks.
Social robots have particular appeal for assisting the world’s growing elderly population. The PARO Therapeutic Robot (developed by Japan’s National Institute of Advanced Industrial Science and Technology), which looks like a cuddly baby seal, is meant to stimulate and reduce stress for those with Alzheimer’s disease and other patients in care facilities. It responds to its name by moving its head and it cries for petting. Mabu (Catalia Health) engages patients, particularly the elderly, as a wellness aide, reminding them to take walks and medication and to call family members. Social robots are also gaining traction with consumers as toys. Early attempts to incorporate social behaviour in toys, such as Hasbro’s Baby Alive and Sony’s AIBO robotic dog, had limited success. But both are resurging and the most recent version of AIBO has a sophisticated voice and gesture recognition can be taught tricks and develops new behaviours based on previous interactions.
According to WEF, worldwide sales of consumer robots reached an estimated $5.6 billion in 2018 and the market is expected to grow to $19 billion by the end of 2025, with more than 65 million robots sold a year. “This trend may seem surprising given that multiple well-funded consumer robot companies, such as Jibo and Anki, have failed. But a wave of robots is lining up to take the place of defunct robots.”
For the first time, both the soccer robots and the care robot of TU Eindhoven have become world champions. The robot football team Tech United won in the finals of Team Water from China; the care robot HERO also got the most points in its category. The RoboCup, which was in Sydney this year, is the annual international tournament for autonomous (self-directed) robots.
Tech United had made it through the preliminaries without any noteworthy cracks, but the final against Water from China was nerve-racking. It was only in the last minute of the race that Tech United made the equalizer with a shot from Lieke Motors. Also in the extra time, it remained exciting. Tech United did get a 5-4 lead due to an action by Dominique Bluetooth, but Lineth Rekenbrein only made the 6-4 a few moments before the end of the game. Team Water was strong in the attack and Tech United’s defence was put to the test several times. Vivianne Wielema was a surprising trump card within the Eindhoven team: it is a new kind of soccer robot with not three but eight wheels.
Tech United has also won a World Cup in the care robots category, with a good distance to the number two. This is the first time that the Eindhoven team is going home with two world cups.
Special strategy per opponent
During the football game, the students and researchers of the Eindhoven University of Technology have no contact with their robots: they play completely autonomously. Team captain Wouter Kuijpers: “The team has trained on various strategies for game rethinking such as throwing in and free kicks.” With a new piece of software, it is now possible to choose a strategy that is perfectly adapted to the opponent in question. In this way, a strategy could be used during the final, specially tailored to the Chinese rival Water, who had already met the Eindhoven team in the finals several times before. Both have won the championships several times.
Tech United also played in the Domestic Standard Platform League for care robots. In this category all teams use the same robot, a Toyota HSR, but each team uses its own software. The Eindhoven HSR is called HERO. The Eindhoven robot excels in its ‘world model’, the digital representation of the world. The robot makes a 3D map of the walls, places all kinds of digital objects such as cabinets and benches in it, and there is a special code that explains that it is more convenient, for example, to stand in front of a cabinet instead of next to it.
During the ‘challenges’, the robots received assignments from the messaging service Telegram, such as “Find Josja in the living room” and “Take out the garbage”. Such tasks may seem simple, but there are still many challenges for robots. Not only does it need to make a digital map of the space, but the robot also needs to understand the task well, be able to recognize objects such as benches and cans, and finally, he needs to devise optimal strategies for different tasks.
Until a few years ago, clothing served only to protect people and at the same time still had fashionable aspects. But meanwhile, our second skin can do more and more. The measurement of body data such as pulse value or calorie consumption by means of integrated sensors is almost an old hat. Now, however, the clothing will also take on teaching functions through artificial intelligence: On the one hand as a trainer for humans, on the other hand as a programmer for robots.
The latest development comes from Turing Sense. For over three years, a team of 27 engineers and competitive athletes has worked on their vision of replacing complex video analyses of movements with digital technologies such as AI. Their vision: to design complicated sports exercises in a timely, precise and effective manner. The result was officially launched recentely. It’s a yoga outfit that incorporates sensors that connect to a virtual yoga studio through an app. Yoga videos by renowned instructors such as Brett Larkin, Kim Sin and Molly Grace are offered here. Almost as if the yogini were personally on site, she leads through the selected yoga course. The i-Double scans the execution of the asana, the yoga posture, of the students through W-LAN. This is then displayed as an avatar on the mobile or TV screen so that the user can observe his likeness to the teacher while practising warrior, dog and co. As an interactive app, the i-Yogini also reacts to voice commands such as “Freeze” or “Show me the camera”. But now comes the clue: when the user asks for “How does this look?”, he will receive a correction of the yoga position, if necessary. The workout can thus be individually adapted to the requirements of personal performance.
Of course, high-tech clothing also meets the highest demands in terms of comfort and functionality. It is even washable. The outfit called Pivot Yoga consisting of a shirt and pants for $99 is currently only available in the USA and Canada. The app currently only works in combination with IOS 11, an iPhone 7 and higher. An Android app, as well as the delivery to Europe, is planned.
Possibly exactly because demanding yoga only has the desired effect on body and soul through precise execution, there is another clothing manufacturer that has already specialised in smart clothing for yoga in 2017: Wearablex. Although with the so-called Nadi X Pants, which are also connected to an app via Bluetooth, the yoga student receives haptic feedback instead of a visual and optical correction. Ten tiny, individually adjustable vibrations are sent to the hip area, knees and ankles to indicate an incorrect position. They provide peace of mind when the position is correct. Wearable X Smart Pants are currently available in the USA, Canada, the EU (plus Switzerland, Norway) and Australia/New Zealand, they work under IOS and cost $249.
The Dresden-based Start-Up Wandelbots is currently working on an exact opposite application of artificial intelligence. The company has developed software that enables robots to program themselves by imitating human movements – sent to them by, for example, a technician using Smart Clothes. This new technology should be 20 times faster and 10 times cheaper than conventional programming. This application can be seen, for example, in the Transparent Factory of VW in Dresden.
The focus of changebots is currently still on industrial robots. But if these have proven themselves in the first practical tests, the technology could become a groundbreaking innovation: the application is so simple that in the future, everyone could be able to program an individual robot, even those without background knowledge. In addition to industrial assembly, conceivable areas of application include use at home and in nursing care.
BREMEN, December 15, 2018 – Robots have long been an important part of industrial production. They are also fulfilling more and more functions in logistics, especially in the internal logistics of an industrial company. In the past, they could only be found stationary, but now there are more and more mobile and partly autonomous robots that transport individual components or search complex production plants for errors. However, this also poses new challenges to the development of robots. The German Research Institute for Artificial Intelligence (DFKI) is now investigating in a research project how users without expert knowledge can develop robot systems tailored to their requirements in the future.
Artificial intelligence helps to construct robots
Q-Rock is funded by the Federal Ministry of Education and Research with 3.17 million euros. The aim is to develop a solution with which even small and medium-sized companies can develop robots for their own purposes. “Q-Rock is an important step towards so-called ‘integrated AI solutions’. This approach will also enable people who are not AI or robotics experts to develop and deploy systems tailored to their own needs,” says Professor Frank Kirchner, who heads the DFKI Robotics Innovation Center.
Q-Rock uses artificial intelligence methods such as structural reasoning and machine learning. It also uses data from a previous project, which created a database for the development of robots. In addition to software modules, this database also contains hardware and behaviour models. The individual components are modularized and can, therefore, be combined within certain limits.
In the end, users should be able to access such a database and configure robots from the offered elements according to their specifications. In Q-Rock, the robot itself will be able to understand its capabilities based on its hardware structure.
Robots that understand themselves
Of course, the first thing the researchers have to do is develop special programs that can do this. And they must first describe the capabilities of subcomponents, i.e. a sensor or a joint, before they can derive the capabilities of an overall system. To put it simply, they need digital models of the individual robot components from which the entire machine is created. The components are not only components such as gripper arms, a motor or sensors but also modular software modules for controlling robot behaviour.
Q-Rock should also lead to robots that are capable of understanding their skills based on the hardware they are made of. The robot software would first determine the capabilities of individual components on the basis of a general description and then derive the capabilities of the entire system from this. This would teach the robot what to do. Conversely, these abilities can then also be stored as software modules, which are then also contained in the database.
A user can now combine hardware and software components to create a complete robot system. Special prior knowledge is not required. It is sufficient if he enters his requirements into the database. This way, artificial intelligence can help to build highly specialized robot systems with little effort in the future. These would normally be industrial robots, but the principles could also be applied to the construction of space probes or autonomous exploration robots.
HAMBURG, 9 December 2018 – Industry 4.0, or the fourth industrial revolution, is already changing not only industrial production, but also the world of work. Digitilisation, the increasing use of adaptive robots and systems for production and logistics control make conventional function profiles superfluous and create new ones. However, there is still a lack of diagnostic tools to accurately describe the changes that are taking place and to make suggestions for concrete action.
Further training for job retention
The Scientific Association for Production Technology (WGP) wants to change this. The WGP is a national association of German professors of production technology. In its recently presented strategy paper Industrial Workplace 2025 (only in German), it presents an analysis model that examines the different levels of automation and shows where action is needed. “As an association of German professors of production technology, we want to contribute our know-how to make these changes as humane as possible,” says Professor Berend Denkena, chairman of the WGP and head of the Institute for Production Technology and Machine Tools IFW at the University of Hanover.
The aim is to prepare a company’s employees for innovations as early as possible by means of further training. “Companies can use this model to determine the degree of automation of their various production processes and to determine where action should be taken,” adds Professor Peter Groche, initiator of the WGP position paper and head of the Institute for Production Technology and Moulding Machines (PtU) at the Technical University of Darmstadt. Moreover, the analyses carried out with this model will allow conclusions to be drawn about what needs to change in educational institutions and training courses.
Five steps to Industry 4.0
For their analysis model, the scientists used the step-by-step model for autonomous driving as a model. The individual phases describe the degree of automation, i.e. the development phase towards a fully developed industry 4.0. The material and information flows, the state of the plant and the respective production processes are investigated.
At level 0, the production facility has no connection to other systems. The operators take care of the inflow and outflow of the material and the information flow to the higher production system. In phase 1, the system is already connected to a higher level control system. However, the material and information flows to other systems are still taken over by the operators. In phase 3, the production line is directly connected to other systems. The path taken by individual workpieces through the system is followed in real time by sensors. The same applies to the flow of material from and to other plants. Delays are immediately taken into account in the planning.
At level 4, the degree of cross-linking of the production line is even higher. Sometimes it can communicate autonomously with operators and other systems. Now robots that work together and learn with people are also part of it. Finally, step 5 describes a system that learns and interacts with the production and information systems inside and outside the factory and with the operating personnel. It would be the final phase of automation and digitisation.
Conclusion: It doesn’t work without people
But even at this highest stage of development, there is still demand for people. Many qualified activities are also taken over by machines and self-learning computer systems. “Self-learning production systems should also be taught by skilled workers,” explains Professor Bernd-Arno Behrens, head of the Institute for Moulding Technology and Moulding Machines (IFUM) at Leibniz Universität Hannover. “And autonomous subsystems of a production facility must be monitored and maintained. In addition, new business models based on data-supported services would be created, as well as new professional profiles.
Whether industry 4.0 actually creates more jobs than it destroys remains to be seen
Industry 4.0 could make simple activities in this country worthwhile again. “It can be a considerable advantage for the entrepreneur to be able to oversee the entire process chain at one location’, continues Behrens. Up to now, domestic industry has only retained the high quality jobs in the country and moved many other production steps to cheaper locations.
It remains to be seen whether industry 4.0 actually creates more jobs than it does away with. But the WGP analysis model can help companies to identify areas where they need to further qualify their employees and thus maintain employment. This would have a positive effect on the quality of employment. In a global comparison, they can still score points with the high competence of their employees.
Effect of digitisation depending on the economic environment
Experts do not agree on the long-term consequences of digitisation and the introduction of intelligent machines for the labour market. While WGP scientists are rather optimistic, others predict negative consequences. Last spring, Bitkom, the interest group of the German IT and telecommunications industry, warned that up to three million jobs would be lost in Germany by 2022. And in a 2013 study, Oxford teachers Carl Benedikt Frey and Michael A. Osborne sketched a still bleak picture. According to the study, 47 percent of all jobs in the US were available for planning alone. The two refer to findings on the consequences of automation and mechanization. Today, not only handiwork, but also intellectual activities are taken over by machines. Moreover, the qualification of people in the future will probably be measured by their ability to work with intelligent machines.
And indeed, the WGP experts believe that this ability will be very important. Moreover, the much-discussed end of work has been in sight for a long time. A study by the Mannheim Centre for European Economic Research (ZEW) gives a positive employment balance. Although in five years about five percent of the workforce has been replaced by modernisations in production, the companies have also become more productive. They now produced cheaper and in larger quantities – and hired new staff elsewhere. In addition, this development also affected other sectors of the economy, where jobs were created. Ultimately, between 2011 and 2016, digitisation led to a 1% increase in the number of jobs.
However, this only works in a positive economic environment. In an environment characterised by stagnation and trade conflicts, the impact of digitisation is likely to be different.
There is an urgent need to prepare work-floor employees for working alongside a new generation of smart robots, says Josje Verbeeten, managing director and co-owner of Robot Academy, the training agency which was set up with the support of robot employment agency Smart Robotics and the province of Noord-Brabant. Ensuring that the province is robot-ready is a gradual process, though, and Verbeeten says the question remains whether employers are willing to allocate training budget to workers who in many cases are on temporary contracts.
In setting up Robot Academy 18 months ago, she commissioned research amongst 300 companies in the province to establish how far they are in the adoption of robots in the production process, and to hear what their related needs are. One of the outcomes from this research was evidence that companies need the training to ensure that the transition involved in the innovation process is a healthy one, with sustainably positive outcomes.
Being mindful about adding robots
“I visited different factories and met stackers and packers from different countries and backgrounds who were unaware of their jobs being threatened. At the same time, it was clear that management saw robots like another tool being introduced, without considering the impact on their most important asset, the human employee. All of these innovative changes have consequences for the workforce, and it is vital that HR consider what the changes will mean for the atmosphere on the work-floor and for organisational processes,” Verbeeten says
She is full of optimism about the future of work with collaborative robots – cobots – and describes how the project captivated her from the start. “Eighteen months ago it was very new; I had an instant fascination and jumped into it. I believe in the power of robots – to improve working conditions, to take over increasingly complex work”.
Getting the best out of innovation
Preparing employees who may be resistant, uncertain or fearful about the future of working with robots will be important in speeding up the innovation which is enabled through the technology. “Robots can be a solution where it is difficult to find the right skills or to replace dull and dirty jobs. Physically challenging work does not fit a time when there is an increase in the need for sustainable work. Besides, robots make it possible to include those with a physical disability”.
Robot Academy focuses its training on management as well as technical employees: For management and HR, it is about creating awareness and providing inspiration for company policy. For production employees, including operators, cleaners and administrative colleagues, it is about creating awareness, building experience and stimulating thinking about extending knowledge, Verbeeten explains.
“I visited different factories and met stackers and packers from different countries and backgrounds who were unaware of their jobs being threatened. At the same time, it was clear that management saw robots like another tool being introduced.”
As people in industrialised countries grow older and older, appropriate care is needed. But qualified personnel is lacking. The TU in Munich wants to create a remedy with for this a project. According to the researchers, robot assistants will support old people in their daily life and thus enable a self-determined life.
In field studies, it will be tested whether robot assistants can furnish also the necessary assistance, and for this reason, the TUM opened the User and research centre Geriatronik in Garmisch Partenkirchen. In the future, new robot assistants will be researched there on the premises of the former hotel management school. The scientists have two floors with about 700 square meters at their disposal. Initially, 15 researchers from the Munich School of Robotics and Machine Intelligence (MSRM) will work here. It is planned to increase the number of employees to up to 40 MSRM members. The Geriatronik Centre is an important part of the MSRM Lighthouse Initiative, focusing on the future of health.
Aid in everyday life
Geriatronik’s most important project is the two-arm robot assistant Garmi. With his help, elderly people are to be supported in everyday activities such as getting up from an armchair. Garmi will also be used in telemedicine, however: For example, during routine examinations or in emergencies, physicians could act from a distance without delay. Thus remote treatments could become possible.
Garmisch-Partenkirchen model city
The fact that the project plays an important role for the Free State of Bavaria and for Garmisch-Partenkirchen was demonstrated during the opening of the research centre. State Minister Dr. Florian Herrmann, Head of the State Chancellery and 1st Mayor Dr. Sigrid Meierhofer of Garmisch-Partenkirchen were present. The user and research centre is financed by subsidies from the Bavarian Ministry of Economic Affairs and LongLeif GaPa gGmbH. It manages the assets received from the Leifheit-Stiftung for the Garmisch-Partenkirchen market. The aim is to turn the current location into a model city for geriatrics with the new research project.
LongLeif managing director Viktor Wohlmannstetter said about the project: “Our goal is to develop Garmisch-Partenkirchen into a model city for intelligent assistance robotics systems in old age on the basis of forward-looking senior care concepts”. Prof. Sami Haddadin, Director of MSRM, added: “The name says it all. We will carry out geriatric research here and bring the technology into contact with the people in Garmisch-Partenkirchen, who will benefit from our developments in the application.”
First eight years
The project of the TUM is initially designed for eight years. However, the hotel management school is only acting as a temporary location. In the long term, the user and research centre will be accommodated on the planned LongLeif campus. The campus is to become a generalist training centre for nursing staff and house the new Caritas nursing home. LongLeif is currently still looking for a suitable site.
Photo: Minister of State Dr. Florian Hermann tested a shaving robot at the opening. In the background Garmisch-Partenkirchen’s 1st mayor Dr. Sigrid Meierhofer and Prof. Sami Haddadin are talking. (Picture: U. Benz / TUM)
Who is responsible when an autonomous car drives into a group of cyclists? No one has the answer to this yet. Is the manufacturer to blame? The team that wrote the software? The owner of the car? Or maybe the automatic system of the car itself? Who will decide?
In a court case against a robot, Robot Love tries to start a discussion about these questions. According to the organisation of the case, it is important that we as a society start thinking about the rights of robots. Especially now that more and more forms of artificial intelligence are appearing that make the intervention of people superfluous. How far should we go in granting rights to robots?
The court case is part of the Robot Love event, which takes place from 15 September to 2 December, in the old Campina Factory in Eindhoven. More on the event here.
During the trial that takes place on 4 October, a brothel keeper takes a sex robot to court. The robot threatens to place personal information of a violent visitor on social media, the brothel keeper fears reputation damage and via the judge tries to prevent the robot from placing the story on the internet.
Hub Dohmen, a lawyer specializing in IT and Intellectual Property cases, defends the robot during the trial. Dohmen is trained as a technician and therefore ‘speaks’ two languages. He sees that the legal system is struggling with the rapid progress of various AI systems. According to him, it is impossible to predict where this technique will go, but that does not mean that we should ignore the subject. How does he do that, defending a robot? Dohmen: “For this, we need to apply an artifice in the process. We assume that the robot has a separate legal subjectivity.”
Dohmen explains that there are currently two legal subjects: natural persons (people), and legal persons, which includes a BV for example. In the robot process, a third form is added: that of robots. Or as Dohmen calls it: “the artificial entity. In this legal form a robot has rights and duties, just like people or companies have, but not necessarily the same. Robots should be able to order their own maintenance, for example, or can be held liable if they cause damage. If a person breaks something, he has to pay, the decision-making power of a BV lies with the director, but the BV has to pay. So there is always one person to be appointed who is ‘behind it’. But what about robots?
Dohmen: “Even with a BV you can’t say that a director decides everything. Sellers can sell things, HR takes on staff. They don’t have to ask the director for permission; that would create an unworkable situation. See what’s going on within ING now: a top executive steps down under great pressure from public opinion. But you could also look at the layer below him. Did they know what was going on? Can you identify them as people who were responsible? With artificial intelligence or robots, the software is written by a person. But because of all the data that the system collects, the program changes, that does not always involve a person. So the difference between a BV and a robot is not as big as it seems.”
Dohmen admits that robots are not (yet) able to make independent decisions. But he emphasizes that the robot process is “not an air cycling business”. By this, he refers to a proposal of the Legal Committee of the EU to give robots some form of rights. “There is still a lot of disagreement about this proposal. There is a lot of fear of robots, although not unjustifiably so. After all, we do not know where it is going. It is not inconceivable that unskilled workers, but also judges or lawyers, will be replaced by robots in the future. Opponents sketch a science-fiction story based on emotion. We have to stay away from this, we have to look at the facts. At the same time, this fear should be an incentive to be well informed in all areas: technical, philosophical and legal. By already thinking about different possibilities, we can avoid some surprises.”
Dohmen does not think that robot rights will become a reality very soon: “It is not a black and white issue where you can say on the one hand you have the human being who has all the rights and duties and on the other hand on the scale you have robots. It does not work that way, it is more complicated than that. With this court case, we want to take this debate to a higher level.”
The court case will be led by Mr Willem Korthals Altes. He worked as a lawyer, was a lecturer at the University of Amsterdam and has been active as a judge for over 22 years. For cases like this, there is no legislation yet, so Korthals Altes has the difficult task of making it.
After a very successful first round and winning the quarter finals, Team Rembrandts was stopped in the semifinals of FIRST League in Florida. The Eindhoven based team did win the Industrial Design Award though.
Coen Claassen, Senior Designer at VanBerlo, and Martijn Baller, System Architect at VanBerlo, are visiting the Consumer Electronics Show (CES) in Las Vegas. In a daily diary they share their vision and their experiences at the world’s biggest gathering place of innovative products. Today part 4, by Martijn Baller.