Tomorrow is good: Human beings, machines with emotions?

Computers are good at abstract thinking; we are all too keen to delegate complex calculations to them in order to free ourselves from that chore. There is something threatening about the intelligence of machines too. Robots and synthetic or artificial intelligence (AI) force us to question our place in the world. What does it mean to be human? Where does the boundary lie between man and machine? What is man? – enlightenment philosopher Immanuel Kant pondered. Our moral views on in vitro fertilization (IVF) have evolved considerably over the past decades. Even to the extent that many people would find it unacceptable to refuse a couple who is eligible (within certain rules, such as age) to go through with an IVF procedure in the Netherlands or Belgium.  Reference is then made in this context to techno-moral change: modifying moral beliefs as a consequence of technology.

Die as a cyborg

Machine and body will become more and more intertwined. Philosopher James Moor asserts that we are born today as human beings, but that many of us will die as cyborgs. Cyborg stands for ‘cybernetic organism’. As in, partly human, partly computer. Moor’s claims are justified, even though cyborg may sound like science fiction. A good example is the pacemaker which is in fact a miniature computer. Moreover, there are pacemakers that are connected to the internet. There are bionic limbs too, such as a bionic arm for disabled veterans or people with congenital disabilities. As well as exoskeletons for patients with full paraplegia.

For example, knee or hip prostheses are implants in the body, which we have been familiar with for some time already. These are not computerized technologies. Still, our human dignity and integrity have not been altered by them. We have over time accepted these implants without any problems. Even further developments, as yet unknown to us, may amount to a broader sense of human dignity. Consequently, we should not be ‘automatically’ opposed to them.

Thanks to science and technology, human beings have been improving for centuries. And the results are clearly apparent, because we are living longer and healthier lives. The debate must now focus on ethical boundaries and problems – what is desirable? And also – what kind of cyborgs do we want to be? For example, AI implants should not only be accessible to the happy few who can afford them, which invariably means that only they can enjoy the benefits. The principle of justice is important for ensuring fair, democratic access to technology. Damage or risk of harm to the patient and third parties obviously needs to be curtailed.

Are we expendable?

How unique is humankind? Are we replaceable by robots and AI systems? AI researcher Rodney Brooks thinks we should rid ourselves of the idea that we are special. We, people, are ‘just’ machines with emotions. Not only are we able to build computers that recognize emotions, but eventually we could also build emotions into them. According to him, it will at some point even be possible to design a computer with real emotions and a state of consciousness. But he also remains rather cautious and avoids making statements about when that is going to happen. That is a wise decision, because the brain is extraordinarily complex. There is still not enough known about its specific workings or the very long evolution that preceded it. Least of all about being able to replicate it just like that.

 

You decide: who will be our Start-up of the Month for October?

Innovation Origins also chose another five Start-up of the Weeks last October. Now that November has arrived, we’re taking a moment to look back. After all, we still need to hand out our monthly trophy and depend on our readers to help us decide.

You and our editors will decide who will walk off with this wonderful honor. And to refresh your memory, here are the five weekly winners one more time!

Week 40: High Performance battery

Week 41: Etagrow

Week 42: Chakratec

Week 43: Hawa Dawa

Week 44: Vitibot

You can vote until 5 pm next Friday. The winner gets to be in the spotlight and earns eternal fame!

[democracy id=”7″]

TU Munich: Incarnation of the H-1 robot

Scientists surrounding Prof. Gordon Cheng from the Technical University of Munich (TUM) recently gave the robot H-1 a biologically inspired artificial skin. With this skin, (which is the largest organ in humans by the way), the digital being will now be able to feel its body and its environment for the first time. However, while real human skin has around 5 million different receptors, H-1 has a total of just over 13,000 sensors. These can be found on the upper body, arms, legs and even on the soles of its feet. Their goal is to provide the humanoid with its own sense of a physical body. Thanks to the sensors on the soles of the feet, for example, H-1 is able to adapt to uneven ground and even balance on one leg.

But of far greater importance is the robot’s ability to safely embrace a human being. And this is not as trivial as it sounds. As robots are capable of exerting a force that would seriously harm humans. A robot comes into contact with a human being at several different points especially during an embrace. It must be able to quickly calculate the correct movements and the appropriate amount of force required using this complex data.

“This may be less important for industrial applications, but in areas such as healthcare, robots have to be designed for very close contact with people,” Cheng explains.

Biological models as a basis

The artificial skin is based on biological models in combination with algorithmic controls. The skin of H-1 is made up of hexagonal cells. They are about the size of a €2 coin. The autonomous robot has a total of 1260 of these cells. Each cell is equipped with sensors and a microprocessor. These are used to measure proximity, pressure, temperature and acceleration. Thanks to its artificial skin, H-1 perceives its environment in a much more detailed and responsive way. This not only helps it to move around safely. It also ensures that it is safer in its interaction with people. It is able to actively avoid any accidents.

Event-driven programming delivers more computing power

So far, the main obstacle in the development of robot skin has been computing power. Previous systems were already running at full capacity when evaluating data from several hundred sensors. Taking into account the tens of millions of human skin receptors, the limitations soon become clear.

To solve this problem, Gordon Cheng and his team chose a neuroengineering approach. They do not permanently monitor skin cells, but use event-driven programming. This allows the computational workload to be reduced by up to 90 percent. The key is that individual cells only pass on data from their sensors when measured values vary. Our nervous system works in a similar way. For example, we can feel a hat as soon as we put it on. Yet then we quickly get used to it and don’t need to give it any attention. We only tend to become aware of it again once we take it off or it gets blown away. Our nervous system is then able to concentrate on other, new impressions which the body has to react to.

Prof. Gordon Cheng ©Astrid Eckert /TUM

Gordon Cheng, Professor of Cognitive Systems at TUM, designed the skin cells himself about ten years ago. However, this invention really only reveals its full potential as part of a sophisticated system. This has recently been featured  in the specialist journal ‘Proceedings of the IEEE.’

More IO articles on this topic can be found here:

Top 10 Emerging Technologies (2): social robots

Could you love a robot?

Aquatic robot mimics motion of tuna to break speed record

A TU Delft graduate student from Soft Robotics, Indu1strial Design Engineering, has created the fastest swimming flexible robotic fish. Mimicking the movement of real fish, the prototype is able to go as fast as 0.85 m/s. Which is at least 27% faster than what the previous record holder was able to accomplish.

It all started when Sander van der Berg was looking for a graduation project that would make a significant contribution to robotics. But also one that could be carried out within the brief time span that he had for it.

He found that opportunity in the topic of oscillating fin propulsion (robotic fish). It is a very promising field that is still in its early stages of research, which meant that there was plenty of potential for an innovative design within a relatively short time frame.

After reading a few papers, I soon saw that there was room for improvement. Which I eventually managed to do using a single direct current (DC)  propulsion system. This system is the first system that uses a single DC motor which can generate a higher number of the precise swinging movements that a robotic fish needs in order to move faster. This got its top speed up to 0.85 m/s,” said van Der Berg to Innovation Origins. 

The previous record was held by Jun Zhong and his associates. His bionic robotic fish swam at a speed of 0.67m/s back in 2017.

®Sander van den Berg

The flexible robot was able to surpass this record speed by using a fluent S-shaped motion to swim, similar to how a fish flaps its fins and tail. The part that is is actively used for this is pulled from side to side by a single DC motor. Another section bends according to the resistance of the surrounding water.

Moreover, unlike conventional propulsion rotor blades, the whole system is watertight.

Building the underwater robot required printing rigid 3D parts, a sheet of plastic which was used for the compliant section and needed to be able to bend, plus a soft silicon skin which was used to make the hydrodynamic shape. In addition, computer modelling was used to program the precise movements of the fish.

According to the graduate student, this robot could achieve even faster speeds.

“The goal was speed and so a higher speed was accomplished. The system isn’t totally optimized as yet, so an even faster speed might still be achieved, probably with the same prototype,” said van der Berg.

“It is important to note how important it is that our fish is able to swim freely and that it is compliant. There are robotic fish out there that swim faster when they were attached to a rig, for instance. They can’t turn their heads, that’s why they can swim faster and better than if they were actually swimming around freely. The compliant design allows for a more fluent motion (more efficient) and uses a single motor (also more efficient, less costly and less complex). And what’s really of paramount importance – it doesn’t harm the environment it swims in. If it hits something, it will just bend,” he added.

Van der Berg has now left the project as he has since graduated. However, there is currently a new graduate student hired for the project and a new paper is on the way.

Moves Like a Fish

The way in which tuna swim was used foremost in the creation of the robot.

Thunniform swimming is well documented as one of, or even the most, efficient forms of swimming. It uses vortexes to create a peak thrust at the back instead of a simple reactionary force. This is called a reverse Kármán vortex street.” 

®Sander van den Berg

What also appealed in this research, was the fact that thunniform swimming activates only a small portion of the tail. This in turn allows a large portion of the body to be able to carry a load. “This means that practical applications are possible. The gearbox system we designed is also quite wide, which turned out to be similar in width to that of a tuna’s torso. All in all, it was not designed like a tuna, but its capacity to carry a load, its efficiency and speed made it look and act like a tuna.”  

What is its contribution?

When discussing exactly what this prototype has to offer, van der Berg said that “there are many short term benefits. But the ultimate goal is to design a more efficient propulsion system as an alternative for most underwater rotor blade propulsion systems.” 

He does see potential in the route that this work is taking.

“Even before this can be achieved at a reasonable cost, there are already plenty of benefits to consider. I have already mentioned the reduced risk of harm to the environment. For instance, conventional rotor blades make a lot of noise and suck in debris. Sea life can easily be harmed by these fast and sharp rotating blades. Oscillation is much quieter, doesn’t harm anything it comes into contact with and doesn’t suck up any debris.” 
The way in which it respects wildlife makes it an ideal vehicle for research in this area.
As well as all of that, he believes that the benefits of oscillation systems can be used for underwater drones or submarines as it can increase efficiency in deep dives. This is mainly due to the fact that the flexible fin oscillation system is completely watertight.
®Sander van den Berg
Yet what this machine offers is not only limited to the underwater world. It just needs some extra research.
The prediction model for how well the compliant section works and the system’s design for higher speed could perhaps also be applied to airborne drones (using an oscillation system). More uses that I haven’t yet thought of might be discovered by someone else.” 
Consequently, the researcher states that it has major potential for efficiency in general. Even on water surfaces, fin oscillation can be 100% more efficient. This means that the possible areas of application are huge.

‘Europe must invest in a hub for collaborative robots in SMEs’

The European Union must have a robotics innovation hub where small and medium-sized enterprises that collaborate with others in their businesses will be able to test new robotic applications. With this message, Bram Vanderborght of the Vrije Universiteit Brussel opened the discussion between experts on the future of the collaborative robot industry during the European Innovation Days last week in Brussels.

It is expected that millions of euros from the Horizon Europe innovation fund will go towards the development of this hub. The Horizon Research Fund is worth a total of around 100 billion euros.

Not very many small companies are testing robots

According to Vanderborght, it is primarily large companies that are now taking part in European-funded innovative robotics projects with a view to improving their business processes. Smaller companies usually do not have the opportunities to do this. They miss the boat as a consequence.

The development of collaborative robots for enabling companies to work faster, better or more comfortably, is in reality still very much in its infancy here in Europe. There are no proper rules yet for outlining what an autonomous operating robot may and may not do. Such as what data it may or may not collect from its work environment and how it should deal with it.

Moreover, there is a problem with liability, Vanderborght states. Here in Brussels, an experiment is currently underway in a factory where robots are not allowed to do certain work because they cannot be insured against liability. Agreements have to be made with insurers on these matters.

‘Fear of robots is not warranted’

The chief technology officer of the Italian robot multinational Comau (part of FiatChrysler), Pietro Ottavis, states that collaborative robots will be widely used in the automotive industry in about ten years time. By then they will be more accessible than they are today. They will be mobile, portable and intelligent,” he predicts. “Robots are tools for human beings. There’s nothing to be afraid of. After all, humankind has been using tools for more than 100,000 years.”

Examples of the kind of collaborative robots that he is referring to are robots that help on the shop floor by packing products at temperatures below freezing. Think about butchers. This is not a pleasant job for people. It is not healthy to have to work in a very cold room all the time.”

Lost suitcases at the airport

Another example that can relieve people of heavy work is a robot that helps to sort and retrieve lost suitcases and bags at airports. We are all familiar with that problem,” says Ottavis. The staff won’t have to do the heavy lifting any longer. What’s more, there’s a good chance that a robot will look and find something more quickly.

Professor Sigrid Brell-Cokcan from the German University of Aachen, and chairperson of the Association for Robots in Architecture, says that she has seen robots in China assisting in housing construction. She anticipates this development for Europe as well. The use of robots makes work in the construction industry safer. Almost one fifth of all accidents occur in the construction industry. It is safer to let robots do the heavy work. People who work there, such as construction workers, won’t have to suffer anymore from complaints with their lungs, hands or other body parts. They will no longer have to retire at the age of 50 because their body is worn out by heavy work.”

Robot construction workers

Robot construction workers should also make houses cheaper in the future. In addition, healthcare institutions will save money because they will no longer have to treat injured construction workers.

A major problem that European robot companies have yet to resolve is that there are no unequivocal rules governing the production of robots nor for the software used to program them. We have to have these, Vanderborght says. A robotic arm from one manufacturer must be able to connect to a robotic component from another manufacturer.

Uniform EU regulations needed for robots

The various robots must also be able to work together. Their interfaces must be better connected so that they are able to work fast. At present, they are still working too slowly, which means that their productivity is not high enough,” says Minna Lanz, Finnish professor of mechanical engineering at the University of Tampere. The robots also need to be able to comprehend the regulations themselves. They are required to work safely according to the regulations as well. In the meantime, Europeans need be trained in the use of robots and, if necessary, overcome the fear of using them.

Competition from Asia

All parties concerned must be involved in drafting the regulations for robots: the companies that make robots, but also consumers who can thereby indicate what they find acceptable and what they do not consider to be acceptable. If these rules are not put in place in the near future, the EU could lose its leading position in the world of robotics to countries in Asia, EU senior official Lucilla Sioli says. According to Professor Vanderborght, the development of the European robotics market is therefore a top economic priority.

In search of more efficient agriculture using sensors and cameras

Photonics is a promising technology in almost all sectors, now and in the future. Numerous scientists are exploring the possibilities of this light technology. But what use is it to companies in other industries? How will they apply photonics? That’s what Photonics Applications Week is all about. This week will be dedicated to workshops, congresses, and exhibitions to show professionals from various disciplines what photonics can contribute in their sector.

Photonics can also be of great value to the agricultural and food industries. In the agri-food sector, there is currently a threat looming of major shortages of production staff who are able to do the work that has not yet been or cannot be automated,” says Ivo Ploegsma. He is responsible for scouting out new technologies within the Food Tech Brainport innovation accelerator. Agriculture, including arable and livestock farming, and horticulture all form part of the agri sector. “That work is often outdoors or in a food factory where it is relatively cold. That means that people are more likely to choose another job.” This is one of the reasons for the sector to go ahead with developing automation further alongwith photonics.

Converting applications

Together with ZLTO, BOM, FME and REWIN, Food Tech Brainport is organizing an event during the Photonics Applications Week. This is supported by the agri-food innovation program of the province of North Brabant. ” We are looking for applications in the agricultural sector that have been around for some time in the high-tech sector, such as more sensitive cameras or optical sensors,” says Ploegsma.

“We need to transform these into systems that work well in our sector. Like, for example, cameras that can withstand water and sand.” In order to achieve this, the agri sector must cooperate with the high-tech industry. “It’s just that these sectors speak totally different languages. As a result, the agri sector is unable to clarify their problems properly, and the companies in software, photonics and automation are unable to adequately explain their solutions. We want to make sure that the sectors understand each other better during this event and start looking for smart solutions together.”

Robotic eyes

The cameras or sensors, hence photonics, serve as a part of a robot or an automated system. “When you replace the hands of people who cut, pick or pack something, you also replace their eyes. A robot must be able to see what it has to pack or where it is going. That’s what the cameras are for.” These have many more possibilities than the human eye. “Using the latest camera technology, we can even look inside a tomato and measure certain values.”

A number of large farms are already well on the way to applying this technology. “A potato farmer from Brabant even has his own mini-airport and a permit to fly, which allows him to fly drones over his land,” Ploegsma explains. “But there are also still plenty of farms where people have to lie on the side platform of a slow-moving tractor in order to weed crops, for instance.”

Weed control and food production

The Photonics Applications Week event is also about weeding, among other things. “A robot or self-propelled tractor can easily weed when a field is empty. But if there are also potato plants in the field, the robot has to decide for itself what constitutes a weed and what does not.”

This makes the system complicated, according to Ploegsma. A camera needs to take pictures of the land. The robot analyzes the land on site and decides whether the plant is a weed or not. It then proceeds to carry out a mechanical action, namely pulling the plant out or leaving it alone “The system has to decide all this by itself. That’s why the machine has to be self-adaptive as well.”

The problems facing food processing companies are more or less the same. “Sometimes a robot with a camera system is already being used here. In many cases this is still difficult because the products are not fixed in shape or move around freely during production. This makes it a lot harder for a robot”, explains Ploegsma. By making existing camera systems smarter or developing new systems, a robot can respond better to variable products. According to him, there is still much to be gained in terms of quality, efficiency and data collection.

Bringing components together

This requires various companies with different expertise to work together. It takes one company to create the proper images, one company to provide software, analyze and classify data, and another company to build a machine or application that will eventually perform an action such as weed removal or relocating a food product. “The technical elements already exist. Now we need to combine them in the right way in order to be able to use the system on a large scale within the agri sector and the food industry.”

The participating organizations are looking for companies that will have developed a viable solution within six months to a year in order to make up for the shortage of production workers. “Research into new technological developments through universities, for example, remains necessary. Yet the agri-food sector is also looking for concrete applications that they will be able to use quickly,” Ploegsma says. “That is why we want to combine existing technologies so that we can create an affordable and reliable solution.”

Hunger for data

The demand for automation also has another reason, namely the hunger for data. ” Data concerning food production is becoming increasingly important. More and more people have an allergy or a preference for certain foods. On top of that, consumers want to know increasingly more about the origin of their food,” says Ploegsma. All this has to be registered on the basis of data collected during the production process. This is often done during the automated stages of the production process.

Cooperation is essential

For the time being, the high-tech and agri-food sectors are two separate worlds. During Photonics Applications Week, Ploegsma wants to ensure that these parties have a better understanding of each other and will start working together on applications that they are able to implement quickly. “We hope that all the various companies will start projects together in order to achieve this. The organizing parties will supervise and support these projects where necessary. By doing this, we hope to be able to show the first results within just a few months.”

The Photonics Applications Week will run until October the 4th. You can find the program here.

Start-up of the day: artificial intelligence for safer rehabilitative therapy

Structural intelligence in combination with sensors could be the future of telemedicine – this is how mission of the Aisens start-up can be summed up. What to do if you invented an innovative technology, but suddenly the market has changed and nobody wants your invention anymore? Adam, Jarosław and Piotr, the founders of the Polish start-up Aisens, have faced that kind of problem. All three of them studied automation and robotics together.

Later, their paths went their separate ways. Adam ran his own business. Jarosław and Piotr stayed at the university and invented new technologies. When they developed sensors for precise drone orientation in space, Adam joined them in order to help transform an invention into a business. However, when the market changed, the founders were left behind with interesting technology, but without any idea how to use it. Then Jarosław’s wife, who is a physiotherapist by profession, offered her help. The founders learned from her all about what problems physiotherapists have to face every day. It turned out that sensors, which were originally supposed to be for drones, is able to be successfully used in the treatment of patients. Subsequently Orthyo was created – a system that aids rehabilitation safely by remote.  Recently the device entered the market after more than one and a half years of work.

What exactly is Orthyo?

Adam Woźniak, CEO: “Orthyo is a set of sensors and an application that is used in physiotherapy and rehabilitative therapy. It has two features. First of all, it supports diagnostics. Currently physiotherapists, occupational therapists and orthopedists just use their eyes in order to assess the mobility of joints and range of motion. Or they use a goniometer, i.e. an adjustable protractor. Its level of accuracy depends on how precisely it is applied to the patient’s body. Orthyo specifies the measurements. After placing sensors on the patient’s elbow or knee joint, the specialist collects parameters about the mobility of the joints. This type of examination can be carried out during the first visit to the clinic, but it can also be done during subsequent visits in order to evaluate the effectiveness of a therapy.”

And the second feature?

We enable safe telerehabilitation. Patients nowadays often either don’t do their exercises at home at all, as they don’t remember them or are afraid that they will hurt themselves. Or they exercise incorrectly and actually worsen their condition. Thanks to our device it is possible to record a series of exercises for any given patient at the clinic. We somehow personalize the therapy this way. The patient puts on sensors at home and when they exercise they see their avatar (e.g. their virtual hand or leg) on the screen. They are then able to compare their movements to the model movement and correct themselves if necessary. After completing a series of exercises, they can send the results to the server for the therapist.

The best moments for the company have been …?

The very first one was when we received a grant of 180 thousand PLN for the development of the idea after just giving a PowerPoint presentation. The second time was when we got to go to the  Startupbootcamp Digital Health accelerator in Berlin. There were 3,000 entries from all over the world, and we made it into the top ten. That’s when we stopped having second thoughts. The third time was when we signed a deal with an investment fund. Then other people started to believe that we were capable of building much more.

The most difficult time for the company?

There are always difficult moments. Sometimes the financing from investors didn’t keep up with our plans. The certification process for the medical device was not easy either, but finally we managed to complete it.

What are your plans for the next year?

By the end of the year we want to validate our business model. We expect that we will be selling Orthyo Pro kits to clinics and from that patients will be renting Orthyo Home sensors for their homes. This year we would like to sell a few dozen or so sets to clinics in Poland and have a few hundred or so patient rentals. In the next 12 months we would also like to approach foreign markets. We are already holding talks on these matters with foreign partners.

What do you want to achieve in 5 years?

We want to be a fully-fledged, self-financing company that operates on several continents and has its branches in several countries around the world, and have a portfolio which includes more products based on artificial intelligence and sensors.

 

All of our articles on start-ups can be found here.

 

 

Prep for RoboCup 2019: instead of training sessions – scan images

Two multiplied by five machines on two legs running after one ball so that they can kick it into the opponents’ goal – that is robot football. Even detecting the ball is a challenge for these players. Tim Laue from the University of Bremen explains how deep learning plays an important role and what a training camp for robotic footballers looks like.

What role does Deep Learning play in robot football?

Deep learning is a technique which is good at identifying and classifying objects. Some mobile phones use it. For example, if you search for ” bicycle ” in your picture gallery, the program will find photos that show bicycles without you ever having itemized them. With robot football, it is essential to be able to immediately and unambiguously identify your teammates, the playing field and above all the ball on the basis of the video images that have been recorded. So we use that.

Don’t the ball and the other players have a tracking device that makes recognition easier?

No. The ball used to be orange. It was usually the only orange spot on the field. They simply had to look for this color value while processing images. That was comparatively easy. Today the ball has a black and white pattern and is easily confused with shadows on the ground or with parts of other robots. I soon reached the limit using a classic image processing approach, such as scanning for dark spots on a large white spot.

So, you are now teaching your robots the concept of the ‘ball’?

We are designing a so-called neural network. We show this network a large number of images. Images that were previously categorized by humans. The network then receives information along with an image as to whether the image depicts a ball or not. This way the network learns frame by frame what constitutes a ‘ball’ in these images. Normally you would define a list of properties that the software has to scan for.

These days, we are redefining the learning process and now let the software compile on the images themselves the properties which make up a ball. There are two main influencing factors for that. On the one hand, there is the range of variations of the example material and the number of repetitions with these images which we feed the network. On the other, we also determine the level of depth for the network.

How long does such a training last?

In order to get reasonably satisfactory results, we need more than 20,000 images, of which only a fraction actually contain a ball, as well as hundreds or even thousands of repeat runs just which feature the characteristics of a ball.

Nowadays, all teams in the RoboCup use this method because it produces pretty solid results, even when lighting conditions change and colors are different. However, you can still see robots running into the penalty area.

Why does it take so long to learn?

Computers are not as good at recognition as humans are. In fact, I can show a child a painting of a giraffe and it is highly probable that the next day when it visits the zoo, the child will recognize the unfamiliar creature with the long neck as a giraffe. A lot of processes happen when a child recognizes something.
The neuronal network has none of these. That means, you have to show it the ball in all its possible variations; the ball is far away, close, half hidden by a teammate, shaded, brightly lit etc … The more variations that we are able to offer the system, the better. Always in the hope that the images will cover as much of the playing field as possible where a ball can be placed. The network can then abstract and recognize ball representations that lie somewhere between the examples shown.

Is the speed at which the image is processed during the game important as well?

It then calculate is the ball rolling, how fast does it roll, which direction is it heading, where am I right now, what do I do now? After all that, it determines its course of action. In case I really want to evaluate every single image, then I only have 16 milliseconds to do all of those calculations for every single image. And we do want to process every single image. There may be a crucial piece of information hidden in an image. Missing even one is therefore not a good idea.

A next step could be to link the optical information with other properties.

That would open a whole new kettle of fish. A computer is good at calculating things. But first you have to convert everything into quantifiable information. You are actually able to do that quite well with images these days. At the moment there is no other way around this when it comes to robot soccer.

How does the robot decide where to look? Do NAOs not have a 360-degree camera?

That is the eternal question. Should all robots always have the ball in sight or should the tasks be distributed across the team? There are a few software programs that try to answer this question. In fact, you frequently see on the video recordings that robots sometimes miss something important.

Mr. Laue, do you play football yourself?

No, I watch football, but I have no talent whatsoever myself. At present these two areas are still too far removed from each other. With real football knowledge, you’d be more likely to get in the way than be able to help.

Yet the long-term goal for the robot community is to one day really get robots to compete against a human team?

Somebody set a goal for 2050. I can’t say whether this is realistic.

Nevertheless, one thing is clear: compared to other challenges, football is still a very simple discipline. We have an entire Collaborative Research Center at the University of Bremen that is working on how to let robots act sensibly in a domestic environment. This is highly complex because this environment is far more unstructured than football. As a human being, I can also cook in an unfamiliar kitchen. When a robot enters these kinds of environments, it gets really complicated.

@Photos: Tim Laue, University of Bremen

Start-up of the day: cylindrical transmission from IMSystems for high-precision robotic arms

IMSystems is a promising start-up from Delft that is developing a new generation of transmission systems for robotic arms. As a result, robots will soon be able to work with much more precision, which means that the human hand will probably no longer be needed for the manufacture of highly sophisticated electronics such as mobile phones.

How is the IMSystems method put together?

“The working principle behind existing transmissions is that they convert the rotational movement of a fast-running engine into a slower but more powerful movement. Due to the mechanical leverage effect that takes place in this process, the power of a particular application such as that of a robot arm is increased. Existing transmissions usually use gears to achieve this effect. Our product does not contain gears but rather hollow cylinders of hard steel (with larger cylinders located on the ends) that rotate around an electrically driven solid cylinder, which also causes this leverage effect to occur. Our transmission system is called the Archimedes Drive, named after the famous philosopher from ancient Greece, Archimedes, who discovered the leverage effect. Archimedes said: ‘give me a lever that is long enough and I will be able to move the earth’. What we do with the Archimedes Drive is put a very large lever in a very small package. That’s what every transmission does. Our innovation is that we don’t use gears like existing transmissions do. They can cause a lot of problems. For example, there is always some slack between the sprockets. This is unavoidable. No matter how small that amount of slack may be, a robotic arm is not be able to move as accuarately because of this. If you want to assemble a mobile phone, the computer needs to know where precisely the robotic arm should place the parts down to the micrometer. Now the robot has to re-adjust itself each time in order to be able to approach that level of precision. And in the end, they still do not achieve the result that the manufacturer wants. If you look at mobile phone development, you see that this creates a stumbling block when it comes to manufacture. The phones have become thinner over the years. However, not much more than that has actually changed, even though they have been around for a long time. You can also see that the hands of many people are still involved in the manufacturing process. Because robots are still as yet incapable of competing with human productivity. With the Archimedes Drive, the rotating hollow cylinders are placed up against each other under very high pressure. This creates so much grip between the parts that you get the same effect as you do with gears that are interlocking. Except that there is no slack between the cylinder surfaces or between the sprockets. This allows us to achieve the precision that is required for meticulous assembly.”

What is the motivation behind the development of this new system?

“Our founder, Jack Schorsch, was once commissioned in the USA to produce lighter prostheses for active use. These included artificial arms that could move with the help of internal motors and transmissions. These transmissions were so heavy that they were impossible for a child to cope with. One of the causes was the presence of solid steel gears. Jack Schorsch then thought that many machines suffered from overly heavy, insufficient gear transmissions. That’s why he devised a new system: the Archimedes Drive.”

What has been the greatest challenge for IMSystems?

“At the moment, this is to prove that the Archimedes Drive has a long lifespan. Customers prefer a 20-year lifespan whereby something is up and running 23 hours a day. Since there is always a time for maintenance or a time when the production line is at a standstill, 24 hours a day is unrealistic. Of course, you don’t have 20 years in which to find that out. You can speed up the testing process by running a test machine at maximum power that contains the Archimedes Drive, when it is actually intended for medium use with occasional peaks. This allows you to find out within a month whether the transmission will last for 20 years. But even then, testing the lifespan remains a challenge. If something breaks down, you have to look for the flaw in the design. Then the engineering team has to discuss how to solve the problem. There is a huge amount of thought, calculation and draftsmanship involved. It can take up to three months. After that, the manufacturer has to make the new components. That also takes another three months. After that, we need to test the new transmission for a further three months. In the meantime, we have to pay the rent and the employees.”

Het team van IMSystems Foto: IMSystems

Which was the most rewarding moment for you?

“That was two and a half years ago when we received the first proper transmission made of steel with which the working principle was proven. Before that, professors told us that our product was ‘theoretically not possible’ and ‘practicably infeasible’. Because they thought our transmission would get too hot and would slip. But that didn’t happen. From that moment on, of course, we knew that it would take years before we would have a fantastic, well-developed end product. But that’s just how it is in the transmission world. The last real breakthrough was sixty years ago. Today’s most advanced robots use the same technology as the cart that drove around on the moon sixty years back. Ridiculous really. If you look at the development of software, you see that computers have become much smarter. But robot hardware is still exactly the same.”

What can we expect from IMSystems in the future?

“Soon we will receive a large investment which should give us a boost in developing the Archimedes Drive further. And in 2020, we want to market a development kit that allows manufacturers to test whether they will be able to use the Archimedes Drive or not. That way they will be able to try out the technology in advance. If they are satisfied, we can see if a custom made transmission can be made in due time. This is how we want to build up a customer base. Many manufacturers of electric bicycles, robotic vacuum cleaners, windmills and mobile phones, for example, have their own research and design department that is always on the lookout for the latest technology. That’s what we’re focusing on now. We hope that the finished product of our Archimedes Drive will be ready within a few years. Then we will be able to deliver it on a large scale.”

IMSystems is one of the twenty start-ups that the pan-European network RobotUnion has nominated in July for a prize of up to 223,000 euros. The next round of this European start-up competition is in October.

 

Interested in start-ups? An overview of all our articles on this subject can be found here.

Start-Up of the Week: Play the piano and say bye bye to Parkinson’s?

”Your sneak preview of the future” is the slogan of Innovation Origins, and that’s just what we will highlight with our Start-up of the Week column. Over the past few days, five start-ups of the day have been featured and on Saturday we will choose the week’s winner.

Innovation Origins presents a Start-up of the Day each weekday

We shall consider various issues such as sustainability, developmental phase, practical application, simplicity, originality and to what extent they are in line with the Sustainable Development Goals  of UNESCO. They will all pass by here and at the end of the week, the Start-Up of the Week will be announced.

 

Up Stream Surfing – Surfing fun in every river

The coasts of the Bay of Biscay, Hawaii and Jeffersons Bay in South Africa are visited by people from all over the world so they can indulge in their favorite sport – surfing. Why do people from all parts of the world go there? Because that’s where the waves are the best! Although the team behind Up Stream Surfing Hawaii can’t really bring Hawaii to the big city, they have developed a technology that recreates those waves.

This means that every river can be turned into a surfing zone. Which is a great solution, especially in big cities with large rivers. Residents no longer have to travel far for their surfing experience and can just paddle in the local waters. For really high waves you will still have to go to the hot spots of course. However, the mobile system consisting of a pulley block and an underwater sail connected to a bridge pulls the surfer forward and allows them to practice their sport wherever they want to.

Sewts – Manufacturing clothing without any manual misery

Most garments travel around the world before they end up on your body. From cotton plantations in the United States to weaving mills in India. Subsequently, children’s hands are often used to make the final products under appalling conditions in Bangladesh. After that, the clothing items go on another long journey before ending up on shop shelves in Western countries.

This could all be done with a lot less airmails and by using less child labor far less; that’s what they thought at the German start-up Sewts. What they want to do is to bring textile manufacturing back to Western countries so that machines can take over the work currently being outsourced to low-wage countries. This is a lot more sustainable and also ensures that the children in Bangladesh no longer have to work in sweatshops. They might simply finish their school and may order a piece of clothing later on from Europe, which they can pay for because they are not part of the manufacturing process.

Ruvu – automation is something you can learn

Perhaps Sewts and Ruvu could could work together because they have something in common. So, what does this Eindhoven team actually do? Builds robots! Well, there are many more robot builders around, but what the Brabanders behind Ruvu do differently than these others, is that they provide custom-made solutions for the logistics sector. Because every logistics process is different – there is no such thing as a one-size-fits-all solution for all of the companies that make things or process orders.

Their ultimate goal? Fully automated factories and distribution centers that ensure that the entire production chain will soon be made up of very tough high-tech workers who never get tired and who don’t need a collective labor agreement, vacation days or a salary. If this were to happen on a larger scale, manufacturing processes would become more efficient and economical. Plus supporters of a Universal Basic Income would have an additional argument that would strengthen their vision of the future.

Tripstix GmbH – Inflatable paddleboards

It seems as if this week is all about automation and surfing, because the Tripstix GmbH plan also fits perfectly into this theme. Although surfing is, of course, a popular form of pastime for many people, transporting surfboards from A to B is definitely not. The paddleboards are not exactly compact or handy in size and therefore are not easily carried around in a car.

However, a board is indispensable if you want to catch a few waves on the Hawaiian coast – or on a river in Zurich with Up Stream Surfing’s technology. Tripstix GmbH has developed an inflatable version for this reason, the technology of which resembles that used in vacuum packages which are sometimes found in coffee machines. Do you remember Tellsell’s Aerobed? Something like that. And not entirely insignificant either; according to the German makers, this inflatable feature is not at all at the expense of quality.

Tripstix and Upstream Surfing should probably get together for a cup of coffee, because together, they could create the ultimate pop-up surf experience without the need to endlessly lug around surfboards – and even without the sea.

The Sáncal Method – musical medics

It is still a bizarre trivial fact that people know more about the universe and galaxies that are millions of lightyears away from us than about what exactly takes place in our own upstairs department. It is known that people and music are like cookies and cream and that there is virtually no-one who does not care about music. Yet there is more; the Spanish start-up Método Sáncal developed a method for tackling neurological disorders such as Parkinson’s and Alzheimer’s disease with music. Yes, you heard it right!

How does the method work exactly? Playing a musical instrument stimulates certain parts of the brain which are susceptible to neurodegenerative diseases. It is actually a kind of brain gymnastics that can be used by the young and old in order to prevent problems. This treatment is not only meant for elderly people who are able to play the piano. Everyone can benefit from the healing tones of this Spanish method. The introductory level is low and everyone can participate for just a few bucks.

This start-up proves that you are never too old to learn and that there are alternatives to pills. It also shows that self-expression, neurology and technology form a very amazing bond. Although auditory medicine is still in its infancy, it would of course sound like music to your ears if a few clever piano lessons were able to make sure that no one would suffer from dementia anymore. This creative combination and its concrete application options mean that this week Método Sáncal may call itself *drum roll!!* Start-up of the Week according to Innovation Origins.

Start-up of the Day: ‘Autonomous robots make life easier, more fun and safer.’

Self-propelled robots are increasingly more common in industry. RUVU makes software that allows a robot to work autonomously. That’s how they help various companies in various industries with the development of a robot. CEO Rokus Ottervanger: “I think that robotics have been used for barely ten percent of their potential at the moment.” That is something he wants to change with his start-up.

What do you do precisely?

“We supply software that enables robots to operate autonomously. For example, think about self-propelled carts for transporting tomatoes in greenhouses or other products in a distribution center. These robots traditionally ride over a floor-mounted induction line or other fixed infrastructure. These are very rigid and therefore expensive to adjust later on. More and more flexibility is required in these kinds of processes. Particularly in terms of the number of robots, routes and conditions that they have to be able to deal with. We supply software which allows a robot to work autonomously to companies that build these types of robots. The software is always wired into one of our customer’s products. That company takes care of the hardware and the sales of the machines themselves.”

“We always build our software out of standardized components. For example, a robot must always be able to determine its own position, be able to think how to get from A to B and be able to detect obstacles along the way. This is the same for a robot that has to do inspections outside, as it is for a robot which operates inside a factory or distribution center. But what works for one robot does not necessarily work for another. That is why we have several software modules for each of these tasks. We choose from these the one that is most suitable for the conditions which the customer’s robot has to operate under. As well as that, we often need some kind of customization in order to meet the customer’s needs. Such as an interface for users, for instance. This way, we deliver customized solutions with standard components.”

[youtube https://www.youtube.com/watch?v=JwD-hRu2ItA]

“We work in three stages in collaboration with our customers. We start with a proof of concept in which we examine, together with the customer, what is technically required for their product. We then present a prototype. During the second stage, the results can be tested by the initial users. We subsequently use this feedback in our software. As a result, we are able to produce a fully-fledged product in just a few months. In the final phase, we provide software licenses and support. The customer can then install and maintain the product themselves.”

Why did you start this business?

“I think robots are just super cool in the first place. Besides that, I see a lot of possibilities that are not being exploited at the moment. There is still relatively little automation in most companies. While there are comparatively simple and inexpensive solutions for automating parts of their process. They can save a lot of costs like that. I think it’s great that the technology we’ve really mastered can be of added value for the user.”

What is the biggest obstacle that you have had to overcome?

“We had it pretty easy in the beginning. We had a free workplace and soon found our first customers. One of them was a company that wanted to build an inspection robot. Among other things, it had to detect gas leaks in oil and gas factories. The other was a robot that could move racks in a distribution center for plants.”

“We were very busy with those first two projects. We assumed that this would result in more work, but that proved not to be the case. After those projects, we were out of work. Although I was aware of this eventuality, I hadn’t paid enough attention back then to attracting new customers. A time came when we didn’t have much work to do. That was difficult.”

How did you eventually remedy that?

“By providing a clearer picture of what we are doing. That way, customers are better able to find us and immediately know what to expect from us. In addition, we have started to build partnerships with companies that supply components for mobile robots, such as sensors, for instance. Their customers are our customers, although from a commercial point of view we don’t get in each other’s way. And that’s a great way to work together. For example, we work with a company that supplies GPS systems. If a customer of this company wants to build a robot, they knock on our door. Both the customer and the GPS supplier benefit from the fact that the systems work well together. This is what we deliver with our software. We complement each other very well in this respect.”

What has been the best moment you have had so far?

“I am happy when the customer is happy. The most wonderful moment is when a robot does what the customer had in mind. That feels like appreciation of our expertise. There’s a big technical aspect to that, but I’ve noticed that I am able to relish the entrepreneurial side of myself more and more. I’m a genuinely technical person by nature. I did mechanical engineering at Eindhoven University of Technology and graduated in control systems technology. When I work with a client on the business case for their new product, and on how we fit in with it, I think that’s very nice too. Then, of course, it’s fantastic when they finally sign the deal.”

What can we expect from you in the coming year?

“I want to double the number of team members in the coming year. There are three of us now. That’s great fun, but the six of us can build up more diversity and a broader expertise. After that we can grow a bit more, although the team shouldn’t get too big at this point in time. If we are all working on the same core of software, we have to make sure that communication is flawless. If the team gets too big, you often see that islands start emerging. You run the risk of inefficiency that way. Together with the team, I want to work on at least three separate robots in three different market sectors. So those are three different collaborations with other companies.”

“In the long term, I would find it really amazing if there were thousands of robots riding around the world with our software. They make people’s lives easier, more fun and safer.”

20 robotics start-ups in with a chance to win 220,000 euros this autumn

Twenty start-ups in robotics have a chance this   to win a total of 220,000 euros from a European subsidy pot as part of the Horizon 2020 innovation program. This will enable them to pay for the further development of their start-up. These are amongst others the Delft-based company IMSystems, which is developing a new type of gearbox for production robots, and the Amsterdam-based company MX3D, which produces software for robotically-controlled steel printers.

The competition was set up by the pan-European network RobotUnion to encourage the development of innovative robotics. This is made up of six EU member states (The Netherlands, Denmark, Finland, France, Spain and Poland) and Canada. Educational institutions such as the Industrial Research Institute for Automation and Measurements in Warsaw, the Danish Technological Institute in Odense and TU Delft are all partners of RobotUnion, as are a number of large companies such as Ferrovial, the Spanish urban infrastructure company.

A total of 206 start-ups from 33 countries registered for the RobotUnion competition. Of these, 44 were allowed to give a presentation in July. A jury of experts selected 20 of these, each of whom received 4300 euros in cash for the further development of their business plan, among other things.

The second selection round will take place in October. The ten best robotics start-ups will receive 120,000 euros each to invest into the further development of their product. Then there will be a final round from which four winners will be chosen. They will receive another 100,000 euros. The prize money comes from the budget of the European Union’s innovation program, Horizon 2020.

Google and Yahoo

According to the marketing manager of one of the nominated companies, Casper van Eersel from IMSystems, just participating in the RobotUnion competition provides a wealth of contacts. “For example, you will meet many other starters and hear what they are up against, e.g. how to attract good employees, how to arrange funding.

Employees from large companies such as Google, Yahoo and other branches of industry will also be present during the presentations to scout for new technology, says Van Eersel. Who knows, this could result in customers or investors. “You notice that there is interest. Someone from the engineering office of one of these large companies can call later on just because he saw the presentation of our product at RobotUnion. They wouldn’t have done that otherwise, because they wouldn’t have seen us. The fact that we are among the last 20 makes them even more interested. They see that others also see something in our product. Then they’ll check it out a lot sooner. Such a competition puts our product in the spotlight much more than they usually are. That’s important.”

Asian robots on the rise

The fact that the development of robotics in Asia is on the rise makes RobotUnion’s European start-up network particularly relevant, Van Eersel says. “A stimulating environment strengthens the competitive position of European robotics companies,” Van Eersel continues. “The RobotUnion network acts as an accelerator”, says Van Eersel. “It facilitates the development of robotics start-ups at a European level that would not otherwise exist. We already have a very good infrastructure for this in the EU whereby member states cooperate with each other. There is a subsidy that the European Commission has made available for this purpose. We are making good use of it this way.”

 

Some hypes and missed opportunities in robotics

Prof. Herman Bruyninckx (KU Leuven) presented a critical view on the robotics industry during the seminar in High Tech Campus Eindhoven. Bruyninckx delivered a talk on the hypes and missed opportunities in robotics pointing out the key issues that robotics should tackle nowadays.

Herman Bruyninckx has a background in mathematics, physics, computer science and “Philips” mechatronics. He has worked as a roboticist since 1988. Now he is a professor of KU Leuven and TU/e. Herman Bruyninckx’ research focus is robotics as the science of the integration of the systems-of-systems.  

Does more computing power lead to better robot systems?

Herman Bruyninckx believes that what robots lack nowadays is the awareness of the users’ intentions and the ability to use abstraction. “50 years after the “first robot” Shakey you can still you go to a robotics conference and understand everything there – because the context and the mathematics are exactly the same. Intention and abstraction of the robots are still very underdeveloped. We need higher-order logic to make the robot aware of why it should be doing something but there is no formal language to represent that. We cannot even formalize what we humans know about the intentional context,” says Bruyninckx.

Context is everything

“There are different levels of perception, and we change them all the time automatically,” says Bruyninckx. “If I’m walking along the stage and I’m not paying much attention to the edge, I’m not going to drop off here. Even if I do, it’s not so high. But I would move completely different if I hold a baby in my hands. Baby will have an impact on all of my control settings, on all of my perception settings. So the context of the task changes everything. But the robots haven’t even started using that kind of context-dependency.”

Abstraction

“We don’t memorize how many people we saw in the cafeteria – lots of background noise is erased. All that we use in our education is abstraction. Humans use abstractions so much and so long that they don’t understand anymore how difficult it is. And we want the robots to learn how to use abstraction by showing them sensors’ data. Then why don’t we teach our kids maths by showing them 1000 million equations?”

Deep learning, according to Bruyninckx is just a new buzz word. “It has nothing to do with deep and nothing to do with learning,” argues Bruyninckx. “It’s all about data reduction. Robots need hard human thinking, not ICT support.”

Robots and environment

“Now robots are kept away from touching the environment, but it shouldn’t be so,” says Bruyninckx. “If I need to take something from the table without looking, I will take it by touching. Touch is really important for most of the tasks.” Bruyninckx and his research group conducted tests on active sensing with people: a person was blindfolded, unable to hear and was wearing very thick clothes. It’s the best possible approximation of a human to a robot.

Open source

ROS (Robot Operating System) is the open source that has conquered the world. In prof. Bruyninckx’s view, ROS has its bad practices. “If it is open-source it doesn’t mean that it automatically gets improved. The first thing that people do when they need to use this open source is forking and fragmenting it.” According to Bruyninckx, ROS has become a monopoly and misses meritocratic credibility to prevent fragmentation.

Where is the state-of-the-art in robotics?

According to prof. Bruyninckx, making an academic career in robotics has become too easy. “Too many old simplistic ideas are coming back again and again. It’s popular to have simplistic solutions but they won’t work,” says the KU Leuven professor.

Another problem of robotics in the academic system is the lack of standardization. “Where are you going to look if you want to know the state-of-the-art in robotics? I don’t know where to find it. There are hundreds of thousands of papers claiming that they are state-of-the-art. We did an extremely bad job in robotics as an academic system. In many cases, Wikipedia can be by far your best source about the state-of-the-art in robotics is.”

Access to knowledge problem

Bruyninckx points out that his work as a university professor is paid 100% by tax money. “When I am hired by a university, that means that your kids only have access to my knowledge in that university because universities don’t work together. I think, people who pay taxes so well should have a lot more value for their money.”

As Bruyninckx says, big IT companies are privatizing decades of public investment because of the lack of open standard regulations. “Robotics is scary. They put things behind the login and all the public knowledge becomes private,” says KU Leuven professor.

Technical conclusions

“Robots need a lot more formalization and standardization to allow the robots to cope with intention, abstraction and context.”

“Progress is value-added, robot-centric applications will profit more from advances in mechatronics and materials than ICT and AI: our robots must touch the environment, including people and plants.”

Bruyninckx believes that materials should be important for the robotics. “Look at yourself, how great these materials are! There is so little friction in our body. Robotics needs new materials. No more computers, networks, sensors. There is more than enough information and computing power.”

“I was in Silicon Valley for a long time and I can say it’s better here, in the Netherlands and in Flanders. We have more potential with robotics then Silicon Valley because we have not only software but the machines.”

 

What robot fish, bees and self-driving cars have to do with each other

“Don’t talk too loudly or move too fast”, whispers Professor Tim Landgraf from the Freie Universtät Berlin. “Otherwise the fish will freak out.” He has just pulled up a large curtain which is surrounding a big tank with four small fish. Well, there are three of them, because one of the fish is actually a robot that is moved back and forth with the help of a magnet underneath the aquarium.
The goal, Landgraf explains, is to make the robot fish as much as possible a part of the group of the real fish in order to map the behaviour of the school of fish. “We’re getting better and better at it,” he says. The robot fish is becoming more and more accepted as a group member. It was a matter of taking small and bigger steps. When the fish got realistic large round eyes instead of painted dots, it proved to be an enormous leap forward. “That’s what the other fish were scared to death of.”

Professor Tim Landgraf, photo FU Berlin

It is just one of the nature research projects that Landgraf is working on at the Berlin University. The other major project involves a bee population in which all bees have been tagged so that their behaviour can be observed over a lifetime.

But why? Why is that information of any use to us?

A bee with a traceable tag on its back, photo FU Berlin

Food exchange

Landgraf does a lot of thinking. Animals and plants are often extremely good at particular things. The way in which bees collect and search for food, how they communicate with each other, etcetera. These are processes that have been perfected over hundreds of thousands of years. If you can capture that in algorithms, it can be extremely educational. “Everything we see in nature is enormously complex and are actually highly developed technologies from which we as human beings can learn if we look closely enough.”
An example of this for bees is how they use their energy. Like humans, bees need food to carry out their work. But sometimes their energy reservoir is empty, while the work is not yet finished and there are no flowers with nectar nearby. Bees have found a solution to this problem. They have a kind of second stomach with reserve food for friends in need. If one of the bees runs out of food, it can fill up on food from another bee.

Cars that refuel each other, photo FU Berlin

Auto-pilot driving

It’s a concept that Landgraf says people could use with electric cars. Especially when all cars will be driven using autopilot. You could provide them with a spare battery that would help other cars which have a flat battery. This would partly solve the problem of the limited range of e-cars. The refueling can even be done while driving, says Landgraf. Then you won’t waste any time.

And that just happens to be another research field of research at the FU Berlin. Twenty meters from Landgraf’s office, other FU employees use robot cars in their quest to perfect autopilot driving. Docking is also being tested. Outside, of course, there is also a real car to try out in the real world what has been tested in the laboratory set-up. Different fields of research intersect, says Landgraf. According to him, it’s the sort of research that Facebook and Google also do. They just have a little more money.

AI: Smart Clothes as instructors

De robot imiteert de beweging van de mens en programmeert zichzelf. Foto: Anne Schwerin

Until a few years ago, clothing served only to protect people and at the same time still had fashionable aspects. But meanwhile, our second skin can do more and more. The measurement of body data such as pulse value or calorie consumption by means of integrated sensors is almost an old hat. Now, however, the clothing will also take on teaching functions through artificial intelligence: On the one hand as a trainer for humans, on the other hand as a programmer for robots.

The latest development comes from Turing Sense. For over three years, a team of 27 engineers and competitive athletes has worked on their vision of replacing complex video analyses of movements with digital technologies such as AI. Their vision: to design complicated sports exercises in a timely, precise and effective manner. The result was officially launched recentely. It’s a yoga outfit that incorporates sensors that connect to a virtual yoga studio through an app. Yoga videos by renowned instructors such as Brett Larkin, Kim Sin and Molly Grace are offered here. Almost as if the yogini were personally on site, she leads through the selected yoga course. The i-Double scans the execution of the asana, the yoga posture, of the students through W-LAN. This is then displayed as an avatar on the mobile or TV screen so that the user can observe his likeness to the teacher while practising warrior, dog and co. As an interactive app, the i-Yogini also reacts to voice commands such as “Freeze” or “Show me the camera”. But now comes the clue: when the user asks for “How does this look?”, he will receive a correction of the yoga position, if necessary. The workout can thus be individually adapted to the requirements of personal performance.

Of course, high-tech clothing also meets the highest demands in terms of comfort and functionality. It is even washable. The outfit called Pivot Yoga consisting of a shirt and pants for $99 is currently only available in the USA and Canada. The app currently only works in combination with IOS 11, an iPhone 7 and higher. An Android app, as well as the delivery to Europe, is planned.

Possibly exactly because demanding yoga only has the desired effect on body and soul through precise execution, there is another clothing manufacturer that has already specialised in smart clothing for yoga in 2017: Wearablex. Although with the so-called Nadi X Pants, which are also connected to an app via Bluetooth, the yoga student receives haptic feedback instead of a visual and optical correction. Ten tiny, individually adjustable vibrations are sent to the hip area, knees and ankles to indicate an incorrect position. They provide peace of mind when the position is correct. Wearable X Smart Pants are currently available in the USA, Canada, the EU (plus Switzerland, Norway) and Australia/New Zealand, they work under IOS and cost $249.

The Dresden-based Start-Up Wandelbots is currently working on an exact opposite application of artificial intelligence. The company has developed software that enables robots to program themselves by imitating human movements – sent to them by, for example, a technician using Smart Clothes. This new technology should be 20 times faster and 10 times cheaper than conventional programming. This application can be seen, for example, in the Transparent Factory of VW in Dresden.

The focus of changebots is currently still on industrial robots. But if these have proven themselves in the first practical tests, the technology could become a groundbreaking innovation: the application is so simple that in the future, everyone could be able to program an individual robot, even those without background knowledge. In addition to industrial assembly, conceivable areas of application include use at home and in nursing care.

Prague gets new European Institute for Artificial Intelligence

The Czech Republic will get a new Research Centre for Artificial Intelligence (AI). Following two years of negotiations, the Prague-based Institute of Computer Science, Robotics and Cybernetics (CIIRC) was awarded the contract for the European RICAIP project (Research and Innovation Centre on Advanced Industrial Production). The center’s scientists will focus on the use of AI in industrial processes.

As the CIIRC announced at the end of last week, the contract is tied to an EU funding of 45 million Euros. The Czech and German governments will provide an additional 5 million Euros.

European Network for AI Research

According to Vladimir Marik, Scientific Director of the CIIRC, the new research center will be built in close cooperation with another Czech institute, CEITEC in Brno, and the German Centre for Mechatronics and Automation in Saarbrücken (ZeMa). As for ZeMa, this means the expansion of a partnership in the field of “Industry 4.0” that has existed since 2016.

In addition, the new research center will be virtually linked to a whole series of European scientific centers dealing with the use of AI in industrial processes. This is a prerequisite for the EU.

Robots per 10.000 employees, IFR

“We have set ourselves the specific goal of having a center within six years that is capable of setting up experimental robotized production lines for interested companies from all over the world,” Marik told the CTK press agency. Marik expects the CIIRC to be financially independent within four years.

Trend towards more industrial robots

The new research center in Prague follows the worldwide trend towards more and more robots in the production process. Only last week, the International Association for Robotics (IFR) announced that South Korea, with its large automotive industry and 710 robots per 10,000 people, is the current leader.

With 658 robots, Singapore is in second place, Germany is third with 322. With about 160 robots, the Netherlands are mid-table. The automotive industry is the all over leader with 1,200 robots per 10,000 employees.

Facebook’s head of AI delivers Holst Memorial Lecture, says open innovation is route to faster scientific progress

Yann LeCun Source_Radio4Brainport

LeCun looks for an explanation for the underlying mechanisms of intelligence, just as the theory of thermodynamics only became clear after the construction of the steam engine

Yann LeCun, head of Facebook’s Artificial Intelligence Research Group (FAIR), and considered one of the doyens of AI, delivered this year’s Holst Lecture, as the recipient of the 2018 Holst Medal. The annual award is hosted by Philips Research, Signify and TU/e, and is in honour of the significant contribution to research made by Dr Gilles Holst, director of Philips’s NatLab from 1914 to 1946.

LeCun, who in 2013 was asked by Mark Zuckerberg to drive Facebook’s AI research programme, is a strong proponent of open innovation and multi-disciplinary research, much in the spirit of the approach taken by Dr Holst. He is known for his work in machine learning, computer vision, mobile robotics, and computational neuroscience; the handwriting-recognition technology that he developed is used by many banks worldwide, and his image compression technology, DjVu, is used extensively to access scanned documents online. His convolutional network model is applied in image recognition by companies such as Facebook, Google, Microsoft and Baidu.

(See also: Tomorrow is good: The ten commandments of Holst).

Having flown in overnight from his New York base, and dressed in the casual elegance more reminiscent of his Silicon-valley employer than of his engineering and academic profession, LeCun addressed students, academics and industry-based researchers in the TU/e Auditorium in Eindhoven. Along with an overview of the history of AI, he outlined some of the research questions that FAIR is addressing today, and listed features that AI simply cannot provide yet, given the current state of the science: What we are still missing, are machines with common sense, intelligent personal assistants, smart chatbots and household robots.

The reason why this is not yet possible, he said, is that machines do not yet have the ability to reason, nor can they react by planning suitable follow-up action. “For that, machines need a model of the world”.

The FAIR team, which has about 200 members worldwide, publishes all of its work, in the form of papers and source code, in the public domain. According to LeCun, this is in the interest of speeding up scientific innovation in AI. “The reason why we do this, is that we get people to use our tools, and to improve on our method, so that, essentially, it becomes much easier for us to move faster. We need to make progress faster”.

Listen to the Radio4Brainport podcast:

.

Wijnand IJsselsteijn, a professor at the TU/e and chair of the Holst Memorial Committee, described the selection process for the Holst Medal to Radio4Bainport: “The award is a great honour bestowed upon scientists who do relevant work in the areas of technology — areas that are relevant to Philips, Signify and TU/e. We make a shortlist of scientists based on the relevance of their research and on the impact on society, within the environment that is relevant to the topic. This year the topic was AI and data science”.

IJsselsteijn says some of the distinguishing aspects of LeCun’s work include his multi-disciplinarity. “This to me is a big appeal of his work. Also, there is great practical significance to his work: Much of what he does is immediately applied to machine learning and image recognition, and to compression algorithms. He has an amazing track record in this area. All of that together makes him a very good candidate for the Holst Medal”.

(See also: Why Europe should have its own AI centre).

Yann LeCun Source TUe
Yann LeCun (c) TUe

LeCun, who maintains strong ties with academia through his professorship at New York University, has several ambitions for scientific developments in AI, including discovering whether self-supervised learning – which he believes will be the future of AI – can lead to common sense. As he put it in his lecture, “a robot has less common sense than a house cat”.

One of his specific focus areas for the scientific future of AI is to develop a theory that could explain the underlying mechanisms of intelligence, whether human or machine intelligence. Describing this as equivalent to developing the theory of thermodynamics after the steam engine was built, he says that, “in the history of science and technology, it is very much the case that the artefact was created before the science was created to explain how it works. In fact, the science was motivated by the fact that the artefact already existed. Now, what is the equivalence of thermodynamics for intelligence? That is the question that I am after”.

(Listen to the full recording of the Holst Memorial Lecture).

Tomorrow is Good: Building Bridges

robosculpt Jordan Bos Maarten Steinbuch

In a weekly column, alternately written by Lucien Engelen, Mary Fiers, Maarten Steinbuch, Carlo van de Weijer, and Tessie Hartjes, E52 tries to find out what the future will look like. All five contributors – sometimes accompanied by guest bloggers – are working on solving the problems of our time. Everything to make Tomorrow Good. This Sunday, it‘s Maarten Steinbuch’s turn. Here are all the previously published columns.

Last week I attended an international medical congress for ENT-doctors and brain surgeons. They often perform very complex operations for which they practice for many years to get exactly to a malignant condition in a person’s head, or insert an electrode into your ear canal to let you hear again. Without them touching a nerve that could paralyse you or make you lose your taste. They are proud of their profession, and at the congress they tell about their experiences and research to do even better. With their drilling and milling tools and other instruments, they work with their hands, very carefully, layer by layer.

And there we are, with our very latest robot, as engineers. Telling them that our robot is waaay better than they are. With our technical enthusiasm, we are going to make the world a better place! Change! The future begins today!

Fortunately, we are also able to listen. We ask the surgeons what they think of our robot, what possibilities they see. For which applications. Some people mainly see problems, think about safety, ask ‘who is responsible’, prefer to avoid any risk. However, most of them also see the advantages. Faster, more precise, more accurate, less tiring. And a few doctors think really out-of-the-box: they imagine that the robot is completely ready and tested and then come up with new treatments and treatment methods that really make use of the advantages of the robot and not the limitations of man. For example, by drilling along a different, completely new, channel with high accuracy into the head.

The best moment is when ‘our’ doctor tells the story of the robot to his colleagues. He takes them to our stand at the congress. Enthusiastic talks about the robot, the navigation software, the benefits, etc.

It is also cool to see how the surgeons in the audience react when I show the football robots in my speech, with the frenzied audience at the 2013 RoboCup World Cup in Eindhoven. The pictures with the speed of technological development: Moore’s Law, the autonomous and connected cars, the eye surgery and vascular surgeon robots that we have developed, and more recently the bone robot and the brain robot and the controllable catheter. And that they have to read Homo Deus. And that we are proud of ASML.

Innovation is increasingly about multi- and interdisciplinary work. Bringing people together and being able to listen to each other. You need to understand each other’s field of expertise in order to do this. To be able to explain. For me, this is at the heart of the future of education and research. The T-shaped, or pi-shaped engineer, and also the Eindhoven Engine. It’s people’s work, and virtual networks are fun, but co-location is essential to really speed up innovation. Beyond the boundaries of disciplines.

I’m sure I will go to a congress with doctors and specialists again. I think I will understand them better and better.

(Picture Bart van Overbeeke)