On 15 November 2018 at 11:40 CET, the mission team in the Biotechnology Space Support Center (BIOTESC) at Lucerne University of Applied Sciences and Arts watched with baited breath. After two-and-a-half years of highly intensive preparations, as well as countless testing and training sessions with CIMON (Crew Interactive Mobile CompanioN) on Earth, you could hear a pin drop – there was an atmosphere of total concentration and thrilled anticipation. After a software upload to the International Space Station, a software update for CIMON himself, an audio check and a test of the navigation camera, Alexander Gerst took a good look at his new robotic housemate and put him straight into operation. The world premiere lasted 90 minutes – the first 'rendezvous' between the German ESA astronaut and his autonomous mobile robot assistant.
Once Alexander Gerst had taken his manmade helper out of its box in the Columbus module of the ISS, the German astronaut woke him up with the words "Wake up, CIMON!" The answer came promptly: "What can I do for you?” After this initial banter, Gerst allowed CIMON to float around freely – initially by remote control from Earth. The guidance, navigation and control system was thus activated.
Then came some autonomous navigation with multiple turns and movements in all directions. Once complete, CIMON was able to locate Gerst's face and make eye contact. To demonstrate its capabilities as an assistant, CIMON used its 'face' – a display at the centre of the sphere – to show the instructions for a student experiment on crystallisation and also played a song. It then recorded a video and photo of Alexander Gerst using its integrated cameras. Afterwards, Alexander Gerst brought CIMON back to its place in the Columbus module. "The voice communication worked perfectly and I am very relieved that the cooperation between CIMON and Alex ran so smoothly," says Gwendolyne Pascua, the BIOTESC project manager who spoke directly with Alexander Gerst during the commissioning phase to guide him through the experiment.
"It is an incredible feeling and an absolute delight to witness how CIMON is seeing, hearing, understanding and speaking. For us, this first real deployment in space is part of history and is hopefully just the beginning of its usage on the ISS," says Dr Christian Karrasch, CIMON project manager from the DLR Space Administration. "Interaction with artificial intelligence fascinates me. As a system, CIMON is unparalleled elsewhere in the world and was designed specifically for deployment on the ISS. We are entering uncharted territory here and broadening technological horizons in Germany."
"CIMON represents the embodiment of our vision," adds Till Eisenberg, project manager for CIMON at Airbus. "It is a huge step for human spaceflight and one that we are taking here as a team. In CIMON, we have laid the foundation for social assistance systems that can work even under extreme conditions."
CIMON used the Wi-Fi on the International Space Station for data transmission and established an Internet connection to the IBM Cloud via satellite link and ground stations. "When CIMON is asked a question or when it is addressed, the Watson AI first converts the audio signal into text that can be understood or interpreted by the AI," says Matthias Biniok, IBM project manager, describing the processes taking place in CIMON's 'brain'. "IBM Watson is thus able to grasp the underlying intention, as well as the context of the words. The result is a pinpoint response, which is then converted back into language and beamed up to the ISS. This process enables a natural, dynamic spoken dialogue."
Bernd Rattenbacher, team leader at the BIOTESC Biotechnology Space Support Center of the Lucerne University of Applied Sciences and Arts, says: "The data connection to the Earth runs via satellite to NASA and to the Columbus Control Centre at the DLR site in Oberpfaffenhofen. The signal travels from there to us, the CIMON ground station at BIOTESC in Lucerne, the Swiss User Support and Operations Center, which is connected to the IBM Cloud in Frankfurt by Internet. The runtime for the signal alone via the satellites is 0.4 second in one direction. A large number of firewalls and VPN tunnels are enabled to ensure data security."
CIMON also has a scientific background. The consultants are Judith-Irina Buchheim and Professor Alexander Choukèr from the Department of Anaesthesiology at LMU Munich. "As an AI partner and companion, CIMON could support astronauts in their high workload of experiments as well as maintenance and repair work, thus reducing their exposure to stress," Buchheim says.
CIMON – The idea
Developed and built in Germany, CIMON is a technology experiment to support astronauts and increase the efficiency of their work. CIMON is able to show and explain information, instructions for scientific experiments and repairs. The voice-controlled access to documents and media is an advantage, as the astronauts can keep both hands free. It can also be used as a mobile camera to save astronaut crew time. CIMON could perform routine tasks, in particular, such as the documentation of experiments, the search for objects and for taking inventory. CIMON is also able to see, hear, understand and speak. Its eyes are actually two cameras that it uses for facial recognition, as well as five other cameras for orientation and video documentation. Ultrasound sensors measure distances to recognise potential collisions. Its ears consist of eight microphones to identify directions, and an additional directional microphone to improve voice comprehension. Its mouth is a loudspeaker used to speak or to play music. At the heart of the AI for language understanding is the IBM Watson AI technology from the IBM Cloud. CIMON was not equipped with self-learning capabilities and requires active human instruction. The AI used for autonomous navigation was contributed by Airbus and is designed for movement planning and object recognition. Twelve internal fans allow CIMON to move and rotate freely in all directions. This means it can turn toward the astronaut when addressed. It can also nod or shake its head and follow through space either autonomously or on command.