29-Jun-2018: CIMON brings AI to the International Space Station

In the immortal words sung by Elton John, “It’s lonely out in space.” But for astronauts on the International Space Station, the journey might be a little less lonely — and maybe a little more productive — thanks to Watson AI on the IBM Cloud and CIMON, the first free-flying AI assistant in space.

CIMON (short for Crew Interactive MObile CompanioN) is the result of a partnership between the German space agency DLR, Airbus and IBM. Matthias Biniok, the IBM project lead for CIMON, was first approached for the project in 2016. “Airbus proposed this idea they had to the German space agency: they wanted to build a robot and send it into space. When DLR commissioned them to build it, Airbus approached IBM about handling the AI components.”

The result was a roughly spherical, 11-pound robot that can converse with astronauts on the ISS. Facial-recognition software lets CIMON know who it’s talking to, and a deliberately simple visual design allows CIMON to show basic facial expressions. The astronaut bot can travel around the European Columbus Research Module of the ISS independently, and has proven to be a handy assistant.

How Watson Assistant translates to space?

“The idea was to create an actual astronaut assistant, so the astronauts could do their work more efficiently,” says Biniok. “A secondary goal was to have kind of a companion in space that they could talk to. That was the original idea, but in the course of things, the project focused in more on getting the experiments done with greater efficiency.”

One way CIMON helps with that is by functioning as a floating brain. The predominant AI technology used by CIMON is IBM Watson Assistant, already in use by IBM clients worldwide. Watson Assistant helps customer service representatives surface relevant and accurate information quickly so questions can be answered faster.

“Imagine you’re an astronaut in space, and you’re at your station working on an experiment,” explains Biniok. “Your hands are busy, and you have a question about the project you’re working on. Normally, you would have to float over to your laptop to get the answer, then back to the experiment station. With CIMON, you can just say, ‘CIMON, what’s the next step?’ and you don’t have to interrupt your workflow.

“Another way CIMON provides assistance is in helping to document the experiments as they’re being done. Astronauts need to record and film everything that they do. With CIMON, they can just tell him, ‘CIMON, come here. Turn your camera 30 degrees to the right and record…’”

What the future holds for CIMON?

Biniok stresses that CIMON is only a first step in bringing robotic AI into space. “We can’t really talk about the next steps yet, but I can tell you about our vision. In my mind, the goal is to create a real astronaut companion, a real assistant that is helping, not just on the ISS, but on other space stations, maybe on journeys to the moon and Mars and beyond — that’s the long-term vision. They will definitely need some AI to accomplish those journeys.”

He says he also hopes to see CIMON’s value as a companion grow over time. “When you go too far away from your mother planet, some interesting things happen from a psychological perspective. You get a little bit crazy, seeing earth only as a dot in the sky.”

Biniok suggests that CIMON could help in two ways — first by offering an objective viewpoint, unaffected by the stresses of space travel, and second by providing companionship. “We’re working toward using Watson Tone Analyzer to enable CIMON to recognize the astronauts’ emotions, so that will trigger responses that accord to those emotions.”

And when chit-chat is not desired, the astronauts can hit CIMON’s off button. There is also an offline button that doesn’t turn CIMON completely off, but does disconnect it from the Internet so nothing gets sent back to earth. “That feature comes with a very nice visualization,” Biniok says. “He will actually close his eyes as soon as he’s in offline mode.”

While there is a sense of play in having CIMON aboard the ISS, CIMON is a scientific experiment itself, so the project was not undertaken lightly. “The ISS is an extremely regulated environment,” Biniok says, “and we needed to confirm with all parties involved that CIMON cannot harm anybody or fly into anything. All of the content within CIMON was approved by a researcher who is responsible for the mental health of the astronauts.”

A testament to the power of AI and the cloud

Biniok proudly notes that all of the Watson services used in CIMON come from the IBM Cloud in Frankfurt. “That tells you how powerful our cloud is. If you can make it work in space, you can make it work anywhere.”

In addition to IBM’s expertise in AI and cloud, Biniok explains that a key factor in the DLR’s decision to choose IBM was data security. “We give the client the choice to opt out of data collection,” he says. “It’s their intellectual property, and we don’t use that data to train our general models unless the client gives us permission. The data belongs to the client, and the models belong to the client. Some other companies won’t give clients that choice.”

As of this writing, CIMON is still aboard the ISS, waiting for his next assignment. An identical version of CIMON resides in an Airbus lab in Germany, where it helps the team troubleshoot problems when they arise in space. A third, slightly lower-tech version is trotted out for events like media interviews and photo ops.

As a concept, CIMON as a scientific experiment and technology demonstrator is still in its infancy. “We’re breaking new ground with this,” says Biniok. “It’s far from a final product, but it’s a way to begin to understand how such systems should look in the future and how they can actually benefit the crew in space.”

16-Jan-2018: Airbus is developing the CIMON astronaut assistance system for the DLR Space Administration

Described by its creators as a “flying brain”, this 3D-printed artificial intelligence system will soon join the crew aboard the International Space Station (ISS) to assist astronauts. It will be tested during the European Space Agency’s Horizons mission between June and October this year.

CIMON will be the first AI-based mission and flight assistance system. The entire structure of CIMON is made up of plastic and metal, created using 3D printing. CIMON has a brain-like AI network and is designed to support astronauts in performing routine work, for example by displaying procedures or offering solutions to problems. With its face, voice and artificial intelligence, becomes a genuine ‘colleague’ on board.

Applications: With CIMON, crew members can do more than just work through a schematic view of prescribed checklists and procedures; they can also engage with their assistant. CIMON makes work easier for the astronauts when carrying out every day routine tasks, helps to increase efficiency, facilitates mission success and improves security, as it can also serve as an early warning system for technical problems.

11-Nov-2018: World’s largest brain-like supercomputer

The world’s largest supercomputer designed to work similarly as the human brain has begun operating for the first time. The newly formed million-processor-core Spiking Neural Network Architecture (SpiNNaker) machine is capable of completing more than 200 million million actions per second, with each of its chips having 100 million transistors.

To reach this point it has taken 15 million in funding, 20 years in conception and over 10 years in construction, with the initial build starting way back in 2006. The SpiNNaker machine, designed and built in The University of Manchester in the UK, can model more biological neurons in real time than any other machine on the planet. Biological neurons are basic brain cells present in the nervous system that communicate primarily by emitting ‘spikes’ of pure electro-chemical energy. Neuromorphic computing uses large scale computer systems containing electronic circuits to mimic these spikes in a machine.

SpiNNaker is unique because, unlike traditional computers, it does not communicate by sending large amounts of information from point A to B via a standard network. Instead it mimics the massively parallel communication architecture of the brain, sending billions of small amounts of information simultaneously to thousands of different destinations.

SpiNNaker completely re-thinks the way conventional computers work. It's a machine that works more like a brain than a traditional computer.

Researchers eventually aim to model up to a billion biological neurons in real time and are now a step closer. To give an idea of scale, a mouse brain consists of around 100 million neurons and the human brain is 1,000 times bigger than that. One billion neurons is one per cent of the scale of the human brain, which consists of just under 100 billion brain cells, or neurons, which are all highly interconnected via approximately one quadrillion synapses.

One of the fundamental uses for the supercomputer is to help neuroscientists better understand how our own brain works. It does this by running extremely large scale real-time simulations which simply aren’t possible on other machines. For example, SpiNNaker has been used to simulate high-level real-time processing in a range of isolated brain networks. This includes an 80,000 neuron model of a segment of the cortex, the outer layer of the brain that receives and processes information from the senses.

It also has simulated a region of the brain called the Basal Ganglia - an area affected in Parkinson’s disease, meaning it has massive potential for neurological breakthroughs in science such as pharmaceutical testing.

The power of SpiNNaker has even recently been harnessed to control a robot, the SpOmnibot. This robot uses the SpiNNaker system to interpret real-time visual information and navigate towards certain objects while ignoring others.

26-Oct-2018: IIT Madras develops India’s first indigenous microprocessor.

Researchers at the Indian Institute of Technology, Madras (IIT-M) have designed and booted up a microprocessor, India's first indigenously-developed one, that can be used in mobile computing devices, embedded low power wireless systems and networking systems. This microprocessor will be helpful in reducing reliance on imported microprocessors in Communications and Defence Sectors and is on par with International Standards.

The microprocessor, developed under the project 'Shakti' was fabricated at Semi-Conductor Laboratory (SCL) of Indian Space Research Organisations(ISRO) in Chandigarh, making it the first 'RISC V Microprocessor' to be completely designed and made in India. The processors developed under Shakti are based on an open, free programming language ISA (Instruction Set Architecture), called RISC-V ISA, which provides software and hardware freedom for future developments.

The indigenous design, development and fabricating approach reduces the risk of deploying systems that may be infected with backdoors and hardware Trojans. This will have more significance when systems based on Shakti processors are adopted by strategic sectors such as defence, nuclear power installations and government agencies and departments.

With the advent of Digital India there are several applications that require customisable processor cores. The 180nm fabrication facility at SCL Chandigarh is crucial in getting these cores manufactured within our Country.

Shakti processor family targets clock speeds to suit various end-user application devices such as various consumer electronic devices, mobile computing devices, embedded low power wireless systems and networking systems, among others. The Project is funded by the Ministry of Electronics and Information Technology, Government of India.

IIT-M said that the impact of this completely indigenous fabrication is that India has now attained independence in designing, developing and fabricating end-to-end systems within the country, leading to self-sufficiency. With a large percentage of applications requiring sub 200 MHz processors, the current success paves the way to productisation of many hand-held and control application devices.

The front-end design of the Shakti processors is developed with Bluespec, an open-source High-Level Synthesis (HLS) language. Shakti and its eco-system with its modular design approach enables design and development of application-specific end-user computing and communicating systems.

In July 2018, an initial batch of 300 chips, named RISECREEK was produced under Project Shakti, that were fabricated at the Multinational Chip Manufacturer Intel's facility at Oregon, US, that successfully booted the Linux operating system. Now, the fabrication has been done in India.