Google Introduces Multitasking Neural
Deep-learning systems tend to be one-trick wonders: they’re great at the task they’ve been trained to do, but pretty awful at everything else. Now a new neural network from Google suggests that AI can be taught to multitask after all.
Most deep-learning systems are built to solve specific problems, such as recognizing animals in photos from the Serengeti or translating between languages. But if you take, for instance, an image-recognition algorithm and then retrain it to do a completely different task, such as recognizing speech, it usually becomes worse at its original job.
Humans don’t have that issue. We naturally use our knowledge of one problem to solve new tasks and don’t usually forget how to use a skill when we start learning another. Google’s neural network takes a tiny step in this direction, by simultaneously learning to solve a range of different problems without specializing in any one area.
The neural network from Google Brain – one of the search giant’s deep-learning teams – learned how to perform eight tasks, including image and speech recognition, translation and sentence analysis. The system, called MultiModel, is made up of a central neural network surrounded by sub networks that specialize in specific tasks relating to audio, images or text.
Although MultiModel did not break any records for the tasks it attempted, its performance was consistently high across the board. With an accuracy score of 86 per cent, its image-recognition abilities were only around 9 per cent worse than the best specialised algorithms – matching the abilities of the best algorithms in use five years ago.
The system also showed other benefits. Deep-learning systems usually need to be trained on large amounts of data to perform a task well. But MultiModel seems to have come up with a neat way of sidestepping that, by learning from data relating to a completely different task.
- Previous Ransomware Attack on the Top Organizations Round The World
- Next Now Your Child Can Program a Robot TOO!