- Back to Home »
- Google’s new AI language model can comprehend entire books
Posted by : Brij Bhushan
Friday, 17 January 2020

One of the prime challenges of a language-based AI model is to understand the context of the surrounding content. To solve this problem, Google has introduced a new model called Reformer, which understands the context of 1 million lines using just 16GB space. The company built this to solve problems of its old model Transformer — a neural network that compares words in a paragraph to each other to understand the relationship between them. Current models, support understanding of a few lines or paragraphs before and after the text in focus. However, as it uses pair matching, Transformer takes a…
This story continues at The Next Web
Or just read more coverage about: Google