WuDao 2.0, an AI ten times more powerful than GPT-3 that writes poems in Mandarin and English (and that predicts protein structures)






GPT-3, the artificial intelligence developed by OpenAI and launched a year ago, caused a sensation among AI fans for its ability to process natural language and automatically generate texts that, in many cases, appeared to be written by humans.



OpenAI was able to achieve something like this thanks to having endowed this AI model with 175,000 million parameters; a milestone at that time and a feature that, broadly speaking, was able to add greater complexity to that 'artificial mind'. Well then: just a year later, we already have a similar AI but ten times more complex.





His name is WuDao 2.0. and was presented two days ago at the Beijing Academy of Artificial Intelligence (or BAAI) Conference, an event held both online and in person and with the participation of 30,000 professionals from the AI ​​sector.



China is also the great threat to the US in the field of AI



The BAAI, as the entity responsible for the development of WuDao 2.0 (with the collaboration of brands such as Xiaomi), announced that the more than 1.75 billion parameters of WuDao 2.0 not only surpass the precedent of the popular GPT-3, otherwise also the current record set by the 1.6 billion Google Switch Transformer.









Deep Learning: what it is and why it will be a key technology in the future of artificial intelligence








The statements of Tang Jie, BAAI's academic deputy director, to the official gazette of the Chinese regime China.org.cn, indicates that his plans for his new AI are not humble:




"WuDao 2.0 aims to enable machines to think like humans and achieve cognitive skills beyond the Turing test. […] The road to AI runs through big models and supercomputers. "




Originally, the objective of developing WuDao (version 1.0 of the same was launched last March) were less ambitious: it was about promoting the Natural Language Processing Research for the Chinese LanguageThis being, as it is, an area dominated by research on the English language (a language in which this IA is also competent).



Certainly, the capabilities of WuDao 2.0 go much further: having proven to excel also in fields such as recognition and imaging, or in the prediction of 3D protein structures. Regarding the text, he has also shown himself competent in the generation of essays, poems and songs written in traditional Chinese.






WuDao owes all those abilities to the fact that he was trained with 4.9 TB of images and texts (including 1.2 TB of documents in both Chinese and English). In the words of Blake Yan, a BAAI researcher:




"These sophisticated models, trained with gigantic datasets, only require a small amount of new data when used for a specific function because they can transfer already learned knowledge to new tasks, just like humans."




Via | South China Morning Post