Home etc Blogs Google Makes NLP Model ALBERT Open Source

Google Makes NLP Model ALBERT Open Source

0
3294
  • ALBERT has been released as an open source implementation on top of TensorFlow
  • It reduces model sizes in two ways- by sharing parameters across the hidden layers of the network and by factorising the embedding layer

According to a report by i-programmer, Google has made ALBERT (A Lite BERT) available in an open source version. It is a deep-learning natural language processing (NLP) model. the report said that developers claimed that ALBERT used fewer parameters than BERT without compromising accuracy.

Bidirectional Encoder Representations from Transformers (BERT) is a self-supervised method that was released by Google in 2018. The report said that the method delivered results on a range of NLP tasks by depending on un-annotated text drawn from the web.

Improved performance on 12 NLP tasks

According to the report, ALBERT is an upgrade to BERT. It offers improved performance on 12 NLP tasks which includes Stanford Question Answering Dataset (SQuAD v2.0) and the SAT-style reading comprehension RACE benchmark. ALBERT has been released as an open source implementation on top of TensorFlow. It will include various ready-to-use pre-trained language representation models.

The report said that, according to a paper given by its developers to the International Conference on Learning Representations, ALBERT reduces model sizes in two ways. These are by sharing parameters across the hidden layers of the network and by factorising the embedding layer.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here