Disclaimer: I collect news here absolutely subjectively.
Part of the news is news only for me and could have appeared for a long time. I just noticed them just now.
I accompany the news with my commentary, in which I state the reason why this news is interesting to me.
It looks like Transformers from Hugginface is now becoming the main repository of ready-made models in the NLP ( Natural Language Processing ) section .
…….. Google . Google, TensorFlow Hub, , . Google . NLP, Microsoft Facebook, . . . , PyTorch, TensorFlow. , , . .
sktime - time series. time series forecasting , . Facebook Prophet Amazon DeepAR . , sktime, PyTorch Forecasting, Amazon GluonTS. sktime , open source .
Google has made an interesting claim with the SMITH model. The previous leader in NLP is BERT. SMITH, according to the authors, allows you to work with text 4 times longer than BERT (2K tokens versus 0.5K). The authors added a few tricks to the design and training of the model. These changes do not seem to make any fundamental difference. Similar tricks can be seen in large numbers in other researchers. Google is unlimited in resources and can train many similar models to the state-of-art level.