Will Douglas Heaven / MIT Technology Review:
Meta’s AI lab creates Open Pretrained Transformer, a language model trained with 175B parameters to match GPT-3’s size, and gives it to researchers for free — Meta’s AI lab has created a massive new language model that shares both the remarkable abilities and the harmful flaws of OpenAI’s pioneering neural network GPT-3.
Source link