GPT-3, the largest artificial intelligence language model, is trained on an estimated 45 terabytes of text data run through 175 billion parameters. It can do more than just autocomplete, like generate code and write stories, just like a human — but it can make errors like a human too.
Comments are closed.