Introducing the Power of GPT-3

By Dhnesh on Oct. 19, 2021, 6:58 a.m.

"Attention is all you need", they said. But with the inclusion of GPT it is confirmed that we are here for an AI text Treat. Answering questions, completing text reading, comprehension summarization and more. This message was developed by scientists at openai and they called it gpt2. The goal was to be able to perform this task with as little supervision as possible. This means that they Unleashed this a great time to read the internet and the question was, what will the AI learn during this process? That is the tricky question and to be able to answer it, have a look at this paper from 2017. When a I was given a bunch of Amazon product reviews and the goal was to teach it to be able to generate new ones or continue review. When given one then something unexpected happened. Surprisingly few neurons were able to continue these reviews. They notice that the neural network has built up a knowledge of, not only language, but also be at the sentiment detector. Is this means that the AI recognized that, in order to be able to continue to review it, not only needs to learn English, but also needs to be able to detect whether the review seems positive or not. If we know that we have to complete review that seems positive from a small snippet. We have a match easier time doing it. And now back to GPT to as it was asked to predict the next character in sentences of not reviews, but of any kind we asked what neural network would learn. Well now we know that, of course, it learn, whatever it needs to learn to perform the sentence completion properly. And to do this, it needs to learn English by itself. And that's exactly what it did. It. Also learned about a lot of garbage .Not going to discuss them. It continued in a way that was not only coherent bath have quite a bit of truth to it, note that there was no explicit instruction for the AI apart from it, being a leash on the internet and reading it. And now the next version of peered by the name gpt-3, this version is now more than a hundred times bigger. So, our first question is, how much better can AI get if we increase the size of a neuron at work, let's have a look together. These are the results on a challenging reading comprehension test as a function of the number of parameters. As you see around one and a half billion parameters, which is roughly equivalent to gpt2, it has learned a great deal but it's understanding is nowhere near the level of human comprehension. However, as we grow the network, something incredible happens non-trivial capability start to appear as we approach. The hundred billion parameters. It nearly match the level of humans, my goodness. This was possible before, but only when you're on at work, that are specifically designed for a narrow task in comparison, gpt-3 is much more General. Let's test that generality and have a look at some of the practical applications together.Firstly, I made this AI accessible to a lucky few people. And it turns out, it has read a lot of things on the internet which contains a lot of cold. So it can generate Website Layout from a written description to it. Also learned how to generate properly formatted plot from a tiny prompt written in plain English. Not just one kind, many kinds, perhaps, to the joy of technical PhD students around the world and it can properly types that mathematical equations from a plain English description as well. And lastly, you can also translate a complex legal text into plain language or the other way around. In other words, it can also generate legal text from our simple description. And as you see here, you can do much, much more. However, of course, this iteration of GPT also has its limitations. We haven't seen the extent to which these examples are cherry-picked or. In other words, for every good output that we Marvel at there might have been one or a dozen tries that did not come out. Well, we don't exactly know. But the main point is that working with gpt-3 is a really peculiar process where we know that vast body of knowledge lies within that, it only emerges, if we can bring it out with properly. Written, prompt is almost feels like a new kind of programming that is open to everyone. Even people without any problem ink or technical. We only scratch the What it can do here, so, make sure to have a look. I can only imagine what we will be able to do with GPT in the near future, what a time to be alive.