OpenAI held its first developer conference a few hours ago and used it to announce important news, including GPT-4 Turbo. What lovers of artificial intelligence will find in this new system is a tool that stands out for two key characteristics: it is more powerful and its access is cheaper. In addition, there is also important news about GPT-4.
Over the past few months, AI assistants like GPT-4 have become the latest technological trend. And surely, to a greater or lesser extent, you have also tried them. Now its creative company has presented its new model and it is expected to pave the way for the growth of many other AIs that continue to be born daily around it.
Much more effective AI
As you can already imagine, one of the most important features in the presentation of GPT-4 Turbo, the new version of the chatbot, is that it has updated knowledge. It now covers all events that have occurred until April of this year 2023, which means that it will be capable of greater flexibility and that the answers it will give will be much more up to date with what has happened in the world.
But the increase in their level of knowledge was what could be taken for granted in the release of a version. There are other features that are more decisive, such as the fact that it now has the capacity to manage prompts of a maximum of 128,000 tokens. To give us an idea of the enormous capacity that this figure represents, it must be said that it is enough enough text to fill 300 pages (the length of a book like Harry Potter and the Prisoner of Azkaban).
Two versions and their prices
Developers working with the paid version of GPT-4 You can now access GPT-4 Turbo through the use of the gpt-4-1106-preview update. However, OpenAI is expected to make it globally available soon, possibly once it has tested and seen how early adopters are using it. As they mention, it will be in the coming weeks.
It is necessary to take into account that GPT-4 Turbo It is available in two versions. The first of them evokes the classic use of the chatbot, being compatible only with text requests. This is the version that is in the API update that we mentioned in the previous paragraph. But OpenAI also has a second version in which image analysis is already supported.
And the price? It all depends on the tokens you use. Tokens are the unitary system that ChatGPT uses as a way of measuring the text that we load within the chatbot and, at the same time, which the program uses to respond to us. They are not equivalent to letters, nor are they equivalent to words. We could say that what is closest to tokens would be syllables, but we would not be getting an exact definition either. Stay with the idea that it is the measurement system and that depends on several factorsalthough it is not difficult to get used to its calculation once you have been working with the tool for a while.
That said, the price of GPT-4 Turbo is $0.01 for an amount of 1,000 tokens in the command request you make (or, in other words, the text you type) and $0.03 for the same amount of tokens, but in the program’s responses. If we were to make a rough approximation of what this figure represents in words, we could say that with 1,000 tokens you would be writing or receiving around 700-750 words. With these figures, OpenAI assures that they have made efforts to make the use of AI much more economical compared to GPT-4. That will help it be used even more in functions such as summarizing texts. And the images? This price is more difficult to calculate. The company assures that what will influence it will be its size. And, so that we can get an idea, they give the example of how a photo with a resolution of 1080 x 1080 pixels will have a price of 0.00765 dollars.
Good news for the community is that OpenAI has not only focused on the presentation of GPT-4 Turbo, but has also announced news for GPT-4. Thus, the company says that it has put its team to work on reviewing the program’s performance to iron out all the aspects in which it is not performing at the highest level, something for which they have understood that a human team working hard was necessary. . Initially it seems that they thought they could adjust GPT-4 just as they did with GPT-3.5, but they ended up discovering that the process was more demanding than initially calculated.
On the other hand, although the token prices in GPT-4 remain the same as before, what the entity has done is double the limit of tokens per minute that users can use. In this way they hope that the user experience will be more satisfactory, although there were those who thought that a change like this could have been accompanied by some kind of reduction in rates.
Other new features announced during the conference include the compatibility of GPT-4 Turbo and GPT-3.5 Turbo with JSON mode and the availability of GPTs, which are customizable versions of the chatbot available to companies and professional users. This system allows carry out your own versions of ChatGPT in a customizable way and without the need to have programming knowledge. In addition, free rein is given to create and use them both privately and commercially, so it is possible that it will help AIs based on OpenAI technology reproduce at high speed.