The neural network GPT-3 led a motivational blog in English and everyone liked it. How it threatens copywriters and writers





OpenAI, founded by Elon Musk a few years ago, released a new neural network algorithm, GPT-3, in June. Today it is the most advanced system that can work with natural languages.



Its capabilities are simply colossal. With GPT-3, you can create any type of textual information, including complex technical texts. For example, when neural networks were given the task of writing a text about themselves, she created an article with the loud title "GPT-3 from OpenAI can be the greatest discovery after Bitcoin . "



But can a neural network not only compile words, imitating a person, but in fact create texts with meaning? Poems, stories or even novels that will be interesting to read? Let's figure it out.





GPT-3: a neural network algorithm that is an order of magnitude ahead of all existing



GPT-3 is the most complex language model ever created by man.



It works with the probability of a certain sequence of words. GPT-3 differs from earlier models in scale.



It uses 175 billion parameters to generate text and has been trained on over 1.5 trillion words. Moreover, the texts for training were used in a variety of ways: from posts on forums to classic literature.



The system tries to predict the text based on the most likely text blocks. The user only needs to set some point of reference from which the neural network will start.







The better the system understands the context of the request, the more accurate the response will be.



One gets the impression that the neural network can really understand natural languages.



Sharif Shameem, co-founder and CEO of the Debuild.co project, posted the results of testing various neural network capabilities on his Twitter account. And they are really impressive.







In the video, you can see that Sharif wrote simple text into the input line, as if explaining to the designer or developer what he needed. And the system interpreted it and returned the result.



I was especially amused by the "button similar to a watermelon".



, , ?



: .




vs. :



Texts from a neural network practically do not differ from human texts. Liam Porr, a student at Berkeley, conducted an experiment and posted articles generated by GPT-3 on his blog on Adolos for two weeks.



Motivational articles from the neural network were read by 26,000 people. And, as Liam says, only one of them guessed that the lyrics were actually written not by a person, but by a machine. But even this comment was minded by other readers:







If you read the articles themselves, you get the impression that they were actually written by some motivational trainer or coach. Appropriate style and phrases, adequate text structure. Sometimes sentences or individual phrases do not seem too natural, but this does not betray the machine. After all, a person makes such minor mistakes.



This is one of the reasons why GPT-3 is not released to the public domain. To get access to the OpenAI API, you need to fill out an application indicating exactly what you plan to use the neural network for.



Even at the stage of creating GPT-2, the previous version of the algorithm, the developers realized the potential danger, because the system can become an instrument of information warfare. Such a neural network is capable of generating fake news at a monstrous rate. If you use its capabilities to the detriment, the Internet will simply be buried under false content.



That is why they plan to sell it to a subscription business in the future. But not to everyone, but only to those who prove that they plan to use it "for peaceful purposes."



Neural network, fiction books and poetry



A neural network may well rival authors of news in the media or technical articles, but the process of creating novels or poetry is much more complicated. Even the basic principles of writing fiction are different from technical ones. It is possible to teach a neural network to select rhymes and follow the rhythm without any problems, but with many meanings, machines are still byad.



There is a Russian neural network "Porfirevich" based on the GTP-2 algorithm. She "knows" Russian and is trained not only in prose, but also in poetry. You can check the work in the "Neuropoet" Telegram bot . You only need to write the first line or several, and the system will do the rest. We checked, it turned out so-so.







The neural network turned the first line of Pushkin into an incomprehensible set of text, in which military motives are guessed. And if some images can be considered successful ("Oak asks for bread from spring" or "Autumn flowers of spring"), others cause a facepalm.



An analogy can be drawn with the theory of endless monkeys who, by pressing random keys on a typewriter for an indefinitely long period of time, will sooner or later write "War and Peace."



The neural network here acts as such a monkey, which, instead of individual letters, compiles words and images, but most of them turn out to be random.




In fact, many developers have created GPT-2 based poetry generators. For example, here is a variant of an English lyrics generator that uses a lightweight database of 13,000 verses.



He also generated something not very intelligible:







As for fiction books, the situation is even worse. This was to be expected, because a neural network does not know how to build a plot and does not know what a plot or a climax is. But from some experiments, such selected game is obtained that one can only be surprised.



How do you like something like Bible porn fiction? An essay titled "The Orange Erotic Bible" has been posted by unknown developers... It was created by a neural network based on the same GPT-2, but they trained it on erotic novels and stories, and then gave excerpts from the Bible as a starting point for generation. The story turned out to be long - over 60,000 words. And the result is surprising, to put it mildly.







Here is a small part of the story. And no, we will not translate it.



There is even an online store for books written by AI . It looks pretty creepy. Because the book covers are also machine made.







True, its creators admitted that they sold only 19 copies of the books. If their quality even roughly resembles the "Orange Bible", then we are not at all surprised. Writers can sleep well for now.



Or not?



All the previous examples were only about GPT-2, and they were so-so.



But there is already a neural network-poet based on the GPT-3 algorithm. And it shows some pretty interesting results.



A neural network can write poetry in a given style and on a given topic. Arram Sabeti, CEO of the ZeroCater project, has already conducted an experiment where he generated neural network queries asking about verse and style. The machine did the rest.



Here is one of the requests:



Here's a poem by Dr. Seuss about Elon Musk launching rockets with SpaceX, building electric cars at Tesla, selling flame throwers, dating Grimes, fighting the SEC, crushing short sellers, building brain chips to fight AI, and colonizing mars, and tweeting at all hours of day and night ... Musk wants to make humans interplanetary to prevent extinction. The poem rhymes every other line with an ABAB structure. The rhyme structure is typical of Dr. Seuss nursery rhymes.




The verse turned out to be amusing. True. Despite the obvious inaccuracies in rhythm and jambs with rhymes, it makes sense. It is long, so we will only include a part here.



Elon Musk

By Dr. Seuss



Once there was a man

who really was a Musk.

He liked to build robots

and rocket ships and such.



He said, “I’m building a car

that’s electric and cool.

I’ll bet it outsells those

Gasoline-burning clunkers soon!”



They sell cool cars

that are electric and clean.

And then he wanted to go

even farther, you see.



The first Tesla, Roadster,

Was just the start of a bigger dream.

He said, “I’m building a plane

that’s electric and green”.




The progress compared to the previous version of the algorithm is amazing. This is not a collection of incoherent phrases, but a quite good poem in structure. Yes, the ideal is still far away, the rhymes are weak, and the rhythm is lame, but already at this stage one is very surprised that the machine came up with it.



Some images and phrases are just beautiful. For example, "gasoline-burning clunkers" - "gasoline clunkers." If you do not know that it was written by a machine, then you might think that this is the creation of some schoolboy who decided to write a poem about his idol.



And damn it, that's cool.



You can read the full version of the verse and other creations of the neural network here .



***



There are very few experiments with GPT-3 in terms of writing poetry or fiction stories, so you have to be content with only this. But if the neural network has advanced so much in creating meaningful texts, then it is quite possible that it will soon be able to write quite readable works of art.



Perhaps it will not be GPT-3, but some GPT-4 or even GPT-5, but the dynamics of development is clearly there. Now the neural network can write technical articles, news, short stories or poetry.



And with the further development of algorithms, the quality of the generated texts will only grow. So yes, AI can write like a human. So far, at the level of a student, but let's see how it will be in 10-20 years. And what do you think?



Online school EnglishDom.com - we inspire to learn English through technology and human care







Only for Habr readers, the first lesson with a teacher on Skype is free ! And when buying classes, you will receive up to 3 lessons as a gift!



Get a full month of ED Words premium subscription as a gift .

Enter the promo code neurowriter on this page or directly in the ED Words app . The promo code is valid until 09/29/2021.



Our products:






All Articles