I found an Opensource AI programme which can generate text from my conversational snippets – subject to approval  – I think this might replace the balloons. Perhaps I could have envelopes with snippets in and participants generate an addressed envelope. They open it and input the text into this AI and, over the course of the exhibition, a narrative will be generated.

From the first sentence of my short essay comment:

She had to desecrate each book and underlined everything which could lead to explosive and amazing writing.

  1. She had to desecrate each book and underlined everything which could lead to explosive and amazing writing. The one, that ultimately led to it was Uru. (It was in my first writing book and I immediately told my husband: “GOD!!! THIS IS AN EPIC GENRE OR SHIT”). She finished Uru and sent it to me asking me to read the rest and finish it. But I never finished it! I did not finish all of my books for a long time and I spent way too many years trying to find one which was not boring

 

  1. She had to desecrate each book and underlined everything which could lead to explosive and amazing writing. She said “couldn’t spoil it for him by picking it up”, and I thought it was awesome. So as a favour, I’ll buy him a cup of cocoa for school today as a thank you! Thank you

 

  1. She had to desecrate each book and underlined everything which could lead to explosive and amazing writing.
    “Are you done yet?” asked Nora, “would it not be better if you had a little bit of this ice cream by now?”

 

  1. She had to desecrate each book and underlined everything which could lead to explosive and amazing writing. Her circle of friends to whom she showed the books was very small and only included a few writers: Alex Raymond, Phil Klay, Jonathan Lethem, Michael Chabon. Recently, I decided to give her a book recommendation and she called me up.”How do you like The Sexually Violent Thing About Brian Wilson””I love it, I love it, I love it, I love it!”

    In the current era of everything being available

 

Some information about this programme Talk To Generator

This text generator programme has been developed by Adam King as a WIZIWIG platform as an easier way to play with OpenAI’s new machine learning model. In February, OpenAI unveiled a language model called GPT-2 that generates coherent paragraphs of text one word at a time. He says that it runs the  full-sized GPT-2 model, called 1558M.

The programme is still in development and king says that while GPT-2 was only trained to predict the next word in a text, it surprisingly learned basic competence in some tasks like translating between languages and answering questions. That’s without ever being told that it would be evaluated on those tasks.

To learn more, read OpenAI’s blog post or follow me on Twitter.

On the generator page King thanks Hugging Face for their PyTorch implementation of GPT-2 which he modified to handle batching queries of mixed lengths.