Speel Chekin [RERITES Process]
RERITES is heading toward publication.
12 essays have been commissioned. (5 received so far).
Publication slated for December 2019.
Edition of 12 boxsets + one artist proof.
2 sold already. Tiered pricing. (Ah, the wistful poet tries to break evn)
Here’s a funny screenshot from spell-checking process:
Ask an AI algorithm to break an AI-created poem that contains the text-string: O0000OOOO … and it suggests: O0000 OOOO
Note: there will be lots of neologisms left in the RERITES — even after the spell-check — because the algorithms ask for them, create them, suggest, bind, blend, merge, and contort the arbitrary rigid temporary skin of syntax.
RERITES: is finished!
What can I say? 12 books in 12 months.
I’m tired. Even though the computer did a significant amount of labour, the process still entailed writing-editing (or what I refer to as “carving the text”) about 60 hours every month. 2 hours a day, 6-7 days a week, between 6:30-8:30am, in the quiet meditative emptiness before the internet awakes, with email closed, and the web only used for searching, I carved.
Carving the text occurred inside a Sublime Text editor. Usually a first rough pass was made then the entire manuscript was re-carved (re-edited) during the final days of each month. As can be seen here, in the final footage of writing the RERITES:
All 12 RERITES books will be published by Anteism in Fall/Winter 2018.
RISD (Workshop + Class Materials)
I was generously sponsored to teach at Rhode Island School of Design [RISD] for 2 days (April 18-19th) by the exceptional poet and RISD faculty member Mairéad Byrne. The following materials were created for her Digital Poetics course.
Here’s the poster:
For the workshop I used the Rerites March corpus
Numbered each verse.
Text:
generated-2018-04-16T22-00-37-Screencast-2018-04-16-222348_WORKSHOP
+ RISD corpus [contributed by RISD students] (+Jacket2)
Text:
generated-2018-04-16T22-25-20_Screencast-2018-04-16-223448_RISD_class
April RERITES Source
Too tired to mix a fresh corpus.
Revived the old ferment.
Changed the process.
Returned:
TXT: generated-2018-03-31T17-16-02_Screencast 2018-03-31 19:12:08
$ cd '/home/jhave/Documents/Github/awd-lstm-lm-master' $ source activate awd-py36 $ python generate_April-2018_nocntrl_ RELOAD.py --cuda --words=333 --checkpoint="models/March- 2018_16mb_QRNN_nhid1500_ batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=.95 --maxt=1.05
Artificial Intelligence is Artistic Cognitive Enhancement
Exoadderal AIgorythms redefine poetic potential.
It’s as simple as that. The muse fueled by a million lines will lose to the muse capable of ingesting and contemplating and comprehending a billion lines: diverse, diffusive ecstatic anamnesis.
I’ve been making videos of the writing purpose. Here’s the first one:
3xspeed didn’t seem fast enough for our temporal performance obsessed generation so i moved to 4x speed.
More here:
“and farther heart, stuff’d the Arts supernova” | 173,901 lines made in 1h35m | Screencast 2018-03-08 21:10:55
Txt excerpt:
I can even hear your violent debates. Of prayer, weep, Whatever you say, я recognise me. paulist. That's right no. laughing?
Complete txt output:
generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55
Corpus retuned to reflect tech and other torque
100selected-archiveorg.txt Jacket2_ALL.txt 99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt ABZ.txt jorie_graham.txt Aeon_autonomousrobots_science-and-metaphysics_enemies_biodiversity.txt literotica.txt capa.txt ndsu.edu-creativeWriting-323.txt Cathay by LiBai pg50155.txt neurotransmitter.txt celan.txt nyt_china-technology-censorship-borders-expansion.txt citizen-an-american-lyric-claudia-rankine.txt patchen.txt citylights.txt Patti Smith - all songs lyrics.txt deepmind.txt poetryFoundation_CLEANED_lower.txt evergreenreview_all_poem.html.txt rodneyJones.txt glossary-networking-terms.txt swear.txt glossary-neurological-terms.txt Teen Love Meets the Internet.txt glossary-posthuman-terms.txt tumblr_jhavelikes_de2107-feb2018.txt Grief.txt tumblr_jhavelikes_EDITED.txt harari_sapiens-summary_guardian-review.txt twoRiver.txt I Feel It Is My Duty to Speak Out_SallyOReilly.txt
Process left to run for 36 hours on Nvidia Titan X:
(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python -u main.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python -u finetune.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python pointer.py --lambdasm 0.1279 --theta 0.662 --window 3785 --bptt 2000 --data data/March-2018_16mb --model QRNN --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python generate_March-2018_nocntrl.py --cuda --words=444 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=0.75 --maxt=1.25
Training QRNN using awd-lstm exited manually after 252 epochs:
----------------------------------------------------------------------------------------- | end of epoch 252 | time: 515.44s | valid loss 4.79 | valid ppl 120.25 ----------------------------------------------------------------------------------------- Saving Averaged! | epoch 253 | 200/ 1568 batches | lr 30.00 | ms/batch 307.76 | loss 4.29 | ppl 73.12 ^C----------------------------------------------------------------------------------------- Exiting from training early ========================================================================================= | End of training | test loss 3.99 | test ppl 54.11
Complete training terminal output: TRAINING__Mid March Corpus | Screencast 2018-03-08 21:10:55
17,969 pseudo-poems of 12 lines of approx. 24 characters each generated at a rate of 6.1s per poem in 2h54m
So as if there was any need: here is more.
Here’s the code:
jhave@jhave-Ubuntu:~$ cd '/home/jhave/Documents/Github/awd-lstm-lm-master' jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ source activate awd-py36 jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_March-2018_PARAGRAPHS.py --cuda --words=333 --checkpoint="models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt" --model=QRNN --data='data/dec_rerites_SUBEST4+JACKET2+LYRICS' --mint=0.95 --maxt=1.25
Here’s a video:
Here’s an excerpt:
Things, media, le openness, and tenacity.
Consigned to vivian for winked. Our mother
On a boat, a heaven. Mother no envied by also.
Papa you never suffered from the played. Cruz:
And, @ to a man that is rapaciousness, just
Another kind of fact, to say that again, i
Was medicine.’ some apartment. Now i brothers’
Often, the defeated of redemption her hair.
She was born, and months an wheelchair — old
Story, and her one—that makes it difficult
To get their basic writing in language and
Sons. Blood onto initial meadows she spoke
To the touch, and all found her a gray rearrangement
Collector referenda gone vita. I didn’t even
Want to save her with some recordings. For
They have dug for his life in crotch. Power
Here’s ALL poems in txt format:
TXT FILE: generated-2018-02-25T13-40-12_Screencast 2018-02-25 16:35:12
Here’s wht they will be processed into:
This will be used as the source for March RERITES
Paragraph style: 16 lines of approx 42 chars each : Averaged Stochastic Gradient Descent : Screencast 2018-02-04 18:52:12
5038 poems of 16 lines of approx 42 chars each generated in 2h42m
I am now utterly convinced of the impossibility of neural nets ever producing coherent contextually-sensitive verse, yet as language play, and a window into the inherent biases within humans, it is is impeccably intriguing and occasionally nourishing.
Text: Screencast 2018-02-04 18:52:12_PARAGRAPHS.txt.tar
(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_Feb4-2018_PARAGRAPHS.py –cuda –words=1111 –checkpoint=”models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt” –model=QRNN –data=’data/dec_rerites_SUBEST4+JACKET2+LYRICS’ –mint=0.75 –maxt=1
Averaged Stochastic Gradient Descent
with Weight Dropped QRNN
Poetry GenerationTrained on 197,923 lines of poetry & pop lyrics.
Library: PyTorch
Mode: QRNNEmbedding size: 400
Hidden Layers: 1550
Batch size: 20
Epoch: 478
Loss: 3.62
Perplexity: 37.16Temperature range: 0.75 to 1.0
You disagree and have you to dream. We Are the bravest asleep open in the new undead
Easy reading? A few QRNN with constrained verse-lengths and randomized line lengths
Ode to simplicity:
May the poem sit in the centre of the page like a stone.
May its margins clear space in the mind.
May consistent structure
deliver semblance of an archetype.
This must be poetry;
it is shaped like a poem.
Text: generated-2017-12-24T16-29-55_Screencast 2017-12-24 18:17:36
Text : generated-2017-12-24T13-59-27_Screencast 2017-12-24 14:25:22
Text : generated-2017-12-23T15-24-44_Screencast 2017-12-23 16:36:56
Text : generated-2017-12-22T16-32-23_Screencast 2017-12-22 18:01:22