What can I say? 12 books in 12 months.
I’m tired. Even though the computer did a significant amount of labour, the process still entailed writing-editing (or what I refer to as “carving the text”) about 60 hours every month. 2 hours a day, 6-7 days a week, between 6:30-8:30am, in the quiet meditative emptiness before the internet awakes, with email closed, and the web only used for searching, I carved.
Carving the text occurred inside a Sublime Text editor. Usually a first rough pass was made then the entire manuscript was re-carved (re-edited) during the final days of each month. As can be seen here, in the final footage of writing the RERITES:
All 12 RERITES books will be published by Anteism in Fall/Winter 2018.
I was generously sponsored to teach at Rhode Island School of Design [RISD] for 2 days (April 18-19th) by the exceptional poet and RISD faculty member Mairéad Byrne. The following materials were created for her Digital Poetics course.
Here’s the poster:
For the workshop I used the Rerites March corpus
Numbered each verse.
+ RISD corpus [contributed by RISD students] (+Jacket2)
Too tired to mix a fresh corpus.
Revived the old ferment.
Changed the process.
TXT: generated-2018-03-31T17-16-02_Screencast 2018-03-31 19:12:08
$ cd '/home/jhave/Documents/Github/awd-lstm-lm-master'
$ source activate awd-py36
$ python generate_April-2018_nocntrl_RELOAD.py --cuda --words=333 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=.95 --maxt=1.05
Exoadderal AIgorythms redefine poetic potential.
It’s as simple as that. The muse fueled by a million lines will lose to the muse capable of ingesting and contemplating and comprehending a billion lines: diverse, diffusive ecstatic anamnesis.
I’ve been making videos of the writing purpose. Here’s the first one:
3xspeed didn’t seem fast enough for our temporal performance obsessed generation so i moved to 4x speed.
I can even hear your violent debates. Of prayer,
weep, Whatever you say, я recognise me. paulist. That's
Complete txt output:
generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55
Corpus retuned to reflect tech and other torque
99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt
Cathay by LiBai pg50155.txt neurotransmitter.txt
Patti Smith - all songs lyrics.txt
Teen Love Meets the Internet.txt
I Feel It Is My Duty to Speak Out_SallyOReilly.txt
Process left to run for 36 hours on Nvidia Titan X:
(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python -u main.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt
$ python -u finetune.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt
$ python pointer.py --lambdasm 0.1279 --theta 0.662 --window 3785 --bptt 2000 --data data/March-2018_16mb --model QRNN --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt
$ python generate_March-2018_nocntrl.py --cuda --words=444 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=0.75 --maxt=1.25
Training QRNN using awd-lstm exited manually after 252 epochs:
| end of epoch 252 | time: 515.44s | valid loss 4.79 | valid ppl 120.25
| epoch 253 | 200/ 1568 batches | lr 30.00 | ms/batch 307.76 | loss 4.29 | ppl 73.12
Exiting from training early
| End of training | test loss 3.99 | test ppl 54.11
Complete training terminal output: TRAINING__Mid March Corpus | Screencast 2018-03-08 21:10:55
So as if there was any need: here is more.
Here’s the code:
jhave@jhave-Ubuntu:~$ cd '/home/jhave/Documents/Github/awd-lstm-lm-master'
jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ source activate awd-py36
jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_March-2018_PARAGRAPHS.py --cuda --words=333 --checkpoint="models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt" --model=QRNN --data='data/dec_rerites_SUBEST4+JACKET2+LYRICS' --mint=0.95 --maxt=1.25
Here’s a video:
Here’s an excerpt:
Things, media, le openness, and tenacity.
Consigned to vivian for winked. Our mother
On a boat, a heaven. Mother no envied by also.
Papa you never suffered from the played. Cruz:
And, @ to a man that is rapaciousness, just
Another kind of fact, to say that again, i
Was medicine.’ some apartment. Now i brothers’
Often, the defeated of redemption her hair.
She was born, and months an wheelchair — old
Story, and her one—that makes it difficult
To get their basic writing in language and
Sons. Blood onto initial meadows she spoke
To the touch, and all found her a gray rearrangement
Collector referenda gone vita. I didn’t even
Want to save her with some recordings. For
They have dug for his life in crotch. Power
Here’s ALL poems in txt format:
TXT FILE: generated-2018-02-25T13-40-12_Screencast 2018-02-25 16:35:12
Here’s wht they will be processed into:
This will be used as the source for March RERITES
5038 poems of 16 lines of approx 42 chars each generated in 2h42m
I am now utterly convinced of the impossibility of neural nets ever producing coherent contextually-sensitive verse, yet as language play, and a window into the inherent biases within humans, it is is impeccably intriguing and occasionally nourishing.
Text: Screencast 2018-02-04 18:52:12_PARAGRAPHS.txt.tar
(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_Feb4-2018_PARAGRAPHS.py –cuda –words=1111 –checkpoint=”models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt” –model=QRNN –data=’data/dec_rerites_SUBEST4+JACKET2+LYRICS’ –mint=0.75 –maxt=1
Averaged Stochastic Gradient Descent
with Weight Dropped QRNN
Trained on 197,923 lines of poetry & pop lyrics.
Embedding size: 400
Hidden Layers: 1550
Batch size: 20
Temperature range: 0.75 to 1.0
You disagree and have you to dream. We
Are the bravest asleep open in the new undead
Text: Screencast 2018-02-03 19:25:34
Ode to simplicity:
May the poem sit in the centre of the page like a stone.
May its margins clear space in the mind.
May consistent structure
deliver semblance of an archetype.
This must be poetry;
it is shaped like a poem.
Text: generated-2017-12-24T16-29-55_Screencast 2017-12-24 18:17:36
Text : generated-2017-12-24T13-59-27_Screencast 2017-12-24 14:25:22
Text : generated-2017-12-23T15-24-44_Screencast 2017-12-23 16:36:56
Text : generated-2017-12-22T16-32-23_Screencast 2017-12-22 18:01:22
Got to messing around a bit more with awd-lstm-lm … and by modifying the poetry generation loop, — so that it reloads the model with a new seed each iteration, — used the time it reloads as an organic tiny delay. The result is like a poem flick : every 300 ms (on average, on this machine) the system generates 222 words.
Is it presumptuous to call these 222 word bursts, poetry?
A talented intellectual friend, refers to them as nonsense. It’s an assessment I can accept. But within these nonsensical rantings are such lucid hallucinatory fragments that the act of writing poetry under such circumstances (rather than waiting for wings of inspiration, or the tickle of some conceptual tongue) becomes more geological, an act of patient sifting, weaving dexterity applied to the excess, editing/panning for nuggets among an avalanche of disconnected debris.
If some nuance of intimacy is buried in the process so be it; the muses are (often) indifferent to the sufferings of those who sing the songs; these epic sessions in all their incoherence signal an emergent rupture in the continuum of creativity.
Yet the lack of coherence does also signal limit-case challenges to so-called deep learning: context and embodiment. Poems are creatures that encompass extreme defiant agility in terms of symbolic affinities, yet they also demonstrate embodied coherence, swirl into finales, comprehend the reader. Without the construction of a functional digital emulation of emotional reasoning (as posited by Rosalind Picard and Aaron Sloman among others) that is trained on a corpus derived from embodied experience, such poetry will remain gibberish, inert until massaged by the human heart. So it is.
These poems will be used as the source text for January’s RERITES.
generated-2017-12-15T12-29-56_Screencast 2017-12-15 14:21:42_CL
Continue reading “Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]”
Trying out a new algorithm: https://github.com/salesforce/awd-lstm-lm using another mildly revised training corpus
Sources: a subset of Poetry Magazine, Jacket2, Bob Marley, Bob Dylan, David Bowie, Tom Waits, Patti Smith, Radiohead, 2 River, Capa, Evergreen Review, jhavelikes.tumblr.com, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.
Same crappy results. But boy does it churn out verses quickly: 2651 poems in 16m09s (approx 2.75 poems per second, each poem is 88 words long).
Pause video below to read.
TEXT file: generated-2017-12-14T15-07-07_Screencast 2017-12-14 15:23:19