RISD (Workshop + Class Materials)

I was generously sponsored to teach at Rhode Island School of Design [RISD] for 2 days (April 18-19th) by the exceptional poet and RISD faculty member Mairéad Byrne. The following materials were created for her Digital Poetics course. 

Here’s the poster:

For the workshop I used the Rerites March corpus

Numbered each verse.

Text:

generated-2018-04-16T22-00-37-Screencast-2018-04-16-222348_WORKSHOP

+ RISD corpus [contributed by RISD students]  (+Jacket2)

 

Text:

generated-2018-04-16T22-25-20_Screencast-2018-04-16-223448_RISD_class

“and farther heart, stuff’d the Arts supernova” | 173,901 lines made in 1h35m | Screencast 2018-03-08 21:10:55


Txt excerpt:

 I can even hear your violent debates. Of prayer,
 weep, Whatever you say, я recognise me. paulist. That's
 right no.
 laughing?

Complete txt output:

generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55


Corpus retuned to reflect tech and other torque

100selected-archiveorg.txt
 Jacket2_ALL.txt
 99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt
 ABZ.txt
 jorie_graham.txt
 Aeon_autonomousrobots_science-and-metaphysics_enemies_biodiversity.txt
 literotica.txt
 capa.txt ndsu.edu-creativeWriting-323.txt
 Cathay by LiBai pg50155.txt neurotransmitter.txt
 celan.txt nyt_china-technology-censorship-borders-expansion.txt
 citizen-an-american-lyric-claudia-rankine.txt patchen.txt
 citylights.txt
 Patti Smith - all songs lyrics.txt
 deepmind.txt
 poetryFoundation_CLEANED_lower.txt
 evergreenreview_all_poem.html.txt
 rodneyJones.txt
 glossary-networking-terms.txt
 swear.txt
 glossary-neurological-terms.txt
 Teen Love Meets the Internet.txt
 glossary-posthuman-terms.txt
 tumblr_jhavelikes_de2107-feb2018.txt
 Grief.txt tumblr_jhavelikes_EDITED.txt
 harari_sapiens-summary_guardian-review.txt
 twoRiver.txt
 I Feel It Is My Duty to Speak Out_SallyOReilly.txt

Process left to run for 36 hours on Nvidia Titan X:

(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python -u main.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python -u finetune.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python pointer.py --lambdasm 0.1279 --theta 0.662 --window 3785 --bptt 2000 --data data/March-2018_16mb --model QRNN --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python generate_March-2018_nocntrl.py --cuda --words=444 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=0.75 --maxt=1.25

Training QRNN using awd-lstm exited manually after 252 epochs:

-----------------------------------------------------------------------------------------
| end of epoch 252 | time: 515.44s | valid loss 4.79 | valid ppl 120.25
-----------------------------------------------------------------------------------------
Saving Averaged!
| epoch 253 | 200/ 1568 batches | lr 30.00 | ms/batch 307.76 | loss 4.29 | ppl 73.12
^C-----------------------------------------------------------------------------------------
Exiting from training early
=========================================================================================
| End of training | test loss 3.99 | test ppl 54.11

Complete training terminal output: TRAINING__Mid March Corpus | Screencast 2018-03-08 21:10:55

 

17,969 pseudo-poems of 12 lines of approx. 24 characters each generated at a rate of 6.1s per poem in 2h54m

 

So as if there was any need: here is more.

 

Here’s the code:

jhave@jhave-Ubuntu:~$ cd '/home/jhave/Documents/Github/awd-lstm-lm-master' 
jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ source activate awd-py36
jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_March-2018_PARAGRAPHS.py --cuda --words=333 --checkpoint="models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt" --model=QRNN --data='data/dec_rerites_SUBEST4+JACKET2+LYRICS' --mint=0.95 --maxt=1.25

 

Here’s a video:

 

Here’s an excerpt:

Things, media, le openness, and tenacity.
Consigned to vivian for winked. Our mother
On a boat, a heaven. Mother no envied by also.
Papa you never suffered from the played. Cruz:
And, @ to a man that is rapaciousness, just
Another kind of fact, to say that again, i
Was medicine.’ some apartment. Now i brothers’
Often, the defeated of redemption her hair.
She was born, and months an wheelchair — old
Story, and her one—that makes it difficult
To get their basic writing in language and
Sons. Blood onto initial meadows she spoke
To the touch, and all found her a gray rearrangement
Collector referenda gone vita. I didn’t even
Want to save her with some recordings. For
They have dug for his life in crotch. Power

 

Here’s ALL poems in txt format:

TXT FILE: generated-2018-02-25T13-40-12_Screencast 2018-02-25 16:35:12

 

Here’s wht they will be processed into:

This will be used as the source for March RERITES

 

Paragraph style: 16 lines of approx 42 chars each : Averaged Stochastic Gradient Descent : Screencast 2018-02-04 18:52:12

5038 poems of 16 lines of approx 42 chars each generated in 2h42m

I am now utterly convinced of the impossibility of neural nets ever producing coherent contextually-sensitive verse, yet as language play, and a window into the inherent biases within humans, it is is impeccably intriguing and occasionally nourishing.

Text: Screencast 2018-02-04 18:52:12_PARAGRAPHS.txt.tar 

(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_Feb4-2018_PARAGRAPHS.py –cuda –words=1111 –checkpoint=”models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt” –model=QRNN –data=’data/dec_rerites_SUBEST4+JACKET2+LYRICS’ –mint=0.75 –maxt=1

Averaged Stochastic Gradient Descent
with Weight Dropped QRNN
Poetry Generation

Trained on 197,923 lines of poetry & pop lyrics.

Library: PyTorch
Mode: QRNN

Embedding size: 400
Hidden Layers: 1550
Batch size: 20
Epoch: 478
Loss: 3.62
Perplexity: 37.16

Temperature range: 0.75 to 1.0

 You disagree and have you to dream. We
 Are the bravest asleep open in the new undead

Text: Screencast 2018-02-03 19:25:34

Easy reading? A few QRNN with constrained verse-lengths and randomized line lengths

Ode to simplicity:

May the poem sit in the centre of the page like a stone.
May its margins clear space in the mind.
May consistent structure
deliver semblance of an archetype.
This must be poetry;
it is shaped like a poem.


Text: generated-2017-12-24T16-29-55_Screencast 2017-12-24 18:17:36


Text : generated-2017-12-24T13-59-27_Screencast 2017-12-24 14:25:22


Text : generated-2017-12-23T15-24-44_Screencast 2017-12-23 16:36:56


Text :  generated-2017-12-22T16-32-23_Screencast 2017-12-22 18:01:22

Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]

Got to messing around a bit more with awd-lstm-lm … and by modifying the poetry generation loop, — so that it reloads the model with a new seed each iteration, — used the time it reloads as an organic tiny delay. The result is like a poem flick : every 300 ms (on average, on this machine) the system generates 222 words.

Is it presumptuous to call these 222 word bursts, poetry?

A talented intellectual friend, refers to them as nonsense. It’s an assessment I can accept. But within these nonsensical rantings are such lucid hallucinatory fragments that the act of writing poetry under such circumstances (rather than waiting for wings of inspiration, or the tickle of some conceptual tongue) becomes more geological, an act of patient sifting, weaving dexterity applied to the excess, editing/panning for nuggets among an avalanche of disconnected debris.

If some nuance of intimacy is buried in the process so be it; the muses are (often) indifferent to the sufferings of those who sing the songs; these epic sessions in all their incoherence signal an emergent rupture in the continuum of creativity.

Yet the lack of coherence does also signal limit-case challenges to so-called deep learning: context and embodiment. Poems are creatures that encompass extreme defiant agility in terms of symbolic affinities, yet they also demonstrate embodied coherence, swirl into finales, comprehend the reader. Without the construction of a functional digital emulation of emotional reasoning (as posited by Rosalind Picard and Aaron Sloman among others) that is trained on a corpus derived from embodied experience, such poetry will remain gibberish, inert until massaged by the human heart. So it is.

These poems will be used as the source text for January’s RERITES.

Video:

Text:

generated-2017-12-15T12-29-56_Screencast 2017-12-15 14:21:42_CL

Description:

Continue reading “Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]”

Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation [Screencast 2017-12-14 15:23:19]

Trying out a new algorithm: https://github.com/salesforce/awd-lstm-lm using another mildly revised training corpus

Sources: a subset of Poetry Magazine, Jacket2, Bob Marley, Bob Dylan, David Bowie, Tom Waits, Patti Smith, Radiohead, 2 River, Capa, Evergreen Review, jhavelikes.tumblr.com, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.

Same crappy results. But boy does it churn out verses quickly: 2651 poems in 16m09s (approx 2.75 poems per second, each poem is 88 words long).

Pause video below to read.

TEXT file: generated-2017-12-14T15-07-07_Screencast 2017-12-14 15:23:19