Category Archives: RERITES

Paragraph style: 16 lines of approx 42 chars each : Averaged Stochastic Gradient Descent : Screencast 2018-02-04 18:52:12

5038 poems of 16 lines of approx 42 chars each generated in 2h42m

I am now utterly convinced of the impossibility of neural nets ever producing coherent contextually-sensitive verse, yet as language play, and a window into the inherent biases within humans, it is is impeccably intriguing and occasionally nourishing.

Text: Screencast 2018-02-04 18:52:12_PARAGRAPHS.txt.tar 

(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_Feb4-2018_PARAGRAPHS.py –cuda –words=1111 –checkpoint=”models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt” –model=QRNN –data=’data/dec_rerites_SUBEST4+JACKET2+LYRICS’ –mint=0.75 –maxt=1

Averaged Stochastic Gradient Descent
with Weight Dropped QRNN
Poetry Generation

Trained on 197,923 lines of poetry & pop lyrics.

Library: PyTorch
Mode: QRNN

Embedding size: 400
Hidden Layers: 1550
Batch size: 20
Epoch: 478
Loss: 3.62
Perplexity: 37.16

Temperature range: 0.75 to 1.0

 You disagree and have you to dream. We
 Are the bravest asleep open in the new undead

Text: Screencast 2018-02-03 19:25:34

Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]

Got to messing around a bit more with awd-lstm-lm … and by modifying the poetry generation loop, — so that it reloads the model with a new seed each iteration, — used the time it reloads as an organic tiny delay. The result is like a poem flick : every 300 ms (on average, on this machine) the system generates 222 words.

Is it presumptuous to call these 222 word bursts, poetry?

A talented intellectual friend, refers to them as nonsense. It’s an assessment I can accept. But within these nonsensical rantings are such lucid hallucinatory fragments that the act of writing poetry under such circumstances (rather than waiting for wings of inspiration, or the tickle of some conceptual tongue) becomes more geological, an act of patient sifting, weaving dexterity applied to the excess, editing/panning for nuggets among an avalanche of disconnected debris.

If some nuance of intimacy is buried in the process so be it; the muses are (often) indifferent to the sufferings of those who sing the songs; these epic sessions in all their incoherence signal an emergent rupture in the continuum of creativity.

Yet the lack of coherence does also signal limit-case challenges to so-called deep learning: context and embodiment. Poems are creatures that encompass extreme defiant agility in terms of symbolic affinities, yet they also demonstrate embodied coherence, swirl into finales, comprehend the reader. Without the construction of a functional digital emulation of emotional reasoning (as posited by Rosalind Picard and Aaron Sloman among others) that is trained on a corpus derived from embodied experience, such poetry will remain gibberish, inert until massaged by the human heart. So it is.

These poems will be used as the source text for January’s RERITES.

Video:

Text:

generated-2017-12-15T12-29-56_Screencast 2017-12-15 14:21:42_CL

Description:

Continue reading

SUBSETS: Small Data (Repeat Mimemesis provoked by lack of Dropout)

So after last week’s disastreous expansion, i decided to get austere. implement strategic pruning, and reduce the corpus.

Selections were integrated from Bob Dylan, Tom Waits, Patti Smith, 2 River, Capa, Evergreen Review, Tumblr, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.

Because the training occured without dropout, the model is overfit and a lot of direct quotations seep into the output.

Text: Screencast 2017-12-10 14:22:56_SUBSET2

Text: Screencast 2017-12-09 19:31:37

Rerites (6 months! 6 books!)

Poetry books.  One a month.

Generated by a computer. Edited by a human.

Project duration: 05.2017 – 05.2018

6 books completed in 6 months.

Rerites October 2017 just published today: preview online (entire text) or order copy available on Blurb.


162 pages, 8000 words.

Excerpt:

All That I Know For All That Is Left

I know your blue throat, 
 writing as you sleep. 
 
Smoke-wreaths howling 
 over the hot sea 
 to see my voice 
 fall so near your feet.

To see the wild light
 as rain breaks
 into thin snow 
 as i pause
 on the stairs 
 above the kitchen.

These are minutes to worry 
 about. The rain caught 
 rushing at windows, begging 
 to hear the garden in the sun.

upco<ing

Added 5 months of
Rerites http://glia.ca/2017/rerites/
into the training corpus.

Will use as source text for November Rerites.

pytorch-poet_Screencast 2017-10-30 23:38:22_OCTOBER-Rerites_source

To read other Rerite months, visit http://glia.ca/2017/rerites/

Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

In the egg, the child is simple. A wet light. Lurching.

Body is wind. The sun in the sea.

Then as if more, motions, the shadows of trees.

The ineluctable diffusion of randomized complexity.

Drainfgdl gsod. Trainins spekcled!

 

Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics. Increments in the temperature of a neural net model simulate time.

For high temperatures ( τ → ∞ {\displaystyle \tau \to \infty } \tau \to \infty ), all actions have nearly the same probability and the lower the temperature, the more expected rewards affect the probability. For a low temperature ( τ → 0 + {\displaystyle \tau \to 0^{+}} \tau \to 0^{+}), the probability of the action with the highest expected reward tends to 1.

https://en.wikipedia.org/wiki/Softmax_function

~ + ~

~ + ~

Text: Screencast 2017-08-30 00:00:40_incrementalTEMPERATURE_PoetryPytorch

~ + ~

This source will become the first seed-text for September’s RERITES.