Category: awd-lstm-lm

Easy reading? A few QRNN with constrained verse-lengths and randomized line lengths

Ode to simplicity:

May the poem sit in the centre of the page like a stone.
May its margins clear space in the mind.
May consistent structure
deliver semblance of an archetype.
This must be poetry;
it is shaped like a poem.


Text: generated-2017-12-24T16-29-55_Screencast 2017-12-24 18:17:36


Text : generated-2017-12-24T13-59-27_Screencast 2017-12-24 14:25:22


Text : generated-2017-12-23T15-24-44_Screencast 2017-12-23 16:36:56


Text :  generated-2017-12-22T16-32-23_Screencast 2017-12-22 18:01:22

Written by Comments Off on Easy reading? A few QRNN with constrained verse-lengths and randomized line lengths Posted in awd-lstm-lm

Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]

Got to messing around a bit more with awd-lstm-lm … and by modifying the poetry generation loop, — so that it reloads the model with a new seed each iteration, — used the time it reloads as an organic tiny delay. The result is like a poem flick : every 300 ms (on average, on this machine) the system generates 222 words.

Is it presumptuous to call these 222 word bursts, poetry?

A talented intellectual friend, refers to them as nonsense. It’s an assessment I can accept. But within these nonsensical rantings are such lucid hallucinatory fragments that the act of writing poetry under such circumstances (rather than waiting for wings of inspiration, or the tickle of some conceptual tongue) becomes more geological, an act of patient sifting, weaving dexterity applied to the excess, editing/panning for nuggets among an avalanche of disconnected debris.

If some nuance of intimacy is buried in the process so be it; the muses are (often) indifferent to the sufferings of those who sing the songs; these epic sessions in all their incoherence signal an emergent rupture in the continuum of creativity.

Yet the lack of coherence does also signal limit-case challenges to so-called deep learning: context and embodiment. Poems are creatures that encompass extreme defiant agility in terms of symbolic affinities, yet they also demonstrate embodied coherence, swirl into finales, comprehend the reader. Without the construction of a functional digital emulation of emotional reasoning (as posited by Rosalind Picard and Aaron Sloman among others) that is trained on a corpus derived from embodied experience, such poetry will remain gibberish, inert until massaged by the human heart. So it is.

These poems will be used as the source text for January’s RERITES.

Video:

Text:

generated-2017-12-15T12-29-56_Screencast 2017-12-15 14:21:42_CL

Description:

Continue reading

Written by Comments Off on Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ] Posted in awd-lstm-lm, RERITES

Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation [Screencast 2017-12-14 15:23:19]

Trying out a new algorithm: https://github.com/salesforce/awd-lstm-lm using another mildly revised training corpus

Sources: a subset of Poetry Magazine, Jacket2, Bob Marley, Bob Dylan, David Bowie, Tom Waits, Patti Smith, Radiohead, 2 River, Capa, Evergreen Review, jhavelikes.tumblr.com, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.

Same crappy results. But boy does it churn out verses quickly: 2651 poems in 16m09s (approx 2.75 poems per second, each poem is 88 words long).

Pause video below to read.

TEXT file: generated-2017-12-14T15-07-07_Screencast 2017-12-14 15:23:19

 

Written by Comments Off on Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation [Screencast 2017-12-14 15:23:19] Posted in awd-lstm-lm, poems