Category Archives: poems

Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation [Screencast 2017-12-14 15:23:19]

Trying out a new algorithm: https://github.com/salesforce/awd-lstm-lm using another mildly revised training corpus

Sources: a subset of Poetry Magazine, Jacket2, Bob Marley, Bob Dylan, David Bowie, Tom Waits, Patti Smith, Radiohead, 2 River, Capa, Evergreen Review, jhavelikes.tumblr.com, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.

Same crappy results. But boy does it churn out verses quickly: 2651 poems in 16m09s (approx 2.75 poems per second, each poem is 88 words long).

Pause video below to read.

TEXT file: generated-2017-12-14T15-07-07_Screencast 2017-12-14 15:23:19

 

SUBSETS: Small Data (Repeat Mimemesis provoked by lack of Dropout)

So after last week’s disastreous expansion, i decided to get austere. implement strategic pruning, and reduce the corpus.

Selections were integrated from Bob Dylan, Tom Waits, Patti Smith, 2 River, Capa, Evergreen Review, Tumblr, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites.

Because the training occured without dropout, the model is overfit and a lot of direct quotations seep into the output.

Text: Screencast 2017-12-10 14:22:56_SUBSET2

Text: Screencast 2017-12-09 19:31:37

Rerites (6 months! 6 books!)

Poetry books.  One a month.

Generated by a computer. Edited by a human.

Project duration: 05.2017 – 05.2018

6 books completed in 6 months.

Rerites October 2017 just published today: preview online (entire text) or order copy available on Blurb.


162 pages, 8000 words.

Excerpt:

All That I Know For All That Is Left

I know your blue throat, 
 writing as you sleep. 
 
Smoke-wreaths howling 
 over the hot sea 
 to see my voice 
 fall so near your feet.

To see the wild light
 as rain breaks
 into thin snow 
 as i pause
 on the stairs 
 above the kitchen.

These are minutes to worry 
 about. The rain caught 
 rushing at windows, begging 
 to hear the garden in the sun.

upco<ing

Added 5 months of
Rerites http://glia.ca/2017/rerites/
into the training corpus.

Will use as source text for November Rerites.

pytorch-poet_Screencast 2017-10-30 23:38:22_OCTOBER-Rerites_source

To read other Rerite months, visit http://glia.ca/2017/rerites/

Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

In the egg, the child is simple. A wet light. Lurching.

Body is wind. The sun in the sea.

Then as if more, motions, the shadows of trees.

The ineluctable diffusion of randomized complexity.

Drainfgdl gsod. Trainins spekcled!

 

Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics. Increments in the temperature of a neural net model simulate time.

For high temperatures ( τ → ∞ {\displaystyle \tau \to \infty } \tau \to \infty ), all actions have nearly the same probability and the lower the temperature, the more expected rewards affect the probability. For a low temperature ( τ → 0 + {\displaystyle \tau \to 0^{+}} \tau \to 0^{+}), the probability of the action with the highest expected reward tends to 1.

https://en.wikipedia.org/wiki/Softmax_function

~ + ~

~ + ~

Text: Screencast 2017-08-30 00:00:40_incrementalTEMPERATURE_PoetryPytorch

~ + ~

This source will become the first seed-text for September’s RERITES.

O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation

Personally I like the coherence of Pytorch, it’s capacity to hold the disembodied recalcitrant veil of absurdity over a somewht stoic normative syntactical model.

Text:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

Code:

https://github.com/jhave/pytorch-poetry-generation

Excerpt:

Jumble, Rub Up The Him-Whose-Penis-Stretches-Down-To-His-Knees. 

 
 The slow-wheeling white Thing withheld in the light of the Whitman? 
 The roof kisses the wounds of blues and yellow species. 
 Far and cold in the soft cornfields bending to the gravel, 
 Or showing diapers in disclosure, Atlantic, Raymond 
 Protract the serried ofercomon, — the throats "I've used to make been sustene, 
 Fanny, the inner man clutched to the keep; 
 Who meant me to sing one step at graves. 

A few SLOW excerpts

from here

accepting tall any flowers, forever with one question
of boots, neural, dead forgotten the glass
of cloud, start and more, who studied legends
and wanted to ascend
Every inch alone you and this desire
tulips of sounds
watching the witness
On the intensity lolling it.

Summer up warmishment
The girls crack our hearts
she set and quickens, swarms at the edge.
where they could wake? It begins
how much design, anthounces are taught her.
Illusion, mimicry a chalk.
when you’re lonely, black large in the calico,
where the bachelo forking genes
might in dusty confidently,
ignore and suck with the main grove,
the dream in the darkness, found hear
if these wobbles, silver for the man.

Deaf-soot he’s an edging of ships
a border that is the court.
The soul,
aroused, breakfast

who wants shapelesse lava
So long as nothing moved oversized no mountains of an eternity

I fed him into oned
There is to say something a fog and mask at writing
minild, the moon and cair of his screens

It is there learning, loving down, screeching.

mystery, painted Spring.
wings
as mid-afternoon, fetlocks

uncurledding cheaping full of pale
eternal, grabs us. Flowers try

migrating every idea
and whispered at the
morning machine

looking at her skin, burst

from her will.

WVNT regamed (blind, still bland, but intriguing)

The opposite of scarcity is not abundance: saturation inevitably damages, or perhaps it just corrodes or blunts the impassioned pavlov puppy, delivering a dent of tiny deliberate delirious wonder.

Yet technical momentum now propels and impels continuance: how can the incoherence be tamed? Set some new hyperparameters and let the wvnt train for 26 hours.

Over this weekend, I’ve churned out about 100,000 lines. Generating reckless amounts of incoherent poetry threatens more than perspective or contemplation, it totters sanity on the whim of a machine. Teeming bacteria, every epiphany a whiff of redundancy.

$ python train_2017_py3p5_150k_low6_sample4096_SAVE-STARTS_100k.py 
--wavenet_params=wavenet_params_ORIG_dilations2048_skipChannels8192_qc2048_dc32.json 
--data_dir=data/2017

Using default logdir: ./logdir/train/2017-06-01T08-38-45 

_______________________________________________________________________________________________

dilations: 2048 filter_width: 2 residual_channels: 32
dilation_channels: 32 skip_channels: 8192 quantization_channels: 2048

(tf0p1-py3.5-wvnt) jhave@jhave-UbuntuScreencast 2017-06-02 11:08:14_2017-06-01T08-38-45

and faithful wound 
To fruit white, the dread 
One by one another, Image saved-- 
Ay of the visit. What pursued my heart to brink. 


Such the curse of hopes fraught memory;

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-06-02 11:08:14_2017-06-01T08-38-45

Caught a new light bulb,   
All the heart is grown.

TXTs generated in SLOW MODE

There’s a way of calculating the matrices that taxes the strength of evn a magnificient GPU, making production crawl, and the computer difficult to use. Each of the following txt files (4444 letters in each) took about 40-60 minutes  to generate on an Nvidia Maxwell TitanX using cuda8 on Ubuntu 16.4

Txts generated slow seem somehow thicker, as if issued from a more calibrated mentation, yet at the same time it’s math scat, glitch flow. Glisses from disintegrating encyclopedias.

Here are some samples:

I found myself within us wonder.
   You purchase as ease with water over events,
   because the straightforward that I miximally, she
   Don't sports commentation with its ruffled story

tf0p1-py3.5-wvnt_jhave-Ubuntu_SLOW_4444charsx4_2017-06-02-16T08_2017-06-01T08-38-45_model.ckpt-117396 Continue reading

Wvnt Tamed

So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot and i found myself entering a mysterious cmd line universe called grub… thus to say apprenticing a digital poet is not without perils.


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 15:19:36_Wavenet_2017-05-30T10-36-56


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 11:49:04_Wavenet_2017-05-30T10-36-56