Pytorch 1800 hidden layers 31 epochs (July 2107 RERITES Source)

 

The second video here became the source-text for July 2017 RERITES  http://glia.ca/rerites/

+~+

PyTorch Poetry Language Model.
Trained on approx 600,000 lines of poetry
http://bdp.glia.ca

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_30-loss_6.00-ppl_405.43.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 30
Loss: 6.00
Perplexity: 405.43.pt

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 31
Loss: 6.00
Perplexity: 405.39.pt

+~+

Ridges— Ourselves?

4
K-Town: ideality;
The train lost, “Aye man!
O old beggar, O perfect friend;
The bath-tub before the Bo’s’n resentments; pissing
rimed metaphors in the white pincers
scratching and whiten each of a clarity
in the sky the sacred hoof of eastward,
arc of the pestle through sobered the cliffs to the own world.

+~+

TXT Version:

jhave@jhave-Ubuntu_Screencast 2017-06-26 09_45_14_2017-06-17T09-22-17_model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

+~+

O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation

Personally I like the coherence of Pytorch, it’s capacity to hold the disembodied recalcitrant veil of absurdity over a somewht stoic normative syntactical model.

Text:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

Code:

https://github.com/jhave/pytorch-poetry-generation

Excerpt:

Jumble, Rub Up The Him-Whose-Penis-Stretches-Down-To-His-Knees. 

 
 The slow-wheeling white Thing withheld in the light of the Whitman? 
 The roof kisses the wounds of blues and yellow species. 
 Far and cold in the soft cornfields bending to the gravel, 
 Or showing diapers in disclosure, Atlantic, Raymond 
 Protract the serried ofercomon, — the throats "I've used to make been sustene, 
 Fanny, the inner man clutched to the keep; 
 Who meant me to sing one step at graves. 

Fermentation & Neural Nets

Mead recipe: dilute honey with water, stir twice a day, wait.  

Fermentation begins after 24-48 hours. After a week, the fermented honey-wine (mead) can be enjoyed green, low in alcohol yet lively with essence. Or you can let it continue.

Generative-poetry recipe: text-corpus analysed by neural net, wait.

After each reading of the corpus (a.k.a. ‘training epoch’), a neural net can produce/save a model. Think of the model as an ecosystem produced by fermentation, an idea, a bubbling contingency, a knot of flavours, a succulent reservoir, a tangy substrate that represents the contours, topology or intricacies of the source corpus. Early models ( a model is usually produced after the algorithm reads the entire corpus; remember, each reading of the corpus is called an epoch; after 4-7 epochs the models are still young) may be obscure but are often buoyant with energy (like kids mimicking modern dance).

Here’s an example from a model that is only 2 epochs old:

Daft and bagel, of sycamore.
“Now thou so sex–sheer if I see Conquerours.

So he is beneath the lake, where they pass together,
Amid the twister that squeeze-box fire:

Here’s an example from a model that is 6 epochs old:

Yellow cornbread and yellow gems
and all spring to eat.

Owls of the sun: the oldest worm of light
robed in the advances of a spark.

Later models (after 20 epochs), held in vats (epoch after epoch), exhibit more refined microbial (i.e. algorithm) calibrations:

Stale nose of empty lair

The bright lush walls are fresh and lively and weeping,
Whirling into the night as I stand

unicorn raging on the serenade
Of floors of sunset after evil-spirits,
adagio.

Eventually fermentation processes halt. In the case of mead, this may occur after a month; with neural nets, a process called simulated annealing intentionally decreases the learning rate every iteration; so the system begins by exploring large features then focuses on details. Eventually the learning rate diminishes to zero. Learning (fermentation) stops.







 

TEXT FILES

2017-04-30 09:49:10_EPOCH-6 2017-04-30 13:27:15_EPOCH-16 2017-04-30 23:03:42_EPOCH-4 2017-05-03 09:41:35_2017-05-02T06-19-17_EPOCH22 2017-05-03 12:54:57_2017-05-02T06-19-17_EPOCH-6 2017-05-12 12:24:54_2017-05-02T06-19-17_EPOCH22

2017-05-13 21:24:06_2017-05-02T06-19-17_-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27 2017-05-14 23:51:14_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_8-loss_6.03-ppl_417.49 2017-05-15 11:00:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_3-loss_6.09-ppl_439.57 2017-05-15 11:28:24_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_4-loss_6.04-ppl_421.71 2017-05-15 11:53:40_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_5-loss_6.02-ppl_413.64 2017-05-15 21:08:56_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_9-loss_6.02-ppl_409.87 2017-05-15 21:48:55_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_10-loss_6.03-ppl_416.36 2017-05-15 22:21:11_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_11-loss_6.02-ppl_413.61 2017-05-15 22:48:53_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_12-loss_6.03-ppl_414.69 2017-05-15 23:24:25_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_13-loss_6.03-ppl_417.08 2017-05-15 23:53:21_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_14-loss_6.03-ppl_416.68 pytorch-poet_2017-05-02T06-19-17_models-6-7-8-9-10-11-12-13-14 2017-05-16 09:26:18_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_15-loss_6.03-ppl_416.602017-05-16 10:08:15_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_16-loss_6.03-ppl_414.912017-05-16 10:45:58_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-18 23:48:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_18-loss_6.03-ppl_414.102017-05-20 10:35:17_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_19-loss_6.03-ppl_413.682017-05-18 22:10:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-20 22:01:22_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_20-loss_6.02-ppl_413.082017-05-21 12:58:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_21-loss_6.02-ppl_413.072017-05-21 14:38:29_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_22-loss_6.02-ppl_412.68

Nameless

It’s strange but i sometimes want to give the mathematical models (created by the neural nets) names. I think of them as having personalities like Bob or Eliza or Abnor Malo or Isa Phren, and i want to know them by name, because names convey spirit and character. Names encompass (or tolerate) the eerie uncanny simulacrum personality evoked by lines that seem real.

these things are

long reflecting gray

like pleasure to love the river hard

people are sketched through the streets

and it is all so green

in the impartial spiral

a cloud of art

a light of lovers

to speak of the education of salt

And if these lines can be written by a machine (that has read many lines written by humans) I wonder if existence is not just an extended copy machine. Maybe  personality is also programmed, programmable; and the sweet radiant wonderful gift of human creativity is just a reflection of evolution, a glint in the universe’s code.

or the skull whose form is of the secret truth

and in that tender place gets still


Tonight

I decided to try another model from the most recent PyTorch for Poetry Generation. Model: “2017-02-15T11-07-50/model-LSTM-emsize-512-nhid_512-nlayers_2-batch_size_20-epoch_15-loss_6.50-ppl_664.33.pt”


dream-racked love-squinting

ground where the onion and musk is lost

I drink you across the gardens ford

I worked as it played so there are several moments in the screengrab where my interface interrupts for a second. Then I showered. Then I lay on the couch, twisting the screen to face me, in my housecoat under a quilt, watching the poems scroll by.

though the body opened with silence

the skeletons of trees filled with poison

Each of the poems is an ephemeral vision, a house seen from the window of a train, partially glimpsed then gone, blurred, a flock of birds, a boy under the autumn mantle star with its deep shadow threshing the luckless dead.

to be hurt and will not

i push a step and begin to come alone, back from it, after winter

i did not wear the beat of my fingers

i knew where the peace loves me at last

I do not know what to call this model but i do know it speaks:

The soul is Woven view

The body of a life with words


View the 2 hour run at


Read it all here.

 

4 hours of Pytorch + 2 hours and 29m of Wavenet for Poetry Generation [SILENT 04-03-2017]

PyTorch word-language-model poetry is more stable and sane than Wavenet. PyTorch is regal, educated, less prone to misspellings or massive neologisms. Wavenet is edgy, erratic, clumped, — its visual dilation more contrite.

Yet reading each of these films is like witnessing a collage of avalanched literary modes and moods drift by, icons, tropes, techniques, incandescent, eerie, somnolent and deranged.

~

Warning: Vocabularies archive ideological debris. Monotheist, racist, and misogynist terms clot like toxic ore amid iridescent love proclamations, stoic iron, clouds that marry the ocean.

~

17000 lines output into a single text file

PyTorch Poetry Generation [Pre-WordHack : Epoch 16 Video]

Another day of testing before going to NYC to perform neural-net poems at WordHack [NYC (Thursday 2/16/2017 @ Babycastles . 7-10pm) w. Sarah Rothberg, John Cayley and Theadora Walsh]

HOPE 

 In the cold weather going out of the snow, 
 She down the lawn. 
 
 The air moves and grows, while she walks smooth, 
 When a swan is born, 
 And it's almost happening 
 
 Who knows what to say 
 The change has brought 
 Throwing the first blood in its face.

It’s clear:

Never will this mode of randomized pattern-reasoning replicate the nuanced human heart. More robust ensemble methods that simulate embodied experience, temporal reflexes, and nested community idioms will be required.

Deep learning is still shallow. The cloud does not understand honey, home or heart. Yet in the short-term, this is the future of writing: a computational assistant for an engaged imagination intent on exploring the topological feature-space of potential phrases.

Done:

Modulated the parameters: raised both embedding size and hidden layers to 512. And did a bit more data mining and parsing to increase the corpus size by 1/3 to 20mb of .txt.

Mode: LSTM
Embedding size: 512
Hidden Layers: 512
Batch size: 20

Expanded Corpus to over 600,000 lines

639,813 lines of poetry from 5 websites.

Poetry Foundation
Jacket2
Capa - Contemporary American Poetry Archive
Evergreen Review
Shampoo Poetry

Continue reading “PyTorch Poetry Generation [Pre-WordHack : Epoch 16 Video]”

40 Minutes of PyTorch Poetry Generation [Real-time SILENT]

Promising results that reflect the limits of a machine without empathy, skilled as a mimic of pattern, lacking longterm memory, emulating cadence and inflections, yet indifferent to context, experience and continuity.

Code: github.com/jhave/pytorch-poetry-generation

60 minutes of poetry output below the break :

A LAND IN SEASON 

 so much a child is up, 
 so much what he cannot feel 
 has found no knowledg more 
 of age, or of much friends 
 
 which, nothing thinks himself. spok'n 
 not knowing what is being 
 
 doing? or else wanting as 
 that 


Continue reading “40 Minutes of PyTorch Poetry Generation [Real-time SILENT]”

PyTorch LSTM Day 2 : Killed (after only 40 epochs)

My dream of an immaculate mesmerizing machine to replace all human imagination and absorb it into an engaging perpetual torrent of linguistic cleverness dissipated.

Yesterday, I let the GPU run overnight, expecting to return to 120 epochs and a stunning result.

Instead, on waking the computer in the morning:

-----------------------------------------
| end of epoch  40 | time: 452.85s 
| valid loss  5.84 | valid ppl   344.72
----------------------------------------
SAVING: models/2017-02-06T17-39-04/model-LSTM-epoch_40-loss_5.84-ppl_344.72.pt
Killed

 

The simulacrum had miscarried. The entire thread had been killed (automatically? by what clause?). Considering the results in glum melancholy, I realized it had been killed because 5 epochs had passed without improvement.

Yet, after dusting off the 40 models that existed, many intriguing gems emerged, spliced they suggest a latent lucidity:

without regret,
 played with a smooth
 raid of soiled petals, the color
 of rage and blood away--
 pinched your nose
the unwavering wind brushed the crystal edge from the stack,
 it came in the mirror adam's--
 eleven miles from the unholy relic
 and i set off
 into the absence of old themes,
 ... looking for the wreck of the rare summers
dark silks and soft blonde feather

on pink sky that hid a blue sun
 where it became dwelling pointing dead
 its lip rattled its green pride, thread-bare

 

Code on Github: https://github.com/jhave/pytorch-poetry-generation

Read the entire UNEDITED batch of 40 generated poems of 111 words after the break:

Continue reading “PyTorch LSTM Day 2 : Killed (after only 40 epochs)”