Category: digital poetry

Pytorch 1800 hidden layers 31 epochs

PyTorch Poetry Language Model.
Trained on approx 600,000 lines of poetry
http://bdp.glia.ca

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_30-loss_6.00-ppl_405.43.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 30
Loss: 6.00
Perplexity: 405.43.pt

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 31
Loss: 6.00
Perplexity: 405.39.pt

+~+

Ridges— Ourselves?

4
K-Town: ideality;
The train lost, “Aye man!
O old beggar, O perfect friend;
The bath-tub before the Bo’s’n resentments; pissing
rimed metaphors in the white pincers
scratching and whiten each of a clarity
in the sky the sacred hoof of eastward,
arc of the pestle through sobered the cliffs to the own world.

+~+

TXT Version:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

+~+

VIDEO Version:

WVNT regamed (blind, still bland, but intriguing)

The opposite of scarcity is not abundance: saturation inevitably damages, or perhaps it just corrodes or blunts the impassioned pavlov puppy, delivering a dent of tiny deliberate delirious wonder.

Yet technical momentum now propels and impels continuance: how can the incoherence be tamed? Set some new hyperparameters and let the wvnt train for 26 hours.

Over this weekend, I’ve churned out about 100,000 lines. Generating reckless amounts of incoherent poetry threatens more than perspective or contemplation, it totters sanity on the whim of a machine. Teeming bacteria, every epiphany a whiff of redundancy.

$ python train_2017_py3p5_150k_low6_sample4096_SAVE-STARTS_100k.py 
--wavenet_params=wavenet_params_ORIG_dilations2048_skipChannels8192_qc2048_dc32.json 
--data_dir=data/2017

Using default logdir: ./logdir/train/2017-06-01T08-38-45 

_______________________________________________________________________________________________

dilations: 2048 filter_width: 2 residual_channels: 32
dilation_channels: 32 skip_channels: 8192 quantization_channels: 2048

(tf0p1-py3.5-wvnt) jhave@jhave-UbuntuScreencast 2017-06-02 11:08:14_2017-06-01T08-38-45

and faithful wound 
To fruit white, the dread 
One by one another, Image saved-- 
Ay of the visit. What pursued my heart to brink. 


Such the curse of hopes fraught memory;

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-06-02 11:08:14_2017-06-01T08-38-45

Caught a new light bulb,   
All the heart is grown.

TXTs generated in SLOW MODE

There’s a way of calculating the matrices that taxes the strength of evn a magnificient GPU, making production crawl, and the computer difficult to use. Each of the following txt files (4444 letters in each) took about 40-60 minutes  to generate on an Nvidia Maxwell TitanX using cuda8 on Ubuntu 16.4

Txts generated slow seem somehow thicker, as if issued from a more calibrated mentation, yet at the same time it’s math scat, glitch flow. Glisses from disintegrating encyclopedias.

Here are some samples:

I found myself within us wonder.
   You purchase as ease with water over events,
   because the straightforward that I miximally, she
   Don't sports commentation with its ruffled story

tf0p1-py3.5-wvnt_jhave-Ubuntu_SLOW_4444charsx4_2017-06-02-16T08_2017-06-01T08-38-45_model.ckpt-117396 Continue reading

Wvnt Tamed

So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot and i found myself entering a mysterious cmd line universe called grub… thus to say apprenticing a digital poet is not without perils.


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 15:19:36_Wavenet_2017-05-30T10-36-56


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 11:49:04_Wavenet_2017-05-30T10-36-56

Fermentation & Neural Nets

Mead recipe: dilute honey with water, stir twice a day, wait.  

Fermentation begins after 24-48 hours. After a week, the fermented honey-wine (mead) can be enjoyed green, low in alcohol yet lively with essence. Or you can let it continue.

Generative-poetry recipe: text-corpus analysed by neural net, wait.

After each reading of the corpus (a.k.a. ‘training epoch’), a neural net can produce/save a model. Think of the model as an ecosystem produced by fermentation, an idea, a bubbling contingency, a knot of flavours, a succulent reservoir, a tangy substrate that represents the contours, topology or intricacies of the source corpus. Early models ( a model is usually produced after the algorithm reads the entire corpus; remember, each reading of the corpus is called an epoch; after 4-7 epochs the models are still young) may be obscure but are often buoyant with energy (like kids mimicking modern dance).

Here’s an example from a model that is only 2 epochs old:

Daft and bagel, of sycamore.
“Now thou so sex–sheer if I see Conquerours.

So he is beneath the lake, where they pass together,
Amid the twister that squeeze-box fire:

Here’s an example from a model that is 6 epochs old:

Yellow cornbread and yellow gems
and all spring to eat.

Owls of the sun: the oldest worm of light
robed in the advances of a spark.

Later models (after 20 epochs), held in vats (epoch after epoch), exhibit more refined microbial (i.e. algorithm) calibrations:

Stale nose of empty lair

The bright lush walls are fresh and lively and weeping,
Whirling into the night as I stand

unicorn raging on the serenade
Of floors of sunset after evil-spirits,
adagio.

Eventually fermentation processes halt. In the case of mead, this may occur after a month; with neural nets, a process called simulated annealing intentionally decreases the learning rate every iteration; so the system begins by exploring large features then focuses on details. Eventually the learning rate diminishes to zero. Learning (fermentation) stops.







 

TEXT FILES

2017-04-30 09:49:10_EPOCH-6 2017-04-30 13:27:15_EPOCH-16 2017-04-30 23:03:42_EPOCH-4 2017-05-03 09:41:35_2017-05-02T06-19-17_EPOCH22 2017-05-03 12:54:57_2017-05-02T06-19-17_EPOCH-6 2017-05-12 12:24:54_2017-05-02T06-19-17_EPOCH22

2017-05-13 21:24:06_2017-05-02T06-19-17_-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27 2017-05-14 23:51:14_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_8-loss_6.03-ppl_417.49 2017-05-15 11:00:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_3-loss_6.09-ppl_439.57 2017-05-15 11:28:24_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_4-loss_6.04-ppl_421.71 2017-05-15 11:53:40_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_5-loss_6.02-ppl_413.64 2017-05-15 21:08:56_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_9-loss_6.02-ppl_409.87 2017-05-15 21:48:55_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_10-loss_6.03-ppl_416.36 2017-05-15 22:21:11_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_11-loss_6.02-ppl_413.61 2017-05-15 22:48:53_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_12-loss_6.03-ppl_414.69 2017-05-15 23:24:25_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_13-loss_6.03-ppl_417.08 2017-05-15 23:53:21_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_14-loss_6.03-ppl_416.68 pytorch-poet_2017-05-02T06-19-17_models-6-7-8-9-10-11-12-13-14 2017-05-16 09:26:18_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_15-loss_6.03-ppl_416.602017-05-16 10:08:15_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_16-loss_6.03-ppl_414.912017-05-16 10:45:58_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-18 23:48:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_18-loss_6.03-ppl_414.102017-05-20 10:35:17_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_19-loss_6.03-ppl_413.682017-05-18 22:10:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-20 22:01:22_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_20-loss_6.02-ppl_413.082017-05-21 12:58:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_21-loss_6.02-ppl_413.072017-05-21 14:38:29_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_22-loss_6.02-ppl_412.68

RERITES Archives

RERITES are poems
written by neural-nets
then human-edited.

For the complete output (often updated daily) visit the RERITES archives.

Here are a few samples:


Pray For A Moment.

In her soft
the hill is strewn

Across the chaff swift door,
That’s her.

The steady level calling.
The ash, similarly still.

 

~ + ~

 

Corpses Let Them Stoop

To carry a neighbor’s time

And once again the world,
seeing whoever it will be,
is not
moulded

and ceases to cry
straying in
flames

 

~ + ~

Few retrieve the sad rocks, that have defined their time

The mind
Blows Back

The lapping waves
a mossy hatred’s stubbing on,
the summer’s sky

distant hair
gone thin and soft
in every living stillness.

 

~ + ~

 

III

Out of the hail
Where the population of illusions
Sing strong in the sky,

Where dawning ships
pray at shaken rocks
in the stupor dive of market-place rigging

And homeward-drawing star-showers,
Are in a body.

~ + ~

Nameless

It’s strange but i sometimes want to give the mathematical models (created by the neural nets) names. I think of them as having personalities like Bob or Eliza or Abnor Malo or Isa Phren, and i want to know them by name, because names convey spirit and character. Names encompass (or tolerate) the eerie uncanny simulacrum personality evoked by lines that seem real.

these things are

long reflecting gray

like pleasure to love the river hard

people are sketched through the streets

and it is all so green

in the impartial spiral

a cloud of art

a light of lovers

to speak of the education of salt

And if these lines can be written by a machine (that has read many lines written by humans) I wonder if existence is not just an extended copy machine. Maybe  personality is also programmed, programmable; and the sweet radiant wonderful gift of human creativity is just a reflection of evolution, a glint in the universe’s code.

or the skull whose form is of the secret truth

and in that tender place gets still


Tonight

I decided to try another model from the most recent PyTorch for Poetry Generation. Model: “2017-02-15T11-07-50/model-LSTM-emsize-512-nhid_512-nlayers_2-batch_size_20-epoch_15-loss_6.50-ppl_664.33.pt”


dream-racked love-squinting

ground where the onion and musk is lost

I drink you across the gardens ford

I worked as it played so there are several moments in the screengrab where my interface interrupts for a second. Then I showered. Then I lay on the couch, twisting the screen to face me, in my housecoat under a quilt, watching the poems scroll by.

though the body opened with silence

the skeletons of trees filled with poison

Each of the poems is an ephemeral vision, a house seen from the window of a train, partially glimpsed then gone, blurred, a flock of birds, a boy under the autumn mantle star with its deep shadow threshing the luckless dead.

to be hurt and will not

i push a step and begin to come alone, back from it, after winter

i did not wear the beat of my fingers

i knew where the peace loves me at last

I do not know what to call this model but i do know it speaks:

The soul is Woven view

The body of a life with words


View the 2 hour run at


Read it all here.

 

4 hours of Pytorch + 2 hours and 29m of Wavenet for Poetry Generation [SILENT 04-03-2017]

PyTorch word-language-model poetry is more stable and sane than Wavenet. PyTorch is regal, educated, less prone to misspellings or massive neologisms. Wavenet is edgy, erratic, clumped, — its visual dilation more contrite.

Yet reading each of these films is like witnessing a collage of avalanched literary modes and moods drift by, icons, tropes, techniques, incandescent, eerie, somnolent and deranged.

~

Warning: Vocabularies archive ideological debris. Monotheist, racist, and misogynist terms clot like toxic ore amid iridescent love proclamations, stoic iron, clouds that marry the ocean.

~

17000 lines output into a single text file

PyTorch Poetry Generation [Pre-WordHack : Epoch 16 Video]

Another day of testing before going to NYC to perform neural-net poems at WordHack [NYC (Thursday 2/16/2017 @ Babycastles . 7-10pm) w. Sarah Rothberg, John Cayley and Theadora Walsh]

HOPE 

 In the cold weather going out of the snow, 
 She down the lawn. 
 
 The air moves and grows, while she walks smooth, 
 When a swan is born, 
 And it's almost happening 
 
 Who knows what to say 
 The change has brought 
 Throwing the first blood in its face.

It’s clear:

Never will this mode of randomized pattern-reasoning replicate the nuanced human heart. More robust ensemble methods that simulate embodied experience, temporal reflexes, and nested community idioms will be required.

Deep learning is still shallow. The cloud does not understand honey, home or heart. Yet in the short-term, this is the future of writing: a computational assistant for an engaged imagination intent on exploring the topological feature-space of potential phrases.

Done:

Modulated the parameters: raised both embedding size and hidden layers to 512. And did a bit more data mining and parsing to increase the corpus size by 1/3 to 20mb of .txt.

Mode: LSTM
Embedding size: 512
Hidden Layers: 512
Batch size: 20

Expanded Corpus to over 600,000 lines

639,813 lines of poetry from 5 websites.

Poetry Foundation
Jacket2
Capa - Contemporary American Poetry Archive
Evergreen Review
Shampoo Poetry

Continue reading

40 Minutes of PyTorch Poetry Generation [Real-time SILENT]

Promising results that reflect the limits of a machine without empathy, skilled as a mimic of pattern, lacking longterm memory, emulating cadence and inflections, yet indifferent to context, experience and continuity.

Code: github.com/jhave/pytorch-poetry-generation

60 minutes of poetry output below the break :

A LAND IN SEASON 

 so much a child is up, 
 so much what he cannot feel 
 has found no knowledg more 
 of age, or of much friends 
 
 which, nothing thinks himself. spok'n 
 not knowing what is being 
 
 doing? or else wanting as 
 that 


Continue reading