“and farther heart, stuff’d the Arts supernova” | 173,901 lines made in 1h35m | Screencast 2018-03-08 21:10:55


Txt excerpt:

 I can even hear your violent debates. Of prayer,
 weep, Whatever you say, я recognise me. paulist. That's
 right no.
 laughing?

Complete txt output:

generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55


Corpus retuned to reflect tech and other torque

100selected-archiveorg.txt
 Jacket2_ALL.txt
 99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt
 ABZ.txt
 jorie_graham.txt
 Aeon_autonomousrobots_science-and-metaphysics_enemies_biodiversity.txt
 literotica.txt
 capa.txt ndsu.edu-creativeWriting-323.txt
 Cathay by LiBai pg50155.txt neurotransmitter.txt
 celan.txt nyt_china-technology-censorship-borders-expansion.txt
 citizen-an-american-lyric-claudia-rankine.txt patchen.txt
 citylights.txt
 Patti Smith - all songs lyrics.txt
 deepmind.txt
 poetryFoundation_CLEANED_lower.txt
 evergreenreview_all_poem.html.txt
 rodneyJones.txt
 glossary-networking-terms.txt
 swear.txt
 glossary-neurological-terms.txt
 Teen Love Meets the Internet.txt
 glossary-posthuman-terms.txt
 tumblr_jhavelikes_de2107-feb2018.txt
 Grief.txt tumblr_jhavelikes_EDITED.txt
 harari_sapiens-summary_guardian-review.txt
 twoRiver.txt
 I Feel It Is My Duty to Speak Out_SallyOReilly.txt

Process left to run for 36 hours on Nvidia Titan X:

(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python -u main.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python -u finetune.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python pointer.py --lambdasm 0.1279 --theta 0.662 --window 3785 --bptt 2000 --data data/March-2018_16mb --model QRNN --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt

$ python generate_March-2018_nocntrl.py --cuda --words=444 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=0.75 --maxt=1.25

Training QRNN using awd-lstm exited manually after 252 epochs:

-----------------------------------------------------------------------------------------
| end of epoch 252 | time: 515.44s | valid loss 4.79 | valid ppl 120.25
-----------------------------------------------------------------------------------------
Saving Averaged!
| epoch 253 | 200/ 1568 batches | lr 30.00 | ms/batch 307.76 | loss 4.29 | ppl 73.12
^C-----------------------------------------------------------------------------------------
Exiting from training early
=========================================================================================
| End of training | test loss 3.99 | test ppl 54.11

Complete training terminal output: TRAINING__Mid March Corpus | Screencast 2018-03-08 21:10:55

 

2017-12-11 11:52:45 [SUBEST4+JACKET2+LYRICS]

Using a mildly revised (cleaner leaner) corpus

AND … dropout=0.65 (to prevent overfitting)

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M_October.py --checkpoint=models/2017-12-11T06-42-23_dec_rerites_SUBEST4+JACKET2+LYRICS/model-LSTM-emsize-2400-nhid_2400-nlayers_2-batch_size_20-epoch_21-loss_3.58-ppl_36.04.pt --cuda --words=600 --data=data/dec_rerites_SUBEST4+JACKET2+LYRICS

Generated: Screencast 2017-12-11 11:52:45_SUBEST4+JACKET2+LYRICS

Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

In the egg, the child is simple. A wet light. Lurching.

Body is wind. The sun in the sea.

Then as if more, motions, the shadows of trees.

The ineluctable diffusion of randomized complexity.

Drainfgdl gsod. Trainins spekcled!

 

Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics. Increments in the temperature of a neural net model simulate time.

For high temperatures ( τ → ∞ {\displaystyle \tau \to \infty } \tau \to \infty ), all actions have nearly the same probability and the lower the temperature, the more expected rewards affect the probability. For a low temperature ( τ → 0 + {\displaystyle \tau \to 0^{+}} \tau \to 0^{+}), the probability of the action with the highest expected reward tends to 1.

https://en.wikipedia.org/wiki/Softmax_function

~ + ~

~ + ~

Text: Screencast 2017-08-30 00:00:40_incrementalTEMPERATURE_PoetryPytorch

~ + ~

This source will become the first seed-text for September’s RERITES.

Pytorch 1800 hidden layers 31 epochs (July 2107 RERITES Source)

 

The second video here became the source-text for July 2017 RERITES  http://glia.ca/rerites/

+~+

PyTorch Poetry Language Model.
Trained on approx 600,000 lines of poetry
http://bdp.glia.ca

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_30-loss_6.00-ppl_405.43.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 30
Loss: 6.00
Perplexity: 405.43.pt

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 31
Loss: 6.00
Perplexity: 405.39.pt

+~+

Ridges— Ourselves?

4
K-Town: ideality;
The train lost, “Aye man!
O old beggar, O perfect friend;
The bath-tub before the Bo’s’n resentments; pissing
rimed metaphors in the white pincers
scratching and whiten each of a clarity
in the sky the sacred hoof of eastward,
arc of the pestle through sobered the cliffs to the own world.

+~+

TXT Version:

jhave@jhave-Ubuntu_Screencast 2017-06-26 09_45_14_2017-06-17T09-22-17_model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

+~+

WVNT regamed (blind, still bland, but intriguing)

The opposite of scarcity is not abundance: saturation inevitably damages, or perhaps it just corrodes or blunts the impassioned pavlov puppy, delivering a dent of tiny deliberate delirious wonder.

Yet technical momentum now propels and impels continuance: how can the incoherence be tamed? Set some new hyperparameters and let the wvnt train for 26 hours.

Over this weekend, I’ve churned out about 100,000 lines. Generating reckless amounts of incoherent poetry threatens more than perspective or contemplation, it totters sanity on the whim of a machine. Teeming bacteria, every epiphany a whiff of redundancy.

$ python train_2017_py3p5_150k_low6_sample4096_SAVE-STARTS_100k.py 
--wavenet_params=wavenet_params_ORIG_dilations2048_skipChannels8192_qc2048_dc32.json 
--data_dir=data/2017

Using default logdir: ./logdir/train/2017-06-01T08-38-45 

_______________________________________________________________________________________________

dilations: 2048 filter_width: 2 residual_channels: 32
dilation_channels: 32 skip_channels: 8192 quantization_channels: 2048

(tf0p1-py3.5-wvnt) jhave@jhave-UbuntuScreencast 2017-06-02 11:08:14_2017-06-01T08-38-45

and faithful wound 
To fruit white, the dread 
One by one another, Image saved-- 
Ay of the visit. What pursued my heart to brink. 


Such the curse of hopes fraught memory;

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-06-02 11:08:14_2017-06-01T08-38-45

Caught a new light bulb,   
All the heart is grown.

TXTs generated in SLOW MODE

There’s a way of calculating the matrices that taxes the strength of evn a magnificient GPU, making production crawl, and the computer difficult to use. Each of the following txt files (4444 letters in each) took about 40-60 minutes  to generate on an Nvidia Maxwell TitanX using cuda8 on Ubuntu 16.4

Txts generated slow seem somehow thicker, as if issued from a more calibrated mentation, yet at the same time it’s math scat, glitch flow. Glisses from disintegrating encyclopedias.

Here are some samples:

I found myself within us wonder.
   You purchase as ease with water over events,
   because the straightforward that I miximally, she
   Don't sports commentation with its ruffled story

tf0p1-py3.5-wvnt_jhave-Ubuntu_SLOW_4444charsx4_2017-06-02-16T08_2017-06-01T08-38-45_model.ckpt-117396 Continue reading “WVNT regamed (blind, still bland, but intriguing)”

Wvnt Tamed

So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot and i found myself entering a mysterious cmd line universe called grub… thus to say apprenticing a digital poet is not without perils.


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 15:19:36_Wavenet_2017-05-30T10-36-56


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 11:49:04_Wavenet_2017-05-30T10-36-56

Fermentation & Neural Nets

Mead recipe: dilute honey with water, stir twice a day, wait.  

Fermentation begins after 24-48 hours. After a week, the fermented honey-wine (mead) can be enjoyed green, low in alcohol yet lively with essence. Or you can let it continue.

Generative-poetry recipe: text-corpus analysed by neural net, wait.

After each reading of the corpus (a.k.a. ‘training epoch’), a neural net can produce/save a model. Think of the model as an ecosystem produced by fermentation, an idea, a bubbling contingency, a knot of flavours, a succulent reservoir, a tangy substrate that represents the contours, topology or intricacies of the source corpus. Early models ( a model is usually produced after the algorithm reads the entire corpus; remember, each reading of the corpus is called an epoch; after 4-7 epochs the models are still young) may be obscure but are often buoyant with energy (like kids mimicking modern dance).

Here’s an example from a model that is only 2 epochs old:

Daft and bagel, of sycamore.
“Now thou so sex–sheer if I see Conquerours.

So he is beneath the lake, where they pass together,
Amid the twister that squeeze-box fire:

Here’s an example from a model that is 6 epochs old:

Yellow cornbread and yellow gems
and all spring to eat.

Owls of the sun: the oldest worm of light
robed in the advances of a spark.

Later models (after 20 epochs), held in vats (epoch after epoch), exhibit more refined microbial (i.e. algorithm) calibrations:

Stale nose of empty lair

The bright lush walls are fresh and lively and weeping,
Whirling into the night as I stand

unicorn raging on the serenade
Of floors of sunset after evil-spirits,
adagio.

Eventually fermentation processes halt. In the case of mead, this may occur after a month; with neural nets, a process called simulated annealing intentionally decreases the learning rate every iteration; so the system begins by exploring large features then focuses on details. Eventually the learning rate diminishes to zero. Learning (fermentation) stops.







 

TEXT FILES

2017-04-30 09:49:10_EPOCH-6 2017-04-30 13:27:15_EPOCH-16 2017-04-30 23:03:42_EPOCH-4 2017-05-03 09:41:35_2017-05-02T06-19-17_EPOCH22 2017-05-03 12:54:57_2017-05-02T06-19-17_EPOCH-6 2017-05-12 12:24:54_2017-05-02T06-19-17_EPOCH22

2017-05-13 21:24:06_2017-05-02T06-19-17_-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27 2017-05-14 23:51:14_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_8-loss_6.03-ppl_417.49 2017-05-15 11:00:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_3-loss_6.09-ppl_439.57 2017-05-15 11:28:24_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_4-loss_6.04-ppl_421.71 2017-05-15 11:53:40_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_5-loss_6.02-ppl_413.64 2017-05-15 21:08:56_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_9-loss_6.02-ppl_409.87 2017-05-15 21:48:55_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_10-loss_6.03-ppl_416.36 2017-05-15 22:21:11_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_11-loss_6.02-ppl_413.61 2017-05-15 22:48:53_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_12-loss_6.03-ppl_414.69 2017-05-15 23:24:25_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_13-loss_6.03-ppl_417.08 2017-05-15 23:53:21_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_14-loss_6.03-ppl_416.68 pytorch-poet_2017-05-02T06-19-17_models-6-7-8-9-10-11-12-13-14 2017-05-16 09:26:18_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_15-loss_6.03-ppl_416.602017-05-16 10:08:15_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_16-loss_6.03-ppl_414.912017-05-16 10:45:58_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-18 23:48:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_18-loss_6.03-ppl_414.102017-05-20 10:35:17_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_19-loss_6.03-ppl_413.682017-05-18 22:10:36_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_17-loss_6.03-ppl_415.35 2017-05-20 22:01:22_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_20-loss_6.02-ppl_413.082017-05-21 12:58:03_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_21-loss_6.02-ppl_413.072017-05-21 14:38:29_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_22-loss_6.02-ppl_412.68

RERITES Archives

RERITES are poems
written by neural-nets
then human-edited.


http://glia.ca/2017/rerites/

RERITES SITE IS NOW HERE

 


For the complete output (often updated daily) visit the RERITES archives.

Here are a few samples:


Pray For A Moment.

In her soft
the hill is strewn

Across the chaff swift door,
That’s her.

The steady level calling.
The ash, similarly still.

 

~ + ~

 

Corpses Let Them Stoop

To carry a neighbor’s time

And once again the world,
seeing whoever it will be,
is not
moulded

and ceases to cry
straying in
flames

 

~ + ~

Few retrieve the sad rocks, that have defined their time

The mind
Blows Back

The lapping waves
a mossy hatred’s stubbing on,
the summer’s sky

distant hair
gone thin and soft
in every living stillness.

 

~ + ~

 

III

Out of the hail
Where the population of illusions
Sing strong in the sky,

Where dawning ships
pray at shaken rocks
in the stupor dive of market-place rigging

And homeward-drawing star-showers,
Are in a body.

~ + ~

Nameless

It’s strange but i sometimes want to give the mathematical models (created by the neural nets) names. I think of them as having personalities like Bob or Eliza or Abnor Malo or Isa Phren, and i want to know them by name, because names convey spirit and character. Names encompass (or tolerate) the eerie uncanny simulacrum personality evoked by lines that seem real.

these things are

long reflecting gray

like pleasure to love the river hard

people are sketched through the streets

and it is all so green

in the impartial spiral

a cloud of art

a light of lovers

to speak of the education of salt

And if these lines can be written by a machine (that has read many lines written by humans) I wonder if existence is not just an extended copy machine. Maybe  personality is also programmed, programmable; and the sweet radiant wonderful gift of human creativity is just a reflection of evolution, a glint in the universe’s code.

or the skull whose form is of the secret truth

and in that tender place gets still


Tonight

I decided to try another model from the most recent PyTorch for Poetry Generation. Model: “2017-02-15T11-07-50/model-LSTM-emsize-512-nhid_512-nlayers_2-batch_size_20-epoch_15-loss_6.50-ppl_664.33.pt”


dream-racked love-squinting

ground where the onion and musk is lost

I drink you across the gardens ford

I worked as it played so there are several moments in the screengrab where my interface interrupts for a second. Then I showered. Then I lay on the couch, twisting the screen to face me, in my housecoat under a quilt, watching the poems scroll by.

though the body opened with silence

the skeletons of trees filled with poison

Each of the poems is an ephemeral vision, a house seen from the window of a train, partially glimpsed then gone, blurred, a flock of birds, a boy under the autumn mantle star with its deep shadow threshing the luckless dead.

to be hurt and will not

i push a step and begin to come alone, back from it, after winter

i did not wear the beat of my fingers

i knew where the peace loves me at last

I do not know what to call this model but i do know it speaks:

The soul is Woven view

The body of a life with words


View the 2 hour run at


Read it all here.

 

4 hours of Pytorch + 2 hours and 29m of Wavenet for Poetry Generation [SILENT 04-03-2017]

PyTorch word-language-model poetry is more stable and sane than Wavenet. PyTorch is regal, educated, less prone to misspellings or massive neologisms. Wavenet is edgy, erratic, clumped, — its visual dilation more contrite.

Yet reading each of these films is like witnessing a collage of avalanched literary modes and moods drift by, icons, tropes, techniques, incandescent, eerie, somnolent and deranged.

~

Warning: Vocabularies archive ideological debris. Monotheist, racist, and misogynist terms clot like toxic ore amid iridescent love proclamations, stoic iron, clouds that marry the ocean.

~

17000 lines output into a single text file