Wvnt Tamed

So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot and i found myself entering a mysterious cmd line universe called grub… thus to say apprenticing a digital poet is not without perils.


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 15:19:36_Wavenet_2017-05-30T10-36-56


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 11:49:04_Wavenet_2017-05-30T10-36-56

THAT ETERNAL HEAVY COLOR OF SOFT

The title of this post is the title of the first poem generated by a re-installed Wavenet for Poetry Generation (still using 0.1 Tensorflow, but now  on Python 3.5), and working on an expanded corpus (using approx 600,000 lines of poetry) the first model was Model: 7365  |  Loss: 0.641  |  Trained on: 2017-05-27T14-19-11 (full txt here).

Wavenet is a rowdy poet, jolting neologisms, careening rhythms, petulant insolence, even the occasional glaring politically-incorrect genetic smut. Tangents codified into contingent unstable incoherence. 

Compared to Pytorch, which aspires to a refined smooth reservoir of cadenced candy, Wavenet is a drunken street brawler: rude, slurring blursh meru crosm nanotagonisms across rumpled insovite starpets.

Pytorch is Wallace Stevens; Wavenet is Bukowski (if he’d been born a mathematician neologist).

Here’s a random poem:

____________________________________________
Model: 118740  |  Loss: 0.641  |  2017-05-28T00-35-33

Eyes calm, nor or something cases.

from a wall coat hardware as it times, a fuckermarket
in my meat by the heart, earth signs: a pupil, breaths &

stops children, pretended. But they were.

Case study: Folder 2017-05-28T12-16-50 contains 171 models (each saved because their loss was under the 0.7 threshold). But what does loss really mean? In principle it is a measurement of the gap between the generated text and the validation text (how close is it?); yet however many different schemas proliferate, loss (like pain) cannot be measured by instrumentality.

Here’s another random poem:

____________________________________________
Model: 93286  |  Loss: 0.355  |   2017-05-28T12-16-50

would destroying through the horizon. The poor
Sequel creation rose sky.

So we do not how you bastard, grew,
there is no populously, despite bet.
Trees me that he went back
on tune parts.

I will set
a girl of sunsets in the glass,

and no one even on the floral came

Training

I’m slowly learning the hard way to wrap each install using VirtualEnvironments. Without that as the upgrades happen, code splinters and breaks, leaking a fine luminous goo of errors.

The current virtual environment was created using

$ sudo apt-get install python3-pip python3-dev python-virtualenv
$ virtualenv -p python3.5 ~/tf0p1-py3.5-wvnt

After that, followed the instructions,

$  TF_BINARY_URL=https//storage.googleapis.com/tensorflow/mac/gpu/tensorflow-0.10.0-py3-none-any.whl
$ pip3 install --upgrade $TF_BINARY_URL

 

then got snarled into a long terrible struggle with error messages messing up the output, resolved by inserting,

os.environ['TF_CPP_MIN_LOG_LEVEL']='2' 
# into generate_Poems_2017-wsuppressed.py

And to generate on Ubuntu, using the lovely Nvidia Titan X GPU so generously donated by Nvidia under their academic grant program:

$ cd Documents/Github/Wavenet-for-Poem-Generation/
$ source ~/tf0p1-py3.5-wvnt/bin/activate
(tf0p1-py3.5-wvnt)$ python train_2017_py3p5.py --data_dir=data/2017 --wavenet_params=wavenet_params_ORIG_dilations1024_skipChannels4096_qc1024_dc32.json

Text Files

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-28 11:31:40_TrainedOn_2017-05-28T00-35-33 tf0p1-py3.5-wvnt_jhave-Ubuntu_2017-05-28 09:18:00_TRAINED_2017-05-28T00-35-33 tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast-2017-05-27 23:50:14_basedon_2017-05-27T14-19-11 tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-28 22:36:35_TrainedOn_2017-05-28T12-16-50.txt

RERITES Archives

RERITES are poems
written by neural-nets
then human-edited.


http://glia.ca/2017/rerites/

RERITES SITE IS NOW HERE

 


For the complete output (often updated daily) visit the RERITES archives.

Here are a few samples:


Pray For A Moment.

In her soft
the hill is strewn

Across the chaff swift door,
That’s her.

The steady level calling.
The ash, similarly still.

 

~ + ~

 

Corpses Let Them Stoop

To carry a neighbor’s time

And once again the world,
seeing whoever it will be,
is not
moulded

and ceases to cry
straying in
flames

 

~ + ~

Few retrieve the sad rocks, that have defined their time

The mind
Blows Back

The lapping waves
a mossy hatred’s stubbing on,
the summer’s sky

distant hair
gone thin and soft
in every living stillness.

 

~ + ~

 

III

Out of the hail
Where the population of illusions
Sing strong in the sky,

Where dawning ships
pray at shaken rocks
in the stupor dive of market-place rigging

And homeward-drawing star-showers,
Are in a body.

~ + ~

Nameless

It’s strange but i sometimes want to give the mathematical models (created by the neural nets) names. I think of them as having personalities like Bob or Eliza or Abnor Malo or Isa Phren, and i want to know them by name, because names convey spirit and character. Names encompass (or tolerate) the eerie uncanny simulacrum personality evoked by lines that seem real.

these things are

long reflecting gray

like pleasure to love the river hard

people are sketched through the streets

and it is all so green

in the impartial spiral

a cloud of art

a light of lovers

to speak of the education of salt

And if these lines can be written by a machine (that has read many lines written by humans) I wonder if existence is not just an extended copy machine. Maybe  personality is also programmed, programmable; and the sweet radiant wonderful gift of human creativity is just a reflection of evolution, a glint in the universe’s code.

or the skull whose form is of the secret truth

and in that tender place gets still


Tonight

I decided to try another model from the most recent PyTorch for Poetry Generation. Model: “2017-02-15T11-07-50/model-LSTM-emsize-512-nhid_512-nlayers_2-batch_size_20-epoch_15-loss_6.50-ppl_664.33.pt”


dream-racked love-squinting

ground where the onion and musk is lost

I drink you across the gardens ford

I worked as it played so there are several moments in the screengrab where my interface interrupts for a second. Then I showered. Then I lay on the couch, twisting the screen to face me, in my housecoat under a quilt, watching the poems scroll by.

though the body opened with silence

the skeletons of trees filled with poison

Each of the poems is an ephemeral vision, a house seen from the window of a train, partially glimpsed then gone, blurred, a flock of birds, a boy under the autumn mantle star with its deep shadow threshing the luckless dead.

to be hurt and will not

i push a step and begin to come alone, back from it, after winter

i did not wear the beat of my fingers

i knew where the peace loves me at last

I do not know what to call this model but i do know it speaks:

The soul is Woven view

The body of a life with words


View the 2 hour run at


Read it all here.

 

4 hours of Pytorch + 2 hours and 29m of Wavenet for Poetry Generation [SILENT 04-03-2017]

PyTorch word-language-model poetry is more stable and sane than Wavenet. PyTorch is regal, educated, less prone to misspellings or massive neologisms. Wavenet is edgy, erratic, clumped, — its visual dilation more contrite.

Yet reading each of these films is like witnessing a collage of avalanched literary modes and moods drift by, icons, tropes, techniques, incandescent, eerie, somnolent and deranged.

~

Warning: Vocabularies archive ideological debris. Monotheist, racist, and misogynist terms clot like toxic ore amid iridescent love proclamations, stoic iron, clouds that marry the ocean.

~

17000 lines output into a single text file

40 Minutes of PyTorch Poetry Generation [Real-time SILENT]

Promising results that reflect the limits of a machine without empathy, skilled as a mimic of pattern, lacking longterm memory, emulating cadence and inflections, yet indifferent to context, experience and continuity.

Code: github.com/jhave/pytorch-poetry-generation

60 minutes of poetry output below the break :

A LAND IN SEASON 

 so much a child is up, 
 so much what he cannot feel 
 has found no knowledg more 
 of age, or of much friends 
 
 which, nothing thinks himself. spok'n 
 not knowing what is being 
 
 doing? or else wanting as 
 that 


Continue reading “40 Minutes of PyTorch Poetry Generation [Real-time SILENT]”

PyTorch LSTM Day 2 : Killed (after only 40 epochs)

My dream of an immaculate mesmerizing machine to replace all human imagination and absorb it into an engaging perpetual torrent of linguistic cleverness dissipated.

Yesterday, I let the GPU run overnight, expecting to return to 120 epochs and a stunning result.

Instead, on waking the computer in the morning:

-----------------------------------------
| end of epoch  40 | time: 452.85s 
| valid loss  5.84 | valid ppl   344.72
----------------------------------------
SAVING: models/2017-02-06T17-39-04/model-LSTM-epoch_40-loss_5.84-ppl_344.72.pt
Killed

 

The simulacrum had miscarried. The entire thread had been killed (automatically? by what clause?). Considering the results in glum melancholy, I realized it had been killed because 5 epochs had passed without improvement.

Yet, after dusting off the 40 models that existed, many intriguing gems emerged, spliced they suggest a latent lucidity:

without regret,
 played with a smooth
 raid of soiled petals, the color
 of rage and blood away--
 pinched your nose
the unwavering wind brushed the crystal edge from the stack,
 it came in the mirror adam's--
 eleven miles from the unholy relic
 and i set off
 into the absence of old themes,
 ... looking for the wreck of the rare summers
dark silks and soft blonde feather

on pink sky that hid a blue sun
 where it became dwelling pointing dead
 its lip rattled its green pride, thread-bare

 

Code on Github: https://github.com/jhave/pytorch-poetry-generation

Read the entire UNEDITED batch of 40 generated poems of 111 words after the break:

Continue reading “PyTorch LSTM Day 2 : Killed (after only 40 epochs)”

Testing PyTorch on Poems (Preliminary Results)

PyTorch is an early release beta software (developed by a consortium led by Facebook and NIVIDIA), a “deep learning software that puts Python first.”

So since I luckily received an NVIDIA GTX TitanX (Maxwell) before leaving Hong Kong under the generous NVIDIA academic GPU Grant program, and having last week finally bought a custom-build to house it, and 2 days ago finally got Ubuntu installed with CUDA and CUDNN drivers, and having found that the Tensorflow 0.11 version no longer runs under Python 3.6 Anaconda, I decided to give a PyTorch example a try, specifically Word-level language modeling RNN

This example trains a multi-layer RNN (Elman, GRU, or LSTM) on a language modeling task…The trained model can then be used by the generate script to generate new text.

And after only an hour of training on an 11k poem corpus, using the default settings, the results announced “End of training | test loss  5.99 | test ppl   398.41” — Which means that the loss is bad and perplexity is now at the seemingly terrible level of 398….

Then I ran the generate script and the 1000 word text below got generated in less than 30 seconds. I find it stunning. If this is what PyTorch is capable of with a tiny corpus, default settings and a minimal run, language generation is entering a renaissance.  Ok, so it’s veering toward the incomprehensible and has little lived evocative phenomenological resonance, but its grasp on idiomatic cadence is creepily accurate. It’s as if it absorbed several semesters of graduate seminars on romantic and post-modern verse:

the embankment
and your face sad like a nest, grew sorry
when your cold work made of snow
broken
and left a thousand magnifies.

a little cold, you plant but hold it
and seems
the slight arts? face, and ends
with such prayer as the fingers do,
this reorganizing contest is how
to be murdered
throwing it
into the arteries obscurity goes disc whispering whole
affairs, now your instinct
does a case,
defense. on her eye, you do not know that every homelands
is didn’t at the
risk very psychiatrists, just under bay.

by the living of life’s melancholy grate.
i have found a
wild orange in eden, eight hazy years guzzles
her neck at the grave turn into every mythological orbit of
distances,
person’s there–see then are we told what we understand
won’t take the slightest danger
or the
size of what it means to take up if you can,
tongue. only your eye exultant whitens again will
happen.
i think that the four-oared clouded of one stick in flowerpot
is part of an antique little
register on a hiatus
till i try for you.
i wash up the door my knee will be
high.
if i refuse a limits as i can lift my hand rubicon.

i can see her
above the stove tide
hip. orange as a breaking sty.

Continue reading “Testing PyTorch on Poems (Preliminary Results)”

3 Survivors : 1397 Models, 16,548 txt files, 8+ hrs of video (& no poems yet): Wavenet for Poem Generation: Secondary Results (After training for 6+ weeks continuously)

From 26-11-2016 to 11-12-2016, Wavenet-for-Poem-Generation (code on github) trained on an 11k poem corpus simultaneously in 7 different tabs of a terminal window (on a 8-core G5 each tab occupied a core of the CPU) — each tab was using different parameter settings.

In the end only 3 settings exceeded 100k training epochs before succumbing to the exploding gradient dilemma (detailed here).

The 3 surviving threads were known as 26-03, 38-59, and 39-18 — each folder name references its time of birth, the time it began receiving models from its thread, the neural network learning as it groped its way thru the corpus. These threads alone (of many myriad attempts) lived longest and saved out hundred of models with loss under 0.7.


SILENT VIDEOS of REALTIME POEM GENERATION

Warning: these videos are long! Total viewing time: 8+ hours.

Each is a silent realtime screen-capture of neural net models generating poems.

Poems from the same model are generated side-by-side to allow for comparative viewing. Note how young models create poems that rampage logic, merge less. Mature models from 50k-110k begin to emulate deflections and balance, concealing and revealing. And ancient models (after they suffer an exploding gradient data hemorrhage) create poems full of fragments and silences, aphasia and lack, demented seeking.

Suggested viewing: put on an extra monitor and let run. Consult occasionally as if the computer were a clever oracle with a debilitating lack of narrative cohesion.


SAMPLE OUTPUT

16,548 text file poems on github


PARAMETER SETTINGS

Common to each survivor were the following parameters:

  • Dilations = 1024
  • SkipChannels = 4096
  • Quantization Channels = 1024

Dilation channels were different for each survivor : 8, 16, 32.

Training process: complete terminal output of training runs .


FOLDER DETAILS

A subset of the models used in demo readings can be found online at github.

39-18 (2016-10-26T18-39-18)

Dilation Channels : 8

Born: 26 October 2016 at 03:29
Died: Sunday, 11 December 2016 at 11:28
Models: 458
Epochs: 145070
Size: 80.37GB

 

38-59 (2016-10-27T10-38-59)

Dilation Channels : 16

Born: 26 October 2016 at 03:29
Died: Sunday, 11 December 2016 at 8:03
Models: 475
Epochs: 150000
Size: 130.68GB

 

26-03 (2016-10-26T15-26-03)

Dilation Channels : 32

Born: 26 October 2016 at 03:29
Died: Sunday, 11 December 2016 at 11:28
Models: 464
Epochs: 145070
Size: 98.1GB