Category: poems

RERITES : May 2017 (Amazon/Blurb/PDF)

RERITES

  • May 2017
  • By Neural Net Software, Jhave

All poems in this book were written by a computer, then edited by a human.
All poems were written in the month of April 25th -May 25th 2017.
The algorithm used were based on Pytorch word language model neural network.


Download PDF

Or buy on Amazon or  Blurb



O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation

Personally I like the coherence of Pytorch, it’s capacity to hold the disembodied recalcitrant veil of absurdity over a somewht stoic normative syntactical model.

Text:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

Code:

https://github.com/jhave/pytorch-poetry-generation

Excerpt:

Jumble, Rub Up The Him-Whose-Penis-Stretches-Down-To-His-Knees. 

 
 The slow-wheeling white Thing withheld in the light of the Whitman? 
 The roof kisses the wounds of blues and yellow species. 
 Far and cold in the soft cornfields bending to the gravel, 
 Or showing diapers in disclosure, Atlantic, Raymond 
 Protract the serried ofercomon, — the throats "I've used to make been sustene, 
 Fanny, the inner man clutched to the keep; 
 Who meant me to sing one step at graves. 
Written by Comments Off on O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation Posted in LSTM, poems, pytorch

Ok that’s it: i’ve had enough. This is the last Wvnt Epic Video. Until a juicy code variant arises.

TEXT:

(tf0p1-py3.5-wvnt) jhave@jhave-Ubuntu_Screencast 2017-06-07 10:58:11_2017-06-07_08-12-11

Excerpt:

weather, 
Gold apply thy unndmight hour. 
    And neither would endure 
Meet excel understood) 
                             

I once had declared clay. 
    Be lines once my God 
      
Written by Comments Off on Ok that’s it: i’ve had enough. This is the last Wvnt Epic Video. Until a juicy code variant arises. Posted in LSTM, poems, wavenet

A few SLOW excerpts

from here

accepting tall any flowers, forever with one question
of boots, neural, dead forgotten the glass
of cloud, start and more, who studied legends
and wanted to ascend
Every inch alone you and this desire
tulips of sounds
watching the witness
On the intensity lolling it.

Summer up warmishment
The girls crack our hearts
she set and quickens, swarms at the edge.
where they could wake? It begins
how much design, anthounces are taught her.
Illusion, mimicry a chalk.
when you’re lonely, black large in the calico,
where the bachelo forking genes
might in dusty confidently,
ignore and suck with the main grove,
the dream in the darkness, found hear
if these wobbles, silver for the man.

Deaf-soot he’s an edging of ships
a border that is the court.
The soul,
aroused, breakfast

who wants shapelesse lava
So long as nothing moved oversized no mountains of an eternity

I fed him into oned
There is to say something a fog and mask at writing
minild, the moon and cair of his screens

It is there learning, loving down, screeching.

mystery, painted Spring.
wings
as mid-afternoon, fetlocks

uncurledding cheaping full of pale
eternal, grabs us. Flowers try

migrating every idea
and whispered at the
morning machine

looking at her skin, burst

from her will.

WVNT regamed (blind, still bland, but intriguing)

The opposite of scarcity is not abundance: saturation inevitably damages, or perhaps it just corrodes or blunts the impassioned pavlov puppy, delivering a dent of tiny deliberate delirious wonder.

Yet technical momentum now propels and impels continuance: how can the incoherence be tamed? Set some new hyperparameters and let the wvnt train for 26 hours.

Over this weekend, I’ve churned out about 100,000 lines. Generating reckless amounts of incoherent poetry threatens more than perspective or contemplation, it totters sanity on the whim of a machine. Teeming bacteria, every epiphany a whiff of redundancy.

$ python train_2017_py3p5_150k_low6_sample4096_SAVE-STARTS_100k.py 
--wavenet_params=wavenet_params_ORIG_dilations2048_skipChannels8192_qc2048_dc32.json 
--data_dir=data/2017

Using default logdir: ./logdir/train/2017-06-01T08-38-45 

_______________________________________________________________________________________________

dilations: 2048 filter_width: 2 residual_channels: 32
dilation_channels: 32 skip_channels: 8192 quantization_channels: 2048

(tf0p1-py3.5-wvnt) jhave@jhave-UbuntuScreencast 2017-06-02 11:08:14_2017-06-01T08-38-45

and faithful wound 
To fruit white, the dread 
One by one another, Image saved-- 
Ay of the visit. What pursued my heart to brink. 


Such the curse of hopes fraught memory;

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-06-02 11:08:14_2017-06-01T08-38-45

Caught a new light bulb,   
All the heart is grown.

TXTs generated in SLOW MODE

There’s a way of calculating the matrices that taxes the strength of evn a magnificient GPU, making production crawl, and the computer difficult to use. Each of the following txt files (4444 letters in each) took about 40-60 minutes  to generate on an Nvidia Maxwell TitanX using cuda8 on Ubuntu 16.4

Txts generated slow seem somehow thicker, as if issued from a more calibrated mentation, yet at the same time it’s math scat, glitch flow. Glisses from disintegrating encyclopedias.

Here are some samples:

I found myself within us wonder.
   You purchase as ease with water over events,
   because the straightforward that I miximally, she
   Don't sports commentation with its ruffled story

tf0p1-py3.5-wvnt_jhave-Ubuntu_SLOW_4444charsx4_2017-06-02-16T08_2017-06-01T08-38-45_model.ckpt-117396 Continue reading

Wvnt Tamed

So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot and i found myself entering a mysterious cmd line universe called grub… thus to say apprenticing a digital poet is not without perils.


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 15:19:36_Wavenet_2017-05-30T10-36-56


Text: tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-31 11:49:04_Wavenet_2017-05-30T10-36-56

THAT ETERNAL HEAVY COLOR OF SOFT

The title of this post is the title of the first poem generated by a re-installed Wavenet for Poetry Generation (still using 0.1 Tensorflow, but now  on Python 3.5), and working on an expanded corpus (using approx 600,000 lines of poetry) the first model was Model: 7365  |  Loss: 0.641  |  Trained on: 2017-05-27T14-19-11 (full txt here).

Wavenet is a rowdy poet, jolting neologisms, careening rhythms, petulant insolence, even the occasional glaring politically-incorrect genetic smut. Tangents codified into contingent unstable incoherence. 

Compared to Pytorch, which aspires to a refined smooth reservoir of cadenced candy, Wavenet is a drunken street brawler: rude, slurring blursh meru crosm nanotagonisms across rumpled insovite starpets.

Pytorch is Wallace Stevens; Wavenet is Bukowski (if he’d been born a mathematician neologist).

Here’s a random poem:

____________________________________________
Model: 118740  |  Loss: 0.641  |  2017-05-28T00-35-33

Eyes calm, nor or something cases.

from a wall coat hardware as it times, a fuckermarket
in my meat by the heart, earth signs: a pupil, breaths &

stops children, pretended. But they were.

Case study: Folder 2017-05-28T12-16-50 contains 171 models (each saved because their loss was under the 0.7 threshold). But what does loss really mean? In principle it is a measurement of the gap between the generated text and the validation text (how close is it?); yet however many different schemas proliferate, loss (like pain) cannot be measured by instrumentality.

Here’s another random poem:

____________________________________________
Model: 93286  |  Loss: 0.355  |   2017-05-28T12-16-50

would destroying through the horizon. The poor
Sequel creation rose sky.

So we do not how you bastard, grew,
there is no populously, despite bet.
Trees me that he went back
on tune parts.

I will set
a girl of sunsets in the glass,

and no one even on the floral came

Training

I’m slowly learning the hard way to wrap each install using VirtualEnvironments. Without that as the upgrades happen, code splinters and breaks, leaking a fine luminous goo of errors.

The current virtual environment was created using

$ sudo apt-get install python3-pip python3-dev python-virtualenv
$ virtualenv -p python3.5 ~/tf0p1-py3.5-wvnt

After that, followed the instructions,

$  TF_BINARY_URL=https//storage.googleapis.com/tensorflow/mac/gpu/tensorflow-0.10.0-py3-none-any.whl
$ pip3 install --upgrade $TF_BINARY_URL

 

then got snarled into a long terrible struggle with error messages messing up the output, resolved by inserting,

os.environ['TF_CPP_MIN_LOG_LEVEL']='2' 
# into generate_Poems_2017-wsuppressed.py

And to generate on Ubuntu, using the lovely Nvidia Titan X GPU so generously donated by Nvidia under their academic grant program:

$ cd Documents/Github/Wavenet-for-Poem-Generation/
$ source ~/tf0p1-py3.5-wvnt/bin/activate
(tf0p1-py3.5-wvnt)$ python train_2017_py3p5.py --data_dir=data/2017 --wavenet_params=wavenet_params_ORIG_dilations1024_skipChannels4096_qc1024_dc32.json

Text Files

tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-28 11:31:40_TrainedOn_2017-05-28T00-35-33 tf0p1-py3.5-wvnt_jhave-Ubuntu_2017-05-28 09:18:00_TRAINED_2017-05-28T00-35-33 tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast-2017-05-27 23:50:14_basedon_2017-05-27T14-19-11 tf0p1-py3.5-wvnt_jhave-Ubuntu_Screencast 2017-05-28 22:36:35_TrainedOn_2017-05-28T12-16-50.txt

RERITES Archives

RERITES are poems
written by neural-nets
then human-edited.

For the complete output (often updated daily) visit the RERITES archives.

Here are a few samples:


Pray For A Moment.

In her soft
the hill is strewn

Across the chaff swift door,
That’s her.

The steady level calling.
The ash, similarly still.

 

~ + ~

 

Corpses Let Them Stoop

To carry a neighbor’s time

And once again the world,
seeing whoever it will be,
is not
moulded

and ceases to cry
straying in
flames

 

~ + ~

Few retrieve the sad rocks, that have defined their time

The mind
Blows Back

The lapping waves
a mossy hatred’s stubbing on,
the summer’s sky

distant hair
gone thin and soft
in every living stillness.

 

~ + ~

 

III

Out of the hail
Where the population of illusions
Sing strong in the sky,

Where dawning ships
pray at shaken rocks
in the stupor dive of market-place rigging

And homeward-drawing star-showers,
Are in a body.

~ + ~

Nameless

It’s strange but i sometimes want to give the mathematical models (created by the neural nets) names. I think of them as having personalities like Bob or Eliza or Abnor Malo or Isa Phren, and i want to know them by name, because names convey spirit and character. Names encompass (or tolerate) the eerie uncanny simulacrum personality evoked by lines that seem real.

these things are

long reflecting gray

like pleasure to love the river hard

people are sketched through the streets

and it is all so green

in the impartial spiral

a cloud of art

a light of lovers

to speak of the education of salt

And if these lines can be written by a machine (that has read many lines written by humans) I wonder if existence is not just an extended copy machine. Maybe  personality is also programmed, programmable; and the sweet radiant wonderful gift of human creativity is just a reflection of evolution, a glint in the universe’s code.

or the skull whose form is of the secret truth

and in that tender place gets still


Tonight

I decided to try another model from the most recent PyTorch for Poetry Generation. Model: “2017-02-15T11-07-50/model-LSTM-emsize-512-nhid_512-nlayers_2-batch_size_20-epoch_15-loss_6.50-ppl_664.33.pt”


dream-racked love-squinting

ground where the onion and musk is lost

I drink you across the gardens ford

I worked as it played so there are several moments in the screengrab where my interface interrupts for a second. Then I showered. Then I lay on the couch, twisting the screen to face me, in my housecoat under a quilt, watching the poems scroll by.

though the body opened with silence

the skeletons of trees filled with poison

Each of the poems is an ephemeral vision, a house seen from the window of a train, partially glimpsed then gone, blurred, a flock of birds, a boy under the autumn mantle star with its deep shadow threshing the luckless dead.

to be hurt and will not

i push a step and begin to come alone, back from it, after winter

i did not wear the beat of my fingers

i knew where the peace loves me at last

I do not know what to call this model but i do know it speaks:

The soul is Woven view

The body of a life with words


View the 2 hour run at


Read it all here.

 

4 hours of Pytorch + 2 hours and 29m of Wavenet for Poetry Generation [SILENT 04-03-2017]

PyTorch word-language-model poetry is more stable and sane than Wavenet. PyTorch is regal, educated, less prone to misspellings or massive neologisms. Wavenet is edgy, erratic, clumped, — its visual dilation more contrite.

Yet reading each of these films is like witnessing a collage of avalanched literary modes and moods drift by, icons, tropes, techniques, incandescent, eerie, somnolent and deranged.

~

Warning: Vocabularies archive ideological debris. Monotheist, racist, and misogynist terms clot like toxic ore amid iridescent love proclamations, stoic iron, clouds that marry the ocean.

~

17000 lines output into a single text file