BRERIN: A PhilosoBot (at random temperatures for 2 hours)

BRERIN : A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi: Producing single sentences for 2 hours and 2 minutes at random temperatures: Temperature is a hyperparameter of neural nets that influences randomness: Think of it as complexity fluctuation.

~ + ~

BRERIN is a homage to a sustained diligent fertile intellectual oeuvre.

Erin Manning and Brian Massumi are thinkers; they operate within diverse terrains (radical empiricists, speculative pragmatists, process philosophers); they utilize language to explore cultural thought as a process; they co-direct the SenseLab. I am grateful to them for the generosity of inviting me to explore their work with machine learning and donating their writings to this process. As they write:

“The SenseLab does not exist as such. It is not an organization. It is not an institution. It is not a collective identity. It is an event-generating machine, a processual field of research-creation whose mission is to inside itself out. Its job is to generate outside prolongations of its activity that rip.ple into distant pools of potential.” — Thought in the Act

~ + ~

BRERIN generates text which reflects the vocabulary and cadence of its origin. It operates as a container for modes of idiomatic discourse. Yet it is also an an artefact of contemporary deep learning, utterly lacking in subtle contextuality or genuine cognition.

+~+

TECH DETAILS

Library: PyTorch

Mode: GRU
Embedding size: 2500
Hidden Layers: 2500
Batch size: 20

Epoch: 69
Loss: 0.71
Perplexity: 2.03.pt

Temperature range: 0.25 to 1.25

~ + ~

TEXT: BRERIN_2h02m_07092017_longSentence

~ + ~

~ + ~

CODE: https://github.com/jhave/pytorch-poetry-generation/tree/master/word_language_model

Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

In the egg, the child is simple. A wet light. Lurching.

Body is wind. The sun in the sea.

Then as if more, motions, the shadows of trees.

The ineluctable diffusion of randomized complexity.

Drainfgdl gsod. Trainins spekcled!

 

Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics. Increments in the temperature of a neural net model simulate time.

For high temperatures ( τ → ∞ {\displaystyle \tau \to \infty } \tau \to \infty ), all actions have nearly the same probability and the lower the temperature, the more expected rewards affect the probability. For a low temperature ( τ → 0 + {\displaystyle \tau \to 0^{+}} \tau \to 0^{+}), the probability of the action with the highest expected reward tends to 1.

https://en.wikipedia.org/wiki/Softmax_function

~ + ~

~ + ~

Text: Screencast 2017-08-30 00:00:40_incrementalTEMPERATURE_PoetryPytorch

~ + ~

This source will become the first seed-text for September’s RERITES.

BRERIN (Sense Lab Philosobot – Ep69 )

BRERIN

A Philosobot:
Trained on the collected book-length works
of Erin Manning and Brian Massumi

Neural nets learn how-to write by reading.
Each reading of the corpus is called an epoch.

This neural net read all  the collected book-length works
of Erin Manning and Brian Massumi
69 times (in approx 8 hours
using a TitanX GPU).

+ ~ +

Now it writes 70 word segments (that end in a sentence).
Matching as best it can vocabulary and cadence of the corpus.

It cannot match the thought, but reflects a simulacrum of thought:
the thought inherent within language, within reading, within writing.

+~+

Library: PyTorch

+~+

Mode: GRU
Embedding size: 2500
Hidden Layers: 2500

Batch size: 20
Epoch: 69

Loss: 0.71
Perplexity: 2.03.pt

~ + ~

Text: 25-08-2017_Epoch69_Temperature0p95_1h39m

~ + ~

BRERIN (Epoch 39)

Epoch 39 is a roughly fermented gated recurrent network (GRU) that exemplifies the rough parabolic deflection contours of Sense Lab discourse.

jhav:~ jhave$ cd /Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model

jhav:word_language_model jhave$ source activate ~/py36 
(/Users/jhave/py36) jhav:word_language_model jhave$ python generate_2017-SL-BE_LaptopOPTIMIZED.py --checkpoint=/Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model/models/2017-08-22T12-35-49/model-GRU-emsize-2500-nhid_2500-nlayers_2-batch_size_20-epoch_39-loss_1.59-ppl_4.90.pt

System will generate 88 word bursts, perpetually, until stopped.

BRERIN 

A Philosobot:
Trained on the collected book-length works of Erin Manning and Brian Massumi

+~+Library: PyTorch+~+

Mode: GRU
Embedding size: 2500
Hidden Layers: 2500
Batch size: 20
Epoch: 39
Loss: 1.59
Perplexity: 4.90.pt

Initializing.
Please be patient.

 


 


Text : Screencast_SL_BE_Epoch39_24-08-2017_16h12_1h04m_model-GRU-emsize-2500-nhid_2500-nlayers_2-batch_size_20-epoch_39-loss_1.59-ppl_4.90


For the tech-minded, let it be noted: this is an overfit model. While overfitting is taboo in science, it is a creator of blossoms in natural language generation. The texture of actual units of source text sutured into a collagen of authenticity.

Specifically: I used all the text sources in the training data. And basically did not care about the relevance or size of test or validation data. And the embedding size is made as large as the gpu will tolerate. Dropout is high so it gets confused.

Basically, for a deep learning expert, the loss and perplexity values are invalid, to put it crudely: bullshit. Yet the texture of the language generated is superior.

Consider the analogy of training a child to read and write: does the wise teacher keep back part of the corpus of knowledge, or does the teacher give all to the student?

Brerin may have many moments of spasmodic incoherence, yet at an idiomatic cadence and vocabulary level the texts recreate the dexterity and delirium intensities of the source fields. In essence, reflecting the vast variational presence of both Erin and Brian. This bot is a homage to their massive resilient oeuvre.

SenseLab (BRERIN – beta testing)

BRERIN 

A Philosobot:
Trained on the collected book-length works 
of Erin Manning and Brian Massumi

+~+Library: PyTorch+~+

Mode: GRU
Embedding size: 2500
Hidden Layers: 2500
Batch size: 20
Epoch: 69
Loss: 0.71
Perplexity: 2.03.pt


Formatted run inside Cathode.

Text file: BRERIN-2h22m-Aug23-2017_TERMINAL


Preliminary run (unformatted)

Txt file output of 1h44m run of 952mb Pytorch GRU model : here


Pytorch 1800 hidden layers 31 epochs

PyTorch Poetry Language Model.
Trained on approx 600,000 lines of poetry
http://bdp.glia.ca

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_30-loss_6.00-ppl_405.43.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 30
Loss: 6.00
Perplexity: 405.43.pt

+~+

jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39.pt’

Mode: LSTM
Embedding size: 1860
Hidden Layers: 1860
Batch size: 20
Epoch: 31
Loss: 6.00
Perplexity: 405.39.pt

+~+

Ridges— Ourselves?

4
K-Town: ideality;
The train lost, “Aye man!
O old beggar, O perfect friend;
The bath-tub before the Bo’s’n resentments; pissing
rimed metaphors in the white pincers
scratching and whiten each of a clarity
in the sky the sacred hoof of eastward,
arc of the pestle through sobered the cliffs to the own world.

+~+

TXT Version:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

+~+

VIDEO Version:

RERITES : May 2017 (Amazon/Blurb/PDF)

RERITES

  • May 2017
  • By Neural Net Software, Jhave

All poems in this book were written by a computer, then edited by a human.
All poems were written in the month of April 25th -May 25th 2017.
The algorithm used were based on Pytorch word language model neural network.


Download PDF

Or buy on Amazon or  Blurb



O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation

Personally I like the coherence of Pytorch, it’s capacity to hold the disembodied recalcitrant veil of absurdity over a somewht stoic normative syntactical model.

Text:

jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27

Code:

https://github.com/jhave/pytorch-poetry-generation

Excerpt:

Jumble, Rub Up The Him-Whose-Penis-Stretches-Down-To-His-Knees. 

 
 The slow-wheeling white Thing withheld in the light of the Whitman? 
 The roof kisses the wounds of blues and yellow species. 
 Far and cold in the soft cornfields bending to the gravel, 
 Or showing diapers in disclosure, Atlantic, Raymond 
 Protract the serried ofercomon, — the throats "I've used to make been sustene, 
 Fanny, the inner man clutched to the keep; 
 Who meant me to sing one step at graves. 
Written by Comments Off on O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation Posted in LSTM, poems, pytorch

Ok that’s it: i’ve had enough. This is the last Wvnt Epic Video. Until a juicy code variant arises.

TEXT:

(tf0p1-py3.5-wvnt) jhave@jhave-Ubuntu_Screencast 2017-06-07 10:58:11_2017-06-07_08-12-11

Excerpt:

weather, 
Gold apply thy unndmight hour. 
    And neither would endure 
Meet excel understood) 
                             

I once had declared clay. 
    Be lines once my God 
      
Written by Comments Off on Ok that’s it: i’ve had enough. This is the last Wvnt Epic Video. Until a juicy code variant arises. Posted in LSTM, poems, wavenet