“and farther heart, stuff’d the Arts supernova” | 173,901 lines made in 1h35m | Screencast 2018-03-08 21:10:55

Txt excerpt: I can even hear your violent debates. Of prayer, weep, Whatever you say, я recognise me. paulist. That’s right no. laughing? Complete txt output: generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55 Corpus retuned to reflect tech and other torque 100selected-archiveorg.txt Jacket2_ALL.txt 99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt ABZ.txt jorie_graham.txt Aeon_autonomousrobots_science-and-metaphysics_enemies_biodiversity.txt literotica.txt […]

17,969 pseudo-poems of 12 lines of approx. 24 characters each generated at a rate of 6.1s per poem in 2h54m

  So as if there was any need: here is more.   Here’s the code: jhave@jhave-Ubuntu:~$ cd ‘/home/jhave/Documents/Github/awd-lstm-lm-master’ jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ source activate awd-py36 jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python generate_March-2018_PARAGRAPHS.py –cuda –words=333 –checkpoint=”models/SUBEST4+JACKET2+LYRICS_QRNN-PBT_Dec11_FineTune+Pointer.pt” –model=QRNN –data=’data/dec_rerites_SUBEST4+JACKET2+LYRICS’ –mint=0.95 –maxt=1.25   Here’s a video:   Here’s an excerpt: Things, media, le openness, and tenacity. Consigned to vivian for winked. Our mother On a […]

Paragraph style: 16 lines of approx 42 chars each : Averaged Stochastic Gradient Descent : Screencast 2018-02-04 18:52:12

5038 poems of 16 lines of approx 42 chars each generated in 2h42m I am now utterly convinced of the impossibility of neural nets ever producing coherent contextually-sensitive verse, yet as language play, and a window into the inherent biases within humans, it is is impeccably intriguing and occasionally nourishing. Text: Screencast 2018-02-04 18:52:12_PARAGRAPHS.txt.tar  (awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ […]

2017-12-11 11:52:45 [SUBEST4+JACKET2+LYRICS]

Using a mildly revised (cleaner leaner) corpus AND … dropout=0.65 (to prevent overfitting) jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M_October.py –checkpoint=models/2017-12-11T06-42-23_dec_rerites_SUBEST4+JACKET2+LYRICS/model-LSTM-emsize-2400-nhid_2400-nlayers_2-batch_size_20-epoch_21-loss_3.58-ppl_36.04.pt –cuda –words=600 –data=data/dec_rerites_SUBEST4+JACKET2+LYRICS Generated: Screencast 2017-12-11 11:52:45_SUBEST4+JACKET2+LYRICS

SUBSETS: Small Data (Repeat Mimemesis provoked by lack of Dropout)

So after last week’s disastreous expansion, i decided to get austere. implement strategic pruning, and reduce the corpus. Selections were integrated from Bob Dylan, Tom Waits, Patti Smith, 2 River, Capa, Evergreen Review, Tumblr, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites. Because the training occured without dropout, the model is overfit […]

Errors on the Path

What follows documents futility. Effort expended for nothing. It began when I (greedy for a more diverse vocabulary) expanded the corpus somewhat erratically. Added my tumblr blog jhavelikes.tumblr.com (for contemporary tech words), then Kraznahorkai’s War And War, Maurice Blanchot’s The Space of Literature, some Kenneth Patchen, and a set of contemporary poetry syllabus packages for […]

BRERIN: A PhilosoBot (at random temperatures for 2 hours)

BRERIN : A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi: Producing single sentences for 2 hours and 2 minutes at random temperatures: Temperature is a hyperparameter of neural nets that influences randomness: Think of it as complexity fluctuation. ~ + ~ BRERIN is a homage to a sustained diligent […]

Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

In the egg, the child is simple. A wet light. Lurching. Body is wind. The sun in the sea. Then as if more, motions, the shadows of trees. The ineluctable diffusion of randomized complexity. Drainfgdl gsod. Trainins spekcled!   Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics. […]

BRERIN (Sense Lab Philosobot – Ep69 )

BRERIN A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi Neural nets learn how-to write by reading. Each reading of the corpus is called an epoch. This neural net read all  the collected book-length works of Erin Manning and Brian Massumi 69 times (in approx 8 hours using a TitanX GPU). […]

BRERIN (Epoch 39)

Epoch 39 is a roughly fermented gated recurrent network (GRU) that exemplifies the rough parabolic deflection contours of Sense Lab discourse. jhav:~ jhave$ cd /Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model jhav:word_language_model jhave$ source activate ~/py36 (/Users/jhave/py36) jhav:word_language_model jhave$ python generate_2017-SL-BE_LaptopOPTIMIZED.py –checkpoint=/Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model/models/2017-08-22T12-35-49/model-GRU-emsize-2500-nhid_2500-nlayers_2-batch_size_20-epoch_39-loss_1.59-ppl_4.90.pt System will generate 88 word bursts, perpetually, until stopped. BRERIN A Philosobot: Trained on the collected book-length works of […]