Author: jhave

  • Screencast 2017-12-15 14:21:42 [ Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation with Random Seed ]

    Got to messing around a bit more with awd-lstm-lm … and by modifying the poetry generation loop, — so that it reloads the model with a new seed each iteration, — used the time it reloads as an organic tiny delay. The result is like a poem flick : every 300 ms (on average, on this…

  • Averaged Stochastic Gradient Descent with Weight Dropped QRNN Poetry Generation [Screencast 2017-12-14 15:23:19]

    Trying out a new algorithm: https://github.com/salesforce/awd-lstm-lm using another mildly revised training corpus Sources: a subset of Poetry Magazine, Jacket2, Bob Marley, Bob Dylan, David Bowie, Tom Waits, Patti Smith, Radiohead, 2 River, Capa, Evergreen Review, jhavelikes.tumblr.com, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites. Same crappy results. But boy does it churn out verses…

  • 2017-12-11 11:52:45 [SUBEST4+JACKET2+LYRICS]

    Using a mildly revised (cleaner leaner) corpus AND … dropout=0.65 (to prevent overfitting) jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M_October.py –checkpoint=models/2017-12-11T06-42-23_dec_rerites_SUBEST4+JACKET2+LYRICS/model-LSTM-emsize-2400-nhid_2400-nlayers_2-batch_size_20-epoch_21-loss_3.58-ppl_36.04.pt –cuda –words=600 –data=data/dec_rerites_SUBEST4+JACKET2+LYRICS Generated: Screencast 2017-12-11 11:52:45_SUBEST4+JACKET2+LYRICS

  • SUBSETS: Small Data (Repeat Mimemesis provoked by lack of Dropout)

    So after last week’s disastreous expansion, i decided to get austere. implement strategic pruning, and reduce the corpus. Selections were integrated from Bob Dylan, Tom Waits, Patti Smith, 2 River, Capa, Evergreen Review, Tumblr, Cathay by Li Bai, Kenneth Patchen, Maurice Blanchot, and previous Rerites. Because the training occured without dropout, the model is overfit…

  • Errors on the Path

    What follows documents futility. Effort expended for nothing. It began when I (greedy for a more diverse vocabulary) expanded the corpus somewhat erratically. Added my tumblr blog jhavelikes.tumblr.com (for contemporary tech words), then Kraznahorkai’s War And War, Maurice Blanchot’s The Space of Literature, some Kenneth Patchen, and a set of contemporary poetry syllabus packages for…

  • Rerites (6 months! 6 books!)

    Poetry books.  One a month. Generated by a computer. Edited by a human. Project duration: 05.2017 – 05.2018 6 books completed in 6 months. Rerites October 2017 just published today: preview online (entire text) or order copy available on Blurb. 162 pages, 8000 words. Excerpt: All That I Know For All That Is Left I know…

  • BRERIN: A PhilosoBot (at random temperatures for 2 hours)

    BRERIN : A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi: Producing single sentences for 2 hours and 2 minutes at random temperatures: Temperature is a hyperparameter of neural nets that influences randomness: Think of it as complexity fluctuation. ~ + ~ BRERIN is a homage to a sustained diligent…

  • Incremental TEMPERATURE Increases (Seed-text for Sept RERITES)

    In the egg, the child is simple. A wet light. Lurching. Body is wind. The sun in the sea. Then as if more, motions, the shadows of trees. The ineluctable diffusion of randomized complexity. Drainfgdl gsod. Trainins spekcled!   Poetry evolves as language in organisms, from simple to complex, from simile and homile to histrionics.…

  • BRERIN (Sense Lab Philosobot – Ep69 )

    BRERIN A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi Neural nets learn how-to write by reading. Each reading of the corpus is called an epoch. This neural net read all  the collected book-length works of Erin Manning and Brian Massumi 69 times (in approx 8 hours using a TitanX GPU).…

  • BRERIN (Epoch 39)

    Epoch 39 is a roughly fermented gated recurrent network (GRU) that exemplifies the rough parabolic deflection contours of Sense Lab discourse. jhav:~ jhave$ cd /Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model jhav:word_language_model jhave$ source activate ~/py36 (/Users/jhave/py36) jhav:word_language_model jhave$ python generate_2017-SL-BE_LaptopOPTIMIZED.py –checkpoint=/Users/jhave/Desktop/github/pytorch-poetry-generation/word_language_model/models/2017-08-22T12-35-49/model-GRU-emsize-2500-nhid_2500-nlayers_2-batch_size_20-epoch_39-loss_1.59-ppl_4.90.pt System will generate 88 word bursts, perpetually, until stopped. BRERIN A Philosobot: Trained on the collected book-length works of…