Author: jhave

  • SenseLab (BRERIN – beta testing)

    BRERIN A Philosobot: Trained on the collected book-length works of Erin Manning and Brian Massumi +~+Library: PyTorch+~+ Mode: GRU Embedding size: 2500 Hidden Layers: 2500 Batch size: 20 Epoch: 69 Loss: 0.71 Perplexity: 2.03.pt Formatted run inside Cathode. Text file: BRERIN-2h22m-Aug23-2017_TERMINAL Preliminary run (unformatted) Txt file output of 1h44m run of 952mb Pytorch GRU model :…

  • RERITES: June 2017

    Read/buy on Blurb, Amazon, or download PDF.

  • Pytorch 1800 hidden layers 31 epochs (July 2107 RERITES Source)

      The second video here became the source-text for July 2017 RERITES  http://glia.ca/rerites/ +~+ PyTorch Poetry Language Model. Trained on approx 600,000 lines of poetry http://bdp.glia.ca +~+ jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_30-loss_6.00-ppl_405.43.pt’ Mode: LSTM Embedding size: 1860 Hidden Layers: 1860 Batch size: 20 Epoch: 30 Loss: 6.00 Perplexity: 405.43.pt +~+ jhave@jhave-Ubuntu:~/Documents/Github/pytorch-poetry-generation/word_language_model$ python generate_2017-INFINITE-1M.py –cuda –checkpoint=’/home/jhave/Documents/Github/pytorch-poetry-generation/word_language_model/models/2017-06-17T09-22-17/model-LSTM-emsize-1860-nhid_1860-nlayers_2-batch_size_20-epoch_31-loss_6.00-ppl_405.39.pt’…

  • RERITES : May 2017 (Amazon/Blurb/PDF)

    RERITES May 2017 By Neural Net Software, Jhave All poems in this book were written by a computer, then edited by a human. All poems were written in the month of April 25th -May 25th 2017. The algorithm used were based on Pytorch word language model neural network. Download PDF Or buy on Amazon or  Blurb

  • O wht the heck. Why not one more last deranged excessive epic deep learning poetry binge courtesy of pytorch-for-poetry-generation

    Personally I like the coherence of Pytorch, it’s capacity to hold the disembodied recalcitrant veil of absurdity over a somewht stoic normative syntactical model. Text: jhave@jhave-Ubuntu-pytorch-poet_Screencast 2017-06-07 20:16:49_model-LSTM-emsize-1500-nhid_1500-nlayers_2-batch_size_20-epoch_7-loss_6.02-ppl_412.27 Code: https://github.com/jhave/pytorch-poetry-generation Excerpt: Jumble, Rub Up The Him-Whose-Penis-Stretches-Down-To-His-Knees. The slow-wheeling white Thing withheld in the light of the Whitman? The roof kisses the wounds of blues and…

  • Ok that’s it: i’ve had enough. This is the last Wvnt Epic Video. Until a juicy code variant arises.

    TEXT: (tf0p1-py3.5-wvnt) jhave@jhave-Ubuntu_Screencast 2017-06-07 10:58:11_2017-06-07_08-12-11 Excerpt: weather, Gold apply thy unndmight hour. And neither would endure Meet excel understood) I once had declared clay. Be lines once my God

  • A few SLOW excerpts

    from here accepting tall any flowers, forever with one question of boots, neural, dead forgotten the glass of cloud, start and more, who studied legends and wanted to ascend Every inch alone you and this desire tulips of sounds watching the witness On the intensity lolling it. Summer up warmishment The girls crack our hearts…

  • WVNT regamed (blind, still bland, but intriguing)

    The opposite of scarcity is not abundance: saturation inevitably damages, or perhaps it just corrodes or blunts the impassioned pavlov puppy, delivering a dent of tiny deliberate delirious wonder. Yet technical momentum now propels and impels continuance: how can the incoherence be tamed? Set some new hyperparameters and let the wvnt train for 26 hours.…

  • Wvnt Tamed

    So the abrasive brash gutter voice of the neural net seemed maybe due to lack of longterm exposure, wider horizons, deeper reading into the glands of embodied organisms, so I set the hyperparameters higher, waited 26 hours and watched the ubuntu HD fill up with models to the point where the OS crashed on reboot…

  • THAT ETERNAL HEAVY COLOR OF SOFT

    The title of this post is the title of the first poem generated by a re-installed Wavenet for Poetry Generation (still using 0.1 Tensorflow, but now  on Python 3.5), and working on an expanded corpus (using approx 600,000 lines of poetry) the first model was Model: 7365  |  Loss: 0.641  |  Trained on: 2017-05-27T14-19-11 (full…