Author: jhave

  • Markov Bern

    Markov chains are one of the traditional tricks in the NLP playbook. They are the apple pie-chart of text-generation. Basic process: given a source text, find words that are neighbours, if you know the neighbours of a word, you can form a chain if you wish. [(“you”),(“know”,”can”,”wish”)] and reconstruct a text which contains pairs (bigrams) from…

  • SYN-SCI-RAP

    I think I have begun to develop a mild form of insanity that often strikes those who fiddle around with computationally-generated text. After reading thousands of lines of dense incomprehensible gibberish it clarifies and makes sense, often more sense than any mere linear thought. The brain acclimatises to syntactic pressure. Recipe for mildly insane word-salad:…

  • RSERVOIRD

    Reservoirs are where I put unwanted words. These orphan words are later fed back into society whenever the next orphan appears. Thus words swap circumstances, exchange semantic situations, live out different meanings. Click on an image to visit a reservoir.  

  • Smaller Words (shrink-gapped at 64ppm)

    Words disconnected from their primary communicative intent operate as lesions/lessons within the psyche. Today, I generated another 10120 poems using a very mild modification of the alchemy-synset algorithm with the average word-length constrained even shorter. Speed decreased to 64 ppm poems-per-minute. This reduction in word-length seems (to me) to make some of the absurd illogical elliptical generated…

  • Small words (a homage)

    I can’t stop. It’s addictive. The ceaseless generative churn. It’s like planting seeds that germinate and blossom as you watch, then goto seed, ripen fall germinate ripen fall germinate, fields filling space to the horizon, blocking out both sun and moon, and again…. I was thinking that after reading the rich thick dense multi-syllable outputs of…

  • 4,704 Swan Songs & 1 Opinion

    The code is now at a stage where if I set it to loop and sent the 57k rap songs I have in archive from ohhla to alchemy, I could generate, an unfathomable amount of unreadable crap (also known as c-rap: computational rap). But I think I have come to the end of the synset…

  • Another 10k day

    I’m beginning to understand the exultation of spam-lords, the rapturous power narcotic that arises from watching thousands of words of perhaps-dubious quality arise & spew in a rapid unreadable scrawl across a screen. Beyond semantics, words like sperm procreate incessantly in abundant sementics. Quality in this inverted world is a quantity. On the technical side: today, I…

  • Hatching (trance bug poem set)

    Another day, another 10k. I received an email today from a friend who is a poet named Ian Hatcher; his email included an MP3 of himself reading a poem I had generated using code that was a bit broken . Ian Hatcher reads “woodland_pattern”  I sent this particular poem to Ian because it included a lot of…

  • Collocation: a poem culled from cruft

    Data-art involves days of futile searches. Redundant processes, meandering through archives trying to retrieve relevant results. In this case, a collocation search resulted (as usual) in 99% pure cruft. Cruft may be a synonym for poetry: the craft of recycling phenomenological debris into revelations. Process: a trigram collocation search [in other words, a search for 3…

  • Writing 10,118 poems in 5 hours

    In the same way that climate change signals something irrevocable in the atmosphere, machine-learning constitutes an erosion of the known habitat of culture. Today I wrote 12,000 poems. Most of them are crap. But if even 1% are halfway decent, that’s 120 poems. Numbers aren’t everything. We love what we love. Quantification does not measure…