Markov Bern

Markov chains are one of the traditional tricks in the NLP playbook. They are the apple pie-chart of text-generation.

Basic process: given a source text, find words that are neighbours, if you know the neighbours of a word, you can form a chain if you wish. [(“you”),(“know”,”can”,”wish”)] and reconstruct a text which contains pairs (bigrams) from the source.

So I did that using as source texts: Charles Bernstein Dark City and Rough Trades. (Found on Bernstein’s EPC author page).

The result is an example of what Charles Hartman might refer to as newbie-augmented-cyborg-poet (dead simple technically, but satisfying artistically since the volume of generated texts from which new verses can be hand-crafted is massive). This sort of auto-suggest based-upon-a-corpus technique radically shifts the dimensions of creativity: in the first ‘modified’ example I edited the output, adding words, disguising some obvious quotations from Bernstein, truncating verses, changing lines, modulating rhythms. In the raw output below, it’s just the computer (fueled by Bernstein’s berning phrases), it could go on infinitely given a large enough corpus.



Poetry is both the easiest and the hardest to generate. Since non-linear deflections and word-riffs are an aspect of contemporary poetry, slamming together ripe fertile conjunctions is easy. Migrating toward a sensitive, complex, experiential and contextual lived poetry is the real challenge (I didn’t even begin to touch it here).


Code on Github
Made by Glia.ca  


what is there if
An exit is as
clear as dead.

Billboards poster our losses.
Better a barber
than a thousand one-line
sweat glands.