Markov chains are one of the traditional tricks in the NLP playbook. They are the apple pie-chart of text-generation.
Basic process: given a source text, find words that are neighbours, if you know the neighbours of a word, you can form a chain if you wish. [(“you”),(“know”,”can”,”wish”)] and reconstruct a text which contains pairs (bigrams) from the source.
So I did that using as source texts: Charles Bernstein Dark City and Rough Trades. (Found on Bernstein’s EPC author page).
The result is an example of what Charles Hartman might refer to as newbie-augmented-cyborg-poet (dead simple technically, but satisfying artistically since the volume of generated texts from which new verses can be hand-crafted is massive). This sort of auto-suggest based-upon-a-corpus technique radically shifts the dimensions of creativity: in the first ‘modified’ example I edited the output, adding words, disguising some obvious quotations from Bernstein, truncating verses, changing lines, modulating rhythms. In the raw output below, it’s just the computer (fueled by Bernstein’s berning phrases), it could go on infinitely given a large enough corpus.
- Modified output: 1114 lines, 4,019 words; generated in 0.3 seconds on 2014-08-17 at 9:05am. HERE
- RAW output: 27,735 lines; 95,734 words; generated in 0.8 seconds on 2014-08-17 at 16:24pm. HERE
Poetry is both the easiest and the hardest to generate. Since non-linear deflections and word-riffs are an aspect of contemporary poetry, slamming together ripe fertile conjunctions is easy. Migrating toward a sensitive, complex, experiential and contextual lived poetry is the real challenge (I didn’t even begin to touch it here).
Code on Github
Made by Glia.ca
what is there if
An exit is as
clear as dead.Billboards poster our losses.
Better a barber
than a thousand one-line
sweat glands.