ELO Performance (Brief Reproduction)

Read a screen where code is rapidly producing poems.
Find a path through the words: construct a poem from machinic intuition.

The following recreates a performance made at
the Electronic Literature Organization conference
in Bergen, Norway on Aug. 4th 2015.

Details: http://bdp.glia.ca/smaller-words-shrink-gapped
Code: https://github.com/jhave/Big-Data-Poetry

Technical process: the following poems were produced using a 10,000+ corpus of poems used as templates. Each poem has been sent to Alchemy API to produce entity-recognition, POS, and sentiment reports. That analysis influences replacement algorithms. Replacement uses NLTK synsets and Pattern.en and a reservoir of words found in the corpus that do not have synonyms.

In short, 10000 poems are
transformed by algorithms
into millions of poems
at an extremely rapid rate.

The reader must then
find a way to convert this
spew into spoken word.

ELO 2015 — Bergen – Performance

One of the ends
of digital literature
is an external intuition.

External intuition
is an engineering problem.

Intuition
in this case
is me.

Skidding thru the
generated poems
as they augment
my imagination.

I call this act of
augmented imagination:
cyborg/ skid/ spreedr poetry.

####

For the performance,
at ELO conference performance in Bergen, Norway on Aug. 5th 2015
I generated 100 poems, pausing ever 5 poems for 20 seconds,
and tried to weave spoken poetry from the output.

Unfortunately I forgot to record the audio/screenvideo
so my actual spoken-poem is gone!

#####

If you want to try it yourself:
Download attached code and data corpus
— this includes the poems generated during the performance.

Unzip
run in command line
>>cd code/poetryFoundation/ELO_July2015/
>>python ELO2015_PERF_Creeley-Aug4th.py

This code will run with Python package Anaconda 2.7 installed.

####

Code will continue to be updated on
on github: https://github.com/jhave/Big-Data-Poetry.

Spreeder: the feature film (EPC 20th Anniversary Celebration)

Loss Pequeño Glazier is celebrating the 20th anniversary of the Electronic Poetry Centre along with Charles Bernstein, cris cheek, Tony Conrad, Steve McCaffery, Myung Mi Kim, Tammy McGovern, Joan Retallack, Laura Shackelford, Danny Snelson, Dennis Tedlock, Cecilia Vicuña, Elizabeth Willis, & Wooden Cities with Ethan Hayden. Along with exhibitions by: “Abra” (Amaranth Borsuk, Kate Durbin & Ian Hatcher); Pry: iPad-based Novella (Samantha Gorman & Danny Cannizzaro); “Enter:in’ Wodies” (Zuzana Husárová & Lubomír Panák) & myself.  http://epc.buffalo.edu/e-poetry/2014/EPC-at-20/

So I made a future-feature film of a computer writing in real time
Spreeder (approx. output 8,100 poems; 2 hour-long real-time episodes).
Implemented in Python with Alchemy API, NLTK and pattern.en

 

 

t-SNE: Classification of 10,557 poems

Once again: there is much magic in the math. The era of numeration discloses a field of stippled language. Songlines, meridians, tectonics, the soft shelled crab, a manta ray, a flock of starlings.

In the image below, each dot is a poem. It’s position is calculated based on an algorithm called t-SNE (Distributed Stochastic Neighbour Embedding)

Screen Shot 2014-08-23 at 9.16.28 pm

The image above is beautiful, but it’s impossible to know what is actually going on. So i built a interactive version (it’s a bit slow, but, functions…) where rollover of a dot reveal all the poems by that author.

Screengrabs (below) of the patterns suggest that poets do have characteristic forms discernible by algorithms. Position is far from random; note, the algorithm did not know the author of any of the poems; the algorithm was fed the poems; this is the equivalent of blind-taste-testing.

Still these images don’t tell us much about the poems themselves, except that they exist in communities. That the core of poetry is a spine. That some poets migrate across styles, while others define themselve by a style. The real insights will emerge as algorithms like t-SNE are applied to larger corpus, and allow nuanced investigation of the features extracted: on what criteria exactly did the probabilities grow? What are the 2 or 3 core dimensions?

What is t-SNE

My very basic non-math-poet comprehension of how it works: t-SNE performs dimensionality reduction: it reduces the numbers of parameters considered. Dimensionality reduction is useful when visualizing data; think about graphing 20 different parameters (dimensions). Another technique that does this is PCA: principal component analysis. Dimensionality reduction is in a sense a distillation process, it simplifies. In this case, it converts ‘pairwise similarities’ between poems into probability distributions. Then it decreases the ‘entropy’ using a process of gradient descent to minimize the (mysterious) Kullback-Leibler divergence.

To know more about the Python version of t-SNE bundled into sklearn, read Alexander Fabisch

One of the few parameters I bothered tweaking over numerous runs is appropriately named) perplexity. In the FAQ, LJP van der Maaten (who created t-SNE) wrote:

 What is perplexity anyway?

Perplexity is a measure for information that is defined as 2 to the power of the Shannon entropy. The perplexity of a fair die with k sides is equal to k. In t-SNE, the perplexity may be viewed as a knob that sets the number of effective nearest neighbors. It is comparable with the number of nearest neighbors-k that is employed in many manifold learners.

SPREED : Speed Screen Reading : One Hour Real-Time Poetry Generation ScreenGrab

Using Python (Anaconda), NLTK, WordNet, Alchemy, pattern.en, and pyenchant
to analyze and perform word replacement
on a corpus of 10,119 poems scraped from the PoetryFoundation
and generate 7,769 poems in approx. 2 hours and 30 minutes.

This is a real-time hour-long screen-grab output
of the trace window in SublimeText
as the poetry-gen program runs.



Code on Github
Made by Glia.ca  


And here is another episode of  “Crawling Toward Creeley” (in this episode: a variation on the ‘Gnomic Verses’ generated then modified)

Markov Bern

Markov chains are one of the traditional tricks in the NLP playbook. They are the apple pie-chart of text-generation.

Basic process: given a source text, find words that are neighbours, if you know the neighbours of a word, you can form a chain if you wish. [(“you”),(“know”,”can”,”wish”)] and reconstruct a text which contains pairs (bigrams) from the source.

So I did that using as source texts: Charles Bernstein Dark City and Rough Trades. (Found on Bernstein’s EPC author page).

The result is an example of what Charles Hartman might refer to as newbie-augmented-cyborg-poet (dead simple technically, but satisfying artistically since the volume of generated texts from which new verses can be hand-crafted is massive). This sort of auto-suggest based-upon-a-corpus technique radically shifts the dimensions of creativity: in the first ‘modified’ example I edited the output, adding words, disguising some obvious quotations from Bernstein, truncating verses, changing lines, modulating rhythms. In the raw output below, it’s just the computer (fueled by Bernstein’s berning phrases), it could go on infinitely given a large enough corpus.



Poetry is both the easiest and the hardest to generate. Since non-linear deflections and word-riffs are an aspect of contemporary poetry, slamming together ripe fertile conjunctions is easy. Migrating toward a sensitive, complex, experiential and contextual lived poetry is the real challenge (I didn’t even begin to touch it here).


Code on Github
Made by Glia.ca  


what is there if
An exit is as
clear as dead.
 
Billboards poster our losses.
Better a barber
than a thousand one-line
sweat glands.

SYN-SCI-RAP

I think I have begun to develop a mild form of insanity that often strikes those who fiddle around with computationally-generated text. After reading thousands of lines of dense incomprehensible gibberish it clarifies and makes sense, often more sense than any mere linear thought. The brain acclimatises to syntactic pressure.


Recipe for mildly insane word-salad:

  • take 57,000 rap songs input by fans,
    • extract all words that do not return results from WordNet synset search and put into Reservoir
  • one list of scientific terminology (for sombre intellectual tone)
    • chop off “-ology” wherever it occurs
  • one list of swear words (for spice)
  • call to WordNet synset algorithm (for fibre and continuity)
  • use pattern.en to do conjugation (for a tiny bit of coherence)
  • use NLTK part-of-speech tagging
  • Alchemy for entity (people, places, etc…) replacement
  • 10,000 or more poems

Mix all ingredients together using replacement algorithms.


To read 10116 poems (simple style) (in a single 24-mb html page) generated in 10356.4216051 seconds (2.87 hours, 3612pph [poems per hour], 60 ppm [poems per minute] ) on 2014-08-14 at 02:54 click here


Read a selection of just a few poems 

Read the RAP Reservoir: 33,150 words extracted from 56k user-input rap songs that did not return any usable results from a WordNet synset search. If you are looking for the evolution of language that occurs through mutation (typo, mispells, pop-cruft) this is it.


Code on Github
Made by Glia.ca  

 

 

RSERVOIRD

Reservoirs are where I put unwanted words. These orphan words are later fed back into society whenever the next orphan appears. Thus words swap circumstances, exchange semantic situations, live out different meanings.

Click on an image to visit a reservoir.

Screen Shot 2014-08-04 at 9.29.45 pm

 

Smaller Words (shrink-gapped at 64ppm)

Words disconnected from their primary communicative intent operate as lesions/lessons within the psyche.

Today, I generated another 10120 poems using a very mild modification of the alchemy-synset algorithm with the average word-length constrained even shorter. Speed decreased to 64 ppm poems-per-minute. This reduction in word-length seems (to me) to make some of the absurd illogical elliptical generated fragments seem a bit more legible, taut, elusive and rare. It comes at a cost of coherence. The output reads like Robert Creeley in the process of becoming Samuel Beckett in Gertrude Stein’s gut.


To read 10120 poems (simple shrink-gapped style) (in a single 20-mb html page) generated in 9500.10482717 seconds (2.63 hours total, 3847 poems per hour, 64 ppm, poems-per-minute) on 2014-08-04 at 12:02, click here


Code on Github
Made by Glia.ca  


Edited fragments:

Let me give a robot aging
And as it rains tag the sun with ‘almost’
while within the green fog
a tree that is a dot
Softly opens after the burn.
………
Gaza masked
as me masked
each heavy’s heart out
crying at halo’s burial
making a meal of soil
the city a scar
…..
enthusiasm’s ice. We have
Walked on the bore all nite.
Now in the light
We exchange smells and snot.
By dawn we will have buried our lack
And glued wet to the army of being.
…………

 

Continue reading

Small words (a homage)

I can’t stop. It’s addictive. The ceaseless generative churn. It’s like planting seeds that germinate and blossom as you watch, then goto seed, ripen fall germinate ripen fall germinate, fields filling space to the horizon, blocking out both sun and moon, and again….

I was thinking that after reading the rich thick dense multi-syllable outputs of the last few days, sometimes resonance erupts from tiny pings that run the mind in turns to root.

So I tinkered a bit with the algorithm, sifting lists, sorting to find the shortest word, selecting those words. Seeded in with the rap reservoir (misspelled gheto slank). And let it fly.

Simple.

Excerpts: 

Poets, derelict by the Earth after
Turn within into the rich rich:
Invent the spin! forge the trope!
cutting cut
I genetic dawn, mourning …

and I can dock my pity and my bread.

“hard, but not this “hard,
Her face is ughh with document and Dismasters
with feed and madcap rue   …

closely let her own worms
without holes or end
unvoiced
she stand laudry in the ruin of her hints
and a man with an executioner’s face
pulls her away. 
… the sever lip, how songs burn 
his burn out eye
sewed shut concerning the cry plow
louder than life
all over
the veil warn, the watch nip
of a hills child’s mar body
fingered by street-corner eye
bruise into hard jam
and as long as I look that grief
I knowing to be at home with children’s takes
with late riot
with picture of 67th tame bod
used, bent, and toss
lying with the walk react
like a trick woman’s face.
Violet as veins are, love knows where.
Fine coral as the shy and wild tonguetip,
Undersea coral, rich as inner lip.
There was a stone to build on!
                                              Friezes ran
In strong chorales that where they closed began;
And statues: each a wrung or ringing phrase
In the soul’s passionate cadence of her days.
Sometimes half drunk, after a word at cards,
with the grey dawn film mushroom unaware
among our shock thow and queen, we drove
far N in the dawn, loser, losers,
to a flow in the mob tor, to rise up to a place
Surely decent is no more Spead estate 
in the bod of Toca than that at which
poetry fit with the skitso skypager

Based on ‘Fanny’ by Carolyn Kizer

I come home to a grow world: cacao, dish squash.
The squash speaks was act, and act, dillz blue.
The spirit spirit spirit spirit off the spirit cat’s toe.

Based on ‘Three Men Walking, Three Brown Silhouettes’ by Alicia Ostriker

They naw the sedgy who blow in the action.
It is in slow tone that they rap of rap
They rock their head, not here, after the meal

Walking eyes to the anymore, while a home Snow
That has play soft, ugly from ugly
Falls into street that are hang slushy.

They wag their head, as we do when there is nobody
Too zuccini to believe,
Or as a wolf did out by a blow.

Based on Lawrence Ferlinghetti’s ‘Queens Cemetery, Setting Sun’

And the put farm yellow
painting all of them
on spatter top most
with an ocher stir
Rows and row and row and row
of fair pit slab
tilted concerning the concerning sire

Based on John Donne “The Bait”

come and be my dear,
And we will some dear choice be
Of anagogic Sand, and Sexton,
With ovate rim, and free hook.

 


This homage is really to Creeley


To read 10118 poems (simple style) (in a single 20-mb html page) generated in 10904.6857641 seconds (3.2 hours, 85 poems a minute) on 2014-08-03 at 23:11 click here


Code on Github
Made by Glia.ca