The simplest and the most trivial step in this quest is to write a Markov chain text generator. A Markov chain (in layman language) is a chain in which state changes are probabilistic and a future state depends just on the present state and is independent of past states.
So, how will this work?
The first step will be to get a collection of poems (or any text for that matter) which will be used as the corpus. Now, using our training database (the corpus) we will look for all triplets (three words in succession) and will make a map of all words which can come after two words. The current state of our Markov chain will be represented by the last two words in our sentence.
To begin, choose two successive words from the database at random. Look into the map and check which possible words can come after those two words. Choose one of them at random and continue.
An example of the method follows.
Input text: Hope is a good thing, maybe the best of things, and no good thing ever dies.
As I did not have a huge collection of my own poems, I used Sir Robert Frost's poetry collection North of Boston as the corpus. It will generate gibberish most of the time but can generate sensible statements once in a while. A kind of sensible one generated by the algorithm follows.
Who else will harbour him At his age for pair,
the pair, you know. We sha'n't have the of
art of what mean
I mean by home. Of the
course the easy job For the next forty it
summers--call it forty. But not
I'm not so drunk I can't here:
stay here: Estelle's take
to take it as much wishing
as wishing him good-night. He went on: sure--I'm
'I'm sure--I'm sure'--as polite as be.
could be. He spoke to his door.
Yes, that's how a drunk Robert Frost writes.
Furthur work includes the use of Backus–Naur context-free grammar and Natural Language Processing techniques to make the outputs more meaningful.
Fork on Github.