Txt excerpt:
I can even hear your violent debates. Of prayer, weep, Whatever you say, я recognise me. paulist. That's right no. laughing?
Complete txt output:
generated-2018-03-08T19-35-32_Screencast 2018-03-08 21:10:55
Corpus retuned to reflect tech and other torque
100selected-archiveorg.txt Jacket2_ALL.txt 99 Terms You Need To Know When You’re New To Tech.txt jhavelikes_CLEANED-again.txt ABZ.txt jorie_graham.txt Aeon_autonomousrobots_science-and-metaphysics_enemies_biodiversity.txt literotica.txt capa.txt ndsu.edu-creativeWriting-323.txt Cathay by LiBai pg50155.txt neurotransmitter.txt celan.txt nyt_china-technology-censorship-borders-expansion.txt citizen-an-american-lyric-claudia-rankine.txt patchen.txt citylights.txt Patti Smith - all songs lyrics.txt deepmind.txt poetryFoundation_CLEANED_lower.txt evergreenreview_all_poem.html.txt rodneyJones.txt glossary-networking-terms.txt swear.txt glossary-neurological-terms.txt Teen Love Meets the Internet.txt glossary-posthuman-terms.txt tumblr_jhavelikes_de2107-feb2018.txt Grief.txt tumblr_jhavelikes_EDITED.txt harari_sapiens-summary_guardian-review.txt twoRiver.txt I Feel It Is My Duty to Speak Out_SallyOReilly.txt
Process left to run for 36 hours on Nvidia Titan X:
(awd-py36) jhave@jhave-Ubuntu:~/Documents/Github/awd-lstm-lm-master$ python -u main.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python -u finetune.py --epochs 500 --data data/March-2018_16mb --clip 0.25 --dropouti 0.4 --dropouth 0.2 --nhid 1500 --nlayers 4 --seed 4002 --model QRNN --wdrop 0.1 --batch_size 20 --emsize=400 --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python pointer.py --lambdasm 0.1279 --theta 0.662 --window 3785 --bptt 2000 --data data/March-2018_16mb --model QRNN --save models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt $ python generate_March-2018_nocntrl.py --cuda --words=444 --checkpoint="models/March-2018_16mb_QRNN_nhid1500_batch20_nlayers4_emsize400.pt" --model=QRNN --data='data/March-2018_16mb' --mint=0.75 --maxt=1.25
Training QRNN using awd-lstm exited manually after 252 epochs:
----------------------------------------------------------------------------------------- | end of epoch 252 | time: 515.44s | valid loss 4.79 | valid ppl 120.25 ----------------------------------------------------------------------------------------- Saving Averaged! | epoch 253 | 200/ 1568 batches | lr 30.00 | ms/batch 307.76 | loss 4.29 | ppl 73.12 ^C----------------------------------------------------------------------------------------- Exiting from training early ========================================================================================= | End of training | test loss 3.99 | test ppl 54.11
Complete training terminal output: TRAINING__Mid March Corpus | Screencast 2018-03-08 21:10:55