Commit Graph

10 Commits

Author SHA1 Message Date
Robyn Speer
969a024dea actually use the results of language-detection on Reddit
Former-commit-id: 75a4a92110
2016-03-24 16:27:24 -04:00
Joshua Chin
eb9add9d71 removed unused scripts
Former-commit-id: 39f01b0485
2015-07-17 14:53:18 -04:00
Robyn Speer
deed2f767c remove wiki2tokens and tokenize_wikipedia
These components are no longer necessary. Wikipedia output can and
should be tokenized with the standard tokenizer, instead of the
almost-equivalent one in the Nim code.
2015-06-30 15:28:01 -04:00
Joshua Chin
e57a88b548 added pycld2 dependency 2015-06-16 15:06:22 -04:00
Robyn Speer
a46f1af4b8 fix dependency 2015-05-07 23:55:57 -04:00
Robyn Speer
a5f6113824 a reasonably complete build process 2015-05-07 19:38:33 -04:00
Robyn Speer
04bde8d617 WIP on more build steps 2015-05-07 16:49:53 -04:00
Robyn Speer
59409266ca add and adjust some build steps
- more build steps for Wikipedia
- rename 'tokenize_twitter' to 'pretokenize_twitter' to indicate that
  the results are preliminary
2015-05-05 13:59:21 -04:00
Robyn Speer
efcf436112 WIP on new build system 2015-04-30 16:24:28 -04:00
Robyn Speer
8b322ce534 Initial commit 2015-02-04 20:19:36 -05:00