whytepaper

Why so many projects? From a ecoinomist

As an economics and statistics student doing research on cryptocurrencies for my senior thesis, I’ve come across the problem that lots of coins and projects exist out there that I will never have enough time to research. Thus, I used Python’s NLP, file conversions, and lots of dirty processing to distill terribly long whitepapers into key stats: sentiment scores, most popular words, biasedness of the text, average word length, average sentence length, and more!

From here, the next steps would be to do more visualization of this data to find the sweet spots of what a good paper’s content should look like, using the benchmarks of true academic papers that aren’t trying to sell people things. If we find that imposters either have a too simple sentene structure, or are really verbose and full of BS, we can use these tools to skip reading some of these terrible projects and actually spend time with our Cryptokitties.