Tokenizer is a tool that uses regular expressions to split given string into tokens. What the hell is that good for, you might ask? Well, you can create your own languages! Tokenizer is used in Latte for example.
To use this library, you just have to add, in your project:
sudo dnf install --advisory=FEDORA-2015-958646e79a \*
Please login to add feedback.