Tokenizer

Tokenizing a text means segmenting it into manipulatable linguistic units such as words, punctuation, numbers,... Each element corresponds to a token that will be useful for the analysis. One may think that it is enough to detect the spaces between words, but it is not always that easy-especially for French!

Build your own custom pipeline

  • Customize to your business needs
  • No limit of input size
  • 3000 API calls per months
  • Combine with other solutions
  • Ready to scale

Subscribe to our newsletter

A monthly digest of the latest Lettria news, articles, and resources.