An implementation of n-gram language models with various smoothing techniques for natural text generation and analysis with tokenization and perplexity calculations
-
Updated
Feb 26, 2025 - Python
An implementation of n-gram language models with various smoothing techniques for natural text generation and analysis with tokenization and perplexity calculations
A language model that generates text based on a given prompt.
Add a description, image, and links to the perplexity-metric topic page so that developers can more easily learn about it.
To associate your repository with the perplexity-metric topic, visit your repo's landing page and select "manage topics."