• Tokenizer

    Origin

    en + -tokenize + er

    Full definition of tokenizer

    Noun

    tokenizer

    (plural tokenizers)
    1. (computing) A system that parses an input stream into its component tokens.
    © Wiktionary