Description

Split “text“ into (tokentype, text) pairs. If “context“ is given, use this lexer context instead.

tokendefs is referenced in 0 repositories