func Split(s string) ([]string, error)
Split partitions a string into a slice of strings.
Lexer turns an input stream into a sequence of tokens. Whitespace and comments are skipped.
type Lexer Tokenizer
func NewLexer(r io.Reader) *Lexer
NewLexer creates a new lexer from an input stream.
func (l *Lexer) Next() (string, error)
Next returns the next word, or an error. If there are no more words, the error will be io.EOF.
Token is a (type, value) pair representing a lexographical token.
type Token struct {
// contains filtered or unexported fields
}
func (a *Token) Equal(b *Token) bool
Equal reports whether tokens a, and b, are equal. Two tokens are equal if both their types and values are equal. A nil token can never be equal to another token.
TokenType is a top-level token classification: A word, space, comment, unknown.
type TokenType int
Classes of lexographic token
const ( UnknownToken TokenType = iota WordToken SpaceToken CommentToken )
Tokenizer turns an input stream into a sequence of typed tokens
type Tokenizer struct {
// contains filtered or unexported fields
}
func NewTokenizer(r io.Reader) *Tokenizer
NewTokenizer creates a new tokenizer from an input stream.
func (t *Tokenizer) Next() (*Token, error)
Next returns the next token in the stream.