Lexer turns graphql request and schema strings into tokens
type Lexer struct { *ast.Source // contains filtered or unexported fields }
func New(src *ast.Source) Lexer
func (s *Lexer) ReadToken() (Token, error)
ReadToken gets the next token from the source starting at the given position.
This skips over whitespace and comments until it finds the next lexable token, then lexes punctuators immediately or calls the appropriate helper function for more complicated tokens.
type Token struct { Kind Type // The token type. Value string // The literal value consumed. Pos ast.Position // The file and line this token was read from }
func (t Token) String() string
Kind represents a type of token. The types are predefined as constants.
type Type int
const ( Invalid Type = iota EOF Bang Dollar Amp ParenL ParenR Spread Colon Equals At BracketL BracketR BraceL BraceR Pipe Name Int Float String BlockString Comment )
func (t Type) Name() string
func (t Type) String() string