...

Package lexer

import "github.com/vektah/gqlparser/v2/lexer"
Overview
Index

Overview ▾

type Lexer

Lexer turns graphql request and schema strings into tokens

type Lexer struct {
    *ast.Source
    // contains filtered or unexported fields
}

func New

func New(src *ast.Source) Lexer

func (*Lexer) ReadToken

func (s *Lexer) ReadToken() (Token, error)

ReadToken gets the next token from the source starting at the given position.

This skips over whitespace and comments until it finds the next lexable token, then lexes punctuators immediately or calls the appropriate helper function for more complicated tokens.

type Token

type Token struct {
    Kind  Type         // The token type.
    Value string       // The literal value consumed.
    Pos   ast.Position // The file and line this token was read from
}

func (Token) String

func (t Token) String() string

type Type

Kind represents a type of token. The types are predefined as constants.

type Type int
const (
    Invalid Type = iota
    EOF
    Bang
    Dollar
    Amp
    ParenL
    ParenR
    Spread
    Colon
    Equals
    At
    BracketL
    BracketR
    BraceL
    BraceR
    Pipe
    Name
    Int
    Float
    String
    BlockString
    Comment
)

func (Type) Name

func (t Type) Name() string

func (Type) String

func (t Type) String() string