Lexer is a JSON lexer: it iterates over JSON tokens in a byte slice.
type Lexer struct { Data []byte // Input data given to the lexer. UseMultipleErrors bool // If we want to use multiple errors. // contains filtered or unexported fields }
func (r *Lexer) AddError(e error)
func (r *Lexer) AddNonFatalError(e error)
func (r *Lexer) Bool() bool
Bool reads a true or false boolean keyword.
func (r *Lexer) Bytes() []byte
Bytes reads a string literal and base64 decodes it into a byte slice.
func (r *Lexer) Consumed()
Consumed reads all remaining bytes from the input, publishing an error if there is anything but whitespace remaining.
func (r *Lexer) Delim(c byte)
Delim consumes a token and verifies that it is the given delimiter.
func (r *Lexer) Error() error
func (r *Lexer) FetchToken()
FetchToken scans the input for the next token.
func (r *Lexer) Float32() float32
func (r *Lexer) Float32Str() float32
func (r *Lexer) Float64() float64
func (r *Lexer) Float64Str() float64
func (r *Lexer) GetNonFatalErrors() []*LexerError
func (r *Lexer) GetPos() int
func (r *Lexer) Int() int
func (r *Lexer) Int16() int16
func (r *Lexer) Int16Str() int16
func (r *Lexer) Int32() int32
func (r *Lexer) Int32Str() int32
func (r *Lexer) Int64() int64
func (r *Lexer) Int64Str() int64
func (r *Lexer) Int8() int8
func (r *Lexer) Int8Str() int8
func (r *Lexer) IntStr() int
func (r *Lexer) Interface() interface{}
Interface fetches an interface{} analogous to the 'encoding/json' package.
func (r *Lexer) IsDelim(c byte) bool
IsDelim returns true if there was no scanning error and next token is the given delimiter.
func (r *Lexer) IsNull() bool
IsNull returns true if the next token is a null keyword.
func (r *Lexer) IsStart() bool
IsStart returns whether the lexer is positioned at the start of an input string.
func (r *Lexer) JsonNumber() json.Number
JsonNumber fetches and json.Number from 'encoding/json' package. Both int, float or string, contains them are valid values
func (r *Lexer) Null()
Null verifies that the next token is null and consumes it.
func (r *Lexer) Ok() bool
Ok returns true if no error (including io.EOF) was encountered during scanning.
func (r *Lexer) Raw() []byte
Raw fetches the next item recursively as a data slice
func (r *Lexer) Skip()
Skip skips a single token.
func (r *Lexer) SkipRecursive()
SkipRecursive skips next array or object completely, or just skips a single token if not an array/object.
Note: no syntax validation is performed on the skipped data.
func (r *Lexer) String() string
String reads a string literal.
func (r *Lexer) StringIntern() string
StringIntern reads a string literal, and performs string interning on it.
func (r *Lexer) Uint() uint
func (r *Lexer) Uint16() uint16
func (r *Lexer) Uint16Str() uint16
func (r *Lexer) Uint32() uint32
func (r *Lexer) Uint32Str() uint32
func (r *Lexer) Uint64() uint64
func (r *Lexer) Uint64Str() uint64
func (r *Lexer) Uint8() uint8
func (r *Lexer) Uint8Str() uint8
func (r *Lexer) UintStr() uint
func (r *Lexer) UintptrStr() uintptr
func (r *Lexer) UnsafeBytes() []byte
UnsafeBytes returns the byte slice if the token is a string literal.
func (r *Lexer) UnsafeFieldName(skipUnescape bool) string
UnsafeFieldName returns current member name string token
func (r *Lexer) UnsafeString() string
UnsafeString returns the string value if the token is a string literal.
Warning: returned string may point to the input buffer, so the string should not outlive the input buffer. Intended pattern of usage is as an argument to a switch statement.
func (r *Lexer) WantColon()
WantColon requires a colon to be present before fetching next token.
func (r *Lexer) WantComma()
WantComma requires a comma to be present before fetching next token.
LexerError implements the error interface and represents all possible errors that can be generated during parsing the JSON data.
type LexerError struct { Reason string Offset int Data string }
func (l *LexerError) Error() string