package chroma
Import Path
github.com/alecthomas/chroma/v2 (on go.dev)
Dependency Relation
imports 19 packages, and imported by 7 packages
Involved Source Files
coalesce.go
colour.go
delegate.go
Package chroma takes source code and other structured text and converts it into syntax highlighted HTML, ANSI-
coloured text, etc.
Chroma is based heavily on Pygments, and includes translators for Pygments lexers and styles.
For more information, go here: https://github.com/alecthomas/chroma
emitters.go
formatter.go
iterator.go
lexer.go
mutators.go
regexp.go
registry.go
remap.go
serialise.go
style.go
tokentype_enumer.go
types.go
Package-Level Type Names (total 36)
AnalyseConfig defines the list of regexes analysers.
If true, the first matching score is returned.
Regexes []RegexConfig
Analyser determines how appropriate this lexer is for the given text.
( Analyser) AnalyseText(text string) float32
Lexer (interface)
*RegexLexer
Colour represents an RGB colour.
Blue component of colour.
Brighten returns a copy of this colour with its brightness adjusted.
If factor is negative, the colour is darkened.
Uses approach described here (http://www.pvladov.com/2012/09/make-color-lighter-or-darker.html).
BrightenOrDarken brightens a colour if it is < 0.5 brightness or darkens if > 0.5 brightness.
Brightness of the colour (roughly) in the range 0.0 to 1.0.
ClampBrightness returns a copy of this colour with its brightness adjusted such that
it falls within the range [min, max] (or very close to it due to rounding errors).
The supplied values use the same [0.0, 1.0] range as Brightness.
Distance between this colour and another.
This uses the approach described here (https://www.compuphase.com/cmetric.htm).
This is not as accurate as LAB, et. al. but is *vastly* simpler and sufficient for our needs.
( Colour) GoString() string
Green component of colour.
IsSet returns true if the colour is set.
Red component of colour.
( Colour) String() string
Colour : expvar.Var
Colour : fmt.GoStringer
Colour : fmt.Stringer
func MustParseColour(colour string) Colour
func NewColour(r, g, b uint8) Colour
func ParseColour(colour string) Colour
func Colour.Brighten(factor float64) Colour
func Colour.BrightenOrDarken(factor float64) Colour
func Colour.ClampBrightness(min, max float64) Colour
func Colour.Distance(e2 Colour) float64
Colours is an orderable set of colours.
( Colours) Len() int
( Colours) Less(i, j int) bool
( Colours) Swap(i, j int)
Colours : sort.Interface
A CompiledRule is a Rule with a pre-compiled regex.
Note that regular expressions are lazily compiled on first use of the lexer.
Regexp *regexp2.Regexp
Rule Rule
Rule.Mutator Mutator
Rule.Pattern string
Rule.Type Emitter
( CompiledRule) MarshalXML(e *xml.Encoder, _ xml.StartElement) error
(*CompiledRule) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
CompiledRule : encoding/xml.Marshaler
*CompiledRule : encoding/xml.Unmarshaler
CompiledRules is a map of rule name to sequence of compiled rules in that rule.
func LexerMutator.MutateLexer(rules CompiledRules, state string, rule int) error
Config for a lexer.
Secondary file name globs
Shortcuts for the lexer
Analyse is a list of regexes to match against the input.
If a match is found, the score is returned if single attribute is set to true,
otherwise the sum of all the score of matching patterns will be
used as the final score.
Regex matching is case-insensitive.
Regex matches all characters.
Make sure that the input ends with a newline. This
is required for some lexers that consume input linewise.
File name globs
MIME types
Name of the lexer.
Regex does not match across lines ($ matches EOL).
Defaults to multiline.
Priority of lexer.
If this is 0 it will be treated as a default of 1.
func Lexer.Config() *Config
func (*RegexLexer).Config() *Config
func MustNewLexer(config *Config, rules func() Rules) *RegexLexer
func NewLexer(config *Config, rulesFunc func() Rules) (*RegexLexer, error)
func (*RegexLexer).SetConfig(config *Config) *RegexLexer
An Emitter takes group matches and returns tokens.
Emit tokens for the given regex groups.
EmitterFunc
SerialisableEmitter (interface)
TokenType
func ByGroupNames(emitters map[string]Emitter) Emitter
func ByGroups(emitters ...Emitter) Emitter
func Using(lexer string) Emitter
func UsingByGroup(sublexerNameGroup, codeGroup int, emitters ...Emitter) Emitter
func UsingLexer(lexer Lexer) Emitter
func UsingSelf(stateName string) Emitter
func ByGroupNames(emitters map[string]Emitter) Emitter
func ByGroups(emitters ...Emitter) Emitter
func UsingByGroup(sublexerNameGroup, codeGroup int, emitters ...Emitter) Emitter
EmitterFunc is a function that is an Emitter.
Emit tokens for groups.
EmitterFunc : Emitter
( Emitters) MarshalXML(e *xml.Encoder, start xml.StartElement) error
(*Emitters) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
Emitters : encoding/xml.Marshaler
*Emitters : encoding/xml.Unmarshaler
A Formatter for Chroma lexers.
Format returns a formatting function for tokens.
If the iterator panics, the Formatter should recover.
FormatterFunc
*github.com/alecthomas/chroma/v2/formatters/html.Formatter
*github.com/alecthomas/chroma/v2/formatters/svg.Formatter
func RecoveringFormatter(formatter Formatter) Formatter
func github.com/alecthomas/chroma/v2/formatters.Get(name string) Formatter
func github.com/alecthomas/chroma/v2/formatters.Register(name string, formatter Formatter) Formatter
func RecoveringFormatter(formatter Formatter) Formatter
func github.com/alecthomas/chroma/v2/formatters.Register(name string, formatter Formatter) Formatter
var github.com/alecthomas/chroma/v2/formatters.Fallback
var github.com/alecthomas/chroma/v2/formatters.JSON
var github.com/alecthomas/chroma/v2/formatters.NoOp
var github.com/alecthomas/chroma/v2/formatters.SVG
var github.com/alecthomas/chroma/v2/formatters.Tokens
var github.com/alecthomas/chroma/v2/formatters.TTY
var github.com/alecthomas/chroma/v2/formatters.TTY16
var github.com/alecthomas/chroma/v2/formatters.TTY16m
var github.com/alecthomas/chroma/v2/formatters.TTY256
var github.com/alecthomas/chroma/v2/formatters.TTY8
A FormatterFunc is a Formatter implemented as a function.
Guards against iterator panics.
( FormatterFunc) Format(w io.Writer, s *Style, it Iterator) (err error)
FormatterFunc : Formatter
An Iterator across tokens.
EOF will be returned at the end of the Token stream.
If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover.
Tokens consumes all tokens from the iterator and returns them as a slice.
func Concaterator(iterators ...Iterator) Iterator
func Literator(tokens ...Token) Iterator
func Emitter.Emit(groups []string, state *LexerState) Iterator
func EmitterFunc.Emit(groups []string, state *LexerState) Iterator
func Lexer.Tokenise(options *TokeniseOptions, text string) (Iterator, error)
func (*RegexLexer).Tokenise(options *TokeniseOptions, text string) (Iterator, error)
func SerialisableEmitter.Emit(groups []string, state *LexerState) Iterator
func TokenType.Emit(groups []string, _ *LexerState) Iterator
func Concaterator(iterators ...Iterator) Iterator
func Formatter.Format(w io.Writer, style *Style, iterator Iterator) error
func FormatterFunc.Format(w io.Writer, s *Style, it Iterator) (err error)
func github.com/alecthomas/chroma/v2/formatters/html.(*Formatter).Format(w io.Writer, style *Style, iterator Iterator) (err error)
func github.com/alecthomas/chroma/v2/formatters/svg.(*Formatter).Format(w io.Writer, style *Style, iterator Iterator) (err error)
A Lexer for tokenising source code.
AnalyseText scores how likely a fragment of text is to match
this lexer, between 0.0 and 1.0. A value of 1 indicates high confidence.
Config describing the features of the Lexer.
SetAnalyser sets a function the Lexer should use for scoring how
likely a fragment of text is to match this lexer, between 0.0 and 1.0.
A value of 1 indicates high confidence.
Lexers may ignore this if they implement their own analysers.
SetRegistry sets the registry this Lexer is associated with.
The registry should be used by the Lexer if it needs to look up other
lexers.
Tokenise returns an Iterator over tokens in text.
*RegexLexer
Lexer : Analyser
func Coalesce(lexer Lexer) Lexer
func DelegatingLexer(root Lexer, language Lexer) Lexer
func RemappingLexer(lexer Lexer, mapper func(Token) []Token) Lexer
func TypeRemappingLexer(lexer Lexer, mapping TypeMapping) Lexer
func Lexer.SetAnalyser(analyser func(text string) float32) Lexer
func Lexer.SetRegistry(registry *LexerRegistry) Lexer
func (*LexerRegistry).Analyse(text string) Lexer
func (*LexerRegistry).Get(name string) Lexer
func (*LexerRegistry).Match(filename string) Lexer
func (*LexerRegistry).MatchMimeType(mimeType string) Lexer
func (*LexerRegistry).Register(lexer Lexer) Lexer
func (*RegexLexer).SetAnalyser(analyser func(text string) float32) Lexer
func (*RegexLexer).SetRegistry(registry *LexerRegistry) Lexer
func github.com/alecthomas/chroma/v2/lexers.Analyse(text string) Lexer
func github.com/alecthomas/chroma/v2/lexers.Get(name string) Lexer
func github.com/alecthomas/chroma/v2/lexers.Match(filename string) Lexer
func github.com/alecthomas/chroma/v2/lexers.MatchMimeType(mimeType string) Lexer
func github.com/alecthomas/chroma/v2/lexers.Register(lexer Lexer) Lexer
func Coalesce(lexer Lexer) Lexer
func DelegatingLexer(root Lexer, language Lexer) Lexer
func DelegatingLexer(root Lexer, language Lexer) Lexer
func RemappingLexer(lexer Lexer, mapper func(Token) []Token) Lexer
func Tokenise(lexer Lexer, options *TokeniseOptions, text string) ([]Token, error)
func TypeRemappingLexer(lexer Lexer, mapping TypeMapping) Lexer
func UsingLexer(lexer Lexer) Emitter
func (*LexerRegistry).Register(lexer Lexer) Lexer
func github.com/alecthomas/chroma/v2/lexers.Register(lexer Lexer) Lexer
var github.com/alecthomas/chroma/v2/lexers.Caddyfile
var github.com/alecthomas/chroma/v2/lexers.CaddyfileDirectives
var github.com/alecthomas/chroma/v2/lexers.CommonLisp
var github.com/alecthomas/chroma/v2/lexers.EmacsLisp
var github.com/alecthomas/chroma/v2/lexers.Fallback
var github.com/alecthomas/chroma/v2/lexers.Genshi
var github.com/alecthomas/chroma/v2/lexers.GenshiHTMLTemplate
var github.com/alecthomas/chroma/v2/lexers.GenshiText
var github.com/alecthomas/chroma/v2/lexers.Go
var github.com/alecthomas/chroma/v2/lexers.GoHTMLTemplate
var github.com/alecthomas/chroma/v2/lexers.GoTextTemplate
var github.com/alecthomas/chroma/v2/lexers.Haxe
var github.com/alecthomas/chroma/v2/lexers.HTTP
var github.com/alecthomas/chroma/v2/lexers.Markdown
var github.com/alecthomas/chroma/v2/lexers.Raku
var github.com/alecthomas/chroma/v2/lexers.Restructuredtext
var github.com/alecthomas/chroma/v2/lexers.Svelte
var github.com/alecthomas/chroma/v2/lexers.Typoscript
A LexerMutator is an additional interface that a Mutator can implement
to modify the lexer when it is compiled.
MutateLexer can be implemented to mutate the lexer itself.
Rules are the lexer rules, state is the state key for the rule the mutator is associated with.
LexerRegistry is a registry of Lexers.
Lexers Lexers
Analyse text content and return the "best" lexer..
Get a Lexer by name, alias or file extension.
Match returns the first lexer matching filename.
Note that this iterates over all file patterns in all lexers, so is not fast.
MatchMimeType attempts to find a lexer for the given MIME type.
Names of all lexers, optionally including aliases.
Register a Lexer with the LexerRegistry. If the lexer is already registered
it will be replaced.
func NewLexerRegistry() *LexerRegistry
func Lexer.SetRegistry(registry *LexerRegistry) Lexer
func (*RegexLexer).SetRegistry(registry *LexerRegistry) Lexer
var github.com/alecthomas/chroma/v2/lexers.GlobalLexerRegistry *LexerRegistry
Lexers is a slice of lexers sortable by name.
( Lexers) Len() int
( Lexers) Less(i, j int) bool
( Lexers) Swap(i, j int)
Lexers : sort.Interface
LexerState contains the state for a single lex.
Group matches.
Lexer *RegexLexer
Custum context for mutators.
Named Group matches.
Pos int
Registry *LexerRegistry
Rule int
Rules CompiledRules
Stack []string
State string
Text []rune
Get mutator context.
Iterator returns the next Token from the lexer.
Set mutator context.
func Emitter.Emit(groups []string, state *LexerState) Iterator
func EmitterFunc.Emit(groups []string, state *LexerState) Iterator
func Mutator.Mutate(state *LexerState) error
func MutatorFunc.Mutate(state *LexerState) error
func SerialisableEmitter.Emit(groups []string, state *LexerState) Iterator
func SerialisableMutator.Mutate(state *LexerState) error
func TokenType.Emit(groups []string, _ *LexerState) Iterator
A Mutator modifies the behaviour of the lexer.
Mutate the lexer state machine as it is processing.
MutatorFunc
SerialisableMutator (interface)
func Combined(states ...string) Mutator
func Mutators(modifiers ...Mutator) Mutator
func Pop(n int) Mutator
func Push(states ...string) Mutator
func Default(mutators ...Mutator) Rule
func Mutators(modifiers ...Mutator) Mutator
A MutatorFunc is a Mutator that mutates the lexer state machine as it is processing.
( MutatorFunc) Mutate(state *LexerState) error
MutatorFunc : Mutator
PrioritisedLexers is a slice of lexers sortable by priority.
( PrioritisedLexers) Len() int
( PrioritisedLexers) Less(i, j int) bool
( PrioritisedLexers) Swap(i, j int)
PrioritisedLexers : sort.Interface
RegexConfig defines a single regex pattern and its score in case of match.
Pattern string
Score float32
RegexLexer is the default lexer implementation used in Chroma.
AnalyseText scores how likely a fragment of text is to match this lexer, between 0.0 and 1.0.
Config returns the Config for this Lexer.
MustRules is like Rules() but will panic on error.
Rules in the Lexer.
SetAnalyser sets the analyser function used to perform content inspection.
SetConfig replaces the Config for this Lexer.
SetRegistry the lexer will use to lookup other lexers if necessary.
(*RegexLexer) String() string
Tokenise text using lexer, returning an iterator.
Trace enables debug tracing.
*RegexLexer : Analyser
*RegexLexer : Lexer
*RegexLexer : expvar.Var
*RegexLexer : fmt.Stringer
func MustNewLexer(config *Config, rules func() Rules) *RegexLexer
func MustNewXMLLexer(from fs.FS, path string) *RegexLexer
func NewLexer(config *Config, rulesFunc func() Rules) (*RegexLexer, error)
func NewXMLLexer(from fs.FS, path string) (*RegexLexer, error)
func Unmarshal(data []byte) (*RegexLexer, error)
func (*RegexLexer).SetConfig(config *Config) *RegexLexer
func (*RegexLexer).Trace(trace bool) *RegexLexer
func Marshal(l *RegexLexer) ([]byte, error)
var github.com/alecthomas/chroma/v2/lexers.HTML *RegexLexer
A Rule is the fundamental matching unit of the Regex lexer state machine.
Mutator Mutator
Pattern string
Type Emitter
( Rule) MarshalXML(e *xml.Encoder, _ xml.StartElement) error
(*Rule) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
Rule : encoding/xml.Marshaler
*Rule : encoding/xml.Unmarshaler
func Default(mutators ...Mutator) Rule
func Include(state string) Rule
Rules maps from state to a sequence of Rules.
Clone returns a clone of the Rules.
( Rules) MarshalXML(e *xml.Encoder, _ xml.StartElement) error
Merge creates a clone of "r" then merges "rules" into the clone.
Rename clones rules then a rule.
(*Rules) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
Rules : encoding/xml.Marshaler
*Rules : encoding/xml.Unmarshaler
func (*RegexLexer).MustRules() Rules
func (*RegexLexer).Rules() (Rules, error)
func Rules.Clone() Rules
func Rules.Merge(rules Rules) Rules
func Rules.Rename(oldRule, newRule string) Rules
func github.com/alecthomas/chroma/v2/lexers.PlaintextRules() Rules
func Rules.Merge(rules Rules) Rules
SerialisableEmitter is an Emitter that can be serialised and deserialised to/from JSON.
Emit tokens for the given regex groups.
( SerialisableEmitter) EmitterKind() string
TokenType
SerialisableEmitter : Emitter
SerialisableMutator is a Mutator that can be serialised and deserialised.
Mutate the lexer state machine as it is processing.
( SerialisableMutator) MutatorKind() string
SerialisableMutator : Mutator
A Style definition.
See http://pygments.org/docs/styles/ for details. Semantics are intended to be identical.
Name string
Builder creates a mutable builder from this Style.
The builder can then be safely modified. This is a cheap operation.
Get a style entry. Will try sub-category or category if an exact match is not found, and
finally return the Background.
Has checks if an exact style entry match exists for a token type.
This is distinct from Get() which will merge parent tokens.
(*Style) MarshalXML(e *xml.Encoder, start xml.StartElement) error
Types that are styled.
(*Style) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
*Style : encoding/xml.Marshaler
*Style : encoding/xml.Unmarshaler
func MustNewStyle(name string, entries StyleEntries) *Style
func MustNewXMLStyle(r io.Reader) *Style
func NewStyle(name string, entries StyleEntries) (*Style, error)
func NewXMLStyle(r io.Reader) (*Style, error)
func (*StyleBuilder).Build() (*Style, error)
func github.com/alecthomas/chroma/v2/styles.Get(name string) *Style
func github.com/alecthomas/chroma/v2/styles.Register(style *Style) *Style
func Formatter.Format(w io.Writer, style *Style, iterator Iterator) error
func FormatterFunc.Format(w io.Writer, s *Style, it Iterator) (err error)
func github.com/alecthomas/chroma/v2/formatters/html.(*Formatter).Format(w io.Writer, style *Style, iterator Iterator) (err error)
func github.com/alecthomas/chroma/v2/formatters/html.(*Formatter).WriteCSS(w io.Writer, style *Style) error
func github.com/alecthomas/chroma/v2/formatters/svg.(*Formatter).Format(w io.Writer, style *Style, iterator Iterator) (err error)
func github.com/alecthomas/chroma/v2/styles.Register(style *Style) *Style
var github.com/alecthomas/chroma/v2/styles.Abap *Style
var github.com/alecthomas/chroma/v2/styles.Algol *Style
var github.com/alecthomas/chroma/v2/styles.AlgolNu *Style
var github.com/alecthomas/chroma/v2/styles.Arduino *Style
var github.com/alecthomas/chroma/v2/styles.Autumn *Style
var github.com/alecthomas/chroma/v2/styles.Average *Style
var github.com/alecthomas/chroma/v2/styles.Base16Snazzy *Style
var github.com/alecthomas/chroma/v2/styles.BlackWhite *Style
var github.com/alecthomas/chroma/v2/styles.Borland *Style
var github.com/alecthomas/chroma/v2/styles.CatppuccinFrappe *Style
var github.com/alecthomas/chroma/v2/styles.CatppuccinLatte *Style
var github.com/alecthomas/chroma/v2/styles.CatppuccinMacchiato *Style
var github.com/alecthomas/chroma/v2/styles.CatppuccinMocha *Style
var github.com/alecthomas/chroma/v2/styles.Colorful *Style
var github.com/alecthomas/chroma/v2/styles.DoomOne *Style
var github.com/alecthomas/chroma/v2/styles.DoomOne2 *Style
var github.com/alecthomas/chroma/v2/styles.Dracula *Style
var github.com/alecthomas/chroma/v2/styles.Emacs *Style
var github.com/alecthomas/chroma/v2/styles.Fallback *Style
var github.com/alecthomas/chroma/v2/styles.Friendly *Style
var github.com/alecthomas/chroma/v2/styles.Fruity *Style
var github.com/alecthomas/chroma/v2/styles.GitHub *Style
var github.com/alecthomas/chroma/v2/styles.GitHubDark *Style
var github.com/alecthomas/chroma/v2/styles.Gruvbox *Style
var github.com/alecthomas/chroma/v2/styles.GruvboxLight *Style
var github.com/alecthomas/chroma/v2/styles.HrDark *Style
var github.com/alecthomas/chroma/v2/styles.HrHighContrast *Style
var github.com/alecthomas/chroma/v2/styles.Igor *Style
var github.com/alecthomas/chroma/v2/styles.Lovelace *Style
var github.com/alecthomas/chroma/v2/styles.Manni *Style
var github.com/alecthomas/chroma/v2/styles.ModusOperandi *Style
var github.com/alecthomas/chroma/v2/styles.ModusVivendi *Style
var github.com/alecthomas/chroma/v2/styles.Monokai *Style
var github.com/alecthomas/chroma/v2/styles.MonokaiLight *Style
var github.com/alecthomas/chroma/v2/styles.Murphy *Style
var github.com/alecthomas/chroma/v2/styles.Native *Style
var github.com/alecthomas/chroma/v2/styles.Nord *Style
var github.com/alecthomas/chroma/v2/styles.OnesEnterprise *Style
var github.com/alecthomas/chroma/v2/styles.ParaisoDark *Style
var github.com/alecthomas/chroma/v2/styles.ParaisoLight *Style
var github.com/alecthomas/chroma/v2/styles.Pastie *Style
var github.com/alecthomas/chroma/v2/styles.Perldoc *Style
var github.com/alecthomas/chroma/v2/styles.Pygments *Style
var github.com/alecthomas/chroma/v2/styles.RainbowDash *Style
var github.com/alecthomas/chroma/v2/styles.RosePine *Style
var github.com/alecthomas/chroma/v2/styles.RosePineDawn *Style
var github.com/alecthomas/chroma/v2/styles.RosePineMoon *Style
var github.com/alecthomas/chroma/v2/styles.Rrt *Style
var github.com/alecthomas/chroma/v2/styles.SolarizedDark *Style
var github.com/alecthomas/chroma/v2/styles.SolarizedDark256 *Style
var github.com/alecthomas/chroma/v2/styles.SolarizedLight *Style
var github.com/alecthomas/chroma/v2/styles.SwapOff *Style
var github.com/alecthomas/chroma/v2/styles.Tango *Style
var github.com/alecthomas/chroma/v2/styles.Trac *Style
var github.com/alecthomas/chroma/v2/styles.Vim *Style
var github.com/alecthomas/chroma/v2/styles.VisualStudio *Style
var github.com/alecthomas/chroma/v2/styles.Vulcan *Style
var github.com/alecthomas/chroma/v2/styles.WitchHazel *Style
var github.com/alecthomas/chroma/v2/styles.Xcode *Style
var github.com/alecthomas/chroma/v2/styles.XcodeDark *Style
A StyleBuilder is a mutable structure for building styles.
Once built, a Style is immutable.
Add an entry to the Style map.
See http://pygments.org/docs/styles/#style-rules for details.
(*StyleBuilder) AddAll(entries StyleEntries) *StyleBuilder
(*StyleBuilder) AddEntry(ttype TokenType, entry StyleEntry) *StyleBuilder
(*StyleBuilder) Build() (*Style, error)
(*StyleBuilder) Get(ttype TokenType) StyleEntry
Transform passes each style entry currently defined in the builder to the supplied
function and saves the returned value. This can be used to adjust a style's colours;
see Colour's ClampBrightness function, for example.
func NewStyleBuilder(name string) *StyleBuilder
func (*Style).Builder() *StyleBuilder
func (*StyleBuilder).Add(ttype TokenType, entry string) *StyleBuilder
func (*StyleBuilder).AddAll(entries StyleEntries) *StyleBuilder
func (*StyleBuilder).AddEntry(ttype TokenType, entry StyleEntry) *StyleBuilder
func (*StyleBuilder).Transform(transform func(StyleEntry) StyleEntry) *StyleBuilder
StyleEntries mapping TokenType to colour definition.
func MustNewStyle(name string, entries StyleEntries) *Style
func NewStyle(name string, entries StyleEntries) (*Style, error)
func (*StyleBuilder).AddAll(entries StyleEntries) *StyleBuilder
A StyleEntry in the Style map.
Background Colour
Bold Trilean
Border Colour
Hex colours.
Italic Trilean
NoInherit bool
Underline Trilean
Inherit styles from ancestors.
Ancestors should be provided from oldest to newest.
( StyleEntry) IsZero() bool
( StyleEntry) MarshalText() ([]byte, error)
( StyleEntry) String() string
Sub subtracts e from s where elements match.
StyleEntry : encoding.TextMarshaler
StyleEntry : expvar.Var
StyleEntry : fmt.Stringer
StyleEntry : gopkg.in/yaml.v3.IsZeroer
func MustParseStyleEntry(entry string) StyleEntry
func ParseStyleEntry(entry string) (StyleEntry, error)
func (*Style).Get(ttype TokenType) StyleEntry
func (*StyleBuilder).Get(ttype TokenType) StyleEntry
func StyleEntry.Inherit(ancestors ...StyleEntry) StyleEntry
func StyleEntry.Sub(e StyleEntry) StyleEntry
func (*StyleBuilder).AddEntry(ttype TokenType, entry StyleEntry) *StyleBuilder
func StyleEntry.Inherit(ancestors ...StyleEntry) StyleEntry
func StyleEntry.Sub(e StyleEntry) StyleEntry
func github.com/alecthomas/chroma/v2/formatters/html.StyleEntryToCSS(e StyleEntry) string
func github.com/alecthomas/chroma/v2/formatters/svg.StyleEntryToSVG(e StyleEntry) string
Token output to formatter.
Type TokenType
Value string
Clone returns a clone of the Token.
(*Token) GoString() string
(*Token) String() string
*Token : expvar.Var
*Token : fmt.GoStringer
*Token : fmt.Stringer
func SplitTokensIntoLines(tokens []Token) (out [][]Token)
func Tokenise(lexer Lexer, options *TokeniseOptions, text string) ([]Token, error)
func Iterator.Tokens() []Token
func (*LexerState).Iterator() Token
func (*Token).Clone() Token
func Literator(tokens ...Token) Iterator
func SplitTokensIntoLines(tokens []Token) (out [][]Token)
func Stringify(tokens ...Token) string
var EOF
TokeniseOptions contains options for tokenisers.
If true, all EOLs are converted into LF
by replacing CRLF and CR
Nested tokenisation.
State to start tokenisation in. Defaults to "root".
func Tokenise(lexer Lexer, options *TokeniseOptions, text string) ([]Token, error)
func Lexer.Tokenise(options *TokeniseOptions, text string) (Iterator, error)
func (*RegexLexer).Tokenise(options *TokeniseOptions, text string) (Iterator, error)
TokenType is the type of token to highlight.
It is also an Emitter, emitting a single token of itself
( TokenType) Category() TokenType
( TokenType) Emit(groups []string, _ *LexerState) Iterator
( TokenType) EmitterKind() string
( TokenType) InCategory(other TokenType) bool
( TokenType) InSubCategory(other TokenType) bool
IsATokenType returns "true" if the value is listed in the enum definition. "false" otherwise
MarshalText implements the encoding.TextMarshaler interface for TokenType
( TokenType) MarshalXML(e *xml.Encoder, start xml.StartElement) error
( TokenType) Parent() TokenType
( TokenType) String() string
( TokenType) SubCategory() TokenType
UnmarshalText implements the encoding.TextUnmarshaler interface for TokenType
(*TokenType) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error
TokenType : Emitter
TokenType : SerialisableEmitter
TokenType : encoding.TextMarshaler
*TokenType : encoding.TextUnmarshaler
TokenType : encoding/xml.Marshaler
*TokenType : encoding/xml.Unmarshaler
TokenType : expvar.Var
TokenType : fmt.Stringer
func TokenTypeString(s string) (TokenType, error)
func TokenTypeValues() []TokenType
func (*Style).Types() []TokenType
func TokenType.Category() TokenType
func TokenType.Parent() TokenType
func TokenType.SubCategory() TokenType
func (*Style).Get(ttype TokenType) StyleEntry
func (*Style).Has(ttype TokenType) bool
func (*StyleBuilder).Add(ttype TokenType, entry string) *StyleBuilder
func (*StyleBuilder).AddEntry(ttype TokenType, entry StyleEntry) *StyleBuilder
func (*StyleBuilder).Get(ttype TokenType) StyleEntry
func TokenType.InCategory(other TokenType) bool
func TokenType.InSubCategory(other TokenType) bool
func github.com/alecthomas/chroma/v2/formatters/html.WithCustomCSS(css map[TokenType]string) html.Option
const Background
const CodeLine
const Comment
const CommentHashbang
const CommentMultiline
const CommentPreproc
const CommentPreprocFile
const CommentSingle
const CommentSpecial
const Date
const EOFType
const Error
const Generic
const GenericDeleted
const GenericEmph
const GenericError
const GenericHeading
const GenericInserted
const GenericOutput
const GenericPrompt
const GenericStrong
const GenericSubheading
const GenericTraceback
const GenericUnderline
const Keyword
const KeywordConstant
const KeywordDeclaration
const KeywordNamespace
const KeywordPseudo
const KeywordReserved
const KeywordType
const Line
const LineHighlight
const LineLink
const LineNumbers
const LineNumbersTable
const LineTable
const LineTableTD
const Literal
const LiteralDate
const LiteralNumber
const LiteralNumberBin
const LiteralNumberFloat
const LiteralNumberHex
const LiteralNumberInteger
const LiteralNumberIntegerLong
const LiteralNumberOct
const LiteralOther
const LiteralString
const LiteralStringAffix
const LiteralStringAtom
const LiteralStringBacktick
const LiteralStringBoolean
const LiteralStringChar
const LiteralStringDelimiter
const LiteralStringDoc
const LiteralStringDouble
const LiteralStringEscape
const LiteralStringHeredoc
const LiteralStringInterpol
const LiteralStringName
const LiteralStringOther
const LiteralStringRegex
const LiteralStringSingle
const LiteralStringSymbol
const Name
const NameAttribute
const NameBuiltin
const NameBuiltinPseudo
const NameClass
const NameConstant
const NameDecorator
const NameEntity
const NameException
const NameFunction
const NameFunctionMagic
const NameKeyword
const NameLabel
const NameNamespace
const NameOperator
const NameOther
const NameProperty
const NamePseudo
const NameTag
const NameVariable
const NameVariableAnonymous
const NameVariableClass
const NameVariableGlobal
const NameVariableInstance
const NameVariableMagic
const None
const Number
const NumberBin
const NumberFloat
const NumberHex
const NumberInteger
const NumberIntegerLong
const NumberOct
const Operator
const OperatorWord
const Other
const PreWrapper
const Punctuation
const String
const StringAffix
const StringBacktick
const StringChar
const StringDelimiter
const StringDoc
const StringDouble
const StringEscape
const StringHeredoc
const StringInterpol
const StringOther
const StringRegex
const StringSingle
const StringSymbol
const Text
const TextPunctuation
const TextSymbol
const TextWhitespace
const Whitespace
Trilean value for StyleEntry value inheritance.
Prefix returns s with "no" as a prefix if Trilean is no.
( Trilean) String() string
Trilean : expvar.Var
Trilean : fmt.Stringer
const No
const Pass
const Yes
TypeMapping defines type maps for the TypeRemappingLexer.
func TypeRemappingLexer(lexer Lexer, mapping TypeMapping) Lexer
Package-Level Functions (total 43)
ByGroupNames emits a token for each named matching group in the rule's regex.
ByGroups emits a token for each matching group in the rule's regex.
Coalesce is a Lexer interceptor that collapses runs of common types into a single token.
Combined creates a new anonymous state from the given states, and pushes that state.
Concaterator concatenates tokens from a series of iterators.
Default returns a Rule that applies a set of Mutators.
DelegatingLexer combines two lexers to handle the common case of a language embedded inside another, such as PHP
inside HTML or PHP inside plain text.
It takes two lexer as arguments: a root lexer and a language lexer. First everything is scanned using the language
lexer, which must return "Other" for unrecognised tokens. Then all "Other" tokens are lexed using the root lexer.
Finally, these two sets of tokens are merged.
The lexers from the template lexer package use this base lexer.
Include the given state.
Literator converts a sequence of literal Tokens into an Iterator.
Marshal a RegexLexer to XML.
MustNewLexer creates a new Lexer with deferred rules generation or panics.
MustNewStyle creates a new style or panics.
MustNewXMLLexer constructs a new RegexLexer from an XML file or panics.
MustNewXMLStyle is like NewXMLStyle but panics on error.
MustParseColour is like ParseColour except it panics if the colour is invalid.
Will panic if colour is in an invalid format.
MustParseStyleEntry parses a Pygments style entry or panics.
Mutators applies a set of Mutators in order.
NewColour creates a Colour directly from RGB values.
NewLexer creates a new regex-based Lexer.
"rules" is a state machine transition map. Each key is a state. Values are sets of rules
that match input, optionally modify lexer state, and output tokens.
NewLexerRegistry creates a new LexerRegistry of Lexers.
NewStyle creates a new style definition.
func NewStyleBuilder(name string) *StyleBuilder
NewXMLLexer creates a new RegexLexer from a serialised RegexLexer.
NewXMLStyle parses an XML style definition.
ParseColour in the forms #rgb, #rrggbb, #ansi<colour>, or #<colour>.
Will return an "unset" colour if invalid.
ParseStyleEntry parses a Pygments style entry.
Pop state from the stack when rule matches.
Push states onto the stack.
RecoveringFormatter wraps a formatter with panic recovery.
RemappingLexer remaps a token to a set of, potentially empty, tokens.
SplitTokensIntoLines splits tokens containing newlines in two.
Stringify returns the raw string for a set of tokens.
Tokenise text using lexer, returning tokens as a slice.
TokenTypeString retrieves an enum value from the enum constants string name.
Throws an error if the param is not part of the enum.
TokenTypeStrings returns a slice of all String values of the enum
TokenTypeValues returns all values of the enum
TypeRemappingLexer remaps types of tokens coming from a parent Lexer.
eg. Map "defvaralias" tokens of type NameVariable to NameFunction:
mapping := TypeMapping{
{NameVariable, NameFunction, []string{"defvaralias"},
}
lexer = TypeRemappingLexer(lexer, mapping)
Unmarshal a RegexLexer from XML.
Using returns an Emitter that uses a given Lexer reference for parsing and emitting.
The referenced lexer must be stored in the same LexerRegistry.
UsingByGroup emits tokens for the matched groups in the regex using a
sublexer. Used when lexing code blocks where the name of a sublexer is
contained within the block, for example on a Markdown text block or SQL
language block.
An attempt to load the sublexer will be made using the captured value from
the text of the matched sublexerNameGroup. If a sublexer matching the
sublexerNameGroup is available, then tokens for the matched codeGroup will
be emitted using the sublexer. Otherwise, if no sublexer is available, then
tokens will be emitted from the passed emitter.
Example:
var Markdown = internal.Register(MustNewLexer(
&Config{
Name: "markdown",
Aliases: []string{"md", "mkd"},
Filenames: []string{"*.md", "*.mkd", "*.markdown"},
MimeTypes: []string{"text/x-markdown"},
},
Rules{
"root": {
{"^(```)(\\w+)(\\n)([\\w\\W]*?)(^```$)",
UsingByGroup(
2, 4,
String, String, String, Text, String,
),
nil,
},
},
},
))
See the lexers/markdown.go for the complete example.
Note: panic's if the number of emitters does not equal the number of matched
groups in the regex.
UsingLexer returns an Emitter that uses a given Lexer for parsing and emitting.
This Emitter is not serialisable.
UsingSelf is like Using, but uses the current Lexer.
Words creates a regex that matches any of the given literal words.
Package-Level Variables (total 4)
ANSI2RGB maps ANSI colour names, as supported by Chroma, to hex RGB values.
EOF is returned by lexers at the end of input.
ErrNotSerialisable is returned if a lexer contains Rules that cannot be serialised.
var StandardTypes map[TokenType]string
Package-Level Constants (total 125)
Default background style.
Code line wrapper style.
Comments.
Comments.
Comments.
Preprocessor "comments".
Preprocessor "comments".
Comments.
Comments.
Aliases.
Used as an EOF marker / nil token
Input that could not be tokenised.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Generic tokens.
Keywords.
Keywords.
Keywords.
Keywords.
Keywords.
Keywords.
Keywords.
Line style.
Line higlight style.
Line number links.
Line numbers in output.
Line numbers in output when in table.
Line numbers table wrapper style.
Line numbers table TD wrapper style.
Literals.
Literals.
Literals.
Literals.
Literals.
Literals.
Literals.
Literals.
Literals.
Literals.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Strings.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Names.
Trilean states.
No highlighting.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Operators.
Operators.
Other is used by the Delegate lexer to indicate which tokens should be handled by the delegate.
Trilean states.
PreWrapper style.
Punctuation.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Aliases.
Text.
Text.
Text.
Text.
Aliases.
Trilean states.
![]() |
The pages are generated with Golds v0.8.2. (GOOS=linux GOARCH=amd64) Golds is a Go 101 project developed by Tapir Liu. PR and bug reports are welcome and can be submitted to the issue list. Please follow @zigo_101 (reachable from the left QR code) to get the latest news of Golds. |