API Reference#
This reference is organised into sections relating to the different things you’re likely to be doing when using Sybil.
Sybils#
- class sybil.Sybil(parsers: Sequence[Callable[[sybil.Document], Iterable[sybil.Region]]], pattern: str | None = None, patterns: Sequence[str] = (), exclude: str | None = None, excludes: Sequence[str] = (), filenames: Collection[str] = (), path: str = '.', setup: Callable[[Dict[str, Any]], None] | None = None, teardown: Callable[[Dict[str, Any]], None] | None = None, fixtures: Sequence[str] = (), encoding: str = 'utf-8', document_types: Mapping[str | None, Type[Document]] | None = None)#
An object to provide test runner integration for discovering examples in documentation and ensuring they are correct.
- Parameters:
parsers – A sequence of callables. See Parsers.
path –
The path in which source files are found, relative to the path of the Python source file in which this class is instantiated. Absolute paths can also be passed.
Note
This is ignored when using the pytest integration.
pattern – An optional
pattern
used to match source files that will be parsed for examples.patterns – An optional sequence of
patterns
used to match source paths that will be parsed for examples.exclude – An optional
pattern
for source file names that will be excluded when looking for examples.excludes – An optional sequence of
patterns
for source paths that will be excluded when looking for examples.filenames – An optional collection of file names that, if found anywhere within the root
path
or its sub-directories, will be parsed for examples.setup – An optional callable that will be called once before any examples from a
Document
are evaluated. If provided, it is called with the document’snamespace
.teardown – An optional callable that will be called after all the examples from a
Document
have been evaluated. If provided, it is called with the document’snamespace
.fixtures – An optional sequence of strings specifying the names of fixtures to be requested when using the pytest integration. The fixtures will be inserted into the document’s
namespace
before any examples for that document are evaluated. All scopes of fixture are supported.encoding – An optional string specifying the encoding to be used when decoding documentation source files.
document_types – A mapping of file extension to
Document
subclass such that custom evaluation can be performed per document type.
- __add__(other: Sybil) SybilCollection #
Sybil
instances can be concatenated into aSybilCollection
.
Documents#
- class sybil.Document(text: str, path: str)#
This is Sybil’s representation of a documentation source file. It will be instantiated by Sybil and provided to each parser in turn.
Different types of document can be handled by subclassing to provide the required
evaluation
. The required file extensions, such as'.py'
, can then be mapped to these subclasses usingSybil's
document_types
parameter.- namespace: Dict[str, Any]#
This dictionary is the namespace in which all examples parsed from this document will be evaluated.
- classmethod parse(path: str, *parsers: Callable[[sybil.Document], Iterable[sybil.Region]], encoding: str = 'utf-8') Document #
Read the text from the supplied path and parse it into a document using the supplied parsers.
- line_column(position: int) str #
Return a line and column location in this document based on a character position.
- find_region_sources(start_pattern: Pattern[str], end_pattern: Pattern[str]) Iterator[Tuple[Match[str], Match[str], str]] #
This helper method can be used to extract source text for regions based on the two regular expressions provided.
It will yield a tuple of
(start_match, end_match, source)
for each occurrence ofstart_pattern
in the document’stext
that is followed by an occurrence ofend_pattern
. The matches will be provided as match objects, while the source is provided as a string.
- push_evaluator(evaluator: Callable[[sybil.Example], str | None]) None #
Push an
Evaluator
onto this document’s stack of evaluators if it is not already in that stack.When evaluating an
Example
, any evaluators in the stack will be tried in order, starting with the most recently pushed. If an evaluator raises aNotEvaluated
exception, then the next evaluator in the stack will be attempted.If the stack is empty or all evaluators present raise
NotEvaluated
, then the example’s evaluator will be used. This is the most common case!
- class sybil.document.PythonDocument(text: str, path: str)#
A
Document
type that imports the document’s source file as a Python module, making names within it available in the document’snamespace
.
- class sybil.document.PythonDocStringDocument(text: str, path: str)#
A
PythonDocument
subclass that only considers the text of docstrings in the document’s source.- classmethod parse(path: str, *parsers: Callable[[sybil.Document], Iterable[sybil.Region]], encoding: str = 'utf-8') Document #
Read the text from the supplied path to a Python source file and parse any docstrings it contains into a document using the supplied parsers.
Regions#
- class sybil.Region(start: int, end: int, parsed: Any = None, evaluator: Callable[[sybil.Example], str | None] | None = None, lexemes: Dict[str, Any] | None = None)#
Parsers should yield instances of this class for each example they discover in a documentation source file.
- Parameters:
- lexemes: LexemeMapping#
The lexemes extracted from the region.
Lexing#
- sybil.typing.Lexer#
The signature for a lexer. See Developing your own parsers. Lexers must not set
parsed
orevaluator
on theRegion
instances they return.alias of
Callable
[[sybil.Document
],Iterable
[sybil.Region
]]
- class sybil.parsers.abstract.lexers.BlockLexer(start_pattern: Pattern[str], end_pattern_template: str, mapping: Dict[str, str] | None = None)#
This is a base class useful for any
Lexer
that must handle block-style languages such as ReStructured Text or MarkDown.It yields a sequence of
Region
objects for each case where thestart_pattern
matches. Asource
Lexeme
is created from the text between the end of the start pattern and the start of the end pattern.- Parameters:
start_pattern – This is used to match the start of the block. Any named groups will be returned in the
lexemes
dict
of resultingRegion
objects. If aprefix
named group forms part of the match, this will be template substituted into theend_pattern_template
before it is compiled.end_pattern_template – This is used to match the end of any block found by the
start_pattern
. It is templated with anyprefix
group from thestart_pattern
Match
andlen_prefix
, the length of that prefix, before being compiled into aPattern
.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
Parsing#
- sybil.typing.Parser#
The signature for a parser. See Developing your own parsers.
alias of
Callable
[[sybil.Document
],Iterable
[sybil.Region
]]
- class sybil.parsers.abstract.codeblock.AbstractCodeBlockParser(lexers: Sequence[Callable[[sybil.Document], Iterable[sybil.Region]]], language: str | None = None, evaluator: Callable[[sybil.Example], str | None] | None = None)#
An abstract parser for use when evaluating blocks of code.
- Parameters:
lexers – A sequence of
Lexer
objects that will be applied in turn to eachDocument
that is parsed. TheRegion
objects returned by these lexers must have both anarguments
string, containing the language of the lexed region, and asource
Lexeme
containing the source code of the lexed region.language – The language that this parser should look for. Lexed regions which don’t have this language in their
arguments
lexeme will be ignored.evaluator – The evaluator to use for evaluating code blocks in the specified language. You can also override the
evaluate()
method below.
- class sybil.parsers.abstract.doctest.DocTestStringParser(evaluator: ~sybil.evaluators.doctest.DocTestEvaluator = <sybil.evaluators.doctest.DocTestEvaluator object>)#
This isn’t a true
Parser
in that it must be called with astr
containing the doctest example’s source and the file name that the example came from.- evaluator: DocTestEvaluator#
The evaluator to use for any doctests found in the supplied source string.
- class sybil.parsers.abstract.skip.AbstractSkipParser(lexers: Sequence[Callable[[sybil.Document], Iterable[sybil.Region]]])#
An abstract parser for skipping subsequent examples.
- class sybil.parsers.abstract.clear.AbstractClearNamespaceParser(lexers: Sequence[Callable[[sybil.Document], Iterable[sybil.Region]]])#
An abstract parser for clearing the
namespace
.
ReST Parsing and Lexing#
- class sybil.parsers.rest.lexers.DirectiveLexer(directive: str, arguments: str = '', mapping: Dict[str, str] | None = None)#
A
BlockLexer
for ReST directives that extracts the following lexemes:- Parameters:
directive – a
str
containing a regular expression pattern to match directive names.arguments – a
str
containing a regular expression pattern to match directive arguments.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.rest.lexers.DirectiveInCommentLexer(directive: str, arguments: str = '', mapping: Dict[str, str] | None = None)#
A
BlockLexer
for faux ReST directives in comments such as:.. not-really-a-directive: some-argument Source here...
It extracts the following lexemes:
- Parameters:
directive – a
str
containing a regular expression pattern to match directive names.arguments – a
str
containing a regular expression pattern to match directive arguments.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.rest.CodeBlockParser(language: str | None = None, evaluator: Callable[[sybil.Example], str | None] | None = None)#
A
Parser
for Code blocks examples.- Parameters:
language – The language that this parser should look for.
evaluator – The evaluator to use for evaluating code blocks in the specified language. You can also override the
evaluate()
method below.
- class sybil.parsers.rest.PythonCodeBlockParser(future_imports: Sequence[str] = ())#
A
Parser
for Python Code blocks examples.- Parameters:
future_imports – An optional sequence of strings that will be turned into
from __future__ import ...
statements and prepended to the code in each of the examples found by this parser.
- class sybil.parsers.rest.DocTestParser(optionflags: int = 0)#
A
Parser
for doctest examples.- Parameters:
optionflags – doctest option flags to use when evaluating the examples found by this parser.
- class sybil.parsers.rest.DocTestDirectiveParser(optionflags: int = 0)#
A
Parser
fordoctest
directives.- Parameters:
optionflags – doctest option flags to use when evaluating the examples found by this parser.
- class sybil.parsers.rest.ClearNamespaceParser#
A
Parser
for clear-namespace instructions.
Markdown Parsing and Lexing#
- class sybil.parsers.markdown.lexers.RawFencedCodeBlockLexer(info_pattern: Pattern[str] = re.compile('$\\n', re.MULTILINE), mapping: Dict[str, str] | None = None)#
A
Lexer
for Markdown fenced code blocks allowing flexible lexing of the whole info line along with more complicated prefixes.The following lexemes are extracted:
- Parameters:
info_pattern – a
re.Pattern
to match the info line and any required prefix that follows it.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.markdown.lexers.FencedCodeBlockLexer(language: str, mapping: Dict[str, str] | None = None)#
A
Lexer
for Markdown fenced code blocks where a language is specified.RawFencedCodeBlockLexer
can be used if the whole info line, or a more complicated prefix, is required.The following lexemes are extracted:
- class sybil.parsers.markdown.lexers.DirectiveInHTMLCommentLexer(directive: str, arguments: str = '.*?', mapping: Dict[str, str] | None = None)#
A
BlockLexer
for faux directives in HTML-style Markdown comments such as:<!--- not-really-a-directive: some-argument Source here... --->
It extracts the following lexemes:
- Parameters:
directive – a
str
containing a regular expression pattern to match directive names.arguments – a
str
containing a regular expression pattern to match directive arguments.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.markdown.CodeBlockParser(language: str | None = None, evaluator: Callable[[sybil.Example], str | None] | None = None)#
A
Parser
for Code blocks examples.- Parameters:
language – The language that this parser should look for.
evaluator – The evaluator to use for evaluating code blocks in the specified language. You can also override the
evaluate()
method below.
- class sybil.parsers.markdown.PythonCodeBlockParser(future_imports: Sequence[str] = (), doctest_optionflags: int = 0)#
A
Parser
for Python Code blocks examples.- Parameters:
future_imports – An optional list of strings that will be turned into
from __future__ import ...
statements and prepended to the code in each of the examples found by this parser.doctest_optionflags – doctest option flags to use when evaluating the doctest examples found by this parser.
- class sybil.parsers.markdown.ClearNamespaceParser#
A
Parser
for clear-namespace instructions.
MyST Parsing and Lexing#
- class sybil.parsers.myst.lexers.DirectiveLexer(directive: str, arguments: str = '.*', mapping: Dict[str, str] | None = None)#
A
Lexer
for MyST directives such as:```{directivename} arguments --- key1: val1 key2: val2 --- This is directive content ```
The following lexemes are extracted:
- Parameters:
directive – a
str
containing a regular expression pattern to match directive names.arguments – a
str
containing a regular expression pattern to match directive arguments.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.myst.lexers.DirectiveInPercentCommentLexer(directive: str, arguments: str = '.*', mapping: Dict[str, str] | None = None)#
A
BlockLexer
for faux MyST directives in %-style Markdown comments such as:% not-really-a-directive: some-argument % % Source here...
It extracts the following lexemes:
- Parameters:
directive – a
str
containing a regular expression pattern to match directive names.arguments – a
str
containing a regular expression pattern to match directive arguments.mapping – If provided, this is used to rename lexemes from the keys in the mapping to their values. Only mapped lexemes will be returned in any
Region
objects.
- class sybil.parsers.myst.CodeBlockParser(language: str | None = None, evaluator: Callable[[sybil.Example], str | None] | None = None)#
A
Parser
for Code blocks examples.- Parameters:
language – The language that this parser should look for.
evaluator – The evaluator to use for evaluating code blocks in the specified language. You can also override the
evaluate()
method below.
- class sybil.parsers.myst.PythonCodeBlockParser(future_imports: Sequence[str] = (), doctest_optionflags: int = 0)#
A
Parser
for Python Code blocks examples.- Parameters:
future_imports – An optional list of strings that will be turned into
from __future__ import ...
statements and prepended to the code in each of the examples found by this parser.doctest_optionflags – doctest option flags to use when evaluating the doctest examples found by this parser.
- class sybil.parsers.myst.DocTestDirectiveParser(optionflags: int = 0)#
A
Parser
for doctest directive examples.- Parameters:
optionflags – doctest option flags to use when evaluating the examples found by this parser.
- class sybil.parsers.myst.ClearNamespaceParser#
A
Parser
for clear-namespace instructions.
Evaluation#
- class sybil.Example(document: Document, line: int, column: int, region: Region, namespace: Dict[str, Any])#
This represents a particular example from a documentation source file. It is assembled from the
Document
andRegion
the example comes from and is passed to the region’s evaluator.
- class sybil.example.NotEvaluated#
An exception that can be raised by an
Evaluator
previouslypushed
onto the document to indicate that it is not evaluating the current example and that a previously pushed evaluator, or theRegion
evaluator if no others have been pushed, should be used to evaluate theExample
instead.
- sybil.typing.Evaluator#
The signature for an evaluator. See Developing your own parsers.
- class sybil.evaluators.doctest.DocTestEvaluator(optionflags: int = 0)#
The
Evaluator
to use forRegions
yielded by aDocTestStringParser
.- Parameters:
optionflags – doctest option flags to use when evaluating examples.