new PegTokenizer(env, options)
Extends
Methods
process(text, sol)
Process text. The text is tokenized in chunks and control is yielded to the event loop after each top-level block is tokenized enabling the tokenized chunks to be processed at the earliest possible opportunity.
Parameters:
Name | Type | Description |
---|---|---|
text |
string | |
sol |
boolean | Whether text should be processed in start-of-line context. |
- Source:
setPipelineId()
Debugging aid: Set pipeline id.
- Source:
setSourceOffsets()
Set start and end offsets of the source that generated this DOM.
- Source:
tokenizeAs()
Tokenizes a string as a rule, otherwise returns an Error
Tokenizes a string as a rule, otherwise returns an Error
- Source:
tokenizeAsync(text, sol)
The main worker. Sets up event emission ('chunk' and 'end' events). Consumers are supposed to register with PegTokenizer before calling process().
Parameters:
Name | Type | Description |
---|---|---|
text |
string | |
sol |
boolean | Whether text should be processed in start-of-line context. |
- Source:
tokenizeExtlink(text, sol)
Tokenize an extlink.
Parameters:
Name | Type | Description |
---|---|---|
text |
string | |
sol |
boolean |
- Source:
tokenizesAsURL(text) → {boolean}
Tokenize a URL.
Parameters:
Name | Type | Description |
---|---|---|
text |
string |
- Source:
Returns:
- Type
- boolean
tokenizeSync(text, argsopt) → {Array}
tokenizeTableCellAttributes()
Tokenize table cell attributes.
- Source: