You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I.e. opening and closing code block triple backticks with 1 newline, only the closing backticks are tokenized as generic fenced code block lines. A printout of the tree with ranges and token types:
I'm using this information for syntax highlighting of course, and I wonder how you deal with these inconsistencies in behavior in your own apps. Given previous questions about this were not received with the same level of 'OCD' that I show, I guess you don't mind this in practice somehow :)
The text was updated successfully, but these errors were encountered:
Isn't a fenced code block, it's a code span within a <p> with a single newline as the contents. It's valid Markdown (which doesn't support fenced code blocks).
Once you split this into multiple paragraphs, e.g. with two newlines, it becomes a fenced code block in MultiMarkdown and others (but not Markdown).
This can be difficult to correctly interpret in the parser, and has to be handled after the fact. (One reason that I partially regret supporting fenced code blocks since they break the rules established by the rest of the "pure" Markdown syntax.)
You have to look at the HTML output to fully understand what is happening here.
Hey Fletcher, hope things are fine!
I found a weird token thing again today.
Given a document that has this string content:
I.e. opening and closing code block triple backticks with 1 newline, only the closing backticks are tokenized as generic fenced code block lines. A printout of the tree with ranges and token types:
Once you add anything in the code block, even another line break, the
lineFenceBacktick3
will be "downgraded" to the generic one:produces
I'm using this information for syntax highlighting of course, and I wonder how you deal with these inconsistencies in behavior in your own apps. Given previous questions about this were not received with the same level of 'OCD' that I show, I guess you don't mind this in practice somehow :)
The text was updated successfully, but these errors were encountered: