Virtually every modern programming language capable of writing non-trivial programs is expected to support Lexical scoping by default. Some legacy programming languages, and even a few modern scripting languages still make use of dynamic scoping (Perl
For todays post I'm going to go in a bit of a different direction than I normally do. After watching a pretty interesting video (in my humble opinion, anyway) on Pascals Triangle, I found myself surfing through wikipedia (as one does) and ultimately ar
MGCLex is a "Lexer Generator" in the spirit of Flex or ScanGen. Input is in the form of a specification file containing a list of Token-rule pairs, 1 per line. Each pair consists of a regular expression pattern and an identifier for the pattern. MGCLex
I've mentioned before that when it comes to the implementation of regular expression matching, the conversation as it appears in the literature tends to begin with NFA's, gives a quick run down of Thompsons Construction, and ends at DFA's with
When it comes to implementing Finite State Automata picking a data structure with which to model the machine is an interesting problem. On the one hand the choice is obvious: Finite state machines be them deterministic or NFAs are generally viewed as d
-
Access Links, Activation Records, And Bytecode: Implementing Lexical Scoping & Closures
-
Pascal & Bernoulli & Floyd: Triangles
-
A Quick tour of MGCLex
-
Compiling Regular Expressions for "The VM Approach"
-
Composable Linked Digraphs: An efficient NFA Data Structure for Thompsons Construction
-
Improving the Space Efficiency of Suffix Arrays
-
Augmenting B+ Trees For Order Statistics
-
Top-Down AST Construction of Regular Expressions with Recursive Descent
-
Balanced Deletion for in-memory B+ Trees
-
Building an AST from a Regular Expression Bottom-up