Theses

Persistent link for this collectionhttps://hdl.handle.net/11299/206560

Search within Theses

Browse

Recent Submissions

Now showing 1 - 3 of 3
  • Item
    Reliably composable language extensions
    (2017-05) Kaminski, Ted
    Many programming tasks are dramatically simpler when an appropriate domain-specific language can be used to accomplish them. These languages offer a variety of potential advantages, including programming at a higher level of abstraction, custom analyses specific to the problem domain, and the ability to generate very efficient code. But they also suffer many disadvantages as a result of their implementation techniques. Fully separate languages (such as YACC, or SQL) are quite flexible, but these are distinct monolithic entities and thus we are unable to draw on the features of several in combination to accomplish a single task. That is, we cannot compose their domain-specific features. "Embedded" DSLs (such as parsing combinators) accomplish something like a different language, but are actually implemented simply as libraries within a flexible host language. This approach allows different libraries to be imported and used together, enabling composition, but it is limited in analysis and translation capabilities by the host language they are embedded within. A promising combination of these two approaches is to allow a host language to be directly extended with new features (syntactic and semantic.) However, while there are plausible ways to attempt to compose language extensions, they can easily fail, making this approach unreliable. Previous methods of assuring reliable composition impose onerous restrictions, such as throwing out entirely the ability to introduce new analysis. This thesis introduces reliably composable language extensions as a technique for the implementation of DSLs. This technique preserves most of the advantages of both separate and "embedded" DSLs. Unlike many prior approaches to language extension, this technique ensures composition of multiple language extensions will succeed, and preserves strong properties about the behavior of the resulting composed compiler. We define an analysis on language extensions that guarantees the composition of several extensions will be well-defined, and we further define a set of testable properties that ensure the resulting compiler will behave as expected, along with a principle that assigns "blame" for bugs that may ultimately appear as a result of composition. Finally, to concretely compare our approach to our original goals for reliably composable language extension, we use these techniques to develop an extensible C compiler front-end, together with several example composable language extensions.
  • Item
    Composable semantics using higher-order attribute grammars
    (2012-11) Manjacheri, Lijesh Krishnan
    Ideally, programmers could make use of domain-specific knowledge and program using constructs that implement abstractions in their problem domains, and obtain domain-appropriate feedback. Further, new functionality could be added to existing languages incrementally, so that programmers could choose features appropriate to their problem domains, retrieve their specifications, and compose them automatically with an existing host language to generate a compiler for an extended version of the language. The library model of language extensibility separates out the stages of host language specification, extension development, and extension composition, to make the process of language extension modular. The model is targeted at writing composable semantic specifications, and emphasizes the importance of static analyses performed at extension development time to flag potential conflicts between extensions during composition. Silver is a general-purpose, high-level framework that implements the library model of extension development. The declarative Silver specification language can be used to specify the host language concrete syntax and semantics, and independently, the extension syntax and semantics, the latter often in terms of the host language's semantics. The problem of brittle concrete syntax specifications is handled by the associated Copper tool that uses context-aware scanning to generate a parser for the extended language. The problem of specifying host language and extension semantics (such as error checking and source-level transformations) is handled via the evaluation on syntax tree nodes of functions written in the higher-order attribute grammar formalism. There are challenges to designing language semantics (such as types, environment and error-reporting) for a non-trivial host language to allow extension writers to write useful extensions that compose with other independently written extensions. An issue when generating sound and terminating generated compilers in a higher-order attribute grammar framework such as Silver is the potential for improper termination of attribute evaluation during the execution of the generated compiler. This dissertation makes two contributions toward writing composable specifications for programming language semantics. It first looks at how a declarative, attribute grammar-based tool (extended with features such as forwarding, aspect productions and collection attributes) can be used to write host language specifications whose type systems and environments can be conveniently accessed and modified by extensions. To this end, it describes ableJ, an extensible host language specification for Java 1.4 that generates front-end translators from extended code to valid Java 1.4 code. Second, it presents a static analysis on higher-order attribute grammars that detects non-terminating tree creation during attribute evaluation. Combined with the higher-order circularity test, this analysis provides extension writers and programmers with a guarantee that the generated compiler will not fail to terminate on account of improper attribute evaluation. We have run the analysis on the ableJ host language and its extension specifications to demonstrate that the analysis is powerful enough to show termination of non-trivial grammars.
  • Item
    Context-aware scanning and determinism-preserving grammar composition, in theory and practice
    (2010-07) Schwerdfeger, August
    This thesis documents several new developments in the theory of parsing, and also the practical value of their implementation in the Copper parser generator. The most widely-used apparatus for syntactic analysis of programming languages consists of a scanner based on deterministic finite automata, built from a set of regular expressions called the lexical syntax, and a separate parser, operating on the output of this scanner, built from a context-free grammar and utilizing the LALR(1) algorithm. While the LALR(1) algorithm has the advantage of guaranteeing a non-ambiguous parse, and the approach of keeping the scanner and parser separate make the compilation process more clean and modular, it is also a brittle approach. The class of grammars that can be parsed with an LALR(1) parser is not closed under grammar composition, and small changes to an LALR(1) grammar can remove the grammar from the LALR(1) class. Also, the separation of scanner and parser prevent the use, in any organized way of parser context to resolve ambiguities in the lexical syntax. One area in which both of these drawbacks pose a particular problem is that of parsing embedded and extensible languages. In particular, it forms one of the last major impediments to the development of an extensible compiler in which language extensions are imported and composed by the end user (programmer) in an analogous manner to the way libraries are presently imported. This is due not only to the problem of the LALR(1) grammar class not being closed under composition, but to the very real possibility that the lexical syntax of two different extensions will clash, making it impossible to construct a scanner without extensive manual resolution of ambiguities, if at all. This thesis presents three innovations that are a large step towards eliminating parsing as an Achilles’ heel in this application. Firstly, it describes a new algorithm of scanning called context-aware scanning, in which the scanner at each scan is made aware of what sorts of tokens are recognized as valid by the parser at that point. By allowing the use of parser context in the scanner to disambiguate, context-aware scanning makes the specification of embedded languages much simpler — instead of specifying a scanner that must explicitly switch “modes” to scan on the different embedded languages, one simply compiles a context-aware scanner from a single lexical specification, which has implicit “modes” to scan properly on each embedded language. Similarly, in an extensible language, this puts a degree of separation between the lexical syntax of different language extensions, so that any clashes of this sort will not be an issue. Secondly, the thesis describes an analysis that operates on grammar extensions of a certain form, and can recognize their membership in a class of extensions that can be composed with each other and still produce a deterministic parser—enabling end-users to compose extensions written by different language developers with this guarantee of determinism. The analysis is made practical by context-aware scanning, which ensures a lack of lexical issues to go along with the lack of syntactic nondeterminism. It is this analysis — the first of its kind — that is the largest step toward realizing the sort of extensible compiler described above, as the writer of each extension can test it independently using the analysis and thus resolve any lexical or syntactic issues with the extension before the end user ever sees it. Thirdly, the thesis describes a corollary of this analysis, which allows extensions that have passed the analysis to be distributed in parse table form and then composed on-the-fly by the end users, with the same guarantee of determinism. Besides expediting the operation of composition, this also enables the use of the analysis in situations where the writer of a language or language extension does not want to release its grammar to the public. Finally, the thesis discusses how these approaches have been implemented and made practical in Copper, including runtime tests, implementations and analyses of real-world grammars and extensions.