Please enable JavaScript in your browser.

Case 7: Frames vs. Grammar in First Gen AI

In the 1950s-1970s, computer scientists began programming their own text generators. Like the writers’ aids of the early 20th century, these programs relied on structural analysis, genre classification, and combinatorial methods. Two conceptually divergent approaches emerged as leading methods of computational text generation: frame-based programs and grammar-based programs. Frame-based programs used generic constraints to structure a top-down approach to text generation. Conversely, grammar-based programs took a bottom-up approach, establishing grammatical rules that could be applied to any context. Some programs, such as Margaret Masterman’s haiku generator, combined these two approaches.

Like Wycliffe Hill’s Plot Genie, with its dozens of genre-specific templates, or Arthur Blanchard’s Movie-writer, with its combination of pre-defined “groups,” frame-based programs worked within a specific context, or “frame.” These frames drew on the constrained vocabulary of “real-world” literary genres or the norms of common social situations, such as birthday-parties, baseball, or psychiatric journals. Since these programs were limited to their specific frames, they produce funny or catastrophic mistakes when deployed outside a constrained context.

Like William Wallace Cook’s formulaic approach to literary production (Plotto), or Propp’s structural analysis of literature (Morphology of a Folktale), grammar-based programs relied on structural rules that could be applied to any context. These programs were related to contemporary linguistic studies which sought the universal rules underlying all languages (such as the work of Noam Chomsky). But, since these programs were not trained with limited vocabulaires and used in specific contexts, they could frequently dissolve into nonsense.

Margaret Masterman’s work lies at the intersection of frame-based and grammar-based approaches. Masterman dismissed rule-based or probabilistic text generators (such as the grammar-based programs) as likely to produce “idiot translations” and “idiot poetry.” Instead, she argued, any grammatical model of language should be grounded in real-world contexts. Masterman’s “Haiku-Generator,” for example, used both semantic schemas and domain-specific vocabulary (the typical language of haikus).