'Accidental' CrossCompiler - Computerphile
**Title: The Evolution of Cross-Compilers and Intermediate Codes: A Journey Through Hardware Challenges**
---
### Introduction to Cross-Compilation and Hardware Bridging
Today, we delve into a fascinating narrative about cross-compilation and the challenges of bridging hardware components in an era where compatibility was not as seamless as it is today. This story follows the development of the first cross-compiler, a tool that allows code written for one machine to run on another. The journey began with an accidental discovery while working on a project involving a Z80 chip and a Linotronic 202 typesetter—a device known for its non-standard parallel port.
The challenge was significant: the Linotronic 202 required data to be sent via its own proprietary parallel interface, which posed a problem for systems that could not directly communicate with it. To address this, a single-board Z80 chip was commissioned. This chip would handle data as serial input but output it as parallel, ensuring compatibility with the Linotronic's port.
### The Role of Steve Marchant and His Bridging Board
Steve Marchant, an electronics engineer, played a crucial role in this endeavor by building the bridging board. Despite being "hardly high tech," this board was essential for facilitating communication between the Z80 chip and the Linotronic 202. Steve adhered to the philosophy that everything should be done in assembler rather than C, which he deemed unnecessary for such a low-level task.
Steve's contributions extended beyond hardware; he also provided a monitor port for debugging purposes. This port, referred to as a VDU (Video Display Terminal) or simply a dumb terminal, allowed developers to see the flow of characters into and out of the system—a lifeline during the debugging process. His generosity was evident in the 2k RAM he allocated for error messages, which was a significant allocation at the time.
### Memory Constraints and Their Implications
The limited memory on the Z80 board posed a challenge, with only 2k of RAM available. This restriction forced developers to be resourceful. Julian, who took over the project later, found himself in a bind when error messages filled up most of the available RAM, leaving barely 40 bytes free. The solution was to request more memory from Steve, who responded by adding another 2k of RAM. This split the memory into two sections: one for character buffering and another for storing error messages.
### Programming Challenges and Solutions
The programming environment was far from ideal. The Whitesmiths C compiler reduced code to assembler, allowing developers to insert assembler instructions directly into their C code. This hybrid approach allowed them to write high-level code while handling low-level details, ultimately resulting in a large assembler-level program that was managed through a shell script.
### Physical Transportation of Code
The process of transferring the compiled code to the target board involved a physical step. The code was burned onto an EEPROM (Electrically Erasable Programmable Read-Only Memory), which was then inserted into the target board. This method, while effective, was laborious and prone to failure, requiring developers to physically walk over to the machine each time they wanted to test their code.
### Philosophical Reflections on Cross-Compilation
Reflecting on this setup, the developer pondered the limitations of 4k memory for an 8-bit microprocessor like the Z80. He imagined the ease of having more memory but acknowledged that such a luxury was not feasible at the time. This led to broader questions about cross-compilation and the feasibility of hosting a compiler directly on the Z80, which would likely be too slow but theoretically possible.
### Ken Thompson's Approach: Invading Enemy Territory
The story shifts focus to Ken Thompson, who faced a similar challenge with the Linotronic 202. Instead of creating a bridging board, he chose a different path—invading the "enemy territory" by taking over the Naked Mini computer that controlled the typesetter. Ken's approach was to port his B interpreter onto this machine, allowing him to run his cross-compiler and eventually bypass the intermediate hardware entirely.
### Steve Bourne's Contribution: Algol 68C and Z Code
Steve Bourne, known for his work on the Bourne shell in Unix, shared insights from his own experiences at Cambridge. He discussed the challenges of generating code for Z80 boards using an IBM 360 mainframe. His solution involved creating a stripped-down version of the Algol 68C compiler to fit into limited memory, which required the full compiler to temporarily balloon in size before producing its final output.
Steve emphasized the difficulties of porting compilers to new architectures and proposed an intermediate code approach. He introduced "Zed Code," a low-level intermediate representation that allowed compilers to generate Z code instead of machine-specific binary. This approach reduced the need for rewriting entire compilers for each new architecture, requiring only an interpreter for the target machine.
### The Quest for a Universal Intermediate Language
Steve's vision of a universal intermediate language, or "Z code," was ambitious but faced challenges due to architectural differences. His colleagues at Bell Labs preferred the term "Z code," while Steve remained loyal to his British roots, referring to it as "zed code." Despite its potential, the quest for a one-size-fits-all intermediate language proved elusive, with architectures remaining too varied to accommodate a single universal solution.
### Conclusion: The Role of Intermediate Codes
Reflecting on this journey, we see how intermediate codes play a crucial role in cross-compilation. By emitting an intermediate representation like Z code, developers can focus on writing the front-end (syntax analysis) once and only need to write an interpreter for the new architecture to complete the process. This approach significantly simplifies the task of porting compilers across different machines.
### Final Thoughts
As we conclude this exploration into cross-compilation and intermediate codes, it's clear that while universal solutions are challenging, the lessons learned from pioneers like Steve Marchant, Ken Thompson, and Steve Bourne continue to shape how we approach compiler development. The challenges faced in the early days of computing remind us of the importance of innovation and adaptability in overcoming technical limitations.
---
**Note:** This article is a transcription and expansion of the provided video content, maintaining the original narrative while organizing it into a coherent structure for readability.