The 1970s were a fast-moving time for Dubner Computer Systems, the company my dad formed in 1970. Around 1976 we had developed a powerful general-purpose computer based on the Intel 8080. At that time, though, we were still creating binaries for that computer using a cross-assembler running on a CDC 6600 at a Control Data Corporation time-sharing center a few blocks across town in Manhattan. We created the Intel code on IBM 026 keypunches, feed the IBM cards into our CDC remote terminal. The resulting binary would, at the data center, be cut onto one-inch punched paper tape. The punched tape was then couriered to our office, where we used a minicomputer to read the tape, and then write the binary back out onto a 1/8-inch cassette recorder using a 1200-baud modem. We then took the cassette to one of our Intel systems, where we were using cassette players for program storage.
The problem was the time that process ate up. With so many manual steps, including getting the tape punched at the data center (where the opertors didn't share our sense of urgency about getting those binaries!), it was usually twenty-four hours, sometimes forty-eight, between compiling and debugging. That delay became frustratingly long.
One fine day, one of the company's Honchos (not my dad; one of his chief lieutenants) came to my lowly twenty-three year old self and said, "Speed it up." My response was an eager, "Sure! How should I do that?"
His answer, in essence, was: "How should I know?"
This was an interesting turning point in my short professional life. I had already gotten used to doing things that were hard. But this was my first assignment that was believed to be impossible. That I was given the task was, basically, a Hail Mary play. The people I was working with and learning from were smart, and they'd already thought about it. Nobody thought it could be done, but the Honcho figured it wouldn't hurt for me to take a swing at it.
So I started working the problem. Specifically, I looked hard at the CDC remote terminal we had in our office. It was straightforward enough: A keyboard. A green-screen alphanumeric CRT display. An IBM card reader. A line printer. The terminal was connected to the CDC data center via a dedicated 9600 baud line, and you don't want to think about what that cost in the early 1970s. You just don't.
No data ports. No access. No way to get at data going in, or out. It was a self-contained bubble: Data went from the keyboard and the card reader to the data center. Data came from the data center and went to the CRT monitor and the line printer.
The line printer. The line printer?
I popped open the back of the line printer. Hey! Look at that: A wire-wrapped backplane.
Okay. I understood wire wrap; our Intel computers were built on hand-wired Augat wire-wrap boards and backplanes. And, as was common back in that golden age, we had manuals for the CDC equipment, including schematics for the printer electronics. Okay, hmm, hmm, it's a chain printer. There is the data bus for the character codes, and there is the decoder that sends the GO NOW! signals to the various hammer solenoids. Let's see; there's the chain timing signals. And that signal line there is how the character data gets latched up into that register...
Data bus? Latch? Hot dog!
I attached an oscilloscope to see what those signals were doing, and experienced that rush when you realize that you have everything you need to create a solution.
It wasn't long before I diverted one of our production Intel 8080 computers to sit next to the line printer. In violation of all that is holy regarding TTL signals, I had a couple of feet of wire-wrap wire running between the general purpose I/O board of our system to the data bus and data latch pins on the printer backplane. I don't know how many terms of our agreement with CDC I violated by making that connection. In my defense I point out that I didn't modify their equipment. I just added some wire to the top of about eight of their wire-wrap posts.
The upshot is that I was able to monitor the signals on the backplane of the printer. I could see the characters being sent.
So, I wrote the code accordingly. After the Intel codes was compiled, the resulting binary was sent to a program I wrote that created a special kind of gibberish: First, a bunch of rows of asterisks were sent, which created a distinct BRAP BRAP BRAP noise as the hammers rippled through the asterisks on the printer's chains. When the Intel saw those, it knew that a binary was coming, and it started the cassette recorder's motor. (The computer controlled the Panasonic cassette recorder using an footswitch input intended for transcribing control.)
Then the binary would be sent: four bits in each of CDC's six-bit characters. (This was a time of great weirdness in the computer universe. CDC used sixty-bit words, and packed ten characters into each word. A number of computers had eighteen and thirty-six bit words. Seems very odd, now.) With that much more random data, the printer switched from BRAP to CHUCKA-CHUCKA-CHUCKA. With 66 lines of 132 characters, a page of characters represented about 4K of binary. In general, most of our binary modules – this was the dark ages, after all – were no more than about 16K.
And the Intel would buffer up that binary and send it to the cassette recorder.
That whole process was audibly very distinct, which was useful in its own way. There was no question what was going on when an Intel binary was coming down the pike.
It worked great. You'd go in with your edit deck of IBM cards. You'd feed the batch job into the terminal. You'd put the cassette into the recorder and press the RECORD button. And then you could go do something else. When your job was done, that BRAP-BRAP-BRAP CHUCKA-CHUCKA-CHUCKA could be heard all over the office.
We used that setup for a couple of years through some major projects.
And forever after I've been able say I once used a line printer as an input device.