Huge PDP-11 in a lorry: How I drove computers into schools

Mobile technology, the mainframe way


This Old Box Computers in classrooms are so common today, we may forget this was once inconceivably difficult. Computers were very expensive and so large they needed a huge truck to transport them. Nearly 35 years ago, I worked on an ambitious but ill-fated project to bring a minicomputer to rural Iowa schools, a classroom on wheels.

This old box

In the autumn of 1977, I was a sophomore at the University of Iowa. I’d had access to the university's computers for the past six years, under their initiative to teach computer science to students as young as 12. I had recently built a SOL-20 microcomputer from a kit, was enrolled in CS classes, and worked part-time as a junior programmer for the university.

I usually worked on experimental educational projects, and one day I was offered an exceptional project. The university had another initiative, to bring Computer Assisted Instruction (CAI) to schools across Iowa. In the late 1970s, even a simple 300 baud modem was exotic equipment, and computers in schools were unheard of. There was an active CAI program on campus, but it was impractical to bring teachers from across the state to the computer facilities. So the university would bring the computers to the schools.

The Mobile Instructional Classroom, as it was called, would put a DEC PDP-11 minicomputer and 16 terminals into a massive semi trailer and drive it to schools throughout Iowa.

It was a very ambitious setup – especially as the trailer had expanding wings to double the room size. I suggested to the project director that since schools already had classrooms, it would be easier and cheaper to buy some inexpensive microcomputers and install them in a spare classroom or the teacher's lounge.

The project's thrilling brochure

I suggested a newly released computer called the Apple II. My idea was immediately dismissed. The director insisted that Apple computers were only suitable for playing Pong and Breakout, they were incapable of "serious" computing. So I suggested something more "serious", an IMSAI or even a SOL. The director insisted that since they had already purchased the trailer and ordered the PDP-11, they were committed to the project as designed.

Writing software the old-fashioned way: with printouts

The project's software content was intended to help teachers recognise learning disabilities like dyslexia, and give them ongoing education and improved teaching certification.

The software originally ran on an IBM 1500 Instructional System, designed exclusively for CAI courses and written in a unique language, Coursewriter. But the IBM 1500 was obsolete and was decommissioned.

The last act of decommissioning was to print out all the programs, resulting in a stack of green-bar paper about 6 feet tall. My job was to learn that dead language, then rewrite the software in BASIC so it would run on the PDP-11. The framework was simple, the courses presented a section of instructional text, some multiple choice questions and a scoring system.

So, I found myself the senior programmer on the project. I helped write some of the new BASIC software, experimenting on my SOL at home. I borrowed a Carterphone 300 baud acoustic coupler modem from the project and wired it up to my SOL so I could write code from home.

The framework was designed so typists that were untrained in programming could read the content from the printout and type the BASIC code around it without having to understand how it worked.

This is where the troubles began

The PDP-11 hadn't shipped yet. So we wrote the software in an ANSI-compatible version of BASIC on a CDC Cyber mainframe. A dozen typists were hired, and put to work on dumb terminals in the computer science building.

The software was written in using a primitive line editor, a word processor that only showed one line at a time. The results were predictably disastrous. There were so many typos and errors, it took more time to fix their code than it would to just write it all by myself. So I did.

I had my SOL and a modem so I could work at home, but the typists only worked an hour or two each day. I was supposed to work in the office, but that would just make me available to fix everyone's bugs. So I started taking an inch or two of paper off the printout stack home with me, spending late nights typing it in.

The director didn't seem to notice the stack getting shorter, or the stored files getting larger. He would question me about why I was never in the office, but my time cards showed so many hours of overtime. I showed him my file storage, he couldn't believe I was doing more work than the rest of the team. Over the next few months, I coded about 90 per cent of the project by myself.

Your laptop? Show me your car park-top

Throughout the cold winter months, we wrote the code and tested it on the Cyber mainframe, consistently falling behind the director's optimistic schedule. Cyber BASIC was compatible with the DEC. I thought it was clever that we'd used an ASCII system

instead of the university's IBM/360, which used their proprietary (and incompatible) EBCDIC. That would have been a problem. But nobody noticed that we had no way to move the software from the Cyber to the PDP-11. The director planned on dumping the code to a DEC disk pack, but the format was incompatible with the Cyber. Eventually we found a contractor that had a DEC and a CDC computer on the same network, and could convert our tapes. But these unexpected delays were putting the project behind schedule.

Similar topics

Narrower topics


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022