Classic telly FX tech: How the Tardis flew before the CGI era

How the special effects boys made magic in the 1960s, 70s and 80s


Doctor Who @ 50 These days it’s all done with computers, of course.

CGI – short for Computer-Generated Images, or Imagery – was a well established visual effects technique long before Doctor Who was rebooted in 2005, so it was never in doubt that on-set mechanical effects would be duly combined with CGI visuals during post-production. Both the new series’ CGI and the picture compositing work was handled by The Mill until it closed its UK TV branch in April 2013.

Doctor Who McCoy titles

A rare use of CGI in classic Who: Sylvester McCoy’s opening titles
Source: BBC

TV production, composition and storage is now entirely digital, so computers are a necessary and inherent part of the production process. No so in the 1960s and 70s, during the classic series’ lifetime. Back then the final product was analog: two-inch Quad videotape masters made from edited videotaped studio footage and telecined 16mm film from location work and model shots

By the end of the Doctor Who’s initial run, computers were already being used in TV graphic design, model photography and video effects. Think, respectively, of Oliver Elmes’ title sequence for Sylvester McCoy’s Doctor, of the motion-control opening Time Lord space station model shot of the mammoth Trial of a Time Lord season, and of the various pink skies and blue rocks applied to extra terrestrial environments during the Colin Baker and McCoy eras.

Yet all these computer applications ultimate still resulted in analogue footage. A sequence shot on analogue videotape would be digitised, tweaked in a gadget like Quantel’s Paintbox rig, and then converted back into the analogue domain to be edited into the rest of the (also analogue) material. During the early 1980s (1983 in the case of Doctor Who) the BBC moved from two-inch Quad tapes to the more compact, more sophisticated one-inch C Format tape, but it was still analogue.

Leisure Hive

Want to pull the Doctor apart? You’ll be wanting Paintbox then
Source: BBC

Paintbox was introduced in 1981 and didn’t achieve widespread use until the decade’s middle years. Than as now doing effects digitally was relatively easy – it’s just about altering or combining pixel colour values. If one of Image A’s pixels in the framestore has a certain RGB value, write the output pixel value from stored Image B instead. Creating a good “green screen” shot is a little more complicated than that, of course, but that is essentially the algorithm for, say, superimposing a shot of the Doctor onto Raxacoricofallapatorius or one of RTD’s other worlds with outlandish names.

Creating the same shot entirely with analog kit was something else, however. So how did Doctor Who’s special effects technicians make the Tardis and all those silver-sprayed washing up liquid bottle spacecraft seem to fly through the stars, and make the Doctor appear to be slugging it out with little green men on alien sands when yet another Buckinghamshire quarry shot simply would not do?

Back in the 1960s, even the basics of colour video signal manipulation were unavailable to the special effects boys. As tight as Doctor Who’s budget was, its designers and the model makers who turned blueprints into physical objects for photography were able to come up with incredibly detailed work.

The Daleks

Existing footage doesn’t do the Dalek city model work justice. Here it’s filmed alongside actors in a false-perspective shot
Source: BBC

Looking back at early stories on DVD, most of which, though restored, still derive from low-quality film or video duplicates at many stages removed from the original footage, it can be hard to appreciate how good the original imagery was, though back then tellies used a mere 377 lines to display the picture.

The first Doctor Who stories, recorded in 1963, were transmitted using the System A format. Devised by EMI, System A streamed television pictures as sequential fields of alternating lines, two fields interlaced together forming a single frame. The moving picture was transmitted at a rate of 50 fields every second – so 25 frames a second – to harmonise it with the frequency of the mains electric current driving studio lighting. To have used a different frequency would have introduced strobing picture interference.

System A actually supported 405 scan lines in a frame, but only 377 were used for the picture, the rest being left in order to allow slow display circuitry to have caught up with the incoming signal back at the start of a new field. System A not only had 65 per cent of the vertical resolution of later standards, but it couldn’t do colour.

The Space Pirates

There’s some nice model work in The Space Pirates - but no stars
Source: BBC

Video might be fine for capturing actors’ performances on brightly lit studio sets, but it wasn’t really any good for model work. Videotape made models look even more like models than they did anyway, as some of the model shots in Frontier in Space, Carnival of Monsters and, later, Full Circle demonstrate this flaw perfectly - so effects shots were filmed, allowing photographers to use more subtle lighting.

So effects shots were filmed, telecined to videotape and then edited into the master tape. Rockets, say, were mounted in front of a suitably starry backdrop, typically a back-lit black cloth with tiny circles cut where the photographer wanted the stars to appear. Sometimes the technicians didn’t even bother with stars. Quite a few of the later Patrick Troughton stories, most notably The Space Pirates, are full of ship models moving across the screen against a plain black background.

Next page: Best of PALs

Similar topics


Other stories you might like

  • Talos names eight deadly sins in widely used industrial software
    Entire swaths of gear relies on vulnerability-laden Open Automation Software (OAS)

    A researcher at Cisco's Talos threat intelligence team found eight vulnerabilities in the Open Automation Software (OAS) platform that, if exploited, could enable a bad actor to access a device and run code on a targeted system.

    The OAS platform is widely used by a range of industrial enterprises, essentially facilitating the transfer of data within an IT environment between hardware and software and playing a central role in organizations' industrial Internet of Things (IIoT) efforts. It touches a range of devices, including PLCs and OPCs and IoT devices, as well as custom applications and APIs, databases and edge systems.

    Companies like Volvo, General Dynamics, JBT Aerotech and wind-turbine maker AES are among the users of the OAS platform.

    Continue reading
  • Despite global uncertainty, $500m hit doesn't rattle Nvidia execs
    CEO acknowledges impact of war, pandemic but says fundamentals ‘are really good’

    Nvidia is expecting a $500 million hit to its global datacenter and consumer business in the second quarter due to COVID lockdowns in China and Russia's invasion of Ukraine. Despite those and other macroeconomic concerns, executives are still optimistic about future prospects.

    "The full impact and duration of the war in Ukraine and COVID lockdowns in China is difficult to predict. However, the impact of our technology and our market opportunities remain unchanged," said Jensen Huang, Nvidia's CEO and co-founder, during the company's first-quarter earnings call.

    Those two statements might sound a little contradictory, including to some investors, particularly following the stock selloff yesterday after concerns over Russia and China prompted Nvidia to issue lower-than-expected guidance for second-quarter revenue.

    Continue reading
  • Another AI supercomputer from HPE: Champollion lands in France
    That's the second in a week following similar system in Munich also aimed at researchers

    HPE is lifting the lid on a new AI supercomputer – the second this week – aimed at building and training larger machine learning models to underpin research.

    Based at HPE's Center of Excellence in Grenoble, France, the new supercomputer is to be named Champollion after the French scholar who made advances in deciphering Egyptian hieroglyphs in the 19th century. It was built in partnership with Nvidia using AMD-based Apollo computer nodes fitted with Nvidia's A100 GPUs.

    Champollion brings together HPC and purpose-built AI technologies to train machine learning models at scale and unlock results faster, HPE said. HPE already provides HPC and AI resources from its Grenoble facilities for customers, and the broader research community to access, and said it plans to provide access to Champollion for scientists and engineers globally to accelerate testing of their AI models and research.

    Continue reading

Biting the hand that feeds IT © 1998–2022