ARPANET pioneer Jack Haverty says the internet was never finished

When he retired with stuff left on his to-do list, he expected fixes would flow. They haven't

Early-internet pioneer Jack Haverty has described the early structure of the internet as experimental – and said not much has changed since.

Haverty was a protégé of Professor JCR Licklider in the early '70s, when he worked on the then brand new ARPANET. He is widely credited with developing File Transfer Protocol, the RFC format still used for internet standards, and one of the world's first email systems. A contemporary of the likes of Vint Cerf and Bob Kahn, Haverty later joined Oracle in the '90s and worked alongside folks like Tim Berners-Lee, establishing linkages between web servers and databases.

TCP had become a standard. Our immediate reaction was 'Wait: it's not done yet'

The way Haverty told it, that process was deeply inelegant, with many efforts being abandoned and others set in stone without much notice.

And yesterday he delivered a keynote speech at the Asia Pacific Regional Internet Conference on Operational Technologies (APRICOT). He reminisced that the internet as we know it came to be sometime around the end of 1981, when attention shifted to network operation and making internetworking technology that could provide a reliable 24/7 service.

"At one of the quarterly meetings Vint Cerf came in and dropped a bombshell on us: he said TCP had become a standard. Our immediate reaction, or at least my reaction, was 'Wait: it's not done yet. We have this long list of things we still have to figure out'," Haverty recalled in his speech.

The technology was out of developers' hands before they felt ready to let it go.

Haverty said the teams he worked with always expected to fix and polish their work – not just build on top of what he referred to as an "experiment."

"There's all sorts of operational issues that we went through and developed but they haven't made it into the real world,” he said.

In a subsequent role as internet architect at Oracle, Haverty found himself working to add resilience to TCP, which at the time interpreted slow data transfers as unwanted duplicates.

Since leaving Oracle, he's become, by his own description, just a user.

"I'm now just one of billions of users. So I don't really know what's going on inside the networks now and I haven't really paid much attention literally for several decades," said Haverty.

After a friend elicited his help for a poorly functioning app, he started looking at how the modern internet works.

"We found, much to my surprise, that things like TCP dump, Wireshark, Ping and Traceroute and all those kind of tools that I used to use back in the '80s and '90s still work," said Haverty.

And his friend's problem? Data travelling long distances arrived at varying speeds, wreaking havoc on the app's ability to function – the same problem he encountered at Oracle.

"The conclusion I draw from all this is the internet is amazing. None of us ever thought that it could possibly last this long or grow this big," said Haverty, who then added his major point: that just like 50 years ago, the internet should all just work, but the reality is it still doesn't.

"There's still a long list of things that have to get done," he said. Some, such as refinements to TCP he was keen on in the 1990s, remain placeholders.

The entire speech can be viewed below. ®

Youtube Video

Other stories you might like

  • Lenovo halves its ThinkPad workstation range
    Two becomes one as ThinkPad P16 stands alone and HX replaces mobile Xeon

    Lenovo has halved its range of portable workstations.

    The Chinese PC giant this week announced the ThinkPad P16. The loved-by-some ThinkPad P15 and P17 are to be retired, The Register has confirmed.

    The P16 machine runs Intel 12th Gen HX CPUs, but only up to the i7 models – so maxes out at 14 cores and 4.8GHz clock speed. The laptop is certified to run Red Hat Enterprise Linux, and can ship with that, Ubuntu, and Windows 11 or 10. The latter is pre-installed as a downgrade right under Windows 11.

    Continue reading
  • US won’t prosecute ‘good faith’ security researchers under CFAA
    Well, that clears things up? Maybe not.

    The US Justice Department has directed prosecutors not to charge "good-faith security researchers" with violating the Computer Fraud and Abuse Act (CFAA) if their reasons for hacking are ethical — things like bug hunting, responsible vulnerability disclosure, or above-board penetration testing.

    Good-faith, according to the policy [PDF], means using a computer "solely for purposes of good-faith testing, investigation, and/or correction of a security flaw or vulnerability."

    Additionally, this activity must be "carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices, machines, or online services to which the accessed computer belongs, or those who use such devices, machines, or online services."

    Continue reading
  • Intel plans immersion lab to chill its power-hungry chips
    AI chips are sucking down 600W+ and the solution could be to drown them.

    Intel this week unveiled a $700 million sustainability initiative to try innovative liquid and immersion cooling technologies to the datacenter.

    The project will see Intel construct a 200,000-square-foot "mega lab" approximately 20 miles west of Portland at its Hillsboro campus, where the chipmaker will qualify, test, and demo its expansive — and power hungry — datacenter portfolio using a variety of cooling tech.

    Alongside the lab, the x86 giant unveiled an open reference design for immersion cooling systems for its chips that is being developed by Intel Taiwan. The chip giant is hoping to bring other Taiwanese manufacturers into the fold and it'll then be rolled out globally.

    Continue reading

Biting the hand that feeds IT © 1998–2022