Net neutrality is heading to the courts (again): So will the current rules stand or be overturned (again)?

We dig in and find a surprising answer


The rules over what companies selling internet access to folks are allowed to do are heading to America's law courts yet again.

Yep, it's net neutrality round three – and with all the legal briefs now filed, we have dug through the competing arguments to try to discern which way it will go.

The Washington DC Court of Appeals will hear arguments for and against the Federal Communications Commission (FCC) Open Internet Order on February 1 and it could well see the current 2018 rules thrown out, with America's internet returning to the previous 2015 rules.

The main argument against the current rules is two-fold: first that scrapping the previous rules didn't come with a decent explanation or well-founded reasons – it was purely ideological; and second that the new rules effectively remove the FCC entirely from overseeing a critical service for millions of US citizens and as a result the regulator has abdicated its role, and therefore the decision is illegal.

The first point should get an audience. The 2015 rules – introduced during the Obama Administration and under FCC chair Tom Wheeler – were put in place following a lengthy policy process. And they were created out of necessity after the court threw out another set of rules covering internet access – the 2011 rules.

There were lots of people that did not like those 2015 rules (ourselves included) but they were instituted properly and through the long established processes that are designed to dig into the issues, develop various proposals, incorporate feedback, and provided reasonable explanations for decisions made.

By comparison, the current 2018 rules – introduced in the Trump Administration under FCC chair Ajit Pai – were an embarrassment to policy making. The decision was made before any deliberations were carried out and the regulator actively downplayed or ignored concerns or criticisms. Little or no effort was made to dig into the issues: the sole focus was to scrap the previous rules.

What's more, because the ideology behind the new rules was unlikely to stand up to scrutiny, the FCC took the peculiar decision to simply write itself out of regulating the internet access market at all, and simply handed over the reigns to sister regulator the FTC.

Pay the price?

There should be a price to pay for this sort of lazy, unprincipled, blinkered approach to policy making and the courts are, of course, the best place to do that.

But, it is very far from certain that the DC Appeals Court – which, by the way, was the same court that tore up the 2011 rules and got this whole mess started – will agree with the petitioners and scrap the current FCC rules just because the process itself was a shit-show.

And the reason for that is quite simple: because the courts follow legal precedents and try their absolute best not to make judgments about how well something has been done, just whether it has been done according to the rules.

And on that point, as pathetic as the policy process was that the FCC followed in developing the current rules, the regulator did, in the strictest sense, dot all the 'i's and cross all the 't's.

In addition, legal precedent points to the federal regulator being allowed to develop its own rules for issues under its own remit. In short, the courts have repeatedly granted federal regulators the autonomy they need to do their job. That's a good thing most of the time because it beats off efforts by Congress and others to try to pressure them into deciding what they want.

That support of autonomy is likely to see the DC Appeals Court come down in favor of the FCC. It may be a mess, but it's their mess. And if it doesn't work out, then the next FCC will come in and fix it.

That is of course a major headache for everyone involved – the risk of net neutrality rules bouncing back-and-forth every four or eight years as elections are held and the composition of the regulator changes. But the alternative could be equally unpleasant: a rebalancing of power between the three branches of government, with the judiciary gaining an edge.

While the executive (the White House) currently seems determined to make itself the dominant force, and the legislature (Congress) is too busy with in-fighting to even notice, America's sanity check is the judiciary (the courts). If they start arguing for greater power, then issues like net neutrality are going to pale in comparison.

Transparent problems

So, on the balance of probability, the FCC is going to win the argument. It may have embarrassed itself with its flawed and distorted policy process to come up with new net neutrality rules but that will be for the next FCC administration to sort out.

However, there is also a dark horse legal argument in front of the DC Appeals Court which could unravel the entire set up and see the court scrap the 2018 rules in their entirely. And that is over the one regulatory thing that the FCC did actually introduce: a transparency requirement.

When the FCC decided in the 2018 rules to effectively thrown out any rules covering constraints on internet access, including removing itself from carrying out any oversight, it argued that its approach wasn't going to cause major problems because it was going to require all ISPs to announce what they were doing publicly.

So if an ISP like Comcast does decide to start throttling, say, Netflix because Netflix refused to pay it the amount of money it demanded, then Comcast will be required to state that publicly on its website.

The idea then is that the market will determine what is acceptable or not. If people love Netflix and hate that Comcast is throttling it, they will simply drop the ISP and give their money to a different ISP.

Aside from the fact that this approach presumes there is a competitive market in the US for internet access – which there is most definitely not – the free market argument and transparency rules actually puts the FCC in very tricky legal water.

In the legal filing [PDF] from the Internet Association (which was the same organization that successfully pushed for the most controversial aspect of the 2015 rules – a legal reclassification of internet access), the trade group has dug into the issue and argues that the transparency requirement undermines the entire Open Internet Order (the 2018 rules).

Oh dear

Here is its logic: The FCC doesn't have the authority to issue the transparency rules in the first place.

How come? Because Congress repealed the very statute that the FCC relies on to make the rules in the first place. That statute (257(c) of the Communications Act) expressly said that any regulations that the FCC comes up with "identify and eliminate market barriers" will be authorized under section 257.

So if you remove section 257 – which Congress has since done – the FCC's authority to make regulations covering market barriers also goes. In short, the over zealous push to deregulate everything may have undermined the FCC's ability to set any rules.

And the transparency rules is critical to the broader decision to scrap the 2015 net neutrality rules – something that the FCC has itself repeatedly noted.

Of course the FCC has a counter-argument: the transparency rule only "identifies" market barriers and is not designed to "eliminate" any of them, and so it doesn't have to rely on Congressional authority.

It's not a great argument, at least not in legal terms. Authority is granted very specifically and you don't get to pick and choose.

Not only is the FCC's argument that it needs authority to impose a rule only if that rules "eliminates" market barriers questionable, but it's assertion that the transparency rules only "identifies" barriers is also suspect. After all, the rule was specifically added to make sure that ISPs didn't run amok. In that sense, its desired impact is to eliminate some practices and barriers.

It is worth noting that the Washington DC Appeals Court decision that threw out the original 2011 rules – something that many ISPs now wish had never happened – focused on a similar issue of whether the FCC had the authority to impose the rules it did.

So there is a significant likelihood that the FCC's new 2018 rules could collapse under their own weight. And we would return, again, to the 2015 rules.

Bigger problem

All of which is an indicator of a much bigger problem with American public institutional right now: that the refusal to compromise and to jam through your preferred approach causes endless stalemates and battles without stability.

If the current rules are scrapped, the FCC under Pai would likely embark on another effort to do much the same thing but working around the legal problems. And then that approach would likely be scrapped by the next FCC. And on and on and on.

The reality is that problem doesn't even lie with the FCC, it is stuck working within a legal framework that was developed in 1996 and wholly inadequate for the modern world we live in.

The truth is that the most controversial aspect of the 2015 rules – that internet access was legally redefined to be equivalent to telephone networks (so called Title II) – doesn't actually make any sense. Because the internet is nothing like the telephone.

But the FCC under Wheeler felt it had no choice to go for that classification because it was the only one that gave it the authority to impose the other things it felt were essential like preventing ISPs from blocking or throttling content from third parties.

Defining the internet as Title II is just bad policy. But the FCC felt it had no choice because the ISP industry wasn't open to a compromise solution and it had to come up with some set of rules after the 2011 rules were thrown out.

Likewise, the current legal classification – Title I – gives ISPs way too much leeway for a service that is increasingly essential for life in the digital world. Pretty much the only people that think the idea of ISPs being given free rein over internet content are the ISPs themselves and the people in government that receive their patronage. There is a reason that the FCC exists – and the biggest is ensuring that the enormous power, influence and profit that comes with telecommunications isn't abused and that Americans overall benefit.

Title trouble

The only viable solution is of course for Congress to develop a new law that adequately accounts for the modern world. It would be "Title VIII: Broadband networks." And it would cover the reality of internet access and incorporate all the things that everyone agrees with (or says they agree with) such as no blocking; throttling only under specific circumstances; network management; and so on.

(FYI: Title I covers "general provisions"; Title II is "common carriers" i.e. telephones; Title III is radio; Titles IV and V are bureaucratic provisions; Title VI covers cable; and Title VII is miscellaneous and covers everything from captions to disabilities.)

Such an approach would pull out Big Cable's greatest fear: the ability of the FCC to set prices. Because, for better or worse, the internet access market has been closely tied to the cable industry's preferred model of bundling a huge array of different options at widely varying prices that shift according to where you live.

There is room for a big compromise where cable companies let go of the idea of making billions from squeezing content companies and in return get the ability to set their own prices.

But it will only happen when Congress shows itself capable of agreeing to compromise for the betterment of all. And there are no signs of that happening any time soon, so net neutrality will have to continue to act as proxy for the wider failure of the US government.

But who knows? There is seemingly broad agreement right now that the internet giants – Facebook, Google, Amazon et al – need new regulations imposed on them. It is possible – possible – that the issue of internet access could be tackled at the same time and Congress create a new set of laws that will allow the United States to thrive for the next couple of decades without being dragged down by regulatory arguments.

Here's hoping. ®

Similar topics


Other stories you might like

  • Electron-to-joule conversion formulae? Cute. Welcome to the school of hard knocks

    Shake, rattle and roll is incompatible with your PABX

    On Call There are some things they don't teach you in college, as a Register reader explains in this week's instalment of tales from the On Call coalface.

    Our reader, safely Regomised as "Col", headed up the technical support team of a PABX telecom provider and installer back in the early 1990s. PABX, or Private Automatic Branch eXchange, was the telephony backbone of many an office. A failure could be both contract and career-limiting.

    Col, however, was a professional and well versed in the ins and outs of such systems. Work was brisk and so, he told us, "I took on a university grad with all the spunk and vigour that comes with it. He knew the electron-to-joule conversion formulae et al."

    Continue reading
  • Korea's NAVER Cloud outlines global ambitions, aim to become Asia's third-biggest provider

    Alibaba is number two in much of the region, but is a bit on the nose right now

    Korean web giant NAVER has outlined its ambition to bring its cloud to the world, and to become the third-largest cloud provider in the Asia-Pacific region.

    NAVER started life as a Korean web portal, added search, won the lion's share of the market, and has kept it ever since. South Korea remains one of the very few nations in which Google does not dominate the search market.

    As NAVER grew it came to resemble Google in many ways – both in terms of the services it offers and its tendency to use its muscle to favour its own properties. NAVER also used its scale to start a cloud business: the NAVER Cloud Platform. It runs the Platform in its home market, plus Japan, Hong Kong, and Singapore. Presences in Taiwan, Vietnam and Thailand are imminent.

    Continue reading
  • Build it fast and they will come? Yeah, but they’ll only stay if you build it right

    Here’s where to start

    Sponsored Developers have never had so much choice. Every week there’s a new framework, API, or cloud service that promises to help deliver software to market faster than ever. And it’s not just tooling. Agile, continuous integration, and DevOps techniques have made teams more efficient, too. But speed brings with it increased expectations. Pressure from customers and colleagues, alongside the burden of staying current with new tooling, can lead to mistakes.

    Whether it’s a showstopping bug that slips through into production or an edge case that lies in wait for years, pressure to deliver is driving some teams to pile up technical debt and mismatched stakeholder expectations.

    What’s the solution? Well, it’s to do what we’ve always done: build on what came before. In the absence of unlimited time and budget, a low-code platform gives both experienced and new developers a suite of tools to accelerate their development. Automation in just the right places lets teams bring their unique value where it really matters, while all the standard building blocks are taken care of.

    Continue reading
  • Royal Navy will be getting autonomous machines – for donkey work humans can't be bothered with

    No robot killers 'in my lifetime' says admiral

    DSEI 2021 The British armed forces will be using robots as part of future warfare – but mostly for the "dull, dangerous and dirty" parts of military life, senior officers have said.

    At London's Defence and Security Equipment International arms fair, two senior officers in charge of digitisation and automation said the near future will be more Wall-E than Terminator – but fully automated war machines are no longer just the stuff of sci-fi.

    Brigadier John Read, the Royal Navy's deputy director of maritime capability, said in a speech the military "must automate" itself so it can "take advantage of advances in robotics, AI and machine learning."

    Continue reading
  • WTF? Microsoft makes fixing deadly OMIGOD flaws on Azure your job

    Clouds usually fix this sort of thing before bugs go public. This time it's best to assume you need to do this yourself

    Microsoft Azure users running Linux VMs in the IT giant's Azure cloud need to take action to protect themselves against the four "OMIGOD" bugs in the Open Management Infrastructure (OMI) framework, because Microsoft hasn't raced to do it for them.

    As The Register outlined in our report on this month's Patch Tuesday release, Microsoft included fixes for flaws security outfit Wiz spotted in Redmond's open-source OMI agents. Wiz named the four flaws OMIGOD because they are astonishing.

    The least severe of the flaws is rated 7/10 on the Common Vulnerability Scoring System. The worst is rated critical at 9.8/10.

    Continue reading
  • Businesses put robots to work when human workers are hard to find, argue econo-boffins

    The lure of shiny new tech isn't a motivator, although in the USA bots are used to cut costs

    Researchers have found that business adoption of robots and other forms of automation is largely driven by labor shortages.

    A study, authored by boffins from MIT and Boston University, will be published in a forthcoming print edition of The Review of Economic Studies. The authors, Daron Acemoglu and Pascual Restrepo, have both studied automation, robots and the workforce in depth, publishing numerous papers together and separately.

    "Our findings suggest that quite a bit of investment in robotics is not driven by the fact that this is the next 'amazing frontier,' but because some countries have shortages of labor, especially middle-aged labor that would be necessary for blue-collar work,” said Acemoglu in a canned statement.

    Continue reading
  • After eight years, SPEC delivers a new virtualisation benchmark

    Jumps from single-server tests to four hosts – but only for vSphere and RHV

    The Standard Performance Evaluation Corporation (SPEC) has released its first new virtualisation benchmark in eight years.

    The new SPECvirt Datacenter 2021 benchmark succeeds SPEC VIRT_SC 2013. The latter was designed to help users understand performance in the heady days of server consolidation, so required just one host. The new benchmark requires four hosts – a recognition of modern datacentre realities.

    The new tests are designed to test the combined performance of hypervisors and servers. For now, only two hypervisors are supported: VMware’s vSphere (versions 6.x and 7.x) and Red Hat Virtualisation (version 4.x). David Schmidt, chair of the SPEC Virtualization Committee, told The Register that Red Hat and VMware are paid up members of the committee, hence their inclusion. But the new benchmark can be used by other hypervisors if their vendors create an SDK. He opined that Microsoft, vendor of the Hyper-V hypervisor that has around 20 per cent market share, didn’t come to play because it’s busy working on other SPEC projects.

    Continue reading
  • Forget that Loon's balloon burst, we just fired 700TB of laser broadband between two cities, says Google

    Up to 20Gbps link sustained over the Congo in comms experiment

    Engineers at Google’s technology moonshot lab X say they used lasers to beam 700TB of internet traffic between two cities separated by the Congo River.

    The capitals of the Republic of the Congo and the Democratic Republic of Congo, Brazzaville and Kinshasa, respectively, are only 4.8 km (about three miles) apart. The denizens of Kinshasa have to pay five times more than their neighbors in Brazzaville for broadband connectivity, though. That's apparently because the fiber backbone to Kinshasa has to route more than 400 km (250 miles) around the river – no one wanted to put the cable through it.

    There's a shorter route for data to take between the cities. Instead of transmitting the information as light through networks of cables, it can be directly beamed over the river by laser.

    Continue reading
  • Apple's M1 MacBook screens are stunning – stunningly fragile and defective, that is, lawsuits allege

    Latest laptops prone to cracking, distortions, owners complain

    Aggrieved MacBook owners in two separate lawsuits claim Apple's latest laptops with its M1 chips have defective screens that break easily and malfunction.

    The complaints, both filed on Wednesday in a federal district court in San Jose, California, are each seeking class certification in the hope that the law firms involved will get a judicial blessing to represent the presumed large group of affected customers and, if victorious, to share any settlement.

    Each of the filings contends Apple's 2020-2021 MacBook line – consisting of the M1-based MacBook Air and M1-based 13" MacBook Pro – have screens that frequently fail. They say Apple knew about the alleged defect or should have known, based on its own extensive internal testing, reports from technicians, and feedback from customers.

    Continue reading
  • Microsoft's Azure Virtual Desktop now works without Active Directory – but there are caveats

    General availability of Azure AD-joined VMs

    Microsoft has declared general availability for Azure Virtual Desktop with the VMs joined to Azure AD rather than Active Directory, but the initial release has many limitations.

    Azure Virtual Desktop (AVD), once called Windows Virtual Desktop, is Microsoft's first-party VDI (Virtual Desktop Infrastructure) solution.

    Although cloud-hosted, Azure Virtual Desktop is (or was) based on Microsoft's Remote Desktop Services tech which required domain-joined PCs and therefore a connection to full Windows Active Directory (AD), either in the form of on-premises AD over a VPN, or via Azure Active Directory Domain Services (AAD DS) which is a Microsoft-managed AD server automatically linked to Azure AD. In the case that on-premises AD is used, AD Connect is also required, introducing further complexity.

    Continue reading
  • It's bizarre we're at a point where reports are written on how human rights trump AI rights

    But that's what UN group has done

    The protection of human rights should be front and centre of any decision to implement AI-based systems regardless of whether they're used as corporate tools such as recruitment or in areas such as law enforcement.

    And unless sufficient safeguards are in place to protect human rights, there should be a moratorium on the sale of AI systems and those that fail to meet international human rights laws should be banned.

    Those are just some of the conclusions from the Geneva-based Human Rights Council (HRC) in a report for the United Nations High Commissioner for Human Rights, Michelle Bachelet.

    Continue reading

Biting the hand that feeds IT © 1998–2021