This article is more than 1 year old

Faster Python: Mark Shannon, author of newly endorsed plan, speaks to The Register

The biggest challenge? 'Backwards compatibility of features that we might not even know we have'

Interview Python creator Guido van Rossum last week introduced a project to make CPython, the official implementation, five times faster in four years. Now Mark Shannon – one of the three initial members of the project – has opened up about the why and the how.

Shannon, formerly a research engineer at Semmle, a code security analysis company which is part of Microsoft-owned GitHub, is now contracting full-time for Microsoft on CPython. At the Python Language Summit last week, Van Rossum said that he would be part of a new team, funded by Microsoft, alongside Shannon and Eric Snow (a senior software engineer also at Microsoft), with the aim of speeding up CPython. He referred to the "Shannon plan" here as the basis for achieving a "2x speedup in 3.11" with a hope for 5x in four years. Version 3.11 of CPython is likely to be released around October 2022.

Until now Van Rossum had suggested that speeding up CPython was not critical because there are many other ways to get better performance, such as writing extensions in C or using the JIT-compiled PyPy. Why the change of heart?

Mark Shannon

Mark Shannon

"You'd need to ask Guido about that, but I think it is because the popularity of Python in machine learning has exploded over the last few years... it means we can invest money in performance without worrying about it undermining reliability and such because there's more resources available," Shannon tells The Reg. "That's my take on it."

Shannon worked on a project called HotPy that was part of his PhD, "implementing parts of the language generally regarded as hard to optimize to prove that it would work," then later on HotPy2 which was "taking those ideas and applying them to CPython, which sort of worked, but that was no longer viable," he says. Then "it all went on the back burner," he adds. What changed? Last year he was able to spend more time on it and posted his plan for faster Python, then "Guido decided when he started working for Microsoft that he would look into the performance," which led to the new project.

What about the argument that the CPython interpreter should be kept simple? "I think as an open-source project, that sort of accessibility to the code is important. That doesn't mean it has to be trivial, just that it needs to be well organised and modular and well documented," Shannon says. "We want to keep that."

Does Van Rossum's involvement in the new group mean that its work will definitely make it into official CPython? "It doesn't absolutely guarantee it. It's following the open-source development model so everything gets committed, reviewed, onto the main development branch of CPython, incrementally. So we're always working on improving CPython rather than having these separate big projects. That does put some constraints on the way we can develop it, or often more in the order in which we do stuff… but it's pretty much guaranteed to be going in. Obviously, Guido's status is only going to help," Shannon says.

How does the group's work fit with all the other projects that speed up Python, such as PyPy, Facebook's Cinder, Pyston and others? "I've been working on Python optimisation since I did my PhD a decade ago," Shannon says. "I've been in close communication with the PyPy people, less so lately. There's always this exchange of ideas, we look through the other projects, see what they've done… often direct use of code is awkward but for a developed and tested idea, rewriting the code for that is usually not such a large amount of work… where they've tried things and they don't work, that's often valuable information to us." It's up to other projects whether they upstream their code, he adds.

There is also the question of whether the new project will reduce interest in the other options. In the case of Cinder, "I don't think we will particularly obsolete it, because it's their thing to be as fast as possible on their code and to have no interest in anything else," Shannon tells us. "As for Pyston, I think Pyston's goals are quite similar… happy to talk to them, happy to learn from them."

We discussed the matter of whether the work will be more focused on Python for data science work or general-purpose use cases such as for websites. "Our interest is to improve the performance of Python in general, both for the web and the data science community. We probably will focus on the web for the moment but that's more to do with what we can optimize first and what's practical… the idea is that we shouldn't privilege a workload, so we shouldn't do anything that will harm one sort of use case over another."

The ML case is unusual, he says, because "for things like deep matrix processing it has to be written in custom languages that [are] tailored for things like Tensor Processing Units and so on… the Python is often there to do interaction at a higher level, loading module models and that sort of stuff, and that workload is more 'normal'. So we can benefit that."

The big challenge: Backward compatibility

What is the biggest challenge with speeding up Python? "Backwards compatibility of features that we might not even know we have. The C API just evolved over time, so for bits of it, it's very awkward to change the underlying behaviour," he says. "There are other parts where we don't have a sufficient API so people have to rummage around the internals of CPython so things like tools, debuggers and profiles, and Cython, which is an important one, and the NumPy stack, use some of the internals of CPython which we are not necessarily aware of," says Shannon. "Consequently, it's tricky to change those… the implicit contract with users of CPython as to what can change and what can't is not terribly well defined. It's getting better, and if we tried this say five years ago it would have been much more of a problem."

If anything is compared to 2 or 3, it's pretty much dead, because no one wants to go through that

The community did not enjoy the backward compatibility issues introduced by Python 3, so the idea of a new fast but incompatible version is unlikely. "If anything is compared to 2 or 3, it's pretty much dead, because no one wants to go through that," he says.

Why has the use of a JIT compiler been deferred, in Shannon's plan, to after 3.11? "People tend to think it's this sort of magic cure-all. It isn't. It's a piece of engineering and if it's not well designed and implemented, it's not going to work. The way optimisation for dynamic languages works if you have to specialise first. Specialisation is saying, we'll rewrite this piece of code for the values and times that we expect to see. You can do that in the interpreter, or you can do it in the compiler, but if you do it in the interpreter then it is already done for the compiler, and the interpreter gets faster, so it makes sense to do it first," he claims.

There are also issues with minor platforms. "We're not going to produce a JIT compiler for every physical architecture," he says. "Whereas any interpreter speed-ups or virtual machine improvements just work on all architectures.

"There's broadly three things we can do. One is to speed up the interpreter itself with specialisation. One is to improve the data structures and the memory layouts, so things are more cache-efficient and better suited to modern architectures; and the third one is this translation to machine code, the just-in-time compiler. That third one is dependent on the others."

Why has it taken so long to focus on performance in CPython? "In practice it often isn't that slow and it doesn't matter," Shannon tells The Reg. One factor is that it is often not the bottleneck being much faster than the network. Another factor is that "Python does a lot of basic operations faster because the libraries are very well tuned and work well with the rest of the language," he adds. "If you've got to write some program and you need to read some files off disk and write them out somewhere or send them over the network, if you write your program in Python and you write one in C, you end up with probably a faster Python program because it's trivial to write it, and writing in C takes a lot of effort and you end up rewriting stuff which is already well written in the Python libraries which are themselves C. So it depends on your use case."

The counter example is that sometimes what starts as a small project where performance matters less than productivity can grow into a huge code base where performance is critical – leading to projects like those at Facebook/Instagram (Cinder) and at Dropbox (Pyston, then abandoned in favour of Go).

"We'd like people not to have to make that choice so much," says Shannon. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like