This article is more than 1 year old
Pair programming? That's so 2017. Try out this deep-learning AI bot that autocompletes lines of source code for you
OpenAI's GPT-2 language model has been tweaked to help you code faster
Talk about working smarter, not harder. A computer-science student has got the right idea, by building an intriguing code-completion tool that uses deep-learning software to finish lines of source.
And while, yes, there are already a ton of source-code autocomplete tools available, this one, dubbed Deep TabNine, is said to be based on OpenAI’s impressive GPT-2 text-spewing engine, which makes it interesting in our book. GPT-2 features a trained neural network that can be given a writing prompt, such as an opening sentence of a news article, or a novel, or a line of code, and predict what should follow next. It can make these predicts because it has studied millions of webpages to get an idea of how humans tie topics, ideas, and words together.
Deep TabNine was developed by Jacob Jackson, a fourth-year compsci undergrad at the University of Waterloo, Canada, who previously produced the non-AI code completion plugin TabNine.
Deep TabNine, once it is installed in a code editor, analyzes each line of source as it's typed in by a human and suggests ways to complete each statement, kinda like a pair-programming partner. The coder can then pick from the list of suggestions to complete the unfinished line without having to type it all out. It is, essentially, like crafting an email with Google’s Smart Compose feature, or using one of the many non-AI autocomplete tools out there. Of course, if Deep TabNine makes a silly suggestion, the developer can just ignore it.
“Autocompletion is a great use case for this type of technology,” Jackson told The Register. “All programmers spend a lot of time writing code that can plausibly be sped up using AI. In comparison to language generation, auto completion in code is technically an easier problem.”
Although Deep TabNine is based on the architecture of GPT-2, it has slightly been modified, we're told. Jackson was hesitant to reveal too many details since he’s trying to commercialize the project. He didn’t say which version of GPT-2 he used to develop Deep TabNine, but did say that using a model with more parameters isn’t always beneficial.
“Using more parameters aren’t necessarily better,” he explained. “The suggestions it makes would get smarter, but it would also be slower and less responsive when you’re typing.”
Don't miss out: Learn essential AI skills from dozens of top-notch speakers, sessions and workshops at this year's MCubed conferenceREAD MORE
Deep TabNine is supposed to be aware of the context of the source code, and therefore be smarter than typical code completion tool kits. For example, if you're writing some code within a mathematical-focused function, for example, it may suggest mathematical operations rather than methods that handle text strings. You can watch demos of Deep TabNine working in various programming languages here.
At the moment, Jackson's tool is pretty computationally intensive, and requires GPUs to accelerate the task of crunching through as many as 10 billion floating-point calculations a second to come up with suggestions as you type. Your laptop may not be powerful enough to run the neural network effectively, therefore, and you can apply to use DeepTab Nine via a beta-grade cloud service, which does all the predictions in a backend and beams the suggestions to your code editor over the internet.
Jackson is hoping to create a version that he can license out to companies so that they can run the software on their own servers to ensure their code is kept private, and not shuttled to and from a remote service for analysis and suggestions.
Coders interested in using Deep TabNine can sign up for a beta release here. ®