The company released Copilot in beta in June 2021 and describes the tool as an “AI pair programmer”. Copilot aims to help developers by proposing the next line of code as they type into a complete development environment such as JetBrains IDE, Neovim or Microsoft Visual Studio Code. In addition to proposing code, it can also suggest complete methods and more complex algorithms when required. In a blog post, GitHub CEO Thomas Dohmke said that GitHub Copilot was designed as an editing extension to ensure that nothing interferes with what developers are doing. “GitHub Copilot distills the collective knowledge of the world’s developers into a real-time code extension that helps you stay focused on what matters most: creating great software,” he explained. According to Dohmke, about 1.2 million developers have tried Copilot throughout its preview phase. Apparently it was also quite useful, with Dohmke claiming to have written up to 40% of the developer code written in popular languages like Python. “Just like the rise of compilers and open source, we believe that AI-assisted coding will fundamentally change the nature of software development, giving developers a new tool to write code more easily and quickly,” Dohmke said. Code automation could well become the next competitive field in software development. Last year, DeepMind, a subsidiary of Google LLC parent company Alphabet Inc., introduced an artificial intelligence system called AlphaCode, which is also very capable of writing software code. DeepMind tested AlphaCode against a third-party coding platform called Codeforces and achieved an estimated ranking that placed it in the top 54% of human coders – not perfect, but definitely an achievement to be relied upon. That said, there has been some controversy over the use of AI in coding. In the case of Copilot, the tool is powered by OpenAI Codex, which is a language model trained in billions of lines of publicly available source code and natural language data, as well as code available in public GitHub repositories. This reliance on open source training data has apparently angered the Free Software Foundation, which has described Copilot as “unacceptable and unfair.” The foundation has challenged whether copilot training in freely licensed code constitutes “fair use”. This is a problem because Copilot is not the same as free, but rather a paid service used as a software substitute, the foundation said. Aside from copyright issues, a December study found that Copilot can be a security issue, with up to 40% of coding output containing vulnerabilities. GitHub did not seem to be dealing with any of these complaints today, but at least it acknowledged its debt in open source. “GitHub Copilot would not be possible without the vibrant community of students and creators of GitHub,” Dohmke said. “To support and reward these communities, we make GitHub Copilot free for verified students and maintainers of popular open source projects.” Image: GitHub