GitHub Copilot: Could This Be The End of Developer Jobs?

Published 29 Jul 2021

By William Feng

We all thought that pursuing a degree within computer science would be safe. As we enter a digital era driven by the technology advancements worldwide, an alarming number of occupations are becoming outdated, automated or just rendered redundant. Surely a career in the STEM-related field would be immune to unemployment? But now we’ve gone to the extent of developing something that will write code for us. Could this be the beginning of how “artificial intelligence could spell the end of the human race”?

What is Copilot?

Described as “Your AI pair programmer”, GitHub Copilot is a service that helps developers write better code by providing suggestions. Powered by Codex – a deep neural network language model created by OpenAI – Copilot was trained on billions of lines of open-source code from GitHub.

How GitHub Copilot works
The official website explains how GitHub Copilot works.

All you must do is to provide some meaningful comments (sometimes a clear function name is sufficient) about what you want to achieve in your code, and it will produce a series of different suggestions that you can choose from. Currently conveniently available as an extension in Visual Studio Code, Copilot is also compatible with many different programming languages.


GitHub Copilot generates code recommendations by drawing context from the file that you’re working on, increasing your overall efficiency. With its smart autocomplete feature, it will streamline the process of writing boilerplate code and tests, reducing the need to search for answers or examples on the Internet. In addition, the fact that it can show alternatives rather than just a single suggestion means that you can evaluate the provided solutions and make adjustments accordingly. This breakthrough can lower the barrier for beginners entering the industry, and really highlights how machine learning can help developers build software.

Example of GitHub Copilot's autocomplete functionality
An example of GitHub Copilot’s autocomplete functionality from its official website.


Unfortunately, major issues also arise with the development of such services. For starters: copyright. As mentioned in this article “the user has no way of knowing if the algorithm made a particular piece of code up by itself or stole it from a code repository protected by a license”. There have been many reported cases where the code generated by Copilot was identical to an original open-source repository. Just check out this link where the user wanted an “About me” page and Copilot has literally reproduced an actual person’s webpage. A phenomenal tool? Or a copycat exploiting other programmers’ hard work?

On the official website, GitHub Copilot tries to address this issue by describing the software as “a code synthesizer, not a search engine”. By stating that it was trained on data from publicly available sources, it also mentions how it is unnecessary for us to credit this service, and how all of its suggested code belong to us. Too good to be true? Well, it means that the final program that you write for a company could be sued if the autocompleted code breached copyright laws. Just imagine the severe consequences that could cause…

People unimpressed with copyright violation
Armin Ronacher, creator of the Flask framework for Python, unimpressed with the potential copyright violation as he demonstrates an example of plagiarism here.

Now, was Microsoft really acting with philanthropic interests when it invested $1 billion in OpenAI two years ago, with the aim of making the lives of programmers easier? Their underlying objective was still to assert dominance in the industry, potentially even to create a monopoly. Is it legal for them to monetise this tool if it reproduces code snippets from licensed repositories? While it mentions that no private code is shared with other users, Microsoft also states how it “is used to improve future versions of the AI system”. Who knows what they are doing with your work? In contrast, if you are interested in exploring how such data mining is not copyright infringement, take a look to read these two articles here and here.

On another note: inaccuracy. Why do you think Copilot offers alternate suggestions? Because it’s nowhere close to being accurate and you need to still check and modify the code to ensure that it fits within the context. According to OpenAI’s research, Copilot only gives the correct solution 43% of the time on its first try. Reading code can be harder than writing code, and it is essential that programmers understand every line of code that has been autocompleted rather than blinding using it. Note that Copilot doesn’t compile the code or check that it actually satisfies what it’s supposed to do, and so there may still be a lot of effort required to debug and refactor those suggestions. Otherwise, any insecure coding patterns, neglect to account for edge cases or references to deprecated libraries can all be the root cause of a corporation’s downfall.

Final Verdict

If you’re worried about your career being at risk: you shouldn’t be. Even if we ignore the aforementioned issues of copyright and inaccuracy, GitHub Copilot is only supposed to improve the speed and convenience of writing code – but not replace your worth as a programmer. Just because it helps us solve problems does not mean that it has the capability of making our jobs obsolete. After all, that’s like saying “What’s the point of learning to code if I can just copy and paste from Stack Overflow?” As you may have realised after all those painful hours of debugging, it’s not as simple as that.

Remember that software engineers aren’t only responsible for programming either – defining the problem and setting up the initial structure aren’t achievable by Copilot. It can only enhance our productivity at best, as it’s just unable to complete complex tasks like meeting customer requirements or trying to magically build a whole website.

Such a tool lacks the human factor of innovation in software development too. This is particularly the case with this AI, since it provides suggestions based on existing code from previously trained data. Of course, Copilot needs to be strictly prohibited for university students, as it obviously eliminates the integrity of assignments and exams. If you try to take the easy way out during the early stage of your fledgling career, then you might really have a hard time remaining competitive in the future.

Regardless, it’s incredible how far Artificial Intelligence has come and the extent of how GitHub Copilot could realistically become integral to the daily workflow of programmers in the future. Who knows what we can achieve in, say, the next 20 years? Maybe there will eventually come a point where there is an AI takeover… Luckily for us, that won’t be happening anytime soon.

Tags: Machine Learning, Deep Learning and Neural Networks Skills and Guidance Opinion