Ethical and productivity implications of intelligent code creation
- Posted on March 2, 2021
- Estimated reading time 4 minutes
Intelligent code creation (ICC) uses machine learning models and embedded intelligence to provide developer support for writing secure best in class code. This can involve a developer co-pilot or an ‘AI pair programming’ approach that flags poor development practices or proposes alternative ways to solve a problem.
What is it?
This approach provides additional support for existing developers, providing answers to write complex algorithms, trained on other “best of breed” codebases. It also enables new developers to become productive on domain specific code, or closed enterprise owned codebases. Long story short, this is not an update to automatic code generation, but it is about providing support in what and how to develop next. AI becomes your assistant, rather than replacing human ingenuity.
Where is this going?
With the development of new machine learning models such as GPT-3, and the inevitable creation of newer models, we can expect this type of automated code generation and refactoring to improve over time and become a standard tool for developers.
We can see this already in Microsoft’s embedding of AI into existing auto-suggest tools like IntelliSense or IntelliCode. This will mean that a developer can hone their skills of problem solving and solution design, rather than just the implementation and writing of code. You can already upload your own codebase to generate patterns for supporting custom application development.
While ICC is primarily a way to simply gain efficiency, incorporating these techniques into your software development lifecycle an introduce ethical issues worth considering. To begin with, it's helpful to focus on the ethically positive outcomes that can help make the business case for ICC. For example, it has the potential to make the software delivery process more inclusive for people with less coding experience as well as more accessible for people with certain disabilities or neurodiversities. By reducing the time needed to write mundane code, ICC may also help empower developers to be more creative and take more control of their work/life balance.
That said, there are also ethical concerns that should be addressed for any implementation of ICC. Considering its potentially positive impact on individuals, it’s important to ask whether ICC will give a select group of already-privileged developers more power and also whether expectations of greater efficiency end up ultimately taking away users’ space for creativity and balance. The scale of implementation may also determine whether there’s an outsized environmental impact, as the energy consumption for such models can be substantial.
To mitigate ethical risks, make sure your use of ICC aligns with organizational values (such as inclusivity, employee well-being, and environmental responsibility). Review your compensation model to avoid penalizing developers that use ICC as well. From a security perspective, while intelligent code may reduce common human errors, it also has the potential to introduce fewer common errors or unknown vulnerabilities on a systemic scale, so security and risk reviews are still essential.
Transparency is also key, so developers and managers know when ICC is in use and they can monitor its short-term and long-term impacts on the SLDC accordingly. Finally, documentation requirements for intelligent code should be just as stringent as for manual code, and development teams should still be accountable for quality, security, and compliance of the code they deliver.
As with any emerging technology, ethical implications will largely depend on use cases and implementations, but careful consideration and treatment should help reduce potential harm and assure positive outcomes.
Kite: Diving into one example of an ICC tool
For the purposes of demonstrating how an ICC tool works, we’ll discuss Kite. Kite provides AI-powered code completions in many different languages, allowing developers to quickly complete, words, lines, or even multiple lines of code in a single key stroke. Kite has a free entry level service as well as a paid version that provides access to Kite’s deep learning models for better multi-word suggestions.
Kite includes a "co-pilot” window which launches next to your IDE, to display related documentation. All features are powered by a ‘Kite Engine’ developed to provide autocompletions and code examples embedded within the syntax at hand. Documentation lookup is only supported for Python at this time.
The document lookup is a great example of ICC, as it analyses your own codebase to provide relevant suggestions for the project you’re currently working on, in the local development context.
Using Kite is a learning process in itself, especially for experienced developers, but providing these useful insights within the IDE and the co-pilot help to reduce context switching, improving productivity. Kite also uses its dataset of well-structured code bases to suggest completions which improve the code quality and can also suggest efficient alternatives to code you may be writing.
Numerous ICC tools in the market
Avanade doesn’t endorse any particular Intelligent Code Completion product, and there are many other organizations out there trying to solve the problem of AI-Assisted Code Development, and this is certainly a field to watch. A non-exhaustive list of similar products include:
- Visual Studio IntelliCode – Delivers semantic & context aware line completions – suggesting the “best” APIs to use, rather than listing all of the APIs available. Supports C#, XAML, and previewing support for C++, JS, TS, and VB.
- Kite – Provides a “co-pilot” feature to display documentation on screen, as well as surfacing smart snippets, and suggestions trained on your own codebase. Supports most manor languages.
Ultimately, intelligent code creation is one signal for a broader trend we’ve been seeing. This is the rise of thinking machines – part of the journey to artificial general intelligence, where systems are assigned a task, self-solve, and then work on the task, all supported and supervised by a human.
Find out more around the trends we’re seeing in Emerging Technology, including the rise of the (service) robots.
Chuck Charbeneau @cacharbe
I would think that there are a few more ethical (and equity) concerns in this context as well. There is already a significant amount of coded bias in our tech implementations globally and the use of learning algorithms that are taught with datasets that contain an already embedded bias will create an abstraction to those bias AND understanding of its implication, as well as a codification of those underlying bias into new work.