Loading...

Loading...

How to get started with digital ethics by design

  • Posted on July 8, 2019
  • Estimated reading time 3 minutes
digital ethics framework

This article was originally published in MoneyInc.

Just a few weeks ago, Facebook, Google and Microsoft shared their vision of the world as each held their developer conferences. Each CEO’s keynote included a focus on privacy (and I’ll expand that to include digital ethics). Here are a few snippets:

  • Mark Zuckerberg, CEO Facebook: “Over time, I believe that a private social platform will be even more important to our lives than our digital town squares. So today, we’re going to start talking about what this could look like as a product, what it means to have your social experience be more intimate, and how we need to change the way we run this company in order to build this.”
  • Sundar Pichai, CEO Google: “We feel so privileged to be developing products for billions of users, and with that scale comes a deep sense of responsibility to create things that improve people’s lives. By focusing on these fundamental attributes, we can empower individuals and benefit society as a whole.”
  • Satya Nadella, CEO Microsoft: “We have to work collectively to instill trust in technology across everything we do. These aren’t just words we use or claims we make, we must incorporate trust into the design process of every product, always prioritizing privacy, cybersecurity and responsible AI.”

While each of these technology companies has a different view of how their products shape our world, there is a tremendous amount of talk around doing what’s right with technology. That is easier said than done because digital ethics is a new, emerging field that most organizations aren’t prepared to deal with in the way they design and build products.

Avanade research finds that 81% of executives don’t think their organizations are adequately prepared to address ethical issues related to AI and other emerging technologies. Just like in the early 2000s when companies struggled to stay ahead of security breaches, digital ethics will mimic the path of security in the enterprise, but at a faster rate. And today, with security on every boardroom agenda, digital ethics will get there too. This report looks at what companies are doing about digital ethics and how others can start taking action.

As companies gain access to more intimate personal data, including employee productivity and data from wearables, business leaders will need to make increasingly difficult decisions that involve ethical considerations. Many leaders have started with a traditional approach to categorize digital ethics into risk and compliance. This is a good start, but it limits making the true organizational change that is required to effectively tackle this issue.

Digital ethics impacts product development, marketing, brand and reputation management, and corporate citizenship. Designers and engineers need to design products according to company ethics frameworks and playbooks that explicitly define ethical values and principles. Marketing needs to think about how digital products are described, positioned and promoted. And companies need to think beyond the intended use of digital products to unintended uses with negative consequences. If organizations buy and implement third-party technology, they need to think through how that technology reflects their values, as well as the values of their customers, employees and communities.

Steering committees, advisory councils and centers of excellence are great, but digital ethics needs to be the responsibility of everyone in a company. Exactly how digital ethics plays out as a specific employee’s responsibility depends, of course, on that employee’s role. It’s still early, meaning frameworks and tools aren’t readily available, but they are maturing. Microsoft Research recently released a tool for machine learning to help interpret the black box nature of algorithms, which in turn allows companies to better understand the ethical implications of those algorithms.

Here are cross-industry questions that can help organizations develop digital ethics frameworks and implementations. These are a good starting point before getting into specific industry and market questions.

  • Organizational values and governance: Do your organization’s values address ethical behaviors? Does your organization’s ethics agenda cover technologies and innovations?
  • Data sources and use: Do you know where all your data comes from and how you’ll use what you collect? Have you studied and understood possible sources of bias in your data and insights?
  • Product innovation and ethics-by-design: Are your product teams made up of diverse backgrounds and opinions that will help identify bias and ethical issues at the earliest stage of design? Do you have a data ethics impact-assessment capability to know how to respond to ethics issues? Can you shut down software if an ethics lapse is exposed?
  • Employee stakeholders and internal enablement: Do employees have access to routine training on digital and algorithm-based technologies? Do they know how to engage with and manage those technologies ethically?
  • Customer stakeholders and external transparency: How frequently are digital ethics best practices and customer expectations assessed? In the event of unintended consequences, how are customers and employees alerted?

Doing ethics by design needs to fit into the organization’s culture. It’s about finding a framework and adopting it and eventually leveraging emerging tools to help. Implementing digital ethics is a change-management process; treat it like one, with multiple forms of training, incentives and reinforcement of behavioral change. Because there are no playbooks, join a consortium and start taking action to bring best practices back to your organization.

Get your free copy of Avanade’s latest “Trendlines: Digital Ethics” research report.

Avanade Insights Newsletter

Stay up to date with our latest news.

Share this page
CLOSE
Modal window
Contract