A Crisis of Ethics in Technology Innovation – MIT Sloan

Posted: March 10, 2020 at 9:41 am

What to Read Next

To compete in 2020, its not good enough to have a disruptive product. Your customer experience also needs to shine.

In the age of AI, traditional businesses across the economy are being attacked by highly scalable, data-driven companies that leverage network effects to deliver value.

See our report on the nuts and bolts of digital transformation.

A short podcast for busy leaders from MIT SMR.

Image courtesy of Keith Negley/theispot.com

Cambridge Analytica has become a household name, synonymous with invasion of privacy. Its controversial entanglement with Facebook was a wake-up call about how we share information online. Of course, Cambridge Analytica is gone now, and Mark Zuckerberg has survived so far. But the fallout for Facebook feels never-ending: the initial stock drop, the congressional testimony, a record-breaking $5 billion fine from the Federal Trade Commission, a class-action suit approved by a federal judge,1 and another uncomfortable grilling in Congress.

The Facebook scandal is a cautionary tale for executives and consumers alike. But the lesson is much bigger than one about so-called fake news. The hasty reconstruction of value chains around new technologies is introducing and exacerbating ethical concerns across industries. Its a free-for-all race as companies compete to impress users with new capabilities, and whats at stake isnt just which ones survive but whether we are able to sustain a civilized society or end up in a high-tech Wild West.

Research Updates From MIT SMR

Get semimonthly updates on how global companies are managing in a changing world.

Please enter a valid email address

Thank you for signing up

Privacy Policy

Facebook ushered in a new era of publishing by building the worlds largest content creation and distribution network, amassing billions of users. It invited content makers and advertisers to subsidize those users on a platform that many people feel they cant live without. No longer was the media value chain being orchestrated by a few large organizations; Facebook was opening up markets by enabling anyone with a keyboard and an internet connection to effortlessly plug into the worlds largest distribution system. In effect, Facebook broke apart the media value chain and simultaneously re-created it around the companys application programming interfaces (APIs).

But as Facebook helped transform an industry ecosystem, it didnt concern itself with editorial ethics. It sold access to its user base to companies like Cambridge Analytica while maintaining distance from anything posted on its own platform. Content creators could tap into end-user data to precisely target their messaging, whether the information they were putting out was false, misleading, or true. Driven by demand from billions of users, Facebook focused only on ensuring that the content on its network amassed clicks.

In this new world of publishing where authors, editors, and distributors are separate entities pursuing their own interests the scandalous consequences may seem predictable. After all, accountability also splinters with the rest of the value chain. But when no one steps up to maintain ethical standards across the system, we all suffer in the end.

Facebook is just one example of the evolving and murky world of self-defining ethics in technology. In this article, we argue that as technological systems rapidly restructure, ethical dilemmas will become more common and that well-understood theories can help us predict when and where problems may arise. Executives across industries find it enticing to democratize access to cumbersome markets like health care, lending, and publishing. But if youre the executive who happens to decouple consumer protection from mortgage lending, all the positive intentions in the world wont protect you from the unavoidable backlash.

Bottom line: Predicting where your industry will stumble within this new world can make the difference in ensuring your business flourishes with its reputation intact.

To be clear, this is not about a few software bugs resulting from a move fast and break things mentality. This is about leaders, acting in the best interest of markets and consumers, enabled by the ubiquity of the internet, who unintentionally sidestep the ethical protections that underpin society as we know it. To understand the imminent ethical crisis and why current circumstances are so different, we need to understand how value chains emerge and why even responsible technology companies may overlook their ethical obligations.

In 2001, Clayton Christensen, Michael Raynor, and Matthew Verlinden published a lauded article in Harvard Business Review, Skate to Where the Money Will Be.2 It explained what they called the Theory of Interdependence and Modularity. The theory holds that when new technologies emerge, they tend to be tightly integrated in their design because dependence among components exists across the entire system. To combat this fragility, one entity must take tight control of the systems overall design to ensure performance.

Consider the early iPhone. Apple controlled the software, hardware, and even the network to give users the best experience. There was one size, one browser, and one carrier. Features were eliminated to support battery life, capacitive touch, and call quality. In Christensens language, the designs interdependence was critical, as the phone itself struggled with basic performance issues related to its core function of voice communication. Only Apples unequivocal control made the product reasonably competitive.

Christensen and coauthors argued that, over time, the connections among different parts of complex systems become well understood. Each elements role is defined. Standards are developed. To use Christensens term, the industry becomes modular, and an array of companies can optimize and commercialize small, specific components with no meaningful impact on overall system performance. Todays iPhone consumers can choose their screen size and phone thickness, the app store is filled with tools and games from millions of different developers, and phones are available on any network. An entire smartphone industry now exists whereby consumers can pick and choose practically everything about their phones, and the software on them, to meet their individual needs.

For any new technology industry, modularization is the end state; it benefits consumers and grows the pie. Since one company no longer needs to take responsibility for the entire system, every company is free to focus on whichever elements they deem to be strategically advantageous. Christensen, Raynor, and Verlinden counseled companies to anticipate how their markets would become modular and to compete in the places most difficult to master. In the smartphone arena, chipset makers and mobile app companies gobble up all the profit in the system as they tackle the most differentiated parts of it. Playing off a famous hockey tip from Wayne Gretzky in their HBR article title, the authors coached strategists to head to where the money will be, not where it is today.

But modularization is a double-edged sword: The disaggregation of development responsibility also means the diffusion of responsibility for ethical outcomes.

And todays reality is that modularization is accelerating across industries. The internet standardized communication, architecture, and information exchange in every function, allowing new businesses to turn a profit by perfecting ever-more-narrow slices of a value chain.

Consider Lyft. When the company went public in March 2019, its filings recognized the riskthat it relied on critical third parties for payments, financing, web infrastructure, background checks, and other significant technology components.It isa massively successfulbusiness, but many of its core processes are delivered through the combined servicesof other vendors. Wed expect similar risks to be identified in the filings of almost every outgoing IPO.

The rise of companies focused on simple components of complex systems has created a virtual la carte menu from which would-be disrupters can tailor new, complex products according to customer demands. The result: a virtuous cycle that has caused a whirlwind reconstruction of value chains in every industry.

In our increasingly modular world, companies can quickly tailor products to user demands; innovation and opportunity flourish, but so do the potential risks not just to a companys bottom line and reputation but also to society at large. Innovation might be able to move with lightning speed, but our user protections do not.

The danger of trusting the pull of user demand to shape an industry is that users short-term desires dont always account for long-term societal needs. Think of the personal choice of smoking versus its secondhand effects on other people, or the short-term savings of not carrying personal health insurance versus the long-term impact on public health, or the convenience of driving your own car to work versus the societal benefit of public transportation. In many situations, a user makes a choice and society bears the burden of it.

Now lets expand this dilemma to a uniquely modern one. Imagine youre a parent who wants to educate your child about technology, given the increasing need for young people to understand engineering concepts and have some familiarity with design. You purchase a cheap 3D printer and use it to impart lessons around technology, software, and manufacturing processes. Youve brought into your home an amazing tool but also a potentially dangerous one.

For context, 3D printing (or additive manufacturing) is the process whereby a physical object is constructed using a 3D computer model and a standard machine that extrudes material to build the object, often layer by layer. These machines are extremely affordable for small-batch productions relative to the manufacturing equipment weve relied on until now. Most 3D printers cant yet create objects at the speed required for commercial scale, but flexibility was designed into their architectures from the beginning. Whereas the injection-mold manufacturing used in the last paradigm required specialized configuration, 3D printers are designed to enable someone to make almost any design a reality.

Today, 3D-printable items already range from the mundane, like plastic trinkets, to life-changing, like affordable housing. The first airplane with a 3D-printed part took flight in 2014. And the worlds first 3D-printed heart was announced in April 2019. Simply put, 3D printing will democratize the production of anything.

On its face, this is amazing. Imagine completely eliminating the organ-transplant waiting list or not having to run to a hardware store when you need a nail. Its no wonder that hundreds of thousands of households have already invested in 3D printers. The world of home-printing critical goods is imminent.

Unfortunately, putting a modular manufacturing device in every household drives the same type of value-chain disruption that Facebook enabled with its publishing API. Customers are no longer beholden to the large companies that also were responsible for producing and distributing products. Instead, any amateur designer can use inexpensive computer design software to create models for production and then distribute their designs to millions of eager consumers by leveraging distribution networks of 3D-printer makers. With a simple download, end users can now fire them off to 3D printers.

Such modularization in manufacturing allows us to bypass the controls that have existed for generations in supply chains, regulated industries, and intellectual property. Relatively benign examples abound: Your child wants a new action figure do you pay for it or just print an illegal replica? Much more serious, what if your driving-age teen puts a faulty home-printed part in your car? Even worse, consider firearms. Gun regulations vary across countries and U.S. states, but they do exist and many are enforced at the point of sale: What types of arms and ammunition can be sold and to whom? If anyone can download a model from the internet and print a weapon at home, much of our approach to gun control will be rendered moot.

Of course, most consumers bringing desktop 3D printers into their homes simply wish to take advantage of the flexibility of the new systems, not to forecast every potential use and failure of them. Users pull technology into their lives to scratch an itch: Facebook to entertain themselves and socialize, Lyft to get from point A to point B, 3D printers to educate their kids or get simple tasks done faster. Consumers dont (and shouldnt) be responsible for thinking about the implications of introducing new systems on the back of modular innovations.

As executives, if we rely on users to guide our ethical responsibilities, we are destined to be at best reactive and, at worst, too late to chart the right course.

Luckily, if you believe that the internet will continue to enable rapid modularization in every industry, there are clear ways to navigate this compelling future.

Around the time the news feeds debuted, Anne Wojcickis 23andMe began offering direct-to-consumer DNA testing: Simply spit in a vial, and 23andMe would analyze more than 600,000 genetic markers to send you information about your health risks and ancestry. Time named it the Best Invention of 2008 for pioneering retail genomics. And it was possible only because of the modularization in intellectual property related to genomics and gains in cloud computing that enabled high-volume storage, search, and processing. Of course, this modularization also created ethical gray areas.

Beyond empowering individuals with easy access to their health indicators, Wojcicki maintained a vision to accelerate and simplify medical research. The cost and time required to bring new treatments to market could be slashed with access to a sufficiently large, diverse database of consenting participants. Its easy to get caught up in the extraordinary possibilities. Its harder to consider tough questions about things like test validity, unexpected parentage discoveries, and the role of primary care providers in understanding results. Its tougher still to imagine all the new ways that access to this information might upend our existing social systems: What are your obligations to report a genetic marker for a disease to your health insurer? Can health insurers buy access to this information? What access should law enforcement have? What if you choose not to participate but your information can be easily inferred from that of a relative? And whos responsible for considering all of these questions and others?

Ownership and accountability are messy in the age of modularity.

Considering all possible societal implications is a big ask for people merely curious about their ancestry. And consumer genetic testing falls somewhere between the Centers for Medicare and Medicaid Services regulations of clinical research (consumer DNA testing is not a clinical trial) and the Food and Drug Administrations regulations of drugs, biological products, and medical devices (the FDA now lumps consumer genetic tests in with medical devices).

Wojcicki spoke about this topic for four consecutive years at Stanfords Graduate School of Business. Her take is that, despite its challenges, trust is still crucial to keeping the health care system functioning. Therefore, if individuals couldnt contemplate the wide-ranging effects, and if regulators couldnt keep up with the breadth and pace of change, Wojcicki had to take responsibility to deliver that trust. Borrowing a proven concept from the existing health industry, she engaged an independent institutional review board to serve as ethical adviser on all of 23andMes activities.

The fact is, 23andMes data can be used for earth-changing research and, at the same time, have unexpected destructive effects. Skipping the middlemen of primary care providers in ordering genetic tests or of clinical research organizations in collecting data is not a question of morality but of how we as a society maximize the benefits while controlling costs. Pertinent applications of 23andMes data will be debated, probably for years, before something like public consensus develops.

Weve already seen that modularity enables businesses to quickly scale to entire populations, after discovering and delivering what users want and that this speed shortcuts our long-standing approaches to public scrutiny. By seeking out third-party advisers to review the use of their data, Wojcicki has created a countervailing power to represent the societal viewpoint, just as any traditional research institution would maintain.

In redefining the way we access medical information and participate in research, direct-to-consumer genetic testing is another area where modular innovations could fail us without thoughtful action. The FDA, and certainly an individual consumer, cannot possibly consider all the positive and negative implications of merely spitting in a cup. The companies that find enormous value in this act must take on some of the ethical onus, as 23andMe has set out to do.

Christensen et al.s Theory of Interdependence and Modularity is a powerful explanation of how value chains evolve and of the influence of consumer demand. As value chains split apart, innovators can reassemble them in response to customers desires, in ways that take advantage of new technological options. Executives who embrace these changes should also seek to conscientiously address the often less-than-obvious ethical issues that arise. We suggest three courses of action:

1. Assume you become the standard bearer. Most innovators are comfortable playing on the margin. As disrupters who embrace modularity come up from below, its easy for them to point to traditional businesses and refer to their ability to fulfill complex needs in the market. But success as a disrupter should come with a sense of obligation to change the paradigm, particularly when the upstart turns into the dominant platform. So instead of focusing only on the outcomes of your initial attack, work backward. Assume you become dominant. Then ask what is most likely to break, what can be done to prevent breaks, and how to handle them when they occur.

2. Document the safeguards that would have prevented such failure in the current system. Borrow a page from lean process improvement and start by mapping the complete value chain for the service youre providing as it existed before your company arrived. Next, chart out the future state in which youre dominant. Chances are, youve created an efficiency by removing or reducing the scope of some step. Learn the history of how this step evolved, and consider the safeguards ingrained within it: Are they regulatory? Are they related to standards? Are they social constructs? Consider the protections inherent in restricted access: What education or training did those with access have? If it helps, imagine how a horde of naive teenagers might misuse or misunderstand your service. Definitely contemplate how it may be used by malicious actors. Safeguards have protected consumers as well as the market. Know them, and plan for how they will be addressed in the future state.

3. Identify who is responsible for delivering these capabilities. In some cases, it will be crystal clear: Ride-share services could not survive without trust in drivers, so Lyft and Uber must ensure background checks are done, even if they dont conduct them directly. In other cases, it wont be obvious: Are 3D printers just a platform facilitating exchange between model designers and consumers? Leaders need to anticipate that theyll be held accountable for the failures of the changes they usher in.

To put these recommendations into practice, its important to assume success, understand the gaps, and take responsibility for the future that will be created. The particulars of implementation will vary by industry and company, of course. But we believe strongly that these three actions are key to recognizing where ethical uncertainties may arise from modularity and how to responsibly navigate that change.

The model for the Liberator, a 3D-printable plastic gun, was downloaded more than 100,000 times before a federal judge blocked the posting of 3D gun blueprints online.3 Lucky for us all, not every household has a 3D printer; the printed parts have to be meticulously assembled; and, even when built correctly, the gun produced is unreliable (its just as likely to misfire on its owner as on the intended target). In time, these complications will be worked out. But that also means theres time for regulators to plan for the obvious threat.

In other arenas, we should be more concerned. Industries such as lending, media, employment, and health care as we know them have evolved over the course of decades; their protections were sometimes hard-won and sometimes inherent in the very nature of the previous operators or target audiences. Faster than ever, disrupters and large corporations alike are reforming these value chains to take advantage of blazing-fast transfer of information, the application of artificial intelligence, and the creation of marketplaces and networks that distribute low-margin work. Its optimistic and reckless to assume that the existing protections will automatically port over to the newly modular systems.

Strict compliance with the laws, while crucial, is also insufficient to avoid the ethical pitfalls. In a piece for CNN Business, the former COO of Cambridge Analytica, Julian Wheatland, reflected on the scandal: Cambridge Analytica made many mistakes on the path to notoriety. But its biggest mistake was believing that complying with government regulations was enough and thereby ignoring broader questions of data ethics and public perception.4

Lesson: The only rational solution is to embrace new ethical paradigms in a thoughtful way. Every executive should imagine the future that is bound to arrive and consider both the path toward consumer delight and the systemic protections that will be required.

Max Wessel (@maxwellelliot) is chief innovation officer at SAP, responsible for technology research and product incubation efforts. Nicole Helmer (@nikkihelmer) is a decision scientist at SAP, working at the intersection of customer experience, emerging technologies, and new product development.

1. J. Stempel, Judge Lets Facebook Privacy Class Action Proceed, Calls Companys Views So Wrong, Reuters, Sept. 9, 2019, http://www.reuters.com.

2. Building on a theory popularized by Kim B. Clark: C.M. Christensen, M.E. Raynor, and M. Verlinden, Skate to Where the Money Will Be, Harvard Business Review 79, no. 10 (November 2001): 72-83.

3. C. Domonoske, Federal Judge Extends Order Blocking 3D Gun Blueprints From Internet, NPR, Aug. 27, 2018, http://www.npr.org.

4. J. Wheatland, I Was a Top Executive at Cambridge Analytica. It Taught Me a Tough Lesson About Public Trust, Perspectives, CNN Business, Aug. 19, 2019, http://www.cnn.com.

See the original post here:
A Crisis of Ethics in Technology Innovation - MIT Sloan

Related Posts

Comments are closed.

Archives