• Home
  • >
  • DevOps News
  • >
  • Don’t Build Software You Will Regret – InApps 2022

Don’t Build Software You Will Regret – InApps is an article under the topic Devops Many of you are most interested in today !! Today, let’s InApps.net learn Don’t Build Software You Will Regret – InApps in today’s post !

Read more about Don’t Build Software You Will Regret – InApps at Wikipedia



You can find content about Don’t Build Software You Will Regret – InApps from the Wikipedia website

New years are for resolutions, right? And then February finds them discarded like last season’s must-have toy. Resolutions most often fail because people don’t make plans, just wishes. With that in mind, this new year, we at InApps — with the help of some at the forefront of modern tech ethics — will not only help you resolve to work more ethically, but to offer you the beginnings of a framework with which to achieve it.

After all, we should be working not just to pay the bills, but to make sure we don’t create software that we will one day regret. And we need to be clearer about what we are building and how.

Buzzword Alert: What Is Tech Ethics, Anyway?

At InApps, we talk a lot about avoiding technical debt, but what about the ethical debt? Let’s begin by attempting to define just what ethical technical delivery even is. Black Pepper Software’s Sam Warner at the Good Tech Conf — a conference which focused on technology for social good — simplified this great university philosophy topic, saying ethical software:

  • causes no negative social impact
  • doesn’t make the world worse to live in

At Coed Ethics, another conference dedicated to tech ethics that InApps covered earlier this year, Doteveryone’s Sam Brown echoed Warner, saying “Responsible technology considers the social impact it creates and seeks to understand and minimalize its potential unintended consequences.” Doteveryone as an organization is dedicated to supporting responsible technology as a key business driver for positive and inclusive growth, innovation, and trust in technology.

But should those of us building the future’s code feel obligated to contribute something toward social good? Warner argues we should go even further than that and contribute to work that benefits the most amount of people in a significant way.

So, if this is our objective, where do we begin?

There’s no doubt that privacy is a hot ethics topic and it makes a good tech ethics use case. With increasing data regulations in the European Union and California, plus the constant risk of a very embarrassing public breach, it’s important that data processing is at the front of our minds. In fact, someone within a company should be appointed to document and share with the rest of the company how personal data, like mailing addresses and credit information, flow through your company. Everyone within an organization is now responsible for the data being processed through it and everyone must be aware of that data flow.

While we are only beginning to address flows of information within across organizations, it makes for a good starting point to begin to observe your company’s tech ethics as well. The questions companies should be considering with data usage could be applied to the entire software lifecycle and user experience, particularly at the design stage.

Read More:   Chaos Engineering for Every Layer – InApps 2022

Beyond foreseeing any obvious negative social impact, we have to constantly ask ourselves the following questions:

  1. What is our code connecting to?
  2. Is it necessary that they connect?
  3. Is it necessary that particular data flow through it?
  4. How long are we storing that data? Why do we need to store it?
  5. Should we be sharing that data with that entity?
  6. Who does this marginalize? Who is not included in this software?
  7. What is the worst possible use case for this code?

In this interconnected world, it’s not just about trusting our own team, it’s about trusting the ethics of the systems we are connecting to, and it’s about documenting our ethical considerations so we enhance customer trust.

Warner said we need to move past discussions of fuzzy terms like good versus bad because “We all make mistakes. And sometimes we don’t even realize we are making a mistake before it is too late.”

This is especially the case with open source — often considered more honorable than its closed-source counterparts — when you cannot always predict how people are going to use and abuse your code down the line. The first step to any ethical development is to consider your worst-case scenarios.

And then you have to be prepared to learn from your mistakes, often publicly acknowledging them — particularly in customer data breaches — and explaining how you are trying to fix them.

“What is criminal is not learning from your mistakes and trying to move forward with that.” Warner continued that “Facebook is a good use case — when Mark Zuckerberg was making Facebook in his dorm room bedroom, he probably wasn’t trying to make this dopamine-inducing tool.”

The remedies may not always be obvious, but we must work toward them.

Warner said that awareness is important, but also as an industry we need to begin to realize why we even make mistakes in the first place. Medicine has had centuries to learn from its mistakes, but the much less mature software industry has only had about 60 years of study and about 20 with people carrying computers in their pockets.

In short, there’s still a lot we have to learn.

Designing Ethically: Should We Even Be Building This?

We now have the opportunity to leverage education and awareness to mitigate against the mistakes we might make. Every major data leak or emissions scandal shouldn’t just be a headline, but a learning opportunity for reflection.

A culture of openness is a good start.

Also at Coed Ethics, Container Solutions’ Andrea Dobson addressed why people often make these bad decisions in the first place:

  • Conformity
  • Obedience
  • They don’t know what else can be done

How can we counteract this? It’s all about building a culture of psychological safety, Dobson says.

“Not so easy to do, but it mainly comes down to educating people who have roles of perceived power and make them know psychological safety is necessary to grow. You can’t learn without making mistakes.”

Like the open communication theme driving tech ethics, she continued that this safety comes from, “Speaking to people one on one and asking them questions.”

International development agency Aptivate practices and teaches consensus decision-making as a way to actively lose ego, consider every voice, and give shared ownership of any decision made. If individual opinions are valued, someone is more likely to object to ethically risky behavior earlier. Other companies, like BookingGo, Booking’s rental car arm based out of the UK, address ethics as part of a volunteer committee, that sees about ten percent of the 1200–person team actively, openly discussing the ethics-based decision-making. Google famously created ethical principles for its artificial intelligence in response to the outrage of four percent of its staff, committing to not continue with Project Maven or any other weaponizing of AI.

Read More:   Update What Every Developer Should Know about Machine Learning

Tech companies — and companies in general — can start with brown bag discussions about ethics. Next, perform an ethical audit of internal processes and actively signpost about your commitment to ethical development, sharing good and bad real-life examples.

The Institute of Future and the Tech and Society Solutions Lab together created a collection of PDFs which form an Ethical OS Toolkit which helps tech teams anticipate the future impact of today’s technology. For anyone delegated organizational ethical considerations — and, again, everyone should be considering it — this toolkit includes eight zones to help you identify emerging areas of risk and social harm for your software’s future:

  1. Truth, disinformation, and propaganda
  2. Addiction and dopamine economy
  3. Economic and asset inequalities
  4. Machine ethics and algorithmic biases
  5. Surveillance state
  6. Data control and monetization
  7. Implicit trust and user understanding
  8. Hateful and criminal actors

Sure, you wouldn’t build anything to facilitate any of this, but could someone use what you’re building to achieve any of the above anti-goals? Warren argues ethical feasibility should be factored into the design stage of agile and Scrum workflows.

Apply consensus-based design processes, repeatedly allowing anyone to offer responses to:

  • Should we be building this?
  • Should we be releasing this? (Just because you’ve invested time and money into building it doesn’t mean you should be launching it onto the world.)

There is also available many activities your team can do to help you determine the ethics of your code, including the Open Data Institute’s data ethics canvas.

Ethical UX Is Human-Centric Design

Sometimes ethical software development is as simple as building user-first or designing for humanity, not just your immediate human users. At Good Tech Conf, Fjord Interaction Designer Hollie Lubbock talked about ethical user experience design. She has created her own set of boundaries to determine what she would or not work on, something all of us freelancers and contractors should do.

Why is UX design so important for this? As Tristan Harris of the Center for Humane Technology put it: “Whoever controls the menu, controls the choices.”

Or as Lubbock put it, “Design enforces design on other people,” offering this tricky example of a Facebook pop-up.

Lubbock says design involves practical ethics.

“Every time we make a design decision, we are actually making a choice with how people interact” with our tools.

She reminds us that tech is “making us as a society lonelier, less satisfied, less productive. Screen time is linked to that loneliness,” as automation doesn’t make us more productive so much as it removes the boundaries of 9 to 5.

We need to be considering human-centered design at scale. Lubbock says we need to consider these essential questions with each design decision:

  • What if what we are building is super popular and everyone’s using it?
  • How will it infiltrate people’s lives?
  • Who will be left out?

The problem often is that performance targets, like clicks and signups, see teams manipulating user experience to reach company goals, which in turn inadvertently takes our code into some of those eight risk zones.

Despite top-down influence, Lubbock argues each team should prioritize the following:

  1. Respecting human rights: decentralized, private, open, interoperable, accessible, secure, and sustainable
  2. Respecting human effort: functional, convenient, and reliable
  3. Respecting the human experience: delighting your users
Read More:   How Hypernetes Brings Multitenancy to Microservice Architectures – InApps 2022

Ethics for Designers offers a moral value map to help designers prioritize enhancing human potential while acknowledging how human-machine interaction is constantly increasing.

Ethical Design Rule No. 1: Design for Transparency — Lubbock said “Tech isn’t magic, people can’t assume it’s behind a curtain and understand what’s going. Make sure that people understand how the system works.” This can involve simple design cues like Transferwise’s no hidden fees or how Netflix offers the percentage their suggestions match your viewing habits.

Ethical Design Rule No. 2: Design Mindfully — despite our rapid release cycles, we must assess the impact of what we’re building. We are constantly aiming for frictionless and seamless services, but “by making it as easy as possible, do we do it at the expense of giving them a choice? With gambling tools, perhaps we should get more friction.”

Finally, for all sorts of consulting work, Lubbock says we need to take time to pause and think:

  1. Who is the project for? Do you like that brand? Do you believe in the people and mission?
  2. What is the project? Is it for good?
  3. Can you create a positive impact? (Even if the first two questions are a No, maybe the project is still worth it if it yields a positive change.)
  4. How do you feel?

A lot about ethics is trusting your gut.

Finally, “Challenge everyone to question every day: Is the world better for me being in it?” ThinkNation’s Lizzie Hodgson reminds us “We need to change the narrative around young people and hope and technology.”

“We all have silos,” Hodgson continued. “I don’t think we should be hiding behind our technology and our algorithms, telling contradictions things to next generations: ´Don’t trust tech but use it´.”

She says “We have a responsibility to those young people to help them understand the world.”

But maybe we need to understand it first? In the end, we must remember that this narrative can be written. And so can our code.

Why the Tech Industry Needs to Take Ownership of Ethics (and Soon!)

Finally, on a writer’s note, it seems important to mention why it has to be the tech industry that puts tech ethics first.

First, if you’ve watched the U.S. Congress interrogating the leaders of Facebook and Google recently, it seems like we cannot have confidence that these representatives would ever have enough understanding of technology to regulate it. And in revoking net neutrality, many of us could agree maybe they shouldn’t have that ability anyway.

Of course, there’s also the motivation not to go to jail. The Volkswagen emissions scandal was a good reminder that we are indeed individually responsible for what we code, no matter who told us to write it.

But also we must remember that taking a combination of collective and individual responsibility for what we are creating is right in line with other cultural transformations going on within the industry nowadays. We are witnessing a continued trend of DevOps and agile software development, which pushes individuals to take greater ownership of what we are creating — from design to testing to releasing and maintaining — while we are also expected to be better connected with company-wide business goals. Tech ethics should follow in suit.

To compound the urgency of everything talked about above, it feels like we are reaching a point in human advancement — from the rapid rise of AI in tech to the onslaught of genetically engineered embryos and dinosaur resurrection — that we really have to stop and think about the future ethical implications of each of our jobs.

So why shouldn’t we resolve to build more ethical software in 2019? It is certainly starting to feel like it is now or never.

Feature image via Pixabay.




Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      [cf7sr-simple-recaptcha]

      Success. Downloading...