
Quick Notes on EthicalOS by Institute For The Future
I was looking into the work done by Jane McGonigal, or at least that she was a part of, and saw EthicalOS. Having “operating system” in its name instantly drew me to look into it, you can find more about it on its web page here.
Taking a Humanitarian and Free Open Source Software class at RIT, research led me to really dig deep into what this framework is all about. Everything under this paragraph are notes taken from researching and trying to understand it. There are also these slides that I found which offer some scenarios that paint the picture of what they are actually trying to explain. What you should be looking out for.
Web Source: https://ethicalos.org/wp-content/uploads/2018/08/Ethical-OS-Toolkit.pdf
Background
- Created by the Institute for The Future. Research led by Jane McGonigal (I’m a big fan of her work and books!).
- Not an actual operating system. It aims to create a path to analyze ideas and predict the future impacts of these ideas. Don’t get blindsided by a future you created.
- Three important questions:
- Does the tech being built now, be used in unexpected ways in the future or now?
- What categories of risk should you pay special attention to now?
- And which design, team, or business model choices can actively safeguard users, communities, society, and your company from future risk?
Ethical OS
- The aim is to help makers, creators, and more to be able to foresee problems before they occur.
- Better product development. Faster deployment. Impactful innovation.
- Prevent possible downsides of your tech.
Risk Zones
- Truth, Disinformation, Propaganda
- Addiction and Dopamine Economy
- Economic and Asset Inequalities
- Machine Ethics and Algorithmic Biases
- Surveillance State
- Data Control and Monetization
- Implicit Trust and User Understanding
- Hateful, Criminal Actors
How does EthicalOS work? (The Tools)
Tool 1: Problems that may see down the road are things to think about. Tomorrow’s risks and problems are always important, don’t push them away. Train yourself to envision far-off risk.
Signals: Current specific examples that might influence or shape the future of the project.
DON’T GET CAUGHT UP IN WHETHER A SCENARIO IS LIKELY OR EVEN POSSIBLE. PICK ONE AND GO WITH IT.
Watch out for:
- Greatest worry in the scenario
- How different users get affected differently in the future
- Actions to safeguard users and their communities
- Actions now to get ready for the risky future
Tool 2: Different/New technology always have their own risks. Check what should concern you.
Be aware of the 8 risk zones.
What to do:
- Choose: Technology, product, or feature currently working on.
- Read: Check 8 risk zones.
- Check out the signals: Real examples of risks identified and mitigated.
- Identify: Checklist questions relevant to the tech chosen.
- Think: Plan how to prevent the problems/risks identified.
Tool 3: Strategies, future-proof the problems you see.
What to do:
- Pick: A strategy.
- Consider: What could it accomplish?
- Discover: What could go wrong?
Make a list of all of these strategies and the steps that follow. What their relations are. Which potential aspects that can go wrong are more important and keep coming up across multiple strategies?
Strategies:
- Tech Ethics: If programs had a course or training that focused on ethics. Specifically technology-based for engineers and designers.
- Working under oath with data: Not a simple contract, something more where there are more severe consequences that could take effect on you.
- Bug Bounties, no! Ethical Bounty Hunters, yes: Start paying people to find and identify these boundary-breaking tech where the ethics are in question. Social, individual, and community are taken into account.
- Red Flags: Companies have rules and metrics they follow in using third-party open-source packages, maybe focusing on social impact as well. Same idea different application.
- Pulse of Platforms: How healthy is the platform? Measures are not only for business status. How do you monitor these potential societal metrics of the health of platform users or other stakeholders?
- License to Design: A more formal way of becoming a technology designer/developer. Don’t go fast and rush to break things to see the outcomes. Limit the speed, and focus on more ethical and responsible work.
Takeaways
Starting up a new project, community, or organization, EthicalOS sounds like an amazing point to plan through the possible hurdles and “just deal with them later”. The real question is: If you already have an ongoing process and plan, how do you adapt to the guide that EthicalOS offers?
Societal impacts are quite severe when you don’t plan before building and seeing what is coming in the future. Look at Instagram where they knew what was happening, and they decide to ignore it. Quite crazy. The presentation I used for this post points out something that could be amazing if implemented. Ethical Bounty Hunters that independently work on finding these possible outcomes could be a major prevention method or at least a starting point. This is something that everyone has to go over and maybe the change can start then to minimize negative societal impact.
Thanks for reading this post!
If you’d like to contact me about anything, send feedback, or want to chat feel free to:
Send an email: andy@serra.us
Leave a Reply