Together we make human and machine behavior more ethical.
Embedding human values into requirements engineering to make machine systems more ethical.
ETHICAL RISK IS (ALSO) DEFINED BY THE BEHOLDER, NOT THE SYSTEMS' ARCHITECT.
Adapted heuristic from The Art of Systems Architecting by Maier & Rechtin, 2009
The Global SAS survey of 2200 business leaders and managers shows a broad awareness of the risks inherent in using AI, but few practitioners have taken action to create policies and processes to manage risks, including ethical, legal, reputational, and financial risks. Managing ethical risk is a particular area of opportunity. Companies with more advanced AI practices are establishing processes and policies for ethical software engineering practices, data governance and risk management, including providing ways to explain how algorithms work and deliver results. These leaders point out that understanding how AI systems reach their conclusions is both an emerging best practice and an ethical necessity, in order to ensure that the human intelligence that feeds and nurtures AI systems keeps pace with the machines’ advancements (adapted from MIT SMR Connections/SAS. (2020). How AI Changes the Rules: New Imperatives for Intelligent Organizations.)
Engineering Hearts™ is about including digital engineers in how to consider ethics and human values in the design and development of socio-technical machine systems. But both human and machine systems must be critically evaluated because they are symbiotic and cannot be addressed separately in the quest to design with ethical considerations in mind. Global efforts in layering ethical principles on top of design-dev regimens have largely fallen short of changing how engineers design machines and software systems. Embedding human values into software and systems engineering must include the engineers, and most importantly engage their Engineering Hearts™.
Engineering Hearts™ is a method of design research that helps organizations with three processes:
(1) audit design choices & scope through an ethical & human values lens perspective using participatory research,
(2) elicit ethical values based on human centered design,
(3) design non-functional requirements engineering based on priorities derived in 1 & 2 above.
Future step: (4) critically evaluate 1,2,3 above against relevant prior use cases