In Standardizing Ethical Design for Artificial Intelligence and Autonomous Systems, Bryson discusses the evolution and future of AI in our society.
First, she defines AI and the characteristics of intelligence which include, the capacity to perceive contexts for action, the capacity to act, and the capacity to associate contexts to action. However, challenges and concerns arise with AI as it progresses and stabilizes in our society. The fears associated with AI include the ability to outcompete humans, ability to undermine societal stability, and harming privacy, personal liberty, and autonomy. As a result, professionals are working towards preventing these issues from becoming real life problems by establishing standards for ethics in AI. If these issues aren’t moderated with standards, then we are left vulnerable to our innovations. Bryson argues that transparency and safety are necessary for the existence of AI. Specifically, transparency provides users with insight into why a robot act in different situations, and safety provides engineers and scientists with insight into accidents, issues, and technical details.
I agree with Bryson’s argument and her beliefs about the future of AI. Throughout this class, we’ve discussed our own issues and concerns with the implementation of AI in our lives and popular concerns raised by society, so I was already familiar with the topic. However, I’ve never heard of IEEE (Institute of Electrical and Electronics Engineers) and their Initiative for Ethical Considerations in Artificial Intelligence Systems. I’m curious to see how engineers and scientists tackle these concerns and the future of AI in our world.
I think that everyone will have a hard time agreeing with each other over how much AI they are comfortable with as it continues to develop. Personally, I do not want that much technology in my household or in my life in the future. Although AI does have its benefits I think it can be extremely detrimental to people’s mental health. In addition, there are also many privacy concerns as you stated in your discussion.