9/21 Tesla Bot – Talia Feinberg

In this reading, a researcher whose job is to study unforeseen consequences of developing technology discusses the Tesla Bot. This is a 125 pound humanoid robot designed by Elon Musk to eventually help humans with repetitive and boring tasks that people despise doing. Musk, who is known for his creation of the Tesla automobile, utilizes his AI technology established in the vehicle in order to direct the robot to avoid obstacles just as his cars do. His goal is to blend technology and biology to “[transcend] our evolutionary heritage through technologies that are beyond-human…” 

While it seems great to have a robot to take away the boring everyday tasks humans do, in reality, this poses many risks to society. Some of these risks include threats to privacy and autonomy, job security, and even equality. The article goes on to the point of “just because we can, should we?”: something that has been continuously brought up in this course. I agree with the author of this article as this Tesla bot seems to pose more risks than the benefits it provides. In terms of the article’s point about equality, it seems as though new, and presumably expensive, artificial intelligences such as this will further the uneven playing field in the classroom, workforce, and more.   


One thought on “9/21 Tesla Bot – Talia Feinberg

  1. I read this article as well, and find the question raised by Maynard to be very important. I believe the question could be applied to other forms of technology and the recent innovations available to consumers. As much as technology can help improve our lives it can also ruin our lives or complicate them in ways we’d never imagine. The ‘orphan risks’ associated with the Bot show the dangers of these technologies and provide transparency about Tesla and other big companies. I think innovators like Elon Musk are much more concerned with their personal triumph instead of actually helping people.

Leave a Reply

Your email address will not be published. Required fields are marked *