Gurcan argues that a model using semantic content analysis can uncover how the topics of Human-Computer Interaction (HCI) have changed since the 1960s. This method is different from previous studies because it uses natural language processing (NLP) to “efficiently” detect trends in data from 60+ years back that can then can be analyzed to uncover themes and temporal trends.
One conclusion drawn from this project, which I found to be quite important, is that there was a strong shift in 2009 where the most common themes went from topics like “Control Systems” and “Intelligent Systems” to “Online social Communication” and “Feature Recognition”. The primary result of this study, however, is that three primary ages for the HCI field can be drawn, including “Legacy Systems Age (1959–1989), Internet Age (1989–2009), and Pervasive Age (2009–2019).”
I think the idea behind this study is interesting since NLP is a fairly new topic area, so it would have been extremely difficult to reproduce these results in previous decades, so the conclusion is certainly a unique one. I also think it is important to use the results of this study for prediction, which they briefly did, because by predicting the next age of HCI development we can better prepare ourselves for what’s to come. For example, the study predicted a future topic of “Human-Robot Interaction” which, based on current Tesla and Boston Dynamics technology, is spot on (pun intended). So using that data could be helpful to even small companies that need to stay proactive in transitioning to human-robot-oriented systems.