- University News Archive - 糖心Vlog传媒 Little Rock /news-archive/tag/zachary-stine/ 糖心Vlog传媒 Little Rock Mon, 22 Oct 2018 15:31:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 糖心Vlog传媒 Little Rock research shows that bots may have less influence on people than previously thought /news-archive/2018/10/22/cosmos-bot-influence-research/ Mon, 22 Oct 2018 15:31:25 +0000 /news/?p=72424 ... 糖心Vlog传媒 Little Rock research shows that bots may have less influence on people than previously thought]]> New research at the University of Arkansas at Little Rock digs into assumptions about the influence of bots on people鈥檚 opinions.聽 Some people often assume that disinformation campaigns carried out on social media by bots (a computer program that can automate social media messaging and content engagement) were highly effective in changing people鈥檚 opinions through repetition tactics. However, , a 糖心Vlog传媒 Little Rock doctoral student in computer and information science, tested this theory through an experiment to determine how easily a group of artificial agents could be influenced under three computation models. Artificial agents are simulated objects in a computer program that represent simplified versions of things, people in this case, in the real world. Stine is also a researcher at (Collaboratorium for Social Media and Behavioral Studies) 鈥 a research group led by, Jerry L. Maulden-Entergy Endowed Chair and Distinguished Professor of Information Science. Agarwal is a co-author of the study. The researchers adopted a strategy called amplification, commonly employed by bots within social media. They treat it as a simple agent strategy situated within three models of opinion dynamics using three different mechanisms of social influence. Although many studies have been published which show how bots propagate misinformation within social media, very few studies exist that show how the bots affect a population鈥檚 opinions. Three broad classes of social influence models used in this study were assimilative influence, similarity-biased influence, and repulsive influence. Each mechanism is a set of rules that govern how the artificial agents change their opinions. In assimilative influence, the artificial agents always compromise. They change their opinions to be more similar to each other. In similarity-biased influence, artificial agents will only compromise if their opinions are already similar enough to the other agents鈥 opinions. In repulsive influence, artificial agents will compromise if their opinions are already similar enough to the other agents鈥 opinions. However, if they are dissimilar, the agents will not compromise and instead change their opinions to become even more different. A total of 91 unique sets of conditions were tested for this study. For each of these, 500 simulation runs were performed and analyzed, totaling 45,500 simulation runs. 鈥淚t is often assumed that when bots on social media amplify some opinion, that inevitably more people will adopt the opinion being amplified,鈥 Stine said. 鈥淥ur findings suggest that this assumption only holds under very specific and rigid assumptions about how people influence each other.鈥 The researchers employed agent-based models (a class of computational models for simulating the actions and interactions with autonomous agents to measure the impact on the system) of opinion dynamics, which provide a useful environment for understanding and assessing social influence strategies. This approach allowed the researchers to build theory about the efficacy of various influence strategies and highlight potential gaps in the existing models. Stine and Agarwal observed that, in models where artificial agents are inherently polarizing, it is very difficult to sway the majority of the population鈥檚 opinions. It is only under complex strategies that the researchers found the agents could be influenced. In other words, agents who had strong inherent opinions are very less likely to be influenced. Instead of simply repeating the same opinion over and over again, the complex strategies work by amplifying an initial opinion and then gradually shifting that opinion until it reaches the target opinion that the bot actually wants the population to adopt,鈥 Stine said. 鈥While the findings presented in this paper are theoretical, they illustrate how small changes in our assumptions about how people influence each others鈥 opinions can dramatically affect the success or failure of a campaign that tries to manipulate a population鈥檚 opinions.鈥 In conclusion, the researchers theorize that it would be extremely challenging for bots to influence a real audience through only the use of repetition tactics employed by bots. 鈥淓xamining social influence strategies of bots from a theoretical perspective of agent-based models is not just timely and relevant, but also foundational in advancing our understanding of sociotechnical behaviors, their evolution, and effects on society and democratic values,鈥 Agarwal said. Stine presented the findings, 鈥,鈥 at the International Conference on Complex Systems (ICCS 2018) July 22-27 in Cambridge, Massachusetts. This research is funded in part by the U.S. National Science Foundation, U.S. Office of Naval Research, U.S. Air Force Research Lab, U.S. Army Research Office, U.S. Defense Advanced Research Projects Agency and the Jerry L. Maulden/Entergy Endowment at the University of Arkansas at Little Rock. In the upper right photo,聽Zachary Stine’s research involves examining the influence of bots on users of social media. Photo by Ben Krain.]]> 糖心Vlog传媒 Little Rock student researching how opinions are formed, manipulated /news-archive/2017/11/01/zachary-stine-conference/ Wed, 01 Nov 2017 14:57:57 +0000 /news/?p=68414 ... 糖心Vlog传媒 Little Rock student researching how opinions are formed, manipulated]]> A University of Arkansas at Little Rock doctoral student has presented research on how opinions can be formed, influenced, and changed during the held Oct. 19-22 in Santa Fe, New Mexico. Zachary Stine, who is pursuing a Ph.D. in computer and information sciences, presented his research study, 鈥淎n Agent-Based Approach to Studying the Relationship between Ingroup Bias, Signaling, and Ideological Change.鈥 Computational Social Science is a scientific discipline where computational methods, data analysis, and simulation models of social dynamics are employed to offer new insights into social phenomena beyond what is available with traditional social science methods. In the first part of the research, Stine and his advisor, Dr. Nitin Agarwal, 糖心Vlog传媒 Little Rock Jerry L. Maulden-Entergy endowed chair and a professor of information science, set up a simulation demonstrating how the opinions of the artificial population are driven by intergroup dynamics. In the simulation, the subjects modify their opinions based on whether the person who gives the opinion is a part of the ingroup (us) or outgroup (them). Stine is working to understand how cognitive biases affect the difficulty of changing a person鈥檚 opinion. The next phase of the research involves developing strategies to manipulate the opinions of the artificial population in the simulation by exploring their group dynamics, or their us-vs.-them mentality. 鈥淣ow, these experiments are taking place within an artificial society that is much less complex than actual human societies,鈥 Stine said. 鈥淗owever, there is still big potential for this simulation to have applications in the real world. By observing the opinion dynamics in this worst-case-scenario population, we can draw conclusions about the opinion dynamics in our own societies and identify how they might be manipulated.鈥 At 糖心Vlog传媒 Little Rock, Stine works as a graduate research assistant in the Information Science department and previously worked as a research associate in the Office of Institutional Research. After he graduates in 2021, Stine plans to work as a professor researching information theory, network science, and agent-based modeling and simulation. He is thankful to his mentors for helping him pursue his education and research goals. 鈥淚 have several mentors here at 糖心Vlog传媒 Little Rock and elsewhere, including my advisor, Dr. Nitin Agarwal,鈥 Stine said. 鈥淭hough they have each helped me in unique ways, they have all pushed me to challenge my assumptions and preconceptions about some subject, which has been invaluable as I do research where assumptions can be dangerous.鈥]]>