How can university curriculum stay as current as possible when international affairs develop by the day? This semester, criminal justice professor Lew Nescott stayed ahead of the curve with students enrolled in his seminar-style course on the presence of Artificial Intelligence (AI) in some of the largest international conflicts of the current day.
Nescott served as a mentor for two teams of students who spent the semester taking deep dives into the applications of AI at the current day.
The first team dove into AI in the Russia-Ukraine war and was made up of three seniors: Arsenio Garcia, Ethan Reeder and Anthony Schenk.
The second team was tasked with researching AI’s role in the Israel-Hamas war and was made up of three graduate students: Oluwatobi Akintola, Stefan Dimitrijevic Meyer and Nkeiruka Origa.
The information uncovered from the two teams was presented in the Bergami Center on Dec. 6. Nescott prefaced their presentations with saying that all of the information was “unclassified and from open sources,” and that with something as fast-moving as AI in the realm of international affairs, information can change as quickly as daily.
The Russia-Ukraine war group dove into a number of applications of AI in this war, ranging from drone development, to programming able to detect enemies and identify camouflage, to numerous sets of information processing.
The team said that “AI has gained an edge in its geospatial intelligence” and that the country has made notable strides in its access to and usage of AI technology. Facial recognition software is also used in Ukraine to identify dead Russian soldiers, a tool which they use to notify their families. This has been said to contribute to the public image of Ukraine’s people.
This AI software is also used for geo-location purposes, not only to identify enemies, but to determine which weapons will be most effective in a specific time and place. This is done in real-time, in order to allow their forces to make informed decisions on the fly.
Ukraine also uses AI for language translation, a tool that Nescott’s students said is put to use to aid in communication with all of the languages currently being used in their country.
The team of seniors also broke down some of Russia’s usage of AI despite the country being “relatively behind” compared to other countries. Russia has drones which use neural networks to process data in the same way that humans do but at a much faster rate. This avenue exercises the same technology as their enemies in Ukraine.
Beyond the positive applications of the technology, students from both teams broke down how Russia and Israel use AI in comparable ways to drive their misinformation campaigns, primarily through the generation of “Deep Fakes.”
Deep Fakes are generated by AI and display images that are intended to mislead their consumers. Akintola explained the power of Deep Fakes, displaying images of children in distress that he said “appear very real at first glance,” but are used to evoke feelings of sympathy and anger without the faces of any actual people.
These campaigns are furthered by fake accounts made in mass quantities and bots programmed to spread misinformation in these countries’ favor.
Israel’s usage of AI was said to also contain devices such as the Iron Dome and The Habsora (which translates to “Gospel” in English), among a number of others covered by the graduate team. The Iron Dome is used by Israel to shoot down incoming projectiles, using a command and control system from the ground. The Gospel, one of the country’s largest tools in this war, is used to generate precise attacks on Hamas infrastructure and generate targeted recommendations for the researchers working with it.
The students’ presentations included diagrams of these tools’ structures, as well as thorough explanations on their functionality.
Origa worked through a number of other systems used by Israel, breaking down the power of AI and highlighting its impact on the battlefield. With this, she said that “AI has the power to revolutionize the defense industry,” but that we need to be mindful of its impacts and “ensure that it is used mindfully and ethically.”
Following the research shared by his students, Nescott reflected on the course he ran, its goals and his outlook on the information in application to students’ studies. He said that he would be interested in further teaching courses that focus heavily on AI, as almost any area of study will include the technology as a “critical component piece.”
He also spoke in the direction of the university community, clarifying what those within institution can take away from the findings of his students. Nescott said that “Similar to the nuclear age, AI will disrupt the way we leave, and the way in which we will wage war and peace. But unlike the nuclear age, AI is not limited to a small club of nation states that have exclusive access to material, money, and expertise. Geopolitically, you may not be able to control the pace of development, but you might be able to manage it.”