'Big Data', drones & intelligence seminar

'Big Data', drones & intelligence seminar

February 21st 2020     11:00 AM – 1:00 PM in Room 8.61 NB, with lunch to follow

Please RSVP to Maggie Smith if you plan to attend this seminar - msmith@jjay.cuny.edu or mlsjjc@gmail.com.

Ross Bellaby returns to John Jay on Friday February 21st to participate in a seminar following his Thursday night lecture.  The specific focus for his presentation will be the capacity, implications and potential limits of surveillance using drone technology. In his published work, Ross Bellaby has used the principles of the just war tradition - just cause, legitimate authority, last resort, proportionality and discrimination - to balance the needs of protecting one community with potential harms to those in other communities. 

Abstract: Ethically Autonomous Drones

The ability of machines to make truly independent and autonomous decisions is a goal of many. Not least of military leaders who wish to take the human out of the loop as much as possible, claiming that autonomous military weaponry – most notably drones – can make decisions more quickly and with greater accuracy. However not only is there much disagreement over which ethical framework should form the basis of these autonomous weapons, but there is no clear understanding on the implications of true autonomy for a weapon of war or who would be ultimately morally responsible for their actions. Therefore this paper will argue that if any ethical framework were to be developed for autonomous drones then the just war tradition should act as its basis. However, that while it is theoretically possible to have autonomous drones the ethical implications of true autonomy would mean the weapon could and would never fire a shot, and that any mistakes made would fall upon the shoulders of those most senior officers meaning that they would never authorise their use.

Going Dark - abstract from "Goung dark: Anonymising technology in cyberspace"

Anonymising technologies are cyber-tools that protect people from online surveillance, hiding who they are, what information they have stored and what websites they are looking at. Whether it is anonymising online activity through ‘TOR’ and its onion routing, 256-bit encryption on communications sent or smart phone auto-deletes, the user’s identity and activity is protected from the watchful eyes of the intelligence community. This represents a clear challenge to intelligence actors as it prevents them access to information that many would argue plays a vital part in locating and preventing threats from being realised. Moreover, such technology offers more than ordinary information protections as it erects ‘warrant-proof’ spaces, technological black boxes that no matter what some authority might deem as being legitimately searchable is protected to the extent that there are very limited or non-existent means of forcing oneself in. However, it will be argued here that not only is using such anonymising technology and its extra layer of protection people’s right, but that it is ethically mandatory. That is, due to the en masse surveillance—from both governments and corporations—coupled with people’s limited awareness and ability to comprehend such data collections, anonymising technology should be built into the fabric of cyberspace to provide a minimal set of protections over people’s information, and in doing so force the intelligence community to develop more targeted forms of data collection.

Return to:     Opening page         Privacy Lecture        Surveillance Lecture       Privacy seminar