Unusual Gathering | Episode XXVII | Kristin Judge and Chris Cox

Conversations At The Intersection Of IT Security And Society

Guests
Kristin Judge | Chris Cox

Hosts
Sean Martin | Marco Ciappelli

ITSPmagazine’s Unusual Gatherings XXVII:
The very thin line between monitoring and spying


Visit Bugcrowd’s page here on ITSPmagazine

Learn more about sponsoring the Unusual Gatherings Talk Show


Technology is good for society. Technology is bad for society. On which side are you going to stand? 

Wait, before you pick one, let me tell you a secret: these statements are both wrong and right. Yes, at practical levels we can make a case for both, and neither one is going to prevail over the other; but from a philosophical and anthropological perspective, I feel quite confident in supporting the statement that technology is intrinsically good. Human nature is defined and driven by intellectual exploration and growth, and ultimately technology is just part of being human — which is the most contemporary, yet probably not the ultimate, expression of our evolution as a species.

It seems that, more often than before, when we address cybersecurity we end up talking about privacy and personal information. We talk about cyber criminals trying to perpetrate fraud and steal personal information, about companies violating trust to sell the data to a third party, IoT devices that collect way more data than they should and that get unauthorized access to our private lives by invading our home environment and our bodies. And, of course, there are the old cases of government institutions spying on citizens, and companies and states spying on each other. 

And that’s just mentioning a few scenarios that have made the news. But what about when spying activity happens within the family walls: between couples, parents and kids? In all these cases, the line between potential beneficial monitoring of activities that can reinforce the safety of the subject can easily be stepped over, and the whole scenario turns from good to bad, and it becomes spying, stalking and abuse. This is when good technology turns “bad”, and the consequences become severe and extremely dangerous.

Technology isn’t “bad,” of course. It is either built or used that way, and this when the line is crossed.

Among the many interesting observations and comments around this incredibly complex topic, these are some of the relevant issues we discussed:

• Abusive behaviors often start with knowledge of other people’s private life and information that can turn into an instrument of control (i.e. blackmailing).

• More frequently, technology components are part of the case: cyber monitoring, stalking, spying, and harassment are, in most cases, present in conjunction with physical harm.

• Abusive relationships are about power and control, both physical and emotional. The same technology that allows us to be connected with the world allows the psychological harm to be a factor 24/7. As with cyberbullying, the technology is the same mechanism whether used for good or bad, but its harmful uses are amplified.

• It is not just about installing software and hardware in a stealthy way. Most of the time, it is a matter of what is defined as "dual-use technology" where the app or the device is designed for a specific use and scope, but is instead harnessed by the abuser for his own purposes. A typical example is the installation of an app on someone’s — let's say a kid or partner — phone, and done with good intentions but that can quickly cross the line from monitoring to spying, or even infiltrated by a third party with evil intentions.

• Knowledge and education can be powerful tools that make the difference between good and bad choices from a technological perspective. It is the people’s responsibility to get educated on the IoT-connected devices they plug into their house and carry around on their body. Unfortunately, almost every technology and app has a default “opt-in” option, and we need to know what are we opting into and what we should immediately opt out from. 

• Maybe soon companies will be driven by socially responsible ethics, or we will have regulations that will force a default opt-out standard. So we are looking at product regulation, individual education, and legality.

• How can a line be drawn between monitoring and spying, and what can people do when they feel that the line has been crossed? Some state laws can be applied in a technology abuse case, but there is still much to be done to educate our elected officials so that they can understand the consequences of good vs. bad technology.

• When things have gotten bad and someone becomes a victim, how can they safely get out from a dangerous situation without alerting the abuser, and find help, shelter and protection? Sadly, victim-advocate organizations — meant to help the victims — oftentimes are not prepared to protect people from those threats. So there is much education and work to be done not only to prevent abuse, but also to protect the individuals who are escaping from such an unfortunate and harsh reality.

After playing devil's advocate and pointing out flaws, we must think about solutions. We believe that things are changing and awareness is starting to grow. Many organizations are taking the initiative and are bringing these conversations from the shadow into the light, fighting the good fight, increasing education, knowledge and understanding, and moving public opinion so that elected officials and legislators can address, and take actions on, these critical social issues. 

Knowing and understanding the problem is always the first step towards the solution. So in the short amount of time we had to cover this complex topic, we suggested a few tips for people to become aware of potential spy-ware/stalk-ware threats on their phones, computers and homes.

We talk about the behavioral and technical side of things, and the fact that sometimes it is not even as deep as a technical hack, but rather just a matter of privacy settings, basic cyber hygiene, GPS location sharing, and social media accounts on their phones — to mention just a few.

There is so much more to talk about that we could have gone on longer, but we felt that it would have been overwhelming. Instead, we decided to expand the conversation in the future — dedicating more episodes to this topic and including discussions around monitoring in the workplace, schools, smart homes, and smart cities, with all the vast range of possible repercussions on our privacy, safety, freedom, ethics, morals and human rights.

We will keep raising the issue until our collective voice is heard and something is done about it. 

Now listen up.