Edition: International | Greek
MENU

Home » World

Why cyberattacks could be war crimes

Cyberattacks are the new normal, but, when they come from abroad, they can raise panic about an invisible cyberwar. If international conflicts are unavoidable, isn’t a cyberwar better than a physical war with bombs and bullets?

By: EBR - Posted: Thursday, July 20, 2017

Let’s say no town is nearby and no innocent civilians are affected in this scenario. There’s still a prior question of whether that enemy building is a legal target in the first place. If it’s only a propaganda machine and a bank for terrorists, yes, it certainly plays a crucial role in enabling militants. But being crucial doesn’t make something a legal target. The Laws of Armed Conflict prohibit the targeting of media and financiers, allowing only people and objects directly participating in hostilities as targets.
Let’s say no town is nearby and no innocent civilians are affected in this scenario. There’s still a prior question of whether that enemy building is a legal target in the first place. If it’s only a propaganda machine and a bank for terrorists, yes, it certainly plays a crucial role in enabling militants. But being crucial doesn’t make something a legal target. The Laws of Armed Conflict prohibit the targeting of media and financiers, allowing only people and objects directly participating in hostilities as targets.

by Patrick Lin*

Sure, cyberwar is better than a kinetic or physical war in many ways, but it could also make war worse. Unless it’s very carefully designed, a cyberattack could be a war crime.

Imagine that you’re a political leader and you want to take out an enemy base. We suspect it’s a propaganda machine and financing terrorist activities. How would you do it?

Well, you could go the old fashioned way — call in some airstrikes or send troops to blow up the building — but this would be an open declaration of war, worsening tensions. It would also be a political disaster if your troops or even drones were captured.

Now, there is another way: you could launch a cyberattack against the facility. This is more invisible and therefore less risky. It’d take too long to directly hack into the facility’s secure network, but you’ve already created an email virus that can knock out the town’s energy grid, which would take out the base.

Let’s say you plan to disguise the malware as an official United Nations email to help ensure it’ll be opened by the local leaders. Once opened, the malware will autonomously spread on its own across the town’s networks until it finds the energy grid and is able to disable its controls and overload its transformers.



Without power, the enemy headquarters has effectively been taken out, without a single boot on the ground or bullet fired. So, in this scenario, should you launch that cyberattack?

Before you do, your legal advisor might tell you: “Not so fast.”

By taking out an energy grid, you’re not only blacking out the enemy base, but also all local civilians. You will also infect innocent computers with malware — you used them to reach the energy grid — and this seems to break a bedrock rule in the Laws of Armed Conflict: the principle of distinction, which requires that we never target non-combatants and spare them from the effects of an attack as much as possible.

Collateral damage is allowed, of course, but within limits. If a few nearby civilians are accidentally killed while some important target is blown up, that’s tragic, but not illegal in war, if the military advantage gained outweighs the deadly side effect. This is the rule of proportionality, which means that collateral damage must not be disproportionate or unreasonable.

Bombing an entire town to kill a lone sniper, for instance, would likely be disproportionate. Causing a blackout for an entire town or city? That could be excessive, too. Remember, electricity doesn’t just turn on the lights, it also keeps medicine and food refrigerated and runs air conditioning and heating units, without which hundreds of people — or more — could die in the summer or winter. Blowing up transformers could also start wildfires that affect or kill local residents.

Let’s say no town is nearby and no innocent civilians are affected in this scenario. There’s still a prior question of whether that enemy building is a legal target in the first place. If it’s only a propaganda machine and a bank for terrorists, yes, it certainly plays a crucial role in enabling militants. But being crucial doesn’t make something a legal target. The Laws of Armed Conflict prohibit the targeting of media and financiers, allowing only people and objects directly participating in hostilities as targets.

Even if we can resolve all of these things — no collateral damage, no affected civilians and a confirmed legal target — there’s also a rule against perfidy or treacherous deceit. Dressing up as a humanitarian worker or in a UN uniform to gain access and attack an enemy is an example of illegal perfidy. In your cyberattack, pretending that your email is coming from UN offices might break that rule — you’re disguising it with what’s supposed to be a neutral or protected status in war.

And, even if we can somehow resolve this issue, unleashing an autonomous cyberweapon could be a problem. In ongoing debates about killer robots, a key argument is that autonomous robots are illegal if we can’t retain meaningful human control. Their autonomy may create a responsibility gap, where it’s hard to pin liability on a person if things go wrong. After all, we can’t punish artificial intelligence (AI) for its decisions and actions.

Responsibility aside, without meaningful human control, we could see “flash escalations”, as military AI interacts with other AI systems at digital speed and causes unpredictable, cascading effects too fast for us to stop. This is something like the “flash crashes” that still plague stock markets or “flash spikes” from competing price-bots that can drive the sale price of a textbook to $23 million.

There are many other legal and ethical issues too and it seems weird that war is governed by so many rules. But the Laws of Armed Conflict exist to protect us all, so that war doesn’t become a free-for-all in which terrible, inhumane weapons are used, like biological weapons or chemical gas; innocent civilians pay for the sins of their politicians; and fighting is so cruel that lasting peace is impossible.

Deliberately breaking those rules means risking the charge of a war crime. It also sets a dangerous precedent that our enemies may follow, putting us all at risk. It undermines the rule of law and erodes the values such laws are meant to safeguard.

Now, it could be that those laws and norms need to evolve with technological realities. This isn’t meant to argue that cyberweapons should never be used. Again, something seems right about firing digital bullets instead of real ones. But, while we wait for the law to align with changing realities, some victims may turn to self-help measures, such as “hacking back” or counter cyberattacks, that could exacerbate international tensions.

Many other questions are now emerging. Recently, a Facebook glitch accidentally revealed personal information about its content moderators, potentially exposing them to retaliation from the terrorist groups they thwart. Under the old rules of war, it’d certainly feel wrong that these civilian office workers could be legitimate targets. But if cyberspace is just another battlefield domain, then those content moderators could arguably be “combatants directly participating in hostilities” and therefore liable to attack. Anyone else who participates in cyber operations against an adversary should be aware of this risk before they sign up, if the argument, which is untested in law, works.

Given the risks and uncertainty, this is a conversation we need to have right now, not after the cyber genie is out of the bottle and has ripped through the laws of war. By that time, it may be too late.

*Director, Ethics and Emerging Sciences Group, California Polytechnic State University (Cal Poly)
**First published in www.weforum.org

READ ALSO

EU Actually

Is France setting the tone for modern agricultural laws?

N. Peter KramerBy: N. Peter Kramer

Following promises made to protesting farmers, the French government has presented a new draft of the agricultural policy law

View 04/2021 2021 Digital edition

Magazine

Current Issue

04/2021 2021

View past issues
Subscribe
Advertise
Digital edition

Europe

EU’s 2050 net zero goals at risk as EV rollout faces setbacks

EU’s 2050 net zero goals at risk as EV rollout faces setbacks

The EU needs to rethink its policies to make a 2035 ban on new petrol car sales feasible as electric vehicles (EVs) remain unaffordable and alternative fuel options are not credible, the EU’s external auditor said

Business

Artificial intelligence and competitiveness in the retail sector

Artificial intelligence and competitiveness in the retail sector

The importance of AI and machine learning in the retail market is confirmed by the projected dramatic growth of AI services worldwide, which will skyrocket from $5 billion to $30 billion by 2030

MARKET INDICES

Powered by Investing.com
All contents © Copyright EMG Strategic Consulting Ltd. 1997-2024. All Rights Reserved   |   Home Page  |   Disclaimer  |   Website by Theratron