Archive for May, 2015

Paranoid defence controls could criminalise teaching encryption

(This article was also published at The Conversation.)

You wouldn’t think that academic computer science courses could be classified as an export of military technology.

But unfortunately, under recently passed laws, there is a real possibility that innocuous educational and research activities could fall foul of Australian defence export control laws.

Under these laws, despite recent amendments, such “supplies of technology” — and possibly a wide range of other benign activities — come under a censorship regime involving criminal penalties of up to 10 years imprisonment.

The Defence and Strategic Goods List

How could this be?

The story begins with the Australian government’s list of things it considers important to national defence and security. It’s called the Defence and Strategic Goods List (DSGL). Goods on this list are tightly controlled.

Regulation of military weapons is not a particularly controversial idea. But the DSGL covers much more than munitions. It includes many “dual use” goods – goods with both military and civilian uses – including for instance substantial sections on chemicals, electronics, and telecommunications.

Disturbingly, the DSGL veers wildly in the direction of over-classification, covering activities that are completely unrelated to military or intelligence applications. To illustrate, I will focus on the university sector, and one area of interest to mathematicians like myself — encryption — which raises these issues particularly acutely. But similar considerations apply to a wide range of subject material, and to commerce, industry and government.

Encryption: An essential tool for privacy

Encryption is the process of encoding a message, so that it can be sent privately; decryption is the process of decoding it, so that it can be read. Encryption and decryption are two aspects of cryptography, the study of secure communication.

As with many technologies subject to “dual use” regulation, the first question is whether encryption should be covered at all.

Once the preserve of spies and governments, encryption algorithms have now become an essential part of modern life. We use them almost every time we go online. Encryption is used routinely by consumers to guard against identity theft; by businesses to ensure the security of transactions; by hospitals to ensure the privacy of medical records; and much more. Given that email has about as much security as a postcard, encryption is the electronic equivalent of an envelope.

Encryption is perhaps “dual use” in the narrow sense that it is useful to both military/intelligence agencies as well as civilians; but so are other “dual use” technologies like cars.

Moreover, while States certainly spy on each other, essentially everyone with an internet connection is known to be spied on. Since the Snowden revelations — and much earlier for those who were paying attention — we know about mass surveillance by the NSA, along with its Five Eyes partners, which include Australia.

While States have no right to privacy — this is the whole point of Freedom of Information laws — an individual’s right to privacy is a fundamental human right. And in today’s world, encryption is essential for citizens to safeguard this human right. Strict control of encryption as dual-use technology, then, would not only be a misuse of State power, but the curtailment of a fundamental freedom.

How the DSGL covers encryption

Nonetheless, let’s assume for the purposes of argument that there is a justification for regarding at least some aspects of cryptography as “dual use”. (Let’s also put aside the efforts of government, stretching back over decades now, to weaken cryptographic standards and harass researchers.)

The DSGL contains detailed technical specifications covering encryption. Very roughly, it covers encryption above a certain “strength” level, as measured by technical parameters such as “key length” or “field size”.

The practical question is how high the bar is set: how powerful must encryption be, in order to be classified as “dual use”?

The bar is set low. For instance, software engineers debate whether they should use 2048 or 4096 bits for the RSA algorithm, but the DSGL classifies anything over 512 as “dual-use”. It’s probably more accurate to say that the only cryptography not covered by the DSGL is cryptography so weak that it would be foolish to use.

Moreover, the DSGL doesn’t just cover encryption software: it also covers systems, electronics and equipment used to implement, develop, produce or test it.

In short, the DSGL casts an extremely wide net, potentially catching open source privacy software, information security research and education, and the entire computer security industry, in its snare. This is typical of its approach.

Most ridiculous, however, are some badly flawed technicalities. As I have argued elsewhere, the specifications are so poorly written that they potentially include a little algorithm you learned at primary school called division. If so, then division has become a weapon, and your calculator (or smartphone, or computer, or any electronic device) is a delivery system for it.

These issues are not unique to Australia: the DSGL encryption provisions are copied almost verbatim from the Wassenaar Arrangement, an international arms control agreement. What is unique to Australia is the harshness of the law relating to the list.

Criminal offences for research and teaching?

The Australian Defence Trade Controls Act (DTCA) regulates the list, and enacts a censorship regime with severe criminal penalties.

The DTCA prohibits the “supply” of DSGL technology to anyone outside Australia without a permit. The “supply” need not involve money, and can consist of merely providing access to technology. It also prohibits the “publication” of DSGL technology, but after recent amendments, it only applies to half the DSGL: munitions only, not dual-use technologies.

What is a “supply”? The law does not define the word precisely, but the Department of Defence seems to think that merely explaining an algorithm would be an “intangible supply”. If so, then surely teaching DSGL material, or collaborating on research about it, would be covered.

University education is a thoroughly international and online affair — not to mention research — so any such supply, on any DSGL topic, is likely to end up overseas on a regular basis.

Outside of academia, what about programmers working on international projects like Tor, providing free software so citizens can enjoy their privacy rights online? Network security professionals working with overseas counterparts? Indeed, the entire computer security industry?

Examples of innocuous, or even admirable, activities potentially criminalised by this law are easily multiplied. Such activities must seek government approval or face criminal charges — an outrageous attack on academic freedom, chilling legitimate enquiry, to say the least.

To be sure, there are exceptions in the law, which have been expanded under recent amendments. But they are patchy, uncertain and dangerously limited.

For instance, public domain material and “basic scientific research” are not regarded as DSGL technology. However, researchers by definition create new material not in the public domain; and “basic scientific research” is a narrow term which excludes research with practical objectives. Lecturers, admirably, often include new research in teaching material. In such circumstances none of these provisions will be of assistance.

Another exemption covers supplies of dual-use technology made “preparatory to publication”, apparently to protect researchers. But this exemption will provide little comfort to researchers aiming for applications or commercialisation; and none at all to educators or industry. A further exemption is made for oral supplies of DSGL technology, so if computer science lecturers can teach without writing (giving a whole new meaning to “off the books”!) they might be safe.

Unlike the US, there is no exception for education; none for public interest material; and indeed, the Explanatory Memorandum makes clear that the government envisions universities seeking permits to teach students DSGL material – and, by implication, criminal charges if they do not.

On a rather different note, the DTCA specifically enables the Australian and US militaries to freely share technology.

Thus, an Australian professor emailing an international collaborator or international postgraduate student about a new applied cryptography idea, or explaining a new variant on a cryptographic algorithm on a blackboard in a recorded lecture viewed overseas — despite having nothing to do with military or intelligence applications — may expose herself to criminal liability. At the same time, munitions flow freely across the Pacific. Such is Australia’s military export control regime.

Now, there is nothing wrong in principle with government regulation of military technology. But when the net is cast as broadly as the DSGL — especially as with encryption — and the regulatory approach is censorship with criminal penalties — as with the DTCA’s permit regime — then the result is a vast overreach. Even if the Department of Defence did not exercise its censorship powers, the mere possibility is enough for a chilling effect stifling the free flow of ideas and progress.

The DTCA was passed in 2012, with the criminal offences schedule to come into effect in May 2015. Thankfully, emergency amendments in April 2015 have provided some reprieve.

Despite those amendments, the laws remain paranoid. The DSGL vastly over-classifies technologies as dual-use, including essentially all sensible uses of encryption. The DTCA potentially criminalises an enormous range of legitimate research and development activity as a supply of dual-use technology, dangerously attacking academic freedom — and freedom in general — in the process.

This story illustrates just one of many ways in which basic freedoms are being eroded in the name of national security.

Unless further changes are made, criminal penalties of up to 10 years prison will come into effect on 2 April 2016.

The day after April fool’s day. Jokes should be over by then.


Written by dan

May 9th, 2015 at 5:12 pm

The CIA 119

Years and years on, abuses continue.

The Bureau of investigative Journalism, together with the Rendition Project, is still trying to piece together the CIA’s kidnapping (“rendition”) and torture programme.

Only in December 2014 did the US Senate Intelligence Committee release its <summary of its report into the programme — a programme which, at least according to this report summary, effectively ended in 2006.

It took nearly ten years after the fact for an official report to arrive.

And this report, despite arriving so late on the scene, had only its summary published — the rest of the report is still classified to this day — and even the summary was the subject of bitter controversy among politicians. (Though what counts as controversial among US mainstream politicians is not a very good guide as to what matters are deserving of controversy: take global warming, for instance.)

Only with this report, well over a decade after most of the facts, only then did we learn the most basic facts about the program, like the number of people captured under it. The answer to that question, at least according to the report, is 119. They appear to have included people from dangerous terrorists, through to innocents sold to the CIA for profit.

The Bureau’s report begins to pull together the evidence to find out what happened to them. They were disappeared from their lives, disappeared into unaccountable captivity, disappeared into a legal black hole — and, in several cases, disappeared from history. The Bureau was unable to determine the fate of 39 of the abductees.

It is a story of no accountability, brutality and incompetence. To be sure, it apprehended some terrorists — though it appears that following a proper legal process, in every case, would have led to better results in terms of security and preventing terrorism, as well as, of course, following the law and abusing human rights. But other cases are ridiculous.

There is Laid Saidi, who was tortured by submersion in a bathtub of icy water and interrogated about a conversation in which he talked about aeroplanes (as if that were a crime) — except it turns out, thanks to faulty translation, he was talking about tyres. Saidi was later released — except he was released to the wrong country, so had to be taken back into custody and released again months later.

There is Khaled el Masri, who was detained by Macedonian authorities and held in a hotel in Skopje, then handed over to the CIA and taken to Afghanistan. There he was tortured by beatings, solitary confinement, and sodomy. His crime? Having a name similar to that of an alleged terrorist. He eventually won damages from Macedonia in the European Court of Human Rights, but his case is unusual in having won some recompense.

Of course, this is only one of many programs of the CIA as part of the “War on Terror” — a “war” which, for the most part, appears to have consisted of terror. And the CIA is only one of numerous US government agencies to have engaged in abuses. And, the United States is only one of many nations to have engaged in abuses — indeed, they all do, though the US still reigns supreme in its ability to project force around the globe. Australia has assisted many of these abuses.

Almost fourteen years after September 11 2001, more than ten years after most of the kidnappings, the struggle remains ongoing to find out what happened and why. These events offer not just a window into a particular time and circumstances, but the institutional circumstances in which unaccountable force is used and unpunished (or even “legal”) crimes are committed.

In Australia we have heard a lot recently about “lest we forget”. We should above all remember the abuses perpetrated by ourselves and our allies — lest we forget them, and in so doing enable them to happen again. The struggle of people against power has always been the struggle of memory against forgetting.

There is also the constructive question, in examining abusive organisations and programmes like this one, to identify what factors caused, or at least allowed, such horrors to happen. What better set of institutions can we build to ensure that similar abuses never happen again — and maintain peace and security for all?


Written by dan

May 2nd, 2015 at 3:14 am