Update: the decision has been entirely reversed on appeal http://news.yahoo.com/google-executives-acquitted-milan-autism-video-case-124736237--sector.html
I have already said so by and large, microblogged extensively on that. My opinion is that the decision is a shame for my Country.
What has happened
A teenager suffering from Down's syndrome has been beaten by a gang of classmates, who have filmed him and posted the video on Google Video. Why on Earth somebody could be so stupid to do either things escapes me, and I hope that those people have understood how criminal, inhuman and uncool their behaviour was. This is beyond the point, though.
The parents of the boy – with whom I am very sympathetic – reacted and sought counsel from an association of familiars of Down people, Vividown. Vividown did the least urgent thing to protect the online insult to the boy, went to the Police and helped the parents to press charges against both the culprits and Google. A few months later, the Police informed Google and asked to take the video offline, which in the meantime had – incredibly – reached quite a large popularity (which talks volumes as to some internauts' intelligence).
Google, as soon as it was informed of the problem, took the content down and possibly thought it was over.
Not quite, as the Prosecutor in Milan, where Google is based in Italy and where the content was allegedly put online, decided to indict four Google executives. Recently the Court of Milan decided that the executives have violated Italian Data Protection Law and convicted them.
This is what I have learned from public sources, I hope I have not reported them inaccurately. I have no direct knowledge of the facts.
Why the decision is wrong, wrong, wrong
In Italy, in Europe, a provision of the e-Commerce Directive (Art. 14 - 15) provides that the service provider shall not be liable for the content it hosts, unless when notified of the problem has failed to react promptly. The aim of the provision is clear: the development of the information society is possible if the service provider is not put under an excessive burden of monitoring each and any content that goes online through its services. This is the first reason why the decision is – in my very humble opinion – wrong.
The press reports that the Prosecutor has alleged, and the Court has upheld, that Google is not a service provider but a content provider. Wrong. A digital broadcasting TV is a content provider. An online news service is a content provider. You are a content provider because you produce or select the content that goes online. A service provider only gives tools to subscribers to go online with their own content. You are a content provider because you control the content, while the Court alleges that because you host content you are a content provider, then you must control the content. This is the second reason why the decision is wrong.
But you could say that you are not really bothered by this, all in all Google has tons of money and you are neither service nor content provider, hell with that. Wrong!
This is the third reason why the decision is wrong. The consequence of this is a mandatory filtering of all content that it is put online, and this is very akin to censorship.
Filtering == censorship
It is impossible to put enough people in line to watch, inspect, report of each and any video that is uploaded. Too much information is put online per second, period.
So the solution for the Prosecutor seems to be "you are doing this in China, you can do this here". What Google is doing there is censorship. So we want censorship here too.
Very clever! And precisely, how should a technical measure decide whether a content is obscene, libelous, privacy-infringing, when this sort of decisions is very hard to take in full court? Do we want to give up on the law and rely on technical means? This is madness! This is blue ignorance. This is exactly what Google has bowed to, and should not have, on copyright (sometimes bogus) infringements (of course, of the "Majors", screw with the others).
Stefano Rodotà – former president of the Data Protection Authority in Italy – said it very bluntly in a recent radio interview I stumbled upon. He said more or less "do we want to give Google an excuse to really become the Big Brother"? To fix an inexistent problem these smart people would want to give immense power and an unprecedented amount of control over people's data to somebody who have the money, technical ability and – well – the data to do that?
An inexistent problem
An this is the last point. Seems that the Google issue is an afterthough. We don't know how long the content has been online since when the parents have come to know it. But if when the video become known, they should have just sent a cease-and-desist letter. Had Google not reacted promptly on that, I would have been in favour of the gravest liability. But this was not the case, and I am still under the impression that the problem is that some people simply fails to understand how things go in technology and telecommunication.
The culprits are those who have committed the crime and those who have extended the effect of the crime by putting online the video. Those boys were identified and went under criminal proceedings. They have been convicted and are now clear.
The decision in full has been published. It seems that the recount I have given is quite accurate. Only the duration of the uptime for the video has been shorter than that, slightly more than one month, with 5500 downloads. The judge completely ignores some facts, it even quotes that the download figures is only a reference because of the "virality" of this kind of videos, while it was technically almost impossible to locally download the video and reuse in other ways different from embedding (where the actual service comes again from Google). The key is that by "indexing" the content the Judge finds that Google is content provider, but seems to ignore the difference between manually indexing and automatically indexing via agnostic euristic algorithms. It seems to reflect a very poor understanding of the technical implications.
Also, the Italian Data Protection Authority's President says more or less the same things on preliminary censorship and the too bold hacking of the law to justify the decision (which is based on Italian Data Protection Code). Plus, he stresses one point that also struck me as completely flawed, which is that the decision finds guilty the officers for lack of proper information. This is not a criminal punishment in the first place (just an administrative fine, Section 161 Italian DPC), but more important, the duty is to be given to the user for the data they upload, not to clearly advise the subscriber that a person depicted in a video must be informed, that duty is simply non existent.
This is a minor part and almost harmless, of the decision because it does not bear criminal consequences, but it is quite surprising that the Judge has misunderstood that very basic law.