‘Upskirt’ photography, as the term suggests, refers to the voyeuristic practice of covertly taking pictures of women under their clothing without their consent or knowledge. These pictures, labeled ‘creepshots’, are generally pictures of a woman’s private areas. They are then widely disseminated via the internet, infamously through sites such as Reddit and 4chan.
In 2014, the legality of upskirt photography was brought into question before the US Courts across three different jurisdictions –Washington DC, Massachusetts and Texas. The first part of this post addresses the American perspective on the issue. It seeks to illustrate how America would rather err on the side of caution and permit the morally reprehensible acts of upskirt photography than curtail free speech. The second portion of the post looks into the Japanese perspective on creepshots, which is the polar opposite. Japan prioritizes women’s safety above free speech concerns. This portion of the post also looks into the curious phenomenon of cellphone manufacturers taking law into their own hands to regulate upskirt photography. I argue that this is a classic example of Lessig’s adage ‘code is law, law is code’. I conclude by extrapolating where India lies on the legal spectrum with regard to regulating upskirt photography.
(Image Source: https://flic.kr/p/oHcd72)
Just yesterday, the internet became abuzz with the news that the European Parliament (‘EP’) is pressurising the European Union (‘EU’) to break Google Search away from the rest of its services (such as Android, et al). We’ve covered Google’s antitrust woes with the EU on the TLF earlier. According to this Techdirt article here, the EP hasn’t really given any reasons for breaking up Google other than the fact that ‘it’s very big and very European’. (Of course, its powers to even take such actions are themselves quite suspect.)
For Facebook, it has never been about the profit, but the users. The social network has spent more than $22 billion on acquisitions, which includes $19 billion on WhatsApp exclusively! That is 2000 times the annual revenue of WhatsApp! Other popular acquisitions include Instagram ($1 billion), Oculus ($ 2 billion) and Atlas ($100 million). With recent psychological experiments conducted by Facebook on its unsuspecting users coming to surface, it becomes imperative to understand how our information is being collected, stored or used. In this blog post, I have tried to analyze the privacy policies (before and after) of three of Facebook’s major acquisitions – Instagram, Moves and WhatsApp.
1. It Looks Like India’s Going to get a Web Filter, by Nikhil Pahwa, Medianama.
2. Up-vote all you want, but the Internet isn’t a democracy, by Caitlin Dewey, The Washington Post.
Have you ever wondered how the spam in your mailbox is automatically detected? And what about speech recognition or handwriting recognition? These are quite challenging problems. But luckily they have one thing in common – that is data, and a good deal of it.
Machine learning aims at creating systems that learn from data using various computer science and mathematical techniques. To put it differently, machine learning is the study of computer algorithms that improve automatically through collected information of experience, i.e., data.
It is my great pleasure to announce our new Guest Editor, Sahebjot Singh. Sahebjot is currently a computer science major at the Manipal Institute of Technology, and is an avid programmer and web developer, who also enjoys dabbling in physics. He has worked earlier at a few startups, including Fracktal Works, a 3D Printing startup, and WLS Global, a Web Consultancy service, and is currently working on a few separate projects of his own. He’s also a Counter Strike dilettante, and a connoisseur, if you can call it that, of mangas and animes.
Saheb will be bringing to the TechLawForum@NALSAR the perspective of an engineering student on issues related to technology and law, both, and would also be writing a series of 101s on technological topics. His first post on Machine learning is available here.