Written by Carolina Christofoletti
When we talk about The Onion Router Project, we tend to believe that the protocol is so strong that, once connected to it, one enters an unbreakable protection bubble until the end of the section. Things are not like that and, in computer terms, everything made out of TOR remains, still to a great extent, as a separated environment.
TOR will not armor your computer
TOR will armor what you are doing inside it, and with restrictions. Because TOR is a privacy-designed browser, some Java scripts will not run there. As such, the first “easy” resource of TOR users is, wrongly, opening a parallel browser to look at whatever one was looking for. It is not only a different browser that people open while on TOR, but they also open peer-to-peer software, open-web galleries, video players, and so on.
For a forensic expert attentive to details, it is worth nothing more than 3 seconds for one to put the TOR’s section duration and the Open Web activity together to see what was going on there. And, even where the default search engine is said to collect no compromising data, search engines end in itself – they have no control of what is going on in the places you click.
So, is TOR forensics and deanonymization still possible? I believe so. And, from my own point of view, search warrants in the dark are at the point that need immediate discussion. Especially, when one comes to the finding that violent criminals (that is, Child Sexual Abuse Materials -CSAM- forums members) use such a thing as translators to communicate with each other, to “discourage” linguistic analysis, and to translate CSAM titles. Let us see, then, what my proposal is.
When we talk about deanonymization, Browsers are the first, if not the most relevant, pro-forensic artifacts. To put it simply, every single browser serving as an Internet entrance point carries with it some very specific identifiers, through which its end-user could be finally identified. And no, we are not talking about IPs. We are talking about “useless” information such as screen resolution, time zone, cookies, and others. At the end of the day, when put together, that information could trace back the user’s unique identifiers.
As such, the new generation of privacy-protected browsers such as Mozilla and The Onion Router (TOR) cares a lot, as it could be no other, with those very same fingerprints. To talk about TOR, the solution that was given here was to set the very same fingerprint to all TOR users, and independently of which Operating System is being used. Where everything has the same distinctive features, identification is complexified. Identification and trace-backs rely, from now on, on the “wrong click” or “wrong set of configurations” selected by the user.
Cookies are the second way a user could, while browsing the Internet, be identified. Cookies, differently from Browser Fingerprint, involve but a slightly different problem: A large part of the Internet’s business is generated around the Cookies feature. Cookies serve, to put it simply, to improve the user’s experience: It guarantees, for example, that in case you accidentally close the browser window while doing a 36-hours training, you will go back to where you stopped. Cookies are, figuratively, the browser’s memory. Of course, where there is a memory, there is also a commercial interest in the era of Big Data. Knowing what kind of certifications, one takes might be useless if we look at it through the single user’s lens. If you start to compile this data and cross the comparative, you build matching predictions.
Cookies are stored in the browser, and also on the TOR browser. The issue with TOR is that the cookies are “cleaned” as soon as you close it – which means that there are still cookies running on, for improving user’s experience, while one keeps with it opened. Cookies can threaten your privacy if all you want is your data not to be “sold” to other companies for marketing purposes. Even if companies knew whoever constantly accessed a Child Sexual Abuse Material Website on the Surface Web through shared cookies, the search warrant has stopped miles away, because the person who holds that cookie is, still, non-transparent data.
Social Engineering Vulnerabilities
And here comes the sun: Even in the Dark, everyone accessing the Surface Web through an anonymized website (like the .onion version of a Google, for example) is generating cookies. And, as Surface, there are still some collection points – which could, indeed, be very useful for investigations when one figures out that titles and instructions of a Child Sexual Abuse Material are being translated on the Open Web.
Of course, this Dark/Surface vulnerability has the very same principle of a .onion user logging into a Yahoo or Facebook’s account through TOR. As such, the cookie’s generated through a .onion address become, suddenly, a very interesting point for investigating serious crimes (such as Child Sexual Abuse Materials ones).
Maybe, IPs do not matter anymore. Maybe, a post done through a surface-registered website in a session storing a CSAM forum cookies is far more valuable than the IP that hides behind it.
Think about it.