Saturday 13 November 2010

Firesheep : HTTP session hijacking Firefox extension !

Presented during the Toorcon and developed by Eric Butler, Firesheep is a Firefox extension that permits to sniff on local network the non-encrypted HTTP sessions transmitted allowing ill-intentioned users the access to Google, Twitter, Facebook, Flickr, Amazon, and bit.ly (...) accounts. So be careful in your school/college and company!

This extension takes the shape of a side panel in Firefox, displaying the avatar and the name of the users that are "caught". By double clicking on this avatar, you are automatically logged on the user account and you can access his/her personal infos. This sniffing method is nothing but new, but now everyone is able to do it.

Sniffing, easy as a click !


If you want to take a look/test (for educational purposes only of course ^^ ), Firesheep is available here and to use it under Windows you'll need Winpcap.

Now take a sneak peek under the hood. The exploit here ? A lot of websites don't secure all of their communication with HTTPS, a lot of them only encrypt the login part of the message "forgetting" the non-encrypted access to the cookie...

But don't worry ! A lot of websites have work to do to patch this exploit but meanwhile you can force an HTTPS encryption with those sites. In order to do that you'll have to install the ForceTLS Firefox extension. After a quick set up, you'll be able to tell on which website you want to encrypt your infos.

Friday 29 October 2010

Digital copiers as a security threat

I'm always stunned how easy it is to retrieve sensible informations. By sensible infos, I mean bank account statements, insurance documents...

We feign ignorance but we know that every piece of hardware that can store informations represents a possible point of entry for information theft. It goes from USB flash drive, hard drive, memory card,(..), to cell phone and now even an apparent harmless copier could pose a threat. Actually, since 2002, every single copier made is equipped with a hard drive that stores an image file for every copy and fax ever made with the device. So when a company decides to get rid of an "old" copier, a lot of sensible informations is available for everyone willing to get it (am I hearing industrial espionage ?), all you need to access those infos is a screwdriver and a good Linux distribution with forensics tools (like DEFT or Backtrack)

Here follows a quick CBS news segment about this issue :

Wednesday 20 October 2010

How does a (cheap) GSM interceptor work ?

During the last DEFCON hacking conference the topic of GSM communications security has been brought under the light by Chris Paget, a researcher in security. Until now, to listen to non-encrypted GSM communications you needed to own a pricey IMSI catcher. An antenna similar to those catchers has been built by Paget for only...1500$ !


That antenna mimics a standard GSM antenna, and by using a stronger signal than the official carrier signal, the nearby cell phones are automatically connected on it. As we're talking about interception of GSM 2G communications, Chris Paget used in parallel a 3G scrambler to force the phones to switch to their respective 2G mode.

By default, on a classic GSM network, all conversations are encrypted. But the encryption is managed by the antenna et on Chris' antenna, the encryption is disabled. According to the GSM specifications, the phone must notice the user that the encryption is unavailable but by default carriers deactivate that option via the SIM card. So it's impossible to tell if you are on an antenna with encryption or not...

Once the communication has been established with the interceptor antenna, the call is routed through a VoIP service in order to allow the user to make his call. In case of an incoming call, the caller will be directed to the voice-mail of his friend, connected to the interceptor antenna. During the presentation, nearly 30 phones were connected to this special antenna. People used their phones during this field test, first they heard a pre-recorded message from Paget telling them that the conversation will be monitored and recorded then the antenna patch them through the people they want to talk. The various conversations have been written on a USB flash drive, destroyed after the demonstration was finished.

Once again, with a little hardware and some technical knowledges everybody could spy on you. This time we have to blame the carriers who decided to not warn their customers if they were connected on an antenna that provided no encryption.

I invite you to watch the demonstration as it is very educational and the guy could be funny when he wants !

Part 1/4 :


Part 2/4 :


Part 3/4 :


Part 4/4 :

Monday 11 October 2010

Cloud Computing: Pros and Cons of a Revolution


The use of cloud computing is a fact in modern computing, but do we really know the “cloud” that well? In this review, we’ll be discussing the good points of this architecture (and therefore understand why it is so popular) but also several disadvantages. Our study will be mainly based on two research papers.
The first one, The Case for Cloud Computing was written by Robert L. Grossman, who is a teacher at the University of Illinois at Chicago and a member of the Open Data Group, and published by the IEEE Computer Society in 2009. The main purpose of this paper is to explain what is Cloud Computing, to point out that there are two types of clouds and to bring forth the advantages of a Cloud based architecture.
The second one, Above the Clouds: A Berkeley View of Cloud Computing was written by Michael Armbrust et al and which was published at the University of California at Berkeley in 2009. Besides explaining what is a cloud and its inherent advantages, this paper emphasizes on possible limitations of the cloud architecture.
We have all heard the term “Cloud Computing” one day or another but most people have only a loose idea of what it’s really about. You could find a lot of different definitions for a cloud, as a sum up we could say that Cloud Computing refers to two specific things: the applications delivered as services over the Internet (Amazon S3, Google WebApp) and the hardware in the datacenters that provide those services.  There are a lot of advantages of using a cloud : the illusion of infinite resources available whenever the user want, companies can start small and increase resources only when needed, the ability to pay for use on a short-term basis (and sometimes it’s way cheaper than a standard server farm [3] )…[1]
But the Cloud Computing is not a miracle cure and there are some drawbacks that we need to take into account. Data security and confidentiality is a very important topic nowadays, service stability is also a big issue; Cloud Computing seems to be vulnerable to several types of attacks, especially if the workload controller (MapReduce for example) is not able to bear a DOS attack [4]. Another issues are bugs in large-scale distributed systems, this is one of the biggest challenge in Cloud Computing e.g. to remove software/systems software errors in a very large scale [2]. Another obstacle is the software licensing [2], a lot of software licenses restrict the number of computers on which the software can run therefore it’s not adapted at all to the pay-as-you-go scheme commonly found in clouds [2] and software company are pretty reluctant to create specifically designed software for clouds as it interferes with the quarterly sales tracking used to measure effectiveness, which is based on one-time purchase [2].  One more obstacle is the data lock-in which comes from a lack of standardization in the world of Cloud Computing, the APIs for Cloud Computing are still essentially proprietary so the customers can’t easily extract their data or worst cannot even recover them [5].
Although cloud computing has many advantages that explain why services such Amazon S3 are so popular a lot of issues and concerns are also raised when we look further. The main problems are confidentiality and accessibility of data but also software licensing, we need to change the way we consider computing in order to overcome those problems (for example stop seeing computers mainly as hardware and start considering it like a set of services).
References:
[1] Robert L. Grossman, “The Case for Cloud Computing”, IEEE Computer Society, 2009.
[2] Michael Armburst et al, “Above the Clouds: A Berkeley View of Cloud Computing”, University of California at Berkeley, 2009.
[3] J. Hamilton, “Cost of Power in Large Scale Data Centers”, 2008, available from http://perspectives.mvdirona.com .
[4] D. Hyuk Woo and H.H. Lee, “Analyzing performance vulnerability due to resource denial of service attack on chip multiprocessors”, Workshop on Chip Multiprocessor Memory Systems and Interconnects, 2007.
[5] J. Brodkin, “Loss of customer data spurs closure of online storage service ‘The Linkup’”, Network World, 2008.

Wednesday 6 October 2010

A MSc...but why ?

Five years since I left high school, so why add another year of study ? First of all, I've chosen to study abroad
(I'm French) because as we lived in a whole connected world I thought it would be a nice point of entry...
I have decided to do a MSc in Computing in order to improve my knowledges by learning new paradigms,
studying parallel programming...

I really hope that this course will allow me to learn new skills, to be more versatile and effective in my future
work. I'm pretty sure that this course will enable me to raise my own issues and, of course, help me find my own
solutions.

As a conclusion, I could say that I hope this MS will teach me "how to learn", a very useful skill as I intend
to learn new thing on a lifelong basis.

Monday 4 October 2010

Delete: The Virtue of Forgetting in the Digital Age

With the shift to digital modes of storage (nowadays everyone has a flash drive, know how to burn a CD...), "have we forgotten how to forget ?" This is the main concern of Viktor Mayer-Schönberger who has written a book: Delete: The Virtue of Forgetting in the Digital Age. For most human history, every little thing we did was at a point forgotten, and from a social point of view it was a good thing; everyone has done stupid things that needed to be forgotten...Digital world has eliminated that forgiveness. Google caches copies of our blog postings (even this one !), social-networking (e.g. Facebook) sites archives all of our photos, messages... It's literally harder to erase information than to retrieve it. Why this is a problem ? According to the author, in addition with piling up unwanted and out of date informations the social implications are tremendous and everything thing we post could turn into a trap for a future career...(the author gives some examples in this video)

So the point is made: there is a true problem, but what are the solutions, what could be done in a world that relies on digital data ? According to Viktor Mayer-Schönberger, people need to stop designing tools that automatically store everything instead we need to design tools that allow people to put "expiration date" on the data entered. The author gives an example with drop.io, this service is an on-line private sharing but what makes unique is when you're uploading a file it asks you to put an expiration date on it. With this solution the file is automatically deleted and the data on it...forgotten.

In my own opinion, I am more concerned about what social networking sites and web crawling sites want to do with those informations...In some case it's rather clear, for example with gMail Google displays ads according to the contents of your mail but what about social networking sites what are they gonna do with those infos (and by infos I mean our private life...). But this problem doesn't exist for everyone, in fact the existence of the problem in one's mind is conditioned on our relation with the digital tools. During last class we took a quick survey, the question was rather straight-forward "Is the standard default of remembering on the Internet a problem ?" and the results: 18 for yes and 12 for no, the class was nearly cut in half on this topic. Now it could be interesting to take a survey on a wider population and to study the results according to the age of the participants as kids tends to use a lot social networking.