Friday, February 27, 2015

What’s your email? Let me guess…

Recently, a friend asked me for the email of one of my colleagues. I told him I can’t give it to him, but he can easily guess it, because almost every company in the world uses one (or more) of the 3 algorithms. These are [first letter of first name]+[last name], [first name]+[first letter of last name] and [first name].[last name]. My friend said that this wasn’t true, because his own email was somewhat different. So I showed him how a simple 7-line script can dump every reasonable permutation of his name, and with a simple CDO.Message function, can even send out a message to all of them (then, by comparing the list to the bounce emails, you can easily figure out the right email even if he didn’t respond).

My buddy, who works for a security company, was sure this kind of approach wouldn’t work, and that their security monitoring team would detect and block this, but, as I was able to prove a few seconds later (when he got my email), this is virtually impossible. A well-staffed security team could detect a major enumeration attack, but there’s no practical way to detect someone sending a few dozen wrong emails. During this little experiment, we also learned that his company was supporting the [first name].[last name] algorithm in addition, so guessing an employee’s email is even simpler.

Add to that other easy ways of finding an email, such as doing a simple web search for [first name] [last name] *@[], and it’s pretty much impossible to hide. For example:


By the way, when I ran my script, it produced 20 permutations on my friends’ name, with one of them actually being the email of someone else…who was all but happy to provide me with the person’s correct email. That would classify this as “Social Engineering”, which could be slow, but just as effective.

So, if you ever wondered how is it that you are finding so much spam in your inbox even though you never put your email anywhere, this is it. Spammers simply run enumeration scripts that send random emails to various combinations of well-known names and last names, and then mark as “good” any email that didn’t bounce. Some mail servers have the ability to detect a massive influx of bad emails, but if the spammer sets their attempt-rate to low enough, it’s virtually impossible to detect.

The biggest question is, of course, what can we do about it? The answer is simple. When provisioning users, simply avoid the classic algorithms. For example, add a random high number at the end of the address (jsmith78), or some character like an underscore (jsmith_ ). If you want to preserve an easy to remember structure, you could use two underscores in the separator (john__smith) instead of a single dot or underscore, or add the middle initial (john.j.smith). Regardless, check if your email server has a feature to detect and alert on a large number of bounced emails. If not, you can write a script to go over the list of bounced emails, and alert if something is wrong (for example, if the number changes drastically from one period to another, or if many bounces are coming from a specific IP).

Friday, January 23, 2015

WHAT on earth do you mean? (Cryptography concepts and protocols)

When working with SSL and cryptography in general, there are a LOT of concepts and protocols flying around. When reading Netmon traces, I often found myself not being sure what a certain acronym refers to or where it belongs. To make life a little easier, I created this infographic poster, which covers this, as well as some interesting tidbits about important topics in cryptography.

You can download this version as a wallpaper, or the PDF below to print and hang on your wall (It’s optimized for a 20x30” print, which Costco does for less than $10).

Hope you like it!

Cryptographic Protocols and concepts poster 012015 Wallpaperjpg

Download the high-res PDF:

Monday, November 10, 2014

I’m afraid I can’t do that, Dev (Patel)

Finding a topic for this week’s blog is sometimes challenging, but this week, HBO was kind enough to give me something to talk about.

Those who watched yesterday’s episode might have noticed how Neelamani Sampat (a.k.a. Neal), the Indian “computer genius” played by Dev Patel needs an “air gapped” computer to receive some sensitive files from someone. Will McAvoy proceeds to hand over a credit card to Neal and tells him to go buy such a computer.


The concept of “Air Gap” may sound familiar to those who dealt with UAG, Microsoft’s reverse proxy and application publishing product (which I used to be a part of until a few years back). Indeed, Whale Communications, which was later acquired by Microsoft, originated a product called “AirGap” and used it when creating super secure gateway device for connecting two networks. In reality, the AirGap concept is not more or less secure than any good firewall, but it was developed specifically to meet some poorly worded security policy set forth by the Israeli government and some Israeli military and defense industry related organizations.

While I have to appreciate Aaron Sorkin and his team trying to educate the public about security, I’m also uncomfortable by the way they do it, which can end up misleading rather than teaching. Let me explain a bit more about this.

While Neal describes this as “literally a gap of air between the computer and the rest of the world”, this is not literal in any way. What’s really going on in this episode is that a person wants to provide Neal with sensitive documents, and they are concerned that if Neal opens them on a regular computer (one that has an internet connection), the government might be able to know of this because they might be bugging Neal’s regular computers in some way.

One fallacy in this plot line is the fact that there’s no such thing as an “air gapped” computer any more than an “HBO-less” television. If you want a computer that isn’t connected to the internet, just disconnect it from the internet! (pull out the network cable or disable the Wi-Fi network card). Even if we went to an extremely paranoid state of mind that the government is somehow capable of overriding this, then the Laptop Neal buys at Best Buy would certainly not be immune to this sort of high-level espionage, and the only thing to do would be to go into a faraday-cage, which would block any radio signals coming in or out.

Another fallacy is the claim that Neal needs to setup a higher level of encryption above his regular AES encrypted mail because his opponent is capable of 3 trillion guesses per second. A fast super-computer is indeed capable of about 1 trillion guesses per second as part of brute-force attack to decode an encrypted message, but even then, it would take many billions of years to crack even a lowly 128 bit encryption (1,000,000,000,000,000,000 years).

I probably can’t go into each and every error in that episode right here, so here’s a quick summary:

1. any store-bought laptop cannot be, by definition, “air gapped” because they all contain wireless networking.

2. An “air gapped” computer would be a specialty item, one you wouldn’t find in a regular store like Best Buy

3. Any computer can be air-gapped if you rip out the networking equipment

4. Any computer that has networking hardware built in cannot be fully air-gapped, because it might be remotely-controlled

5. Even a computer that doesn’t have networking hardware might still be tracked via concealed networking hardware (just like a phone can be bugged without the user’s knowledge)

6. Even a computer that has NO networking hardware of any kind can still be eaves-dropped on upon by external cameras, which can see what’s showing on the screen or being typed on the keyboard

7. Even when there’s no external cameras, a computer can be eaves-dropped upon using radio signals it emits as part of normal operation. The only way to block that is with a faraday cage

8. An AES encrypted message is so secure, that it’s infeasible to break within a person’s lifetime even with the best super-computers in the world (like those used by the NSA)

9. If Neal’s computers are monitored or controlled by the government (necessitating the air-gapped computer), then they could easily read the encrypted message as it’s being decrypted or shown on-screen, and any level of encryption would be useless. Same if they can access the buildings security cameras.

10. Stashing the USB drive in a public bathroom as opposed to a personal delivery would violate ALL the other security steps taken prior. Really??? In the Toilet tank?!?! (BTW…Ewwwww!)

11. Leaving the “air-gapped” computer in the room and walking out, like Neal did, without locking it down or securely-wiping it is also a terrible idea.

12. Discussing this with his peers in an open-air place like the balcony is a big no-no. In fact, I would say that after the issues they had with the government recently, not to mention their high-profile as a news source, they should not be discussing ANYTHING secret anywhere near their building.

Now…if you, dear reader, are a journalist, or script-writer, or anyone else who would like to deliver or receive secret info, here are some rules to help you out:

1. Do not start the conversation on a computer of any kind…not even if the message is encrypted. Follow the person and grab him at a public and noisy place like Starbucks. Even if the person is followed, the noise reduces the chance of successful eavesdropping even with directional microphones.

2. Stay away from Windows, so lip-reading can’t be used

3. If either of you need to use a computer, build one from scratch using parts. Lock the computer case with a high-security lock and check it daily for modifications. Install the operating system yourself, from a secure source (an original DVD, not from a downloaded copy). Enable storage encryption (like Bit Locker).

4. On the computer, enable passwords on the BIOS, The drive (bit locker), the operating system, and any software that supports password-protection. Make sure every password is unique, long, complex and not written anywhere.

5. Turn off the computer when not in use, and pull out the power cord.

6. When not in use, lock the computer itself, as well as the keyboard and screen in a physical locker (these can be implanted with a bug too)

7. When using the computer, do so in a metal cage (a Faraday cage)

8. Make sure you’re not in an environment where there are security cameras, Windows or other people.

9. If you can, install a security system in the room/house, or at least tamper-evident safeguards (for example, gluing a hair on the doorframe can tell you if it’s been opened)

10. If you need to transfer data from or to the computer, use floppy disks or CD-ROM discs (nor USB drives). Destroy them with fire after use. If you have to use a permanent storage device like a USB drive, buy the small and cheap ones, and destroy them with heat after use, or at least encrypt their content.

Finally, keep in mind that a person or group with sufficient resources can tap into ANYTHING, and even professional spies get caught. If you’re dealing with stuff that would get the NSA or CIA interested, your chances of avoiding them completely are virtually zero…just don’t do it.

Friday, October 24, 2014

ECC, See? Si?

With cryptography being based on mathematical manipulation of data, mathematics and strong computers are the potential tools for breaking the encryption. Thousands of scientists and hackers are working diligently to find holes and vulnerabilities in common encryption technology (such as the HeartBleed vulnerability in the HeartBeat extension to TLS), and the computes that are available to them are becoming faster and cheaper.

Keeping ahead of the enemy has been going in two directions. One is making the math “longer” by using longer keys. If you switch from a key that’s 512 bit long to a 1024 bit key, it will take a computer a lot longer to break it (breaking the current 2048 bit keys used in many SSL certificates is considered to be unfeasible with today’s computing power). The other direction is making the math more complex by designing algorithms that will be more effective and harder to break.

Over the years, we’ve seen the industry move from one algorithm to another, and the latest generation in cryptography is ECC – Elliptic Curve Cryptography. ECC is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. Quote from Wikipedia:

Public-key cryptography is based on the intractability of certain mathematical problems. Early public-key systems are secure assuming that it is difficult to factor a large integer composed of two or more large prime factors. For elliptic-curve-based protocols, it is assumed that finding the discrete logarithm of a random elliptic curve element with respect to a publicly known base point is infeasible — this is the "elliptic curve discrete logarithm problem" or ECDLP. The entire security of ECC depends on the ability to compute a point multiplication and the inability to compute the multiplicand given the original and product points. The size of the elliptic curve determines the difficulty of the problem.

Finite Fields? Intractaility? Discrete logarithm? Publicly known base point? If that mambo-jumbo gives you a headache, let’s simplify it a bit….

Encryption is based on encrypting your data using some kind of key, and then decrypting it on the other side. For the other side to do so, it would have to have the key. In classic encryption (for example, a spy working behind enemy lines), the decryptor would have the key beforehand (a.k.a. “Preshared key”), but when two computers are talking over the internet, they can’t do that (assuming you don’t plan on flying out to San Francisco to pre-share a key with eBay or Gmail). Instead, the concept of a public/private key was developed. This is a special piece of math where the sender and recipient use TWO keys. You use one key to encrypt the data, and the other to decrypt it. The math is such that the key used for the encryption cannot be used for the decryption. The first key is designated as the “private” key, and the other the “public” key.

When a cryptographic session starts, the server gives the client its “public” key. The client then generates a unique two-way key that will be used JUST for this session. The client encrypts that two-way key using the server’s public key, and sends it over. Because of the way this math works, only the server can decrypt the key, because only the server has the “private” key. Once it decrypts the encrypted two-way key, both sides have effectively shared a regular key and they can start using it to both encrypt and decrypt subsequent data.

Well…why switch to a two-way key? Why not just use this private/public system for the whole session? The reason is that the math required for this is complex and slow, and if we did that, it would take a long time to transmit the message. Also, if someone was recording the encrypted data, and the private key was ever hacked, every session ever done would be compromised. With the two-way session key, every unique session is protected individually.

So, private/public encryption and decryption are complex and slow, and we keep lengthening the keys to keep up with increasing computer powers…and in comes ECC. Elliptic Curve is based on special math that uses special functions in the family of y2=x3+ax+b. These functions, if plotted, create a curve that’s elliptical. For example:


The math makes it much harder to break, which means an ECC based encryption using a 256 bit key would take as much time to break as a traditional encryption using 2048 bit keys. This means that by using this technology, you are reducing the load on your server significantly, and that means a lot of you are running a website servicing millions of people a day (just think of how much money you can save by using 100 servers instead of 800!). Alternatively, you could stick to the same key length and achieve 8 times the security (as in, it would take 8 times as long to break the key).

I should note that in the real world, things might not be as clearly defined as I stated above. Switching to ECC doesn’t really mean you can ditch 87.5% of your servers, because there are probably other bottle necks. There are other considerations such as compatibility…if your switch to using ECC on your server, your clients need to support it too, and not all clients do (for example, Windows XP doesn’t, and it still comprises a large chunk of the web client market).

Next time I’ll talk more about the practical aspects of using ECC.

Thursday, October 16, 2014

One dog you don’t want to pet…

Poodles are great dogs, but a few days ago we all learned of one you sure don’t want to pet. The POODLE exploit, also known as CVE-2014-3566, was reported and documented in September and once again reminds us that making compromises in the world of security can end up biting us.

The actual attack that we are referring to here is known as “Padding Oracle”, and it’s quite old…over a decade old, in fact. It was reported back in 2002, at a time where SSL 3.0 was already “old” (6 years old is a lot in computer time, right?). SSL 3 was already replaced by TLS 1.0 15 years ago (2002) and since then TLS 1.1 and TLS 1.2 and these days, SSL 3.0 is rarely used. Even Windows XP is set to prefer TLS.

So Padding Oracle is old and pretty much all servers and clients are designed to give preference to TLS which isn’t sensitive to this sort of attack…but things are not that simple. Pretty much all clients and servers are designed to negotiate the encryption technology they use when connecting, and fallback to older protocols when unable to use new ones. This would allow a web server to drop from TLS 1.2 to TLS 1.1 if it talks to a client that is too old to support TLS 1.2, or vice versa. Fallback is of particular importance to public servers because they usually cannot afford to rule-out older clients.

With pretty much all servers and clients allowing protocol negotiation and fallback by default, and having SSL 3 as an available option, the world now has to deal with POODLE. All the attacker has to do is interfere with the connection during the protocol negotiation, and force the client and server to negotiate down to SSL 3.0…and then exploit the Padding Oracle vulnerability in it. This interference is not very simple, but can be done if the attacker has some control over the network.

Ultimately, having both control over the network to force SSL 3 and the ability to exploit Padding Oracle together isn’t very trivial, and that’s why this issue is not considered to be as high of a risk as HeartBleed and other recent security vulnerabilities, but it does need to be addressed and more importantly, we need to keep this in mind when designing or configuring systems. Keeping old protocols in place is comfortable and re-assuring, but can also turn out to be a gaping hole in our security.


What to do about POODLE?

POODLE can be used only if both client and server are configured to fallback to SSL 3.0, but clients cannot trust all servers to do so or vice versa, so both server owners and users should take steps to prevent it. Browser vendors are working on updating their products, but until they do, you can disable SSL 3.0 at the operating system level. For example, Digicert have published this article, describing doing this on Windows, which is quite simple to do. You can read more about POODLE and how it works here.

Thursday, October 9, 2014

Improve your grades!

Well…school days might be over, but if you’re running a web server servicing HTTPS, you might still care about your grades! I’m referring, of course, to Qualys’ SSL Labs HTTPS testing and grading system. Qualys have set up their free grading service several years ago, and it’s now considered by most to be the de-facto standard for qualifying web servers. For many companies and administrators running public sites, getting an A+ is an important goal and today, I’ll talk about getting that score for your IIS server.

Many people who run Qualys’ test against their IIS are disappointed to get only an A-, with the test quoting the following two reasons:

1. RC4 cipher is used with TLS 1.1 or newer protocols, even though stronger ciphers are available.

2. The server does not support Forward Secrecy with the reference browsers.

Even though Windows Server 2012 was released not very long ago, it still gets this grade on a default configuration. Luckily, though, you can easily adjust your settings to get the desired A+. The key to this is the Cipher Suite priority list that is built-in to the SChannel component of the operating system. When a client (web browser, for example) connects to a server, they conduct a negotiation process during which the client tells the server which cipher suites it supports. The server then goes through its own prioritized list of suits, attempting to find the one that’s as high in the list as possible that’s supported by the client. Once that is determined, the server notifies the client which suite to use, and the connection continues. Let’s see how this looks like in a network-trace (Netmon capture):

Client sending its list to the server:


As you can see, the most preferred cipher suite is TLS_RSA_WITH_AES_128_CBC_SHA256, which is TLS 1.2 using the RSA Key Exchange and Digital Signature (see THIS article about reading Cipher Suite names) and AES based encryption with 128 bit key length, CBC Mode and SHA256 hashing. The packet also shows that the client supports Elliptic Curves and Server Name Indication (SNI) extensions, as well as a few others (the bottom part of the list).

After the server picks the suite that’s best, it tells the client to use it:


As you can see, the server picked TLS_RSA_WITH_AES_128_CBC_SHA, which was actually the 2nd highest item on the client’s list. This suggests the server either doesn’t support the SHA256 hash, or has it lower in its own list of preference. In reality, there are virtually NO servers out there that don’t support SHA256 these days, so the 2nd option is what really happened.

When Qualys’ SSL Labs test examines a server, it establishes several connections, sending various versions of the preference order, to see what the server chooses. If the RC4 cipher is high on the server’s priority list and gets chosen, the grade is reduced. Similarly, if the server has Forward-secrecy based key exchange suites too low in the priority list, which also reduces the grade to A-.

What can you do about it? Well, the ideal solution is to upgrade your server to Windows 2012 R2. This latest version of the server (for now, until the next version is released in 2015) has been programmed with a different default cipher suite order, so that RC4 is de-prioritized, and Forward Secrecy suites are prioritized. This also has other benefits that getting a newer version provides, let’s not forget!

If this is not a viable option, another things one can do is re-order the cipher suite order on his server manually to match the list that’s built-in to Windows 2012 R2. The list is like this:


As you can see, the top 10 suites in this list are perfect forward secrecy based (the ECDHE key exchange algorithm stands for Elliptic Curve Diffie-Helman Ephemeral, where Ephemeral stands for perfect forward secrecy). Also, the RC4 cipher is all the way down in the list.

Changing the order is relatively simple. Officially, you’re supposed to edit Group Policy, as described here. However, NARTAC software have released a tool that makes it a lot easier. It’s called IISCrypto, and is a free download from here. To use it, simply run it on the server, and use the arrows to re-order the list to your liking:


As you can see, the tool can also be used to completely disable certain protocols, ciphers, hashes or key exchange algorithms, and you also have several templates built-in to make things a one-click operation.

One thing to keep in mind is that changes to the order don’t take effect immediately. After you apply them, whether it’s with IISCrypto or differently, you need to reboot the computer. Once you do, SSL Labs should grade you with the sought-after A+

Thursday, October 2, 2014

Reading between the lines

Most people never need to mess-around with cipher suites, but if you do, it can be quite confusing to figure out their cryptic names. For example, what’s the difference between TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 and TLS_RSA_WITH_AES_256_GCM_SHA384 or what is the difference between the 1st, 2nd and 3rd “256” in TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA256_P256?
If you ever wondered what this all means, I got the answers!
The Cipher Suite name is comprised of the following pieces (some suites have less pieces):
1)      The protocol
2)      The Key Exchange
3)      The Digital Signature
4)      The Encryption (Cipher)
5)      The encryption key length
6)      The encryption mode
7)      The Hashing algorithm
8)      The Elliptic Curve size
Let’s review what they mean, and what possible values you could find.
The protocol is the most important value, and it can have two possible values – SSL and TLS. SSL is a family of older protocols starting from SSL 1 and going up to SSL 3. While most computer still support it, even the latest one (3) has significant vulnerabilities, so it’s rarely used anymore and many server explicitly de-prioritize or disable it. The TLS family of protocols has 3 members – 1.0, 1.1 and 1.2 and like anything else, most people (and the servers they configure) prefer the latest member. At this time, 1.2 is the latest, and version 1.3 has had its draft published a few months ago. We can expect to start seeing it in public use within a year or two. Cipher suites don’t actually list the version of the protocol and just state TLS or SSL, with the version number being suggested by the client to the server separately.
Key Exchange
The key exchange algorithm controls how the client gives the server the symmetric key that will be used for the session. The common key exchange families are RSA and DH (Diffie-Helman). DH has several variations like ECDH (Elliptic-Curve DH), DHE (DH Ephemeral) and ECDHE (both Elliptic-curve and Ephemeral). Windows Servers usually prefer Diffie-Helman’s exchanges, and the typical priority list for cipher suits will list them almost exclusively.
Digital Signature
The Digital Signature makes sure that the data exchanged between client and server is protected from forgery or alteration. The common algorithms are RSA, ECDSA and DSS. Several cipher suites use RSA for both Key Exchange AND Digital Signature, and so their name would list RSA only once. For example:
Encryption (Cipher) and key length
The Encryption algorithm is about how the data actually gets scrambled, and is always paired with a key length between 128 and 256 bit. AES is pretty much ubiquitous these days, though RC4 was in use for many years and still shows up not-and-then. You can sometimes see 3DES (a.k.a. “Triple-DES”), and when using non-windows platforms, occasionally others. A key length of 256 bit might seem low when comparing it to the 1024 bits (or more) used in generating digital certificates, but in reality, symmetric encryption is stronger and thus a 1024 bit key for a certificate (asymmetric encryption) is equivalent to a 80 bit key for symmetric encryption. That means that a 128 bit key is pretty darn good, and a 256 bit key is terrific…we are many decades away from anyone (even the NSA or quantum computers) having the ability to brute-force his way through a 256 bit symmetric key. Since 256 is still much stronger than 128, the default cipher priority order on any computer favors 256 over lower lengths.
Encryption mode
The encryption mode is actually an extension, so it’s optional and not all suites specify one. For example, RC4 doesn’t offer advanced encryption modes and therefore none are listed. AES does, and so the default list of suites includes CBC and GCM as the primary modes, and using GCM offers better security as it implements several technologies to protect message integrity (so does CBC, but GCM is better)
Hashing algorithm
Hashing benefits security by preventing tampering with the encrypted data. Changing the data invalidates the data’s hash, thus alerting the recipient that the data has been tampered with. SHA (Secure Hash Algorithm) is a family of cryptographic hash functions published by the National Institute of Standards and Technology (NIST) as a U.S. Federal Information Processing Standard (FIPS). The number following SHA is the Output size in bits. The bigger the hash, the harder it is to brute-force, hence it’s more secure and preferred.
Elliptic Curve size
Some Key Exchange algorithms use Elliptic Curves, which are easier to calculate. This can provide higher security as the same CPU power can generate better encryption in less time. When Elliptic Curves are used, the suite specifies the curves and adds “P” to differentiate from the encryption algorithm’s key length. When looking at the cipher suites, those without Elliptics won’t list the curves
Ready to see this in real life? Have a look at the following list of suites built-into Windows 2012 R2 and see if you identify the properties of the suites and how they differ: