Crypto'02 Conference

By Tolga Acar
The conference was held at the University of California Santa Barbara campus on Aug 18-22, 2002. As before, there was a presence of various companies and universities interested in computer security and cryptography, although .com representation seemed lower compared to previous years possibly due to the economic downturn.

Presentations
A new AES analysis described the AES algebraic structure in a larger context, called BES. Instead of GF(8) and GF(28) operations, BES uses only GF(28) operations simplifying analysis. AES is then can be analyzed as a very sparse MQ (multivariate quadratic) system over GF(28). This doesn't mean that AES is broken or that its strength is reduced; it means that our understanding of AES is improving. I think this was the best talk of the conference.

The next presentation analyzed the piecewise processing of message blocks in various modes of operation, including the widely used CBC mode. The attack exploits the immediate availability of one ciphertext-plaintext block. It allows the attacker to be adaptive from one block to the next within a single message, and assumes that the cipher is available to produce ciphertext-plaintext pairs upon request.

The presentation on revocation of anonymous credentials introduced the notion of dynamic accumulator, in which a hash value can be dynamically added and deleted to/from a short accumulated value that is an accumulation of previous inputs. This can be quite useful in anonymous certificates and identity escrow systems.

Jacques Stern's presentation on flaws in signature scheme proofs affirmed that security proofs should be taken for granted unless they are sufficiently analyzed publicly. V. Shoup found one such flaw in OAEP at RSA 2002 Conference. Stern's paper presented flaws in the security proofs of ESIGN and ECDSA signature schemes. This doesn't mean that these signature schemes are broken, but their previously proven security properties do not hold.

Another presentation presented the first known security proof of the pervasive TLS/SSL protocol. The paper showed that the security of TLS could be related via a reasonably tight reduction to the hardness of inverting RSA based on partial-RSA decision oracle. Partial-RSA decision oracle basically uses fewer input bits than the modulus and tells you if a truncated plaintext-ciphertext is a potential pair for a given RSA key-pair. The paper continues with the proof by showing that for RSA key sizes larger than 1024 and the SSL/TLS PRF function, the security of TLS/SSL protocol can be reduced to the hardness of inverting RSA.

H. Krawczyk presented a security analysis of SIGMA (SIGn-and-MAc) protocols, the authenticated Diffie-Hellman key exchange protocol used by IKE, the Internet Key Exchange Protocol. The difficulty is the conflict between authentication and identity-concealment requirements.

A paper was presented on the security analysis of three NTRU encryption-padding schemes, and claimed a weakness with chosen-ciphertext attacks on Tuesday. But then on Wednesday's rump session, two NTRU guys showed in a rather ridiculing way that the analysis was incorrect due to false assumptions and understanding of the NTRU's padding schemes.

David Wagner presented a paper on the birthday problem. The birthday problem deals with finding two people with a matching property, say their birthdays. This paper looked at finding one element from each of the given k lists with n-bit values so that the resulting k values XOR to zero, and showed a cube-root time algorithm for the case k=4. For instance, with the classical birthday problem, it is possible to find a collision of a n-bit hash function with complexity O(2n/2), which corresponds to a 2-list problem, k=2. The paper developed an efficient algorithm for the 4-list birthday problem, finding a set such that x1+x2+x3+x4=0 . Note that the new algorithm does not find all possible solutions, but only one.

One session on RC4 discussed the number of bytes that should be discarded in order to be safe, and recommended discarding at least 512 bytes instead of previously reported 256 bytes in order to make known analyses harmless.

Other
Microsoft made available a Palladium FAQ page, see Palladium. A Microsoft representative talked about Palladium and tried to answer questions. The Q&A didn't go too far when people started asking hard privacy questions and past history of similar attempts. For more, go to the page above, and read it with copious amount of salt. In short, it has some form of TCPA hardware (the Fritz chip) under it, a set of trusted programs instantiating other programs with walls built between those programs. This is indeed what was done 20 or so years ago; a reference monitor with multiple layers with hardware memory protection support. The claim is that Microsoft will make the bottom layer's source code available for public scrutiny, any user will be able to turn this on and off. It is not clear if a product may not function properly unless Palladium is enabled.

David Chaum was one of the invited speakers, and discussed anonymity and privicy going forward. He expressed privacy concerns on initiatives like Liberty Alliance, MS Passport, and WEB Security with identities: what sort of information would be stored on such service centers and how/what would be disclosed to who/how under what circumstances; identity management services have inherent privacy issues as they store personal/private information that can potentially be disclosed without an individual's intervention. He also mentioned how Microsoft is trying to use the latest FTC vs Microsoft case on MS Passport settlement as a "certified by FTC" label for their security.

Phil Zimmerman formed a new company, PGP Corporation, and got PGP back from Network Associates.

For those of you who lost sleep since the "Primes in P" paper's web posting, the algorithm has already been improved twice, and is indeed an important contribution. But, it has nothing to do with cryptography, and does not make cryptographic algorithms less secure. The paper introduces a deterministic algorithm for primality testing, but has a bottleneck in one of its steps, which makes it slower than known non-deterministic primality testing algorithms.