Provable Security for Public Key Schemes

更新时间:2023-04-09 12:08:01 阅读量: 实用文档 文档下载

说明:文章内容仅供预览,部分内容可能不全。下载后的文档,内容与下面显示的完全一致。下载之前请确认下面内容是否您想要的,是否完整无缺。

Provable Security for Public Key Schemes David Pointcheval

Abstract.Since the appearance of public-key cryptography in the Di?e-Hell-

man seminal paper,many schemes have been proposed,but many have been

broken.Indeed,for a long time,the simple fact that a cryptographic algorithm

had withstood cryptanalytic attacks for several years was considered as a kind

of validation.But some schemes took a long time before being widely studied,

and maybe thereafter being broken.

A much more convincing line of research has tried to provide“prov-

able”security for cryptographic protocols,in a complexity theory sense:if

one can break the cryptographic protocol,one can e?ciently solve the un-

derlying problem.Unfortunately,this initially was a purely theoretical work:

very few practical schemes could be proven in this so-called“standard model”

because such a security level rarely meets with e?ciency.Ten years ago,Bel-

lare and Rogaway proposed a trade-o?to achieve some kind of validation

of e?cient schemes,by identifying some concrete cryptographic objects with

ideal random ones.The most famous identi?cation appeared in the so-called

“random-oracle model”.More recently,another direction has been taken to

prove the security of e?cient schemes in the standard model(without any

ideal assumption)by using stronger computational assumptions.

In these lectures,we focus on practical asymmetric protocols together with their“reductionist”security proofs,mainly in the random-oracle model.

We cover the two main goals that public-key cryptography is devoted to solve:

authentication with digital signatures,and con?dentiality with public-key en-

cryption schemes.

1.Introduction

Since the beginning of public-key cryptography,with the seminal Di?e-Hellman paper[25],many suitable algorithmic problems for cryptography have been pro-posed and many cryptographic schemes have been designed,together with more or less heuristic proofs of their security relative to the intractability of the above problems.However,most of those schemes have thereafter been broken.

134David Pointcheval

The simple fact that a cryptographic algorithm withstood cryptanalytic at-tacks for several years has often been considered as a kind of validation procedure, but some schemes take a long time before being broken.An example is the Chor-Rivest cryptosystem[21,48],based on the knapsack problem,which took more than10years to be totally broken[86],whereas before this attack it was believed to be strongly secure.As a consequence,the lack of attacks at some time should never be considered as a security validation of the proposal.

1.1.Provable Security

A completely di?erent paradigm is provided by the concept of“provable”secu-rity.A signi?cant line of research has tried to provide proofs in the framework of complexity theory(a.k.a.“reductionist”security proofs[4]):the proofs provide re-ductions from a well-studied problem(RSA or the discrete logarithm)to an attack against a cryptographic protocol.

At the beginning,people just tried to de?ne the security notions required by actual cryptographic schemes,and then to design protocols which achieve these no-tions.The techniques were directly derived from the complexity theory,providing polynomial reductions.However,their aim was essentially theoretical.They were indeed trying to minimize the required assumptions on the primitives(one-way functions or permutations,possibly trapdoor,etc)[37,35,52,71]without consid-ering practicality.Therefore,they just needed to design a scheme with polynomial algorithms,and to exhibit polynomial reductions from the basic assumption on the primitive into an attack of the security notion,in an asymptotic way.However, such a result has no practical impact on actual security.Indeed,even with a poly-nomial reduction,one may be able to break the cryptographic protocol within a few hours,whereas the reduction just leads to an algorithm against the underlying problem which requires many years.Therefore,those reductions only prove the se-curity when very huge(and thus maybe unpractical)parameters are in use,under the assumption that no polynomial time algorithm exists to solve the underlying problem.

1.2.Exact Security and Practical Security

For a few years,more e?cient reductions have been expected,under the denom-inations of either“exact security”[12]or“concrete security”[58],which provide more practical security results.The perfect situation is reached when one manages to prove that,from an attack,one can describe an algorithm against the under-lying problem,with almost the same success probability within almost the same amount of time.We have then achieved“practical security”.

Unfortunately,in many cases,even just provable security is at the cost of an important loss in terms of e?ciency for the cryptographic protocol.Thus some models have been proposed,trying to deal with the security of e?cient schemes: some concrete objects are identi?ed with ideal(or black-box)ones.

For example,it is by now usual to identify hash functions with ideal random functions,in the so-called“random-oracle model”,informally introduced by Fiat

Provable Security for Public Key Schemes135 and Shamir[28],and formalized by Bellare and Rogaway[10].Similarly,block ciphers are identi?ed with families of truly random permutations in the“ideal cipher model”[9].A few years ago,another kind of idealization was introduced in cryptography,the black-box group[53,80],where the group operation,in any algebraic group,is de?ned by a black-box:a new element necessarily comes from the addition(or the subtraction)of two already known elements.It is by now called the“generic model”.Recent works[77,18]even require several ideal models together to provide some new validations.

1.3.Outline of the Notes

In the next section,we explain and motivate more about exact security proofs,and we introduce the notion of the weaker security analyses,the security arguments (in an ideal model,and namely the random-oracle model).Then,we review the formalism of the most important asymmetric primitives:signatures and public-key encryption schemes.For both,we provide some examples,with some security analyses in the“reductionist”sense.

1.4.Related Work

These notes present a survey,based on several published papers,from the author, with often several co-authors:about signature[67,69,68,17,84],encryption[7, 3,62,59,32,33]and provably secure constructions[61,63,65,64,66].Many other papers are also cited and rephrased,which present e?cient provably secure constructions.Among the bibliography list presented at the end,we would like to insist on[10,11,12,22,82,83].We thus refer the reader to the original papers for more details.

2.Security Proofs and Security Arguments

2948c6a60029bd64783e2c9bputational Assumptions

In both symmetric and asymmetric scenarios,many security notions can not be unconditionally guaranteed(whatever the computational power of the adversary). Therefore,security generally relies on a computational assumption:the existence of one-way functions,or permutations,possibly trapdoor.A one-way function is a function f which anyone can easily compute,but given y=f(x)it is computa-tionally intractable to recover x(or any pre-image of y).A one-way permutation is a bijective one-way function.For encryption,one would like the inversion to be possible for the recipient only:a trapdoor one-way permutation is a one-way per-mutation for which a secret information(the trapdoor)helps to invert the function on any point.

Given such objects,and thus computational assumptions about the intract-ability of the inversion without possible trapdoors,we would like that security could be achieved without extra assumptions.The only way to formally prove such a fact is by showing that an attacker against the cryptographic protocol can

136David Pointcheval

be used as a sub-part in an algorithm that can break the basic computational assumption.

A partial order therefore exists between computational assumptions(and intractable problems too):if a problem P is more di?cult than the problem P (P reduces to P,see below)then the assumption of the intractability of the problem P is weaker than the assumption of the intractability of the problem P . The weaker the required assumption is,the more secure the cryptographic scheme is.

2.2.“Reductionist”Security Proofs

In complexity theory,such an algorithm which uses the attacker as a sub-part in a global algorithm is called a reduction.If this reduction is polynomial,we can say that the attack of the cryptographic protocol is at least as hard as inverting the function:if one has a polynomial algorithm to solve the latter problem,one can polynomially solve the former one.In the complexity theory framework,a polynomial algorithm is the formalization of e?ciency.

Therefore,in order to prove the security of a cryptographic protocol,one?rst needs to make precise the security notion one wants the protocol to achieve:which adversary’s goal one wants to be intractable,under which kind of attack.At the beginning of the1980’s,such security notions have been de?ned for encryption[35] and signature[37,38],and provably secure schemes have been suggested.However, those proofs had a theoretical impact only,because both the proposed schemes and the reductions were completely unpractical,yet polynomial.The reductions were indeed e?cient(i.e.polynomial),and thus a polynomial attack against a cryp-tosystem would have led to a polynomial algorithm that broke the computational assumption.But the latter algorithm,even polynomial,may require hundreds of years to solve a small instance.

For example,let us consider a cryptographic protocol based on integer factor-ing.Let us assume that one provides a polynomial reduction from the factorization into an attack.But such a reduction may just lead to a factorization algorithm with a complexity in225k10,where k is the bit-size of the integer to factor.This indeed contradicts the assumption that no-polynomial algorithm exists for fac-toring.However,on a1024-bit number(k=210),it provides an algorithm that requires2125basic operations,which is much more than the complexity of the best current algorithm,such as NFS[46],which needs less than2100(see Section4). Therefore,such a reduction would just be meaningful for numbers above4096bits (since with k=212,2145<2149,where2149is the estimate e?ort for factoring a 4096-bit integer with the best algorithm.)Concrete examples are given later.

2.3.Practical Security

Moreover,most of the proposed schemes were unpractical as well.Indeed,the pro-tocols were polynomial in time and memory,but not e?cient enough for practical implementation.

Provable Security for Public Key Schemes137 For a few years,people have tried to provide both practical schemes,with practical reductions and exact complexity,which prove the security for realis-tic parameters,under a well-de?ned assumption:exact reduction in the standard model(which means in the complexity-theoretic framework).For example,under the assumption that a1024-bit integer cannot be factored with less than270basic operations,the cryptographic protocol cannot be broken with less than260basic operations.We will see such an example later.

Unfortunately,as already remarked,practical or even just e?cient reductions in the standard model can rarely be conjugated with practical schemes.Therefore, one needs to make some hypotheses on the adversary:the attack is generic,inde-pendent of the actual implementation of some objects

?hash functions,in the“random-oracle model”;

?symmetric block ciphers,in the“ideal-cipher model”;

?algebraic groups,in the“generic model”.

The“random-oracle model”was the?rst to be introduced in the cryptographic community[28,10],and has already been widely accepted.By the way,?aws have been shown in the“generic model”[84]on practical schemes,and the“random-oracle model”is not equivalent to the standard one either.Several gaps have al-ready been exhibited[19,54,6].However,all the counter-examples in the random-oracle model are pathological,counter-intuitive and not natural.Therefore,in the sequel,we focus on security analyses in this model,for real and natural construc-tions.A security proof in the random-oracle model will at least give a strong ar-gument in favor of the security of the scheme.Furthermore,proofs in the random-oracle model under a weak computational assumption may be of more pratical interest than proofs in the standard model under a strong computational assump-tion.

2.4.The Random-Oracle Model

As said above,e?ciency rarely meets with provable security.More precisely,none of the most e?cient schemes in their category have been proven secure in the standard model.However,some of them admit security validations under ideal assumptions:the random-oracle model is the most widely accepted one.

Many cryptographic schemes use a hash function H(such as MD5[72]or the American standards SHA-1[56],SHA-256,SHA-384and SHA-512[57]).This use of hash functions was originally motivated by the wish to sign long messages with a single short signature.In order to achieve non-repudiation,a minimal requirement on the hash function is the impossibility for the signer to?nd two di?erent messages providing the same hash value.This property is called collision-resistance.

It was later realized that hash functions were an essential ingredient for the security of,?rst,signature schemes,and then of most cryptographic schemes.In order to obtain security arguments,while keeping the e?ciency of the designs that use hash functions,a few authors suggested using the hypothesis that H behaves like a random function.First,Fiat and Shamir[28]applied it heuristically

138David Pointcheval

to provide a signature scheme“as secure as”factorization.Then,Bellare and Rogaway[10,11,12]formalized this concept for cryptography,and namely for signature and public-key encryption.

In this model,the so-called“random-oracle model”,the hash function can be formalized by an oracle which produces a truly random value for each new query. Of course,if the same query is asked twice,identical answers are obtained.This is precisely the context of relativized complexity theory with“oracles,”hence the name.

About this model,no one has ever been able to provide a convincing con-tradiction to its practical validity,but just theoretical counter-examples on either clearly wrong designs for practical purpose[19],or arti?cial security notions[54,6]. Therefore,this model has been strongly accepted by the community,and is con-sidered as a good one,in which security analyses give a good taste of the actual security level.Even if it does not provide a formal proof of security(as in the standard model,without any ideal assumption),it is argued that proofs in this model ensure security of the overall design of the scheme provided that the hash function has no weakness,hence the name“security arguments”.

This model can also be seen as a restriction on the adversary’s capabilities.In-deed,it simply means that the attack is generic without considering any particular instantiation of the hash functions.Therefore,an actual attack would necessarily use a weakness or a speci?c feature of the hash function.The replacement of the hash function by another one would rule out this attack.

On the other hand,assuming the tamper-resistance of some devices,such as smart cards,the random-oracle model is equivalent to the standard model,which simply requires the existence of pseudo-random functions[34,51].

As a consequence,almost all the standards bodies by now require designs provably secure,at least in that model,thanks to the security validation of very e?cient protocols.

2.5.The General Framework

Before going into more details of this kind of proofs,we would like to insist on the fact that in the current general framework,we give the adversary complete access to the cryptographic primitive,but as a black-box.It can ask any query of its choice,and the box always answers correctly,in constant time.Such a model does not consider timing attacks[44],where the adversary tries to extract the secrets from the computational time.Some other attacks analyze the electrical energy required by a computation to get the secrets[45],or to make the primitive fail on some computation[13,16].They are not captured either by this model.

3.A First Formalism

In this section we describe more formally what a signature scheme and an encryp-tion scheme are.Moreover,we make precise the security notions one wants the schemes to achieve.This is the?rst imperative step towards provable security.

Provable Security for Public Key Schemes139

3.1.Digital Signature Schemes

Digital signature schemes are the electronic version of handwritten signatures for digital documents:a user’s signature on a message m is a string which depends on m,on public and secret data speci?c to the user and—possibly—on randomly chosen data,in such a way that anyone can check the validity of the signature by using public data only.The user’s public data are called the public key,whereas his secret data are called the private key.The intuitive security notion would be the impossibility to forge user’s signatures without the knowledge of his private key.In this section,we give a more precise de?nition of signature schemes and of the possible attacks against them(most of those de?nitions are based on[38]). 3.1.1.De?nitions.A signature scheme S=(K,S,V)is de?ned by the three fol-lowing algorithms:

?The key generation algorithm K.On input1k,which is a formal notation for a machine with running time polynomial in k(1k is indeed k in basis1), the algorithm K produces a pair(pk,sk)of matching public and private keys.

Algorithm K is probabilistic.The input k is called the security parameter.The sizes of the keys,or of any problem involved in the cryptographic scheme,will depend on it,in order to achieve an appropriate security level(the expected minimal time complexity of any attack).

?The signing algorithm S.Given a message m and a pair of matching public and private keys(pk,sk),S produces a signatureσ.The signing algorithm might be probabilistic.

?The veri?cation algorithm V.Given a signatureσ,a message m and a public key pk,V tests whetherσis a valid signature of m with respect to pk.In general,the veri?cation algorithm need not be probabilistic.

3.1.2.Forgeries and Attacks.In this subsection,we formalize some security no-tions which capture the main practical situations.On the one hand,the goals of the adversary may be various:

?Disclosing the private key of the signer.It is the most serious attack.This attack is termed total break.

?Constructing an e?cient algorithm which is able to sign messages with good probability of success.This is called universal forgery.

?Providing a new message-signature pair.This is called existential forgery.

The corresponding security level is called existential unforgeability(EUF).

In many cases the latter forgery,the existential forgery,is not dangerous because the output message is likely to be meaningless.Nevertheless,a signature scheme which is existentially forgeable does not guarantee by itself the identity of the signer.For example,it cannot be used to certify randomly looking elements,such as keys.Furthermore,it cannot formally guarantee the non-repudiation property, since anyone may be able to produce a message with a valid signature.

140David Pointcheval

On the other hand,various means can be made available to the adversary,helping it into its forgery.We focus on two speci?c kinds of attacks against signa-ture schemes:the no-message attacks and the known-message attacks (KMA ).In the former scenario,the attacker only knows the public key of the signer.In the latter,the attacker has access to a list of valid message-signature pairs.Accord-ing to the way this list was created,we usually distinguish many subclasses,but the strongest is de?nitely the adaptive chosen-message attack (CMA ),where the attacker can ask the signer to sign any message of its choice,in an adaptive way:it can adapt its queries according to previous answers.

When signature generation is not deterministic,there may be several signa-tures corresponding to a given message.And then,some notions de?ned above may become ambiguous [84].First,in known-message attacks,an existential forgery becomes the ability to forge a fresh message/signature pair that has not been obtained during the attack.There is a subtle point here,related to the context where several signatures may correspond to a given message.We actually adopt the stronger rule that the attacker needs to forge the signature of message,whose signature was not queried.The more liberal rule,which makes the attacker suc-cessful when it outputs a second signature of a given message di?erent from a previously obtained signature of the same message,is called malleability ,while the corresponding security level is called non-malleability (NM ).Similarly,in adaptive chosen-message attacks,the adversary may ask several times the same message,and each new answer gives it some information.A slightly weaker security model,by now called single-occurrence adaptive chosen-message attack (SO-CMA ),allows the adversary at most one signature query for each message.In other words the adversary cannot submit the same message twice for signature.

When one designs a signature scheme,one wants to computationally rule out at least existential forgeries,or even achieve non-malleability,under adaptive chosen-message attacks.More formally,one wants that the success probability of any adversary A with a reasonable time is small,where Succ euf S

(A )=Pr (pk ,sk )←K (1k ),(m,σ)←A S sk (pk ):V (pk ,m,σ)=1 .We remark that since the adversary is allowed to play an adaptive chosen-message attack,the signing algorithm is made available,without any restriction,hence the oracle notation A S sk .Of course,in its answer,there is the natural re-striction that,at least,the returned message-signature has not been obtained from the signing oracle S sk itself (non-malleability)or even the output message has not been queried (existential unforgeability).

3.2.Public-Key Encryption

The aim of a public-key encryption scheme is to allow anybody who knows the public key of Alice to send her a message that she will be the only one able to recover,granted her private key.

3.2.1.De?nitions.A public-key encryption scheme S =(K ,E ,D )is de?ned by the three following algorithms:

Provable Security for Public Key Schemes 141

?The key generation algorithm K .On input 1k where k is the security parame-ter,the algorithm K produces a pair (pk ,sk )of matching public and private keys.Algorithm K is probabilistic.

?The encryption algorithm E .Given a message m and a public key pk ,E produces a ciphertext c of m .This algorithm may be probabilistic.In the latter case,we write E pk (m ;r )where r is the random input to E .

?The decryption algorithm D .Given a ciphertext c and the private key sk ,D sk (c )gives back the plaintext m .This algorithm is necessarily deterministic.

3.2.2.Security Notions.As for signature schemes,the goals of the adversary may be various.The ?rst common security notion that one would like for an encryption scheme is one-wayness (OW ):with just public data,an attacker cannot get back the whole plaintext of a given ciphertext.More formally,this means that for any adversary A ,its success in inverting E without the private key should be negligible over the probability space M ×?,where M is the message space and ?is the space of the random coins r used for the encryption scheme,and the internal random coins of the adversary:

Succ ow S (A )=Pr m,r

[(pk ,sk )←K (1k ):A (pk ,E pk (m ;r ))=m ].However,many applications require more from an encryption scheme,namely the semantic security (IND )[35],a.k.a.polynomial security/indistinguishability of en-cryptions :if the attacker has some information about the plaintext,for example that it is either “yes”or “no”to a crucial query,any adversary should not learn more with the view of the ciphertext.This security notion requires computational impossibility to distinguish between two messages,chosen by the adversary,which one has been encrypted,with a probability signi?cantly better than one half:its advantage Adv ind S (A ),formally de?ned as 2×Pr b,r

(pk ,sk )←K (1k ),(m 0,m 1,s )←A 1(pk ),c =E pk (m b ;r ):A 2(m 0,m 1,s,c )=b ?1,where the adversary A is seen as a 2-stage attacker (A 1,A 2),should be negligible.

A later notion is non-malleability (NM )[26].To break it,the adversary,given a ciphertext,tries to produce a new ciphertext such that the plaintexts are mean-ingfully related.This notion is stronger than the above semantic security,but it is equivalent to the latter in the most interesting scenario [7](the CCA attacks,see below).Therefore,we will just focus on one-wayness and semantic security.

On the other hand,an attacker can play many kinds of attacks,according to the available information :since we are considering asymmetric encryption,the adversary can encrypt any plaintext of its choice,granted the public key,hence the chosen-plaintext attack (CPA ).It may furthermore have access to additional information,modeled by partial or full access to some oracles:

?A validity-checking oracle which,on input a ciphertext c ,answers whether it is a valid ciphertext or not.Such a weak oracle,involved in the so-called

142David Pointcheval

reaction attacks[39]or Validity-Checking Attack(VCA),had been enough to break some famous encryption schemes[15,42].

?A plaintext-checking oracle which,on input a pair(m,c),answers whether c encrypts the message m.This attack has been termed the Plaintext-Checking Attack(PCA)[59].

?The decryption oracle itself,which on any ciphertext answers the correspond-ing plaintext.There is of course the natural restriction not to ask the challenge ciphertext to that oracle.

For all these oracles,access may be restricted as soon as the challenge ciphertext is known,the attack is thus said non-adaptive since oracle queries cannot depend on the challenge ciphertext,while they depend on previous answers.On the oppo-site,access can be unlimited and attacks are thus called adaptive attacks(w.r.t. the challenge ciphertext).This distinction has been widely used for the chosen-ciphertext attacks,for historical reasons:the non-adaptive chosen-ciphertext at-tacks(CCA1)[52],a.k.a.lunchtime attacks,and adaptive chosen-ciphertext at-tacks(CCA2)[71].The latter scenario which allows adaptively chosen ciphertexts as queries to the decryption oracle is de?nitely the strongest attack,and will be named the chosen-ciphertext attack(CCA).

Furthermore,multi-user scenarios can be considered where related messages are encrypted under di?erent keys to be sent to many people(e.g.broadcast of encrypted data).This may provide many useful data for an adversary.For ex-ample,RSA is well-known to be weak in such a scenario[40,79],namely with a small encryption exponent,because of the Chinese Remainders Theorem.But once again,semantic security has been shown to be the appropriate security level,since it automatically extends to the multi-user setting:if an encryption scheme is se-mantically secure in the classical sense,it is also semantically secure in multi-user scenarios,against both passive[3]and active[5]adversaries.

A general study of these security notions and attacks was conducted in[7], we therefore refer the reader to this paper for more details.See also the summary diagram on Figure1.However,we can just review the main scenarios we will consider in the following:

?one-wayness under chosen-plaintext attacks(OW-CPA)–where the adversary wants to recover the whole plaintext from just the ciphertext and the public key.This is the weakest scenario.

?semantic security under adaptive chosen-ciphertext attacks(IND-CCA)–where the adversary just wants to distinguish which plaintext,between two messages of its choice,has been encrypted,while it can ask any query it wants to a decryption oracle(except the challenge ciphertext).This is the strongest scenario one can de?ne for encryption(still in our general framework.)Thus, this is our goal when we design a cryptosystem.

Provable Security for Public Key Schemes

143

OW -CPA OW -VCA OW -PCA OW -CCA

IND -CPA IND -CCA

NM -CPA NM -CCA OW

–One-Wayness IND

–Indistinguishability (a.k.a.Semantic Security)NM –Non-Malleability CPA –Chosen-Plaintext Attack VCA –Validity-Checking Attack (a.k.a.Reaction Attack)PCA

–Plaintext-Checking Attack CCA –Chosen-Ciphertext Attack

Figure 1.Relations between the Security Notions for Asymmet-

ric Encryption

4.The Computational Assumptions

There are two major families in number theory-based public-key cryptography:

1.the schemes based on integer factoring,and on the RSA problem [73];

2.the schemes based on the discrete logarithm problem,and on the Di?e-Hellman problems [25],in any “suitable”group.The ?rst groups in use were cyclic subgroups of Z p ,the multiplicative group of the modular quotient ring Z p =Z /p Z .But many schemes are now converted on cyclic subgroups of elliptic curves,or of the Jacobian of hyper-elliptic curves,with namely the so-called ECDSA [1],the US Digital Signature Standard [55]on elliptic curves.

4.1.Integer Factoring and the RSA Problem

The most famous intractable problem is factorization of integers:while it is easy to multiply two prime integers p and q to get the product n =p ·q ,it is not simple to decompose n into its prime factors p and q .

Currently,the most e?cient algorithm is based on sieving on number ?elds.The Number Field Sieve (NFS)method [46]has a super-polynomial,but sub-exponential,complexity in O (exp((1.923+o (1))(ln n )1/3(ln ln n )2/3)).It has been

144David Pointcheval

used to establish the main record,in august 1999,by factoring a 155-digit integer (512bits),product of two 78-digit primes [20].The factored number,called RSA-155,was taken from the “RSA Challenge List”,which is used as a yardstick for the security of the RSA cryptosystem (see later).The latter is used extensively in hardware and software to protect electronic data tra?c such as in the SSL (Security Sockets Layer)Handshake Protocol.

This record is very important since 155digits correspond to 512bits.And this is the size which is in use in almost all the implementations of the RSA cryptosystem (namely for actual implementations of SSL on the Internet).

RSA-155=

109417386415705274218097073220403576120\

037329454492059909138421314763499842889\

347847179972578912673324976257528997818\

33797076537244027146743531593354333897

=102639592829741105772054196573991675900\

716567808038066803341933521790711307779

*106603488380168454820927220360012878679\

207958575989291522270608237193062808643

Unfortunately,integer multiplication just provides a one-way function,with-out any possibility to invert the process.No information is known to make factoring easier.However,some algebraic structures are based on the factorization of an in-teger n ,where some computations are di?cult without the factorization of n ,but easy with it:the ?nite quotient ring Z n which is isomorphic to the product ring Z p ×Z q if n =p ·q .

For example,the e -th power of any element x can be easily computed using the square-and-multiply method.It consists in using the binary representation of the exponent e =e k e k ?1...e 0,computing the successive 2powers of x (x 20,x 21,

...,x 2k )and eventually to multiply altogether the ones for which e i =1.However,to compute e -th roots,it seems that one requires to know an integer d such that ed =1mod ?(n ),where ?(n )is the totient Euler function which denotes the cardinality of the multiplicative subgroup Z n of Z n .In the particular case where n =pq ,?(n )=(p ?1)(q ?1).And therefore,ed ?1is a multiple of ?(n )which is equivalent to the knowledge of the factorization of n [50].In 1978,Rivest,Shamir and Adleman [73]de?ned the following problem:

The RSA Problem.Let n =pq be the product of two large primes

of similar size and e an integer relatively prime to ?(n ).For a given

y ∈Z n ,compute the modular e -th root x of y (i.e.x ∈Z n such

that x e =y mod n .)

The Euler function can be easily computed from the factorization of n ,since for any n = p v i i ,?(n )=n × 1?1p i .

Provable Security for Public Key Schemes145 Therefore,with the factorization of n(the trapdoor),the RSA problem can be easily solved.But nobody knows whether the factorization is required,and how to do without it either:

The RSA Assumption.For any product of two primes,n=pq,

large enough,the RSA problem is intractable(presumably as hard

as the factorization of n).

4.2.The Discrete Logarithm and the Di?e-Hellman Problems

The setting is quite general:one is given

?a cyclic group G of prime order q(such as the?nite group(Z q,+),a subgroup of(Z p,×)for q|p?1,of an elliptic curve,etc);

?a generator g(i.e.G= g ).

We note in bold(such as g)any element of the group G,to distinguish it from a scalar x∈Z q.But such a g could be an element in Z p or a point of an elliptic curve,according to the setting.Above,we talked about a“suitable”group G.In such a group,some of the following problems have to be hard to solve(using the additive notation):

?the Discrete Logarithm problem(DL):given y∈G,compute x∈Z q such that y=x·g=g+...+g(x times),then one writes x=log g y.

?the Computational Di?e-Hellman problem(CDH):given two elements in the group G,a=a·g and b=b·g,compute c=ab·g.Then one writes c=DH(a,b).

?the Decisional Di?e-Hellman Problem(DDH):given three elements in the group G,a=a·g,b=b·g and c=c·g,decide whether c=DH(a,b)(or equivalently,whether c=ab mod q).

It is clear that they are sorted from the strongest problem to the weakest one. Furthermore,one may remark that they all are“random self-reducible”,which means that any instance can be reduced to a uniformly distributed instance:for example,given a speci?c element y for which one wants to compute the discrete logarithm x in basis g,one can choose a random z∈Z q,and compute z=z·y. The element z is therefore uniformly distributed in the group,and the discrete logarithmα=log g z leads to x=α/z mod q.As a consequence,there are only average complexity cases.Thus,the ability to solve a problem for a non-negligible fraction of instances in polynomial time is equivalent to solve any instance in expected polynomial time.

A new variant of the Di?e-Hellman problem has more recently been de-?ned by Tatsuaki Okamoto and the author[60],the so-called Gap Di?e-Hellman Problem(GDH),where one wants to solve the CDH problem with an access to a DDH oracle.One may easily remark the following properties about above prob-lems:DL≥CDH≥{DDH,GDH},where A≥

B means that the problem A is at least as hard as the problem B.However,in practice,no one knows how to solve any of them without breaking the DL problem itself.

146David Pointcheval

Currently,the most e?cient algorithms to solve the latter problem depend on the underlying group.For generic groups(for which no speci?c algebraic property can be used),algorithms have a complexity in the square root of q,the order of the generator g[78,70].For example,on well-chosen elliptic curves only these algorithms can be used.The last record was established in April2001on the curve de?ned by the equation y2+xy=x3+x2+1over the?nite?eld with2109elements.

However,for subgroups of Z p,some better techniques can be applied.The best algorithm is based on sieving on number?elds,as for the factorization. The General Number Field Sieve method[41]has a super-polynomial,but sub-exponential,complexity in O(exp((1.923+o(1))(ln p)1/3(ln ln p)2/3)).It was used to establish the last record,in April2001as well,by computing discrete logarithms in Z p,for a120-digit prime p.Therefore,512-bit primes are still safe enough,as far as the generic attacks cannot be used(the generator must be of large order q, at least a160-bit prime)

For signature applications,one only requires groups where the DL problem is hard,whereas encryption needs trapdoor problems and therefore requires groups where some of the DH’s problems are also hard to solve.

5.Digital Signature Schemes

Until1996,no practical DL-based cryptographic scheme has ever been formally studied,but heuristically only.And surprisingly,at the Eurocrypt’96conference, two opposite studies were conducted on the El Gamal signature scheme[27],the ?rst DL-based signature scheme designed in1985and depicted on Figure2.

Initialization→(p,g)

g a generator of Z p,

where p is a large prime

→(p,g)

K:Key Generation→(y,x)

private key x∈Z p?1

public key y=g x mod p

→(y,x)

S:Signature of m→(r,s)

K is randomly chosen in Z p?1

r=g K mod p s=(m?xr)/K mod p?1

→(r,s)is a signature of m

V:Veri?cation of(m,r,s)

check whether g m?=y r r s mod p

→Yes/No

Figure2.The El Gamal Signature Scheme.

Provable Security for Public Key Schemes147 Whereas existential forgeries were known for that scheme,it was believed to prevent universal forgeries.The?rst analysis,from Daniel Bleichenbacher[14], showed such a universal forgery when the generator g is not properly chosen.The second one,from Jacques Stern and the author[67],proved the security against existential forgeries under adaptive chosen-message attacks of a slight variant with a randomly chosen generator g.The latter variant simply replaces the message m by H(m,r)in the computation,while one uses a hash function H that is assumed to behave like a random oracle.It is amazing to remark that the Bleichenbacher’s attack also applies on our variant.Therefore,depending on the initialization,our variant could be a very strong signature scheme or become a very weak one!

As a consequence,a proof has to be performed in details,with precise assump-tions and achievements.Furthermore,the conclusions have to be strictly followed by developers,otherwise the concrete implementation of a secure scheme can be very weak.

5.1.Provable Security

The?rst secure signature scheme was proposed by Goldwasser et al.[37]in1984. It used the notion of claw-free permutations.A pair of permutations(f,g)is said claw-free if it is computationally impossible to?nd a claw(x,y),which satis?es f(x)=g(y).Their proposal provided polynomial algorithms with a polynomial re-duction between the research of a claw and an existential forgery under an adaptive chosen-message attack.However,the scheme was totally unpractical.What about practical schemes?

5.1.1.The RSA Signature Scheme.Two years after the Di?e-Hellman paper[25], Rivest,Shamir and Adleman[73]proposed the?rst signature scheme based on the “trapdoor one-way permutation paradigm”,using the RSA function:the genera-tion algorithm produces a large composite number N=pq,a public key e,and a private key d such that e·d=1mod?(N).The signature of a message m,encoded

as an element in Z

N ,is its e-th root,σ=m1/e=m d mod N.The veri?cation al-

gorithm simply checks whether m=σe mod N.

However,the RSA scheme is not secure by itself since it is subject to existen-tial forgery:it is easy to create a valid message-signature pair,without any help of the signer,?rst randomly choosing a certi?cateσand getting the signed message m from the public veri?cation relation,m=σe mod N.

5.1.2.The Schnorr Signature Scheme.In1986a new paradigm for signature schemes was introduced.It is derived from fair zero-knowledge identi?cation pro-tocols involving a prover and a veri?er[36],and uses hash functions in order to create a kind of virtual veri?er.The?rst application was derived from the Fiat–Shamir[28]zero-knowledge identi?cation protocol,based on the hardness of extracting square roots,with a brief outline of its security.Another famous

148David Pointcheval

identi?cation scheme [75],together with the signature scheme [76],has been pro-posed later by Schnorr,based on that paradigm:the generation algorithm pro-duces two large primes p and q ,such that q ≥2k ,where k is the security para-meter,and q |p ?1,as well as an element g in Z p of order q .It also creates a

pair of keys,the private key x ∈Z q and the public key y =g

?x mod p The sig-nature of a message m is a triple (r,e,s ),where r =g K mod p ,with a random K ∈Z q ,the “challenge”e =H (m,r )and s =K +ex mod q .The latter satis?es r =g s y e mod p with e =H (m,r ),which is checked by the veri?cation algorithm.

The security results for that paradigm have been considered as folklore for a long time but without any formal validation.

5.2.DL-Based Signatures

In our papers [67,68],with Jacques Stern,we formally proved the above paradigm when H is assumed to behave like a random oracle.The proof is based on the by now classical oracle replay technique :by a polynomial replay of the attack with di?erent random oracles (the Q i ’s are the queries and the ρi ’s are the answers),we allow the attacker to forge signatures that are suitably related.This generic

--A

H

H Q 1···Q i ?1Q i (m,σ1)

···Q j ...ρi ρ i ···ρj ······ρ j ···(m,σ1,h =ρi ,σ2)(m,σ1,h =ρ i ,σ 2)

ρ1···ρi ?1Figure 3.The Oracle Replay Technique

technique is depicted on Figure 3,where the signature of a message m is a triple (σ1,h,σ2),with h =H (m,σ1)which depends on the message and the ?rst part of the signature,both bound not to change for the computation of σ2,which really relies on the knowledge of the private key.If the probability of fraud is high enough,then with good probability,the adversary is able to answer to many distinct outputs from the H function,on the input (m,σ1).

To be more concrete,let us consider the Schnorr signature scheme,which is presented on Figure 4,in any “suitable”cyclic group G of prime order q ,where at least the Discrete Logarithm problem is hard.We expect to obtain two signatures (r =σ1,h,s =σ2)and (r =σ 1,h ,s =σ 2)of an identical message m such that σ1=σ 1,but h =h .Thereafter,we can easily extract the discrete logarithm of the public key:r =s ·g +h ·y r =s ·g +h ·y

?(s ?s )·g =(h ?h )·y ,which leads to log g y =(s ?s )·(h ?h )?1mod q .

Provable Security for Public Key Schemes149 Initialization(security parameter k)→(G,g,H)

g a generator of any cyclic group(G,+)

of order q,with2k?1≤q<2k

H a hash function:{0,1} →Z q

→(G,g,H)

K:Key Generation→(y,x)

private key x∈Z q

public key y=?x·g

→(y,x)

S:Signature of m→(r,h,s)

K is randomly chosen in Z q

r=K·g h=H(m,r)s=K+xh mod q

→(r,h,s)is a signature of m

V:Veri?cation of(m,r,s)

check whether h?=H(m,r)

and r?=s·g+h·y

→Yes/No

Figure4.The Schnorr Signature Scheme.

5.2.1.General Tools.First,let us recall the“Splitting Lemma”which will be the main probabilistic tool for the“Forking Lemma”.It translates the fact that when a subset A is“large”in a product space X×Y,it has many“large”sections. Lemma1(The Splitting Lemma).Let A?X×Y such that Pr[(x,y)∈A]≥ε. For anyα<ε,de?ne

B=

(x,y)∈X×Y|Pr

y ∈Y

[(x,y )∈A]≥ε?α

,

then the following statements hold:

(i)Pr[B]≥α

(ii)?(x,y)∈B,Pr y ∈Y[(x,y )∈A]≥ε?α.

(iii)Pr[B|A]≥α/ε.

Proof.In order to prove statement(i),we argue by contradiction,using the nota-tionˉB for the complement of B in X×Y.Assume that Pr[B]<α.Then ε≤Pr[B]·Pr[A|B]+Pr[ˉB]·Pr[A|ˉB]<α·1+1·(ε?α)=ε.

This implies a contradiction,hence the result.

Statement(ii)is a straightforward consequence of the de?nition.

150David Pointcheval

We?nally turn to the last assertion,using Bayes’law:

Pr[B|A]=1?Pr[ˉB|A]

=1?Pr[A|ˉB]·Pr[ˉB]/Pr[A]≥1?(ε?α)/ε=α/ε.

No-Message Attacks.The following Forking Lemma just states that the above oracle replay technique will often success with any good adversary.

Theorem1(The Forking Lemma).Let(K,S,V)be a digital signature scheme with security parameter k,with a signature as above,of the form (m,σ1,h,σ2),where h=H(m,σ1)andσ2depends onσ1and h only.Let A be a probabilistic polynomial time Turing machine whose input only con-sists of public data and which can ask q h queries to the random oracle, with q h>0.We assume that,within the time bound T,A produces,with probabilityε≥7q h/2k,a valid signature(m,σ1,h,σ2).Then,within time

T ≤16q h T/ε,and with probabilityε ≥1/9,a replay of this machine out-puts two valid signatures(m,σ1,h,σ2)and(m,σ1,h ,σ 2)such that h=h .

Proof.We are given an adversary A,which is a probabilistic polynomial time Turing machine with random tapeω.During the attack,this machine asks a polynomial number of questions to the random oracle H.We may assume that these questions are distinct:for instance,A can store questions and answers in a

table.Let Q1,...,Q q

h be the q h distinct questions and letρ=(ρ1,...,ρq

h

)be the

list of the q h answers of H.It is clear that a random choice of H exactly corresponds to a random choice ofρ.Then,for a random choice of(ω,H),with probabilityε,A outputs a valid signature(m,σ1,h,σ2).Since H is a random oracle,it is easy to see that the probability for h to be equal to H(m,σ1)is less than1/2k,unless it has been asked during the attack.So,it is likely that the question(m,σ1)is actually asked during a successful attack.Accordingly,we de?ne Ind H(ω)to be the index

of this question:(m,σ1)=Q Ind

H(ω)(we let Ind H(ω)=∞if the question is never

asked).We then de?ne the sets

S=

(ω,H)|A H(ω)succeeds&Ind H(ω)=∞

,

and S i=

(ω,H)|A H(ω)succeeds&Ind H(ω)=i

for i∈{1,...,q h}.

We thus call S the set of the successful pairs(ω,H).One should note that the set{S i|i∈{1,...,q h}}is a partition of S.With those de?nitions,we?nd a lower bound for the probability of success,ν=Pr[S]≥ε?1/2k.Since we did the as-sumption thatε≥7q h/2k≥7/2k,thenν≥6ε/7.Let I be the set consisting of the most likely indices i,

I={i|Pr[S i|S]≥1/2q h}.

The following lemma claims that,in case of success,the index lies in I with prob-ability at least1/2.

Provable Security for Public Key Schemes

151

Lemma 2.

Pr[Ind H (ω)∈I |S ]≥12.Proof.By de?nition of the sets S i ,Pr[Ind H (ω)∈I |S ]= i ∈I Pr[S i |S ].This probability is equal to 1? i ∈I Pr[S i |S ].Since the complement of I contains fewer than q h elements,this probability is at least 1?q h ×1/2q h ≥1/2.

We now run the attacker 2/εtimes with random ωand random H .Since ν=Pr[S ]≥6ε/7,with probability greater than 1?(1?6ε/7)2/ε,we get at least one pair (ω,H )in S .It is easily seen that this probability is lower bounded by 1?e ?12/7≥4/5.

We now apply the Splitting-lemma (Lemma 1,with ε=ν/2q h and α=ε/2)for each integer i ∈I :we denote by H |i the restriction of H to queries of index strictly less than i .Since Pr[S i ]≥ν/2q h ,there exists a subset ?i of executions such that,

for any (ω,H )∈?i ,Pr H [(ω,H )∈S i |H |i =H |i ]≥ν4q h

Pr[?i |S i ]≥12

.Since all the subsets S i are disjoint,

Pr ω,H [(?i ∈I )(ω,H )∈?i ∩S i |S ]=Pr i ∈I (?i ∩S i )|S = i ∈I

Pr[?i ∩S i |S ]

=

i ∈I Pr[?i |S i ]·Pr[S i |S ]≥ i ∈I Pr[S i |S ] /2≥14

.We let βdenote the index Ind H (ω)corresponding to the successful pair.With probability at least 1/4,β∈I and (ω,H )∈S β∩?β.Consequently,with probability greater than 4/5×1/5=1/5,the 2/εattacks have provided a successful pair (ω,H ),with β=Ind H (ω)∈I and (ω,H )∈S β.Furthermore,if we replay the attack,with ?xed ωbut randomly chosen oracle H such that H |β=H |β,we know

that Pr H [(ω,H )∈S β|H |β=H |β]≥ν/4q h .Then

Pr H

[(ω,H )∈S βand ρβ=ρ β|H |β=H |β]≥Pr H [(ω,H )∈S β|H |β=H |β]?Pr H

[ρ β=ρβ]≥ν/4q h ?1/2k ,where ρβ=H (Q β)and ρ β=H (Q β).Using again the assumption that ε≥7q h /2k ,

the above probability is lower-bounded by ε/14q h .We thus replay the attack

14q h /εtimes with a new random oracle H such that H |β=H |β,and get another

success with probability greater than 1?(1?ε/14q h )14q h /ε≥1?e ?1≥3/5.

Finally,after less than 2/ε+14q h /εrepetitions of the attack,with probability greater than 1/5×3/5≥1/9,we have obtained two signatures (m,σ1,h,σ2)and

152David Pointcheval

(m ,σ 1,h ,σ 2),both valid w.r.t.their speci?c random oracle H or H ,and with the particular relations

Qβ=(m,σ1)=(m ,σ 1)and h=H(Qβ)=H (Qβ)=h .

One may have noticed that the mechanics of our reduction depend on some parameters related to the attacker A,namely,its probability of successεand the number q h of queries to the random oracle.This induces a lack of uniformity.A uniform version,in expected polynomial time is also possible.

Theorem2(The Forking Lemma–The Uniform Case).Let(K,S,V)be

a digital signature scheme with security parameter k,with a signature as

above,of the form(m,σ1,h,σ2),where h=H(m,σ1)andσ2depends on σ1and h only.Let A be a probabilistic polynomial time Turing machine whose input only consists of public data and which can ask q h queries to the random oracle,with q h>0.We assume that,within the time bound T,A produces,with probabilityε≥7q h/2k,a valid signature(m,σ1,h,σ2).Then there is another machine which has control over A and produces two valid signatures(m,σ1,h,σ2)and(m,σ1,h ,σ 2)such that h=h ,in expected time T ≤84480T q h/ε.

Proof.Now,we try to design a machine M which succeeds in expected polynomial time:

1.M initializes j=0;

2.M runs A until it outputs a successful pair(ω,H)∈S and denotes by N j

the number of calls to A to obtain this success,and byβthe index Ind H(ω);

3.M replays,at most140N jαj times,A with?xedωand random H such that

H |β=H|β,whereα=8/7;

4.M increments j and returns to2,until it gets a successful forking.

For any execution of M,we denote by J the last value of j and by N the to-tal number of calls to A.We want to compute the expectation of N.Since ν=Pr[S],and N j≥1,then Pr[N j≥1/5ν]≥3/4.We de?ne = logαq h ,so that,140N jαj≥28q h/εfor any j≥ ,whenever N j≥1/5ν.Therefore,for

any j≥ ,when we have a?rst success in S,with probability greater than1/4, the indexβ=Ind H(ω)is in the set I and(ω,H)∈Sβ∩?β.Furthermore,with probability greater than3/4,N j≥1/5ν.Therefore,with the same conditions as before,that isε≥7q h/2k,the probability of getting a successful fork after at most 28q h/εiterations at step3is greater than6/7.

For any t≥ ,the probability for J to be greater or equal to t is less than (1?1/4×3/4×6/7)t? ,which is less thanγt? ,withγ=6/7.Furthermore,

E[N|J=t]≤j=t

j=0

E[N j]+140E[N j]αj

≤141

ν

×

j=t

j=0

αj≤

141

ν

×α

t+1

α?1

.

So,the expectation of N is E[N]=

t

E[N|J=t]·Pr[J=t]and then it can be

shown to be less than84480q h/ε.Hence the theorem.

Provable Security for Public Key Schemes153 Chosen-Message Attacks.However,this just covers the no-message attacks,with-out any oracle access.Since we can simulate any zero-knowledge protocol,even without having to restart the simulation because of the honest veri?er(i.e.the challenge is randomly chosen by the random oracle H)one can easily simulate the signer without the private key:

?one?rst chooses random h,s∈Z q;

?one computes r=s·g+h·y and de?nes H(m,r)to be equal to h,which is

a uniformly distributed value;

?one can output(r,h,s)as a valid signature of the message m.

This furthermore simulates the oracle H,by de?ning H(m,r)to be equal to h. This simulation is almost perfect since H is supposed to output a random value to any new query,and h is indeed a random value.Nevertheless,if the query H(m,r)has already been asked,H(m,r)is already de?ned,and thus the de?nition H(m,r)←h is impossible.But such a situation is very rare,which allows us to claim the following result,which stands for the Schnorr signature scheme but also for any signature derived from a three-round honest veri?er zero-knowledge interactive proof of knowledge:

Theorem3.Let A be a probabilistic polynomial time Turing machine whose input only consists of public data.We denote respectively by q h and q s the number of queries that A can ask to the random oracle and the num-ber of queries that A can ask to the signer.Assume that,within a time bound T,A produces,with probabilityε≥10(q s+1)(q s+q h)/2k,a valid signature(m,σ1,h,σ2).If the triples(σ1,h,σ2)can be simulated without knowing the secret key,with an indistinguishable distribution probability, then,a replay of the attacker A,where interactions with the signer are simulated,outputs two valid signatures(m,σ1,h,σ2)and(m,σ1,h ,σ 2)such that h=h ,within time T ≤23q h T/εand with probabilityε ≥1/9.

A uniform version of this theorem can also be found in[68].From a more practical point of view,these results state that if an adversary manages to perform an existential forgery under an adaptive chosen-message attack within an expected time T,after q h queries to the random oracle and q s queries to the signing oracle, then the discrete logarithm problem can be solved within an expected time less than Cq h T,for some constant C.This result has been more recently extended to the transformation of any identi?cation scheme secure against passive adversaries into a signature scheme[8].

Brickell,Vaudenay,Yung and the author also extended the forking lemma technique[69,17]to many variants of El Gamal[27]and DSA[55],such as the Korean Standard KCDSA[43].However,the original El Gamal and DSA schemes were not covered by this study,and are certainly not provably secure,even if no attack has ever been found against DSA.

本文来源:https://www.bwwdw.com/article/6cbl.html

Top