Optimizing the Menezes-Okamoto-Vanstone (MOV) Algorithm for Non-supersingular Elliptic Curves.

Conference Paper · January 1999with28 Reads
Source: DBLP
Conference: Advances in Cryptology - ASIACRYPT '99, International Conference on the Theory and Applications of Cryptology and Information Security, Singapore, November 14-18, 1999, Proceedings
Abstract
We address the Menezes-Okamoto-Vanstone (MOV) algorithm for attacking elliptic curve cryptosystems which is completed in subexponential time for supersingular elliptic curves. There exist two hurdles to clear, from an algorithmic point of view, in applying the MOV reduction to general elliptic curves: the problem of explicitly determining the minimum extension degree k such that E[n]⊂E(F q k ) and that of efficiently finding an n-torsion point needed to evaluate the Weil pairing, where n is the order of a cyclic group of the elliptic curve discrete logarithm problem. We can find an answer to the first problem in a recent paper by R. Balasubramanian and N. Koblitz, Cryptology 11, 141-145 (1998; Zbl 0978.94038). On the other hand, the second problem is important as well, since the reduction might require exponential time even for small k. In this paper, we actually construct a novel method of efficiently finding an n-torsion point, which leads to a solution of the second problem. In addition, our contribution allows us to draw the conclusion that the MOV reduction is indeed as powerful as the Frey-Rück reduction under n∤-1, not only from the viewpoint of the minimum extension degree but also from that of the effectiveness of algorithms.
    • Here the time required for computation of the pairing is not considered. However it is well known from [5, 8, 11, 9, 10] that computation of pairings can be carried out in polynomial time for super-singular curves for which m above turns out to be at most 6. Hence such curves are suitable for implementation of pairing based schemes.
    [Show abstract] [Hide abstract] ABSTRACT: This paper develops a cryptanalysis of the pairing based Diffie Hellman (DH) key exchange schemes which have found important applications as in the tripartite exchange scheme proposed in (1). The analysis of weak keys of the standard DH scheme proposed in (2) is applied to show existence of weak sessions for tripartite schemes over super- singular curves. It is shown that for such sessions the associated Bilinear Diffie Hellman Problem (BDHP) is solvable in polynomial time, without computing the private keys i.e. without solving the discrete logarithms. Similar applications of the analysis to Decisional Diffie Hellman Problem (DDHP)and the Identity Based DH scheme (IBS) are also developed. The tripartite key exchange scheme is analyzed in detail and it is shown that the number of weak keys increases in this scheme as compared to the standard two party DH scheme. It is shown that the random choice of private keys by the users independent of each other's knowledge is insecure in these schemes. Algorithms are suggested for checking weakness of private keys based on an order of selection. A modified tripartite key exchange scheme is presented in which detection of weak keys is incorporated.
    Full-text · Article · Jan 2005 · IEICE Transactions on Fundamentals of Electronics Communications and Computer Sciences
    • There is however a third method which, under the assumptions that n -q and n -q − 1, actually is completed in probabilistic polynomial time in l log q. The method is too detailed to be presented here, but can be found in its entirety in [50].
    [Show abstract] [Hide abstract] ABSTRACT: The use of elliptic curves in cryptography was suggested independently by Neal Koblitz and Victor Miller in 1985. Being a relatively new field, there is still a lot of ongoing research on the subject, but elliptic curve cryptography, or ECC for short, has already been implemented in real-life applications. Its strength was proved in 2003 when the U.S. National Security Agency adopted ECC for protecting information classified as mission-critical by the U.S. government. The security of public-key cryptographic systems that can be considered secure, efficient, and commercially viable is directly tied to the relative hardness of their underlying mathematical problems. In the case of ECC, this mathematical problem is to solve the discrete logarithm problem over elliptic curves, or ECDLP for short. Because the best-known way to solve ECDLP is fully exponential, we can use substantially smaller key sizes to obtain equivalent strengths compared to other systems. Hence, ECC provides the most security per bit of any public-key scheme known. In this thesis we have focused on presenting the known attacks on the ECDLP. We started by introducing some basic facts from the theory of elliptic curves. In the rest of the thesis we have described, analyzed and presented running time estimates of attacks on the ECDLP. This included a presentation of attacks which are specially designed to exploit weaknesses in the structure of some classes of elliptic curves. We have also presented attacks which can be used to solve the ECDLP over general elliptic curves. This included Pollard’s rho and lambda algorithms, where the former was used for solving the ECDLP challenges set by the Certicom company.
    Article · IEICE Transactions on Fundamentals of Electronics Communications and Computer Sciences
  • [Show abstract] [Hide abstract] ABSTRACT: The problem we consider in this paper is whether the Menezes-Okamoto-Vanstone (MOV) reduction for attacking elliptic curve cryptosystems can be realized for genera elliptic curves. In realizing the MOV reduction, the base field Fq is extended so that the reduction to the discrete logarithm problem in a finite field is possible. Recent results by Balasubramanian and Koblitz suggest that, if l χq - 1, such a minimum extension degree is the minimum k such that l|qk - 1, which is equivalent to the condition under which the Frey-Riick (FR) reduction can be applied, where l is the order of the group in the elliptic curve discrete logarithm problem. Our point is that the problem of finding an l-torsion point required in evaluating the Weil pairing should be considered as well from an algorithmic point of view. In this paper, we actually propose a method which leads to a solution of the problem. In addition, our contribution allows us to draw the conclusion that the MOV reduction is indeed as powerful as the FR reduction under l Λq - 1 not only from the viewpoint of the minimum extension degrees but also from that of the effectiveness of algorithms.
    Full-text · Article · Sep 2000
Article
December 2010 · Journal of Applied Probability · Impact Factor: 0.59
    In this paper we prove that the stationary distribution of populations in genetic algorithms focuses on the uniform population with the highest fitness value as the selective pressure goes to ∞ and the mutation probability goes to 0. The obtained sufficient condition is based on the work of Albuquerque and Mazza (2000), who, following Cerf (1998), applied the large deviation principle approach... [Show full abstract]
    Conference Paper
    September 2014
      In any database, some fields are discrete and others continuous in each record. We consider learning Bayesian network structures when discrete and continuous variables are present. Thus far, most of the previous results assumed that all the variables are either discrete or continuous. We propose to compute a new Bayesian score for each subset of discrete and continuous variables, and to obtain... [Show full abstract]
      Chapter
      January 2015
        This paper addresses the problem of efficiently finding an optimal Bayesian network structure w.r.t. maximizing the posterior probability and minimizing the description length. In particular, we focus on the branch and bound strategy to save computational effort. To obtain an efficient search, a larger lower bound of the score is required (when we seek its minimum). We generalize an existing... [Show full abstract]
        Conference Paper
        January 2010
          We extend the Chow-Liu algorithm for general random variables while the previous versions only considered finite cases. In particular, this paper applies the generalization to Suzuki's learning algorithm that generates from data forests rather than trees based on the minimum description length by balancing the fitness of the data to the forest and the simplicity of the forest. As a result, we... [Show full abstract]
          Discover more