ﻻ يوجد ملخص باللغة العربية
Hash functions are a basic cryptographic primitive. Certain hash functions try to prove security against collision and preimage attacks by reductions to known hard problems. These hash functions usually have some additional properties that allow for that reduction. Hash functions which are additive or multiplicative are vulnerable to a quantum attack using the hidden subgroup problem algorithm for quantum computers. Using a quantum oracle to the hash, we can reconstruct the kernel of the hash function, which is enough to find collisions and second preimages. When the hash functions are additive with respect to the group operation in an Abelian group, there is always an efficient implementation of this attack. We present concrete attack examples to provable hash functions, including a preimage attack to $oplus$-linear hash functions and for certain multiplicative homomorphic hash schemes.
A $k$-collision for a compressing hash function $H$ is a set of $k$ distinct inputs that all map to the same output. In this work, we show that for any constant $k$, $Thetaleft(N^{frac{1}{2}(1-frac{1}{2^k-1})}right)$ quantum queries are both necessar
In this paper, we specify a class of mathematical problems, which we refer to as Function Density Problems (FDPs, in short), and point out novel connections of FDPs to the following two cryptographic topics; theoretical security evaluations of keyles
Wireless Body Sensor Network (WBSN) is a developing technology with constraints in energy consumption, coverage radius, communication reliability. Also, communications between nodes contain very sensitive personal information in which sometimes due t
Many commonly used public key cryptosystems will become insecure once a scalable quantum computer is built. New cryptographic schemes that can guarantee protection against attacks with quantum computers, so-called post-quantum algorithms, have emerge
As the application of deep learning continues to grow, so does the amount of data used to make predictions. While traditionally, big-data deep learning was constrained by computing performance and off-chip memory bandwidth, a new constraint has emerg