We put forward a new family of computational assumptions, the Kernel Matrix Diffie-Hellman Assumption. Given some matrix A sampled from some distribution D_{l,k}, the kernel assumption says that it is hard to find "in the exponent" a nonzero vector in the kernel of A^t. This family is the natural computational analogue of the Matrix Decisional Diffie-Hellman Assumption (MDDH), proposed by Escala et al. As such it allows to extend the advantages of their algebraic framework to computational assumptions.
The k-Decisional Linear Assumption is an example of a family of decisional assumptions of strictly increasing hardness when k grows. We show that for any such family the corresponding Kernel Assumption family is also a strictly increasingly weaker family of computational assumptions. This requires ruling out the existence of some black-box reductions between flexible problems (i.e., computational problems with a non unique solution).
↧