We consider the binary-LWE problem, which is the learning with errors problem when the entries of the secret vector are chosen from $\{ 0, 1\}$ or $\{ -1, 0, 1 \}$ (and the error vector is sampled from a discrete Gaussian distribution). Our main result is an improved lattice decoding algorithm for binary-LWE which first translates the problem to the inhomogeneous short integer solution (ISIS) problem, and then solves the closest vector problem using a re-scaling of the lattice. We also discuss modulus switching as an approach to the problem. Our conclusion is that binary-LWE is easier than general LWE. We give experimental results and theoretical estimates that can be used to choose parameters for binary-LWE to achieve certain security levels.
↧