Given a decimal number k
, find the smallest integer n
such that the square root of n
is within k
of an integer. However, the distance should be nonzero - n
cannot be a perfect square.
Given k
, a decimal number or a fraction (whichever is easier for you), such that 0 < k < 1
, output the smallest positive integer n
such that the difference between the square root of n
and the closest integer to the square root of n
is less than or equal to k
but nonzero.
If i
is the closest integer to the square root of n
, you are looking for the first n
where 0 < |i - sqrt(n)| <= k
.
Rules
- You cannot use a language's insufficient implementation of non-integer numbers to trivialize the problem.
- Otherwise, you can assume that
k
will not cause problems with, for example, floating point rounding.
Test Cases
.9 > 2
.5 > 2
.4 > 3
.3 > 3
.25 > 5
.2 > 8
.1 > 26
.05 > 101
.03 > 288
.01 > 2501
.005 > 10001
.003 > 27888
.001 > 250001
.0005 > 1000001
.0003 > 2778888
.0001 > 25000001
.0314159 > 255
.00314159 > 25599
.000314159 > 2534463
Comma separated test case inputs:
0.9, 0.5, 0.4, 0.3, 0.25, 0.2, 0.1, 0.05, 0.03, 0.01, 0.005, 0.003, 0.001, 0.0005, 0.0003, 0.0001, 0.0314159, 0.00314159, 0.000314159
This is code-golf, so shortest answer in bytes wins.