Time complexity of 2^sqrt(n)

I am solving an algorithm question and my analysis is that it would run on O(2^sqrt(n)). How big is that? Does it equate to O(2^n)? Is it still non-polynomial time?


This is an interesting question. Fortunately, once you know how to solve it, it is not particularly hard.

For functions f: NR+ and g: NR+, we have fO(g) if and only if lim supn→∞ f(n) / g(n) ∈ R.

A function f: NR+ has at most polynomial growth if and only if there exists a constant kN such that fO(n ↦ nk). Let’s work this out for arbitrary but fixed kN.

lim supn→∞ 2(n1/2) / nk =
limn→∞ 2(n1/2) / nk =
limn→∞ elog(2) n1/2 / elog(n) k =
limn→∞ elog(2) n1/2 − log(n) k = ∞ ∉ R

The first equality is true because both, the nominator and the denominator, are monotonically growing steady functions. The second equality uses the identity xy = elog(x) y. The limit is not finite because the exponent in the final expression is not bounded above. Without giving a formal proof, it can be assumed to be known that n1/2 dominates log(n) asymptotically. Therefore, the function in question exceeds polynomial growth.

However, its growth is strictly less than exponential, where exponential is defined (by me, for this purpose) as O(n ↦ 2cn) for c > 0. Showing this is even more straight forward.

lim supn→∞ 2cn / 2(n1/2) = limn→∞ 2cnn1/2 = ∞ ∉ R

for any fixed c > 0. Therefore, the complexity of the function is somewhere truly in between polynomial and exponential.

How big is that? Well, O (2 ^ sqrt (n)) is exactly how big it is 🙁

To get an idea what it means, imagine your algorithm wouldn’t be just O (2 ^ sqrt (n)), but that it actually takes precisely 2 ^ sqrt (n) nanoseconds on your computer:

n = 100: 2^10 = 1024 nanoseconds. No time at all.
n = 1000: 2^31.xxx = 2 billion nanoseconds. Two seconds, that’s noticeable.
n = 10,000: 2^100 ≈ 10^30 nanoseconds = 10^21 seconds = 30 trillion years.

This is a lot better than 2^n nanoseconds, where n = 100 would take 30 trillion years, but still the size of problems that you can solve is quite limited. If you consider a problem “solvable” if your computer can solve it in one week, that’s about 6 x 10^14 nanoseconds, that’s about n = 2,400. On the other hand, up to n = 400 can be solved in a millisecond.

(In practice, for n = 10,000 both O (2 ^ sqrt (n)) and O (2 ^ n) take exactly the same time: Too long to wait for it. )

It does exceed any polynomial. Take another algorithm taking n^1000 seconds. Which is practically unsolvable for n = 2. This algorithm takes longer until n is about 885 million. But really, who cares? At that point the number of years that both algorithms take is a 9,000 digit number.

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *