my brain is a little stuck as I am trying to compile an example for my blog. I am presenting an algorithm that draws a number from an array, adds it to a sum and calls itself recursively.

```
func solve(coins: Array<Int>, target: Int) -> Int {
func _solve(coins: Array<Int>, current: Int) -> Int {
var candidates = coins
var crt = current
while candidates.count > 0 {
// pop() removes the element at the head and returns it
let rdm: Int = candidates.pop()
if (current + rdm) > target {
continue
}
crt = _solve(candidates, current: current + rdm)
if crt == target {
return crt
}
}
return crt
}
return _solve(coins, current: 0)
}
```

What this algorithm is doing is, given a set of coins, try to approach a given target as close as possible.

What is the complexity of this algorithm?

2

It doesn’t matter that you pick the next coin randomly rather than systematically. You’re still potentially trying out all subsets of a multiset of size N, so it may take as many as 2^N tries, and that’s your (upper bound) complexity.

6