Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def function_to_memoize(n):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that to avoid checking parity. See my answer for more details my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def function_to_memoize(n):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that to avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def function_to_memoize(n):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that to avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def fibfunction_to_memoize(kn):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that to avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(k):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def function_to_memoize(n):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that to avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(k):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(k):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that avoid checking parity. See my answer for more details.
Your fib
function would benefit strongly from memoization, as currently you don't reuse the values you've already calculated to save on computations, resulting in O(n^2) computation time, as each fib(k) takes O(n) on average and you do O(n) calls.
By storing the results of previous calls, you can bring total complexity down to O(n) as each consecutive call will take O(1) and you do O(n) calls.
An easy way to do this with Python 3 is to use functools lru_cache decorator as so:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(k):
As a side note, there is a fun fact about fibonacci numbers that every 3rd one is even and the rest are odd, so you can use that avoid checking parity. See my answer for more details.
As for your other answer, while you seem to have the correct code, I just would like to present an alternative solution for the sake of completeness which first extracts the first letter of every word and then filters by capitalization. I also made use of lambdas for such simple functions.
def acronym(name):
return tuple(filter(lambda x: x.isupper(), map(lambda x: x[0], name.split())))
I would warrant that your code runs faster though as it does a map on a filtered result rather than a filter on a mapped result, as filter usually reduces the search space while map does not