Hello, I am currently working on HW for my algorithms class and I am stuck on how to approach the problem. Any advice would be appreciated. Right now I am still stuck on part a.
1. Assume that each word of your machine has 64 bits. Assume that you can multiply two n-word numbers in time 3n2 with a standard algorithm. Assume that you can multiply two n-word numbers in time 8nlg3 with a “fancy” algorithm. For each part briefly justify and show your work.
(a) Approximately, how large does n have to be for the fancy algorithm to be better?
(b) How many bits is that?
(c) How many decimal digits is that?
note: in this class we treat lg as log base 2
I think I am supposed to do something along the lines of 3n2 <= 8nlg3 and solve for n, but I do not know how to do that...
Thank you
1. Assume that each word of your machine has 64 bits. Assume that you can multiply two n-word numbers in time 3n2 with a standard algorithm. Assume that you can multiply two n-word numbers in time 8nlg3 with a “fancy” algorithm. For each part briefly justify and show your work.
(a) Approximately, how large does n have to be for the fancy algorithm to be better?
(b) How many bits is that?
(c) How many decimal digits is that?
note: in this class we treat lg as log base 2
I think I am supposed to do something along the lines of 3n2 <= 8nlg3 and solve for n, but I do not know how to do that...
Thank you