Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Design and implement an algorithm to convert numbers from binary to decimal. Rep

ID: 3542090 • Letter: D

Question

Design and implement an algorithm to convert numbers from binary to decimal.  Represent the numbers as integer vectors of their digits (each entry in the vector representing a single digit). Thus, the input will be a vector of 0s and 1s and the output will be a vector of integers from {0,1,....,9} (decimal digits). For instance, the input might be [1,0,0,0,1,1,1,0,1] and the corresponding output should be vector [2,8,5].

#####(p.s. can't not use programming language support nor libraries for arbitray length arithmetic)#####

What I did is using a simple way like (2^0+2^1+....), but I think I am wrong.

Please HELP!!!

Explanation / Answer

I will give you a general psuedocode

say the vectors are stored in an array vect[] whose length is n

and the LSB is stored at 0 of the array


for i->0; i<n; i++

sum -> sum + ( vect[i] * 2 ^ i)

end


This will gib you an integer notation of the result


now to break this integer into its corresponding digits

Say you are saving the output vectors to array res[]


int tmp = sum;

i = 0;

while(tmp > 0)

digit -> tmp%10;

res[i++] -> digit;

tmp = tmp / 10;

end


There that should help you. Notice that I used tmp because I didnt want to operate directly on sum. Howerver if you find it is not needed anymore feel free to replace tmp with sum

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote