I am taking an image file and converting it into binary format. Then I am converting that binary as a decimal format. But according to my algorithm I want to take 50,000 bits at a time following I am explaining my algorithm.
- Read an image file from any programming language.
- Convert that into binary format(pure 0's and 1's).
- Take 50,000 bits at a time and convert it into decimal format(here I am taking only 1000 bits right now)
- Convert that decimal again into again binary format.
Now problem is:
- How can I take 50,000 bits at time to convert that into binary format
- How will I convert that decimal number to binary again.
Here are 2 demos
- Converting Binary to decimal https://repl.it/IHMY/1
- Converting decimal to binary https://repl.it/IHMY
Thanks