Why Computer Use Binary?

Discuss Computer Science and Programming related problems

Moderators: bristy1588, Labib

User avatar
Masum
Posts: 592
Joined: Tue Dec 07, 2010 1:12 pm
Location: Dhaka,Bangladesh

Why Computer Use Binary?

Unread post by Masum » Fri Apr 12, 2013 8:23 pm

I saw this post somewhere else. The question is: "Why does a computer only understand something that uses binary? Why not decimal?"
One one thing is neutral in the universe, that is $0$.

User avatar
*Mahi*
Posts: 1175
Joined: Wed Dec 29, 2010 12:46 pm
Location: 23.786228,90.354974
Contact:

Re: Why Computer Use Binary?

Unread post by *Mahi* » Fri Apr 12, 2013 9:34 pm

I think this link explains it in much details:
http://stackoverflow.com/questions/5165 ... -in-binary
Please read Forum Guide and Rules before you post.

Use $L^AT_EX$, It makes our work a lot easier!

Nur Muhammad Shafiullah | Mahi

shayanjameel08
Posts: 10
Joined: Mon Nov 04, 2013 6:17 pm

Re: Why Computer Use Binary?

Unread post by shayanjameel08 » Sat Nov 09, 2013 10:58 am

Works from 1's and 0's, but the computer understands them as onworks from 1's and 0's, but the computer understands them as onto be 10 different voltages, in which case there'd be more room for error with resistors etc.,

Post Reply