Why Computer Use Binary?
Moderators:Labib, bristy1588
I saw this post somewhere else. The question is: "Why does a computer only understand something that uses binary? Why not decimal?"
One one thing is neutral in the universe, that is $0$.
Re: Why Computer Use Binary?
I think this link explains it in much details:
http://stackoverflow.com/questions/5165 ... -in-binary
http://stackoverflow.com/questions/5165 ... -in-binary
Please read Forum Guide and Rules before you post.
Use $L^AT_EX$, It makes our work a lot easier!
Nur Muhammad Shafiullah | Mahi
Use $L^AT_EX$, It makes our work a lot easier!
Nur Muhammad Shafiullah | Mahi
-
- Posts:10
- Joined:Mon Nov 04, 2013 6:17 pm
Re: Why Computer Use Binary?
Works from 1's and 0's, but the computer understands them as onworks from 1's and 0's, but the computer understands them as onto be 10 different voltages, in which case there'd be more room for error with resistors etc.,
- emeryhen121
- Posts:21
- Joined:Fri Jul 16, 2021 6:04 pm
Re: Why Computer Use Binary?
Binary is an easy and primary computer language.
Binary is a base-2 number system made up of only two numbers or digits i.e., 0 and 1. This numbering system is the basis for all binary code used to write digital data of the computer’s use. Computers use this language because it is an easy way to determine if it is on or off, making the language less susceptible to electrical interference. It is also the most effective way to control logic circuits. Computers do not use decimals because using 10 different voltage levels would be very error-prone. The further binary is preferred as Computers use voltages, and since voltages change often, no specific voltage is set for each number in the decimal system. Hence binary proves to be an easier and better option.
Binary is a base-2 number system made up of only two numbers or digits i.e., 0 and 1. This numbering system is the basis for all binary code used to write digital data of the computer’s use. Computers use this language because it is an easy way to determine if it is on or off, making the language less susceptible to electrical interference. It is also the most effective way to control logic circuits. Computers do not use decimals because using 10 different voltage levels would be very error-prone. The further binary is preferred as Computers use voltages, and since voltages change often, no specific voltage is set for each number in the decimal system. Hence binary proves to be an easier and better option.