Why Computer Use Binary?

Discuss Computer Science and Programming related problems

Moderators: bristy1588, Labib

Facebook Twitter

Why Computer Use Binary?

Post Number:#1  Unread postby Masum » Fri Apr 12, 2013 8:23 pm

I saw this post somewhere else. The question is: "Why does a computer only understand something that uses binary? Why not decimal?"
One one thing is neutral in the universe, that is $0$.
User avatar
Posts: 592
Joined: Tue Dec 07, 2010 1:12 pm
Location: Dhaka,Bangladesh

Re: Why Computer Use Binary?

Post Number:#2  Unread postby *Mahi* » Fri Apr 12, 2013 9:34 pm

I think this link explains it in much details:
http://stackoverflow.com/questions/5165 ... -in-binary
Please read Forum Guide and Rules before you post.

Use $L^AT_EX$, It makes our work a lot easier!

Nur Muhammad Shafiullah | Mahi
User avatar
Posts: 1175
Joined: Wed Dec 29, 2010 12:46 pm
Location: 23.786228,90.354974

Re: Why Computer Use Binary?

Post Number:#3  Unread postby shayanjameel08 » Sat Nov 09, 2013 10:58 am

Works from 1's and 0's, but the computer understands them as onworks from 1's and 0's, but the computer understands them as onto be 10 different voltages, in which case there'd be more room for error with resistors etc.,
Posts: 10
Joined: Mon Nov 04, 2013 6:17 pm

Share with your friends: Facebook Twitter

  • Similar topics

Return to Computer Science

Who is online

Users browsing this forum: No registered users and 1 guest