mik3cap: (Default)
[personal profile] mik3cap
Computers do math with electricity.

If you're familiar with the concept of an electrical circuit, you know that it involves electricity traveling in a circle on a wire between a power input and some output (like between a battery and a lightbulb). That is essentially all a computer is; it's millions and millions of circuits and they're all extremely tiny. Each microchip in your computer is a collection of "integrated circuits" and those are the business ends of your computer, where all the math happens.

Remember way back in kindergarten or first grade, when you were learning numbers? There were flash cards that showed a picture of two apples, then a plus sign, then two more apples, then an equals sign and question mark. That was math in its most basic form, 2 + 2 = 4. Your computer is doing the same thing inside its microchips (especially the Central Processor a.k.a. CPU), except it's doing math with electricity; different circuits and different amounts of electricity represent numbers and operations - a computer in kindergarten might see two lit up lightbulbs plus two lit up lightbulbs equals four lit up lightbulbs! At any given time, the electricity inside your computer's circuits is either "on" or "off" and that's why computers use binary representations of numbers for their math (1 for on and 0 for off). In short - when the electrical currents are combined, the logic built into the circuits changes it into different configurations, and new numbers are the result.

People use decimal numbers, the digits zero through ten; but because computers only use on and off it's easier to use just two numbers to represent all numbers - this is called binary notation. A binary digit is also called a "bit" and when you hear about a 64-bit processor that means it's a collection of circuits that can read 64 binary digits at the same time, a string of digits that could look something like this:

0010101010111001010100010011101101111110101010101101010101101001

It may not look like it means anything, but those numbers represent one or more instructions the computer has to execute - something along the lines of "Add 2 and 2" or some other step in a series of instructions that tells the computer to do things with other instructions. When someone says that a CPU is "Two Gigahertz" they're saying that the computer is doing two billion instructions every second (hertz just means "per second"). Your computer is doing something like "Add 2 and 2" TWO BILLION times every second! You can do a lot of really cool things when you can do that much math that quickly.

Profile

mik3cap: (Default)
mik3cap

June 2010

S M T W T F S
  12345
6 7891011 12
131415 16 171819
20212223242526
27282930   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 8th, 2025 01:16 am
Powered by Dreamwidth Studios