GG nerds, help?

I created a thread like this a while back but never read the replies before it got wiped

I know enough to know I know NOTHING about computer programming.

From what I understand it's all about lines of code being processed in various ways to perform functions within a program/OS etc

My questions is, all these codes, algorithms and equations all boil down to a computer knowing 1 1=2.

So how was an inanimate object taught that 1 1=2? Obviously it was waaaaaaay back in the day but how was it done? How does an artificial memory work?

Or am I missing the point completely? Phone Post

Awaits an answer from Vision

You are missing the point, the computer does not KNOW anything the way you are thinking.

In memory the representaion for one, in a byte is,

00000001

That at the hardware level is 8 clock pulses, 7 of them with no voltage, one of them with 5 volts. This is in the binary number system where all you have is 1 and 0.

When you add 00000001 and 00000001 you get 00000010 which is binary for 2.

The computer is, in essence, a very complex series of light switches. It does not know anything about addition. The combination of voltage changes sets the switches into positions based in the principles of binary math.

Think of it this way, if I have two light bulbs connected to two switches. Flip either switch to the on position, by itself, and bulb 1 lights up. Switch both on and bulb 2 lights up and bulb one goes dark. The switches and bulbs are just reacting to the path the electricity is flowing down. This is, in essence, how computers work.

lol again, surprise no GG said "Wrong Froum Liefeld"

BigWilliam - You are missing the point, the computer does not KNOW anything the way you are thinking.

In memory the representaion for one, in a byte is,

00000001

That at the hardware level is 8 clock pulses, 7 of them with no voltage, one of them with 5 volts. This is in the binary number system where all you have is 1 and 0.

When you add 00000001 and 00000001 you get 00000010 which is binary for 2.

The computer is, in essence, a very complex series of light switches. It does not know anything about addition. The combination of voltage changes sets the switches into positions based in the principles of binary math.

Think of it this way, if I have two light bulbs connected to two switches. Flip either switch to the on position, by itself, and bulb 1 lights up. Switch both on and bulb 2 lights up and bulb one goes dark. The switches and bulbs are just reacting to the path the electricity is flowing down. This is, in essence, how computers work.


great explanation.  voted up.

COOL EXPLANATION.   I DIDN'T KNOW THATS HOW THEY WORKED.  

 

 

 

It is kind of funny, people think of computers as these magic thinking boxes, but when you boil it down it really is just a bunch of light switches. You can think of binary as morse code sort of, 1 and 0 being used to spell out numbers, words, instructions and eventually music and pictures when you get enough of them.

YEAH I KNEW ABOUT THE BINARY CODE BUT I DIDN'T KNOW HOW THE COMPUTER HARDWARE ACTUALLY TRANSLATED IT.   THATS FUCKING COOL AS HELL. 

I've been doing a bit of reading so I'm pretty sure I understand how binary works now and how you can get any number you want with just 1s and 0s and 8bit systems and stuff but what I don't get is the leap from a load of electrical 'switches' being switched to it performing tasks from a user Phone Post

pretty sure theres a wizard in there doing magic shit

Billy Joe Rottoncrotch - The word "geek" or "nerd" is pretty offensive in this day and age.

The politically correct term is "sexually-challenged virgin shut-in."

Try to be a little more respectful from now on.

It's ok I am one, I can use either term freely, that's our word Phone Post

Nakedwelshman - 

I've been doing a bit of reading so I'm pretty sure I understand how binary works now and how you can get any number you want with just 1s and 0s and 8bit systems and stuff but what I don't get is the leap from a load of electrical 'switches' being switched to it performing tasks from a user Phone Post


You have to redefine your idea of what 'performing tasks' is. On top of the 'switches' you have instruction sets, these sets allow commands to be interpreted between the hardware layer (the switches) and the software layer. Now these exist in multiple layers and getting into them all would take forever and a day, there are literally entire books devoted to this kind of thing in depth.

The short version is that the strings of 1 and 0 are passed through the instructions back to software which interprets them and gives results. For instance, let's look at a video game.

The game fires up, passes strings of information to the hardware, screen coordinates and colors, telling it what to paint and where. The information passes through several layers of software (direct x, operating system, device driver, etc) to the hardware. The final layer of software tells the machine what switches to set in the video card which passes signals to your monitor. The monitor paints the pixels as directed to make the picture.

Every time you press a key or move the mouse a signal goes from that device to the main hardware with a number string indicating what device, what was done, etc. That information passes back up through the layers to the game. The game then responds to the input and the whole thing starts over again.

So there is not a single simple answer that really covers it. The best summary I can do is the switches generate binary numbers, the numbers are translated by the levels of software until it gets to something you and I can see and interact with. Don't think of a computer as 'understanding' anything though. It doesn't. It is translations of binary math according to very detailed instructions.