Scientific Forums


Pages: (5) 1 [2] 3 4 ... Last »  ( Go to first unread post )

Add reply · Start new topic · Start new poll


> How Computers Work
Confused2
Posted: Jan 20 2010, 10:50 PM


Retreating member
*****

Group: Validating
Posts: 4893
Joined: 8-November 05

Positive Feedback: 63.06%
Feedback Score: -43


Here is the datasheet for a binary adder:-
http://www.datasheetcatalog.org/datasheets/166/375566_DS.pdf

Here is what the symbols mean:-
http://en.wikipedia.org/wiki/Logic_gate

Edit .. more on logic at component level:-
http://en.wikipedia.org/wiki/Transistor%E2...ransistor_logic

This post has been edited by Confused2 on Jan 20 2010, 10:52 PM


--------------------
Anything completed in less than twenty years is likely to be hurried and unsatisfying.
Top
SirLee
Posted: Jan 21 2010, 08:20 AM


Newbie
*

Group: Members
Posts: 5
Joined: 17-January 10

Positive Feedback: 100%
Feedback Score: 1


I must admit to my shortcomings when it comes to learning. I am a slow learner, and perhaps even slower to remember at that. I believe I caught unto most of what was said, however, with exception to a few things. Firstly, you imply that power is a constant in regard to transistors throughout the system, which in itself is logical and brings no great shock; it is when you move on to state they possess signals, whose power may vary which befuddles me. Now, this is a first for me, as I've never heard a thing about "signals" in regard to transistors. Perhaps I'm just being dull-witted here as per usual, or have missed a major piece of something here. I'd like to know what kind of signal this is and how it is effected by input power, if you wouldn't mind. Perhaps one the explanation of one will explain both.

Anyways, moving on, my next question regards the functions. So I've got a tenuous grasp on what XOR, AND, OR, and ADD are and in what ways they operate, if not an understanding of why or how at the micro level. I seem to be getting thrown off track during the bit about Carry, Cin and Cout, though. As I see it, and also as was explained by you, Carrying bits comes out the same way as ANDing them. I even thought I understood for a moment how Cin and Cout fit into things, that is until I progressed further down your list of examples. Here is one such part:
A + B = S1 + Cin = Sfinal + Cout
0 + 0 = 0 + 0 = 0 + Cout = 0
Now, this first line here seemingly made sense. I could justify 0 + 0 = 0, and I could see that beneath the Cin column, the 0 seemed reasonable as well, at first... I then studied the second row and noticed an inconsistancy, thus adopting two different theories to see if I could make things fit in correctly, which I could not. The first theory was that your "CarryIn" is the last carry value from a prior line, the "CarryOut". It seemed to fit along with what you said as well, but that didn't work, seeing as the second row of "0 + 0 = 0 + 1 = 1 + Cout = 0" contained a "1" for the Cin, whereas the prior row had clearly contained a "0" for the Cout.

I then tried reasoning that the Cin could only be a 1 if there was a 1 + 1, which clearly there wasn't anywhere, at least working from the top rows down. I tried a similar approach working from the bottom rows to the top, but got stuck again as well, except it wasn't quite the same. Seeing as you noted as to why the first three bottom rows of Couts came out as 1's, I didn't run into a problem as early, but it was in essence the same. I couldn't figure out where the Cin occurred from within the third row up from the bottom. I appreciate your help thus far, and your detailed explanations are excellent. My apologies if my explanation for why I don't understand seems fragmented.
Top
adoucette
Posted: Jan 21 2010, 02:35 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (SirLee @ Jan 21 2010, 03:20 AM)
I must admit to my shortcomings when it comes to learning. I am a slow learner, and perhaps even slower to remember at that. I believe I caught unto most of what was said, however, with exception to a few things. Firstly, you imply that power is a constant in regard to transistors throughout the system, which in itself is logical and brings no great shock; it is when you move on to state they possess signals, whose power may vary which befuddles me. Now, this is a first for me, as I've never heard a thing about "signals" in regard to transistors. Perhaps I'm just being dull-witted here as per usual, or have missed a major piece of something here. I'd like to know what kind of signal this is and how it is effected by input power, if you wouldn't mind. Perhaps one the explanation of one will explain both.

I've never tried to actually explain this in this detail, so bear with me, its sometimes not easy to give the best examples.

Ok,

Transistors have 3 leads.

One lead has power coming in.
One lead can carry a signal current.
One lead is the output.

A signal current can be anything over the minimum needed for the transistor to function reliably.

In essence what happens is with NO Voltage on the Signal line, the power can not flow through the transistor, and thus there is no current on the output lead.

This is the equiv of 0 to the computer.

When sufficient voltage is applied to the signal line, then the transistor allows current to flow from the input to the output side.

This is the equiv of 1 to the computer.

Just as an aside, in the world of computers, transistors are primarily used as a binary switch with the output being interpreted as a 0 or a 1.

Transistors though can also be designed and used as amplifiers. In this case, the output is proportional to the signal. Thus you will find them used in audio amplifiers, for instance, to turn a small signal (output from the receiver) to a large signal needed by the speakers.

QUOTE
Anyways, moving on, my next question regards the functions. So I've got a tenuous grasp on what XOR, AND, OR, and ADD are and in what ways they operate, if not an understanding of why or how at the micro level. I seem to be getting thrown off track during the bit about Carry, Cin and Cout, though. As I see it, and also as was explained by you, Carrying bits comes out the same way as ANDing them. I even thought I understood for a moment how Cin and Cout fit into things, that is until I progressed further down your list of examples. Here is one such part:
A + B = S1 + Cin = Sfinal + Cout
0 + 0 = 0 + 0 = 0 + Cout = 0
Now, this first line here seemingly made sense. I could justify 0 + 0 = 0, and I could see that beneath the Cin column, the 0 seemed reasonable as well, at first... I then studied the second row and noticed an inconsistancy, thus adopting two different theories to see if I could make things fit in correctly, which I could not. The first theory was that your "CarryIn" is the last carry value from a prior line, the "CarryOut". It seemed to fit along with what you said as well, but that didn't work, seeing as the second row of "0 + 0 = 0 + 1 = 1 + Cout = 0" contained a "1" for the Cin, whereas the prior row had clearly contained a "0" for the Cout.

I then tried reasoning that the Cin could only be a 1 if there was a 1 + 1, which clearly there wasn't anywhere, at least working from the top rows down. I tried a similar approach working from the bottom rows to the top, but got stuck again as well, except it wasn't quite the same. Seeing as you noted as to why the first three bottom rows of Couts came out as 1's, I didn't run into a problem as early, but it was in essence the same. I couldn't figure out where the Cin occurred from within the third row up from the bottom.  I appreciate your help thus far, and your detailed explanations are excellent. My apologies if my explanation for why I don't understand seems fragmented.


My Bad.

I should have included a better explanation

These blocks (for AND, OR and XOR):

CODE
If we AND two bits together we get
A AND B = S
1 AND 0 = 0
1 AND 1 = 1
0 AND 1 = 0
0 AND 0 = 0


Just to show the result of doing the operation on all the bit combinations between A and B that are possible.

Each line is independent.

When we Add:

CODE
But if we ADD two bits together we get
A + B = S and Carry
1 + 0 = 1 Carry = 0
1 + 1 = 0 Carry = 1
0 + 1 = 1 Carry = 0
0 + 0 = 0 Carry = 0


You are absolutely correct in that only when one is adding two 1 bits together do we get a Carry bit.

When you are adding a SERIES of bits together, one has to include a Carry In bit from the previous set of bits and you get a Carry Out bit for the next (moving left to right, just like decimal math)

The Carry In bit for the right most set of bits is always 0.

With this explanation, go back and see if you can get a better grasp on AND, OR and XOR.

No sense going on until we have these nailed down.

Arthur

This post has been edited by adoucette on Jan 21 2010, 03:33 PM


--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
SirLee
Posted: Jan 27 2010, 03:37 AM


Newbie
*

Group: Members
Posts: 5
Joined: 17-January 10

Positive Feedback: 100%
Feedback Score: 1


Forgive my long hiatus. I read your explanation on the transistor, and still felt I was lacking a few details. I youtubed an animated explanation of the transistor and this is what I found. I've gained a bit more insight from combining your explanation and this video together, but I feel I'm left with more questions than I began with... Firstly, if the whole concept is based on filling the base with electrons from the input until there are no openings left, and then applying a positive charge to the base via the middle lead in order to create more openings for new electrons, allowing electrons or 'current' to flow from the input-base-ouput, where does this positive charge, or "signal" come from?

These next two questions more or less ties into the above; if the input of a transistor is connected to a preceeding transistor in the circuit, and the output is connected to the proceeding transistor, what is the middle signal lead interconnected with? You mentioned that the power emanating throughout the system is constant, which seems to be true considering the nature of a power-supply and it's role. If this is so, where does the opportunity arise for a transistor to change it's state if it's being supplied with a constant flow of power? As I said, these questions are integrated into one another, but I'd like to ask them separately for the sake of covering all angles.

QUOTE
When you are adding a SERIES of bits together, one has to include a Carry In bit from the previous set of bits and you get a Carry Out bit for the next (moving left to right, just like decimal math)

The Carry In bit for the right most set of bits is always 0.

As for the logic arguments, I'm afraid I've become even more lost in a distortion of facts here... I understand the concept of carrying in bits over from a prior set of bits to be used as the Carry In, and that the result of each line yields a Carry Out bit for the next set of addition. Why is it that the right most Carry In bit is always 0, though? Wouldn't it be the left most, seeing as when you first begin there are no bits to be carried in?

So for example, you have two variables: A+B. A just so happens to equal 1, and B so happens to equal 0. Now, when you add them, you'll obviously get 1, your S1 value. Your Cin value would be 0, seeing as there have been no Cout values before now to carry in. The following is how it would look, cutting off the end Cout value:
A+B=S1+Cin=Sfinal
1 + 0 = 1 + 0 = 1
I suppose my next question is how exactly are you acquiring the Cout value? Is it the carry value for the first addition/second addition, and if neither yields a 1, it defaults to 0? Out of all of this, I haven't seen the reason for carry values as well. Perhaps that is only after I learn how to use them, though. I'll continue to look around and see what I can find.

This post has been edited by SirLee on Jan 27 2010, 03:38 AM
Top
adoucette
Posted: Jan 27 2010, 02:34 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (SirLee @ Jan 26 2010, 10:37 PM)
Forgive my long hiatus. I read your explanation on the transistor, and still felt I was lacking a few details. I youtubed an animated explanation of the transistor and this is what I found. I've gained a bit more insight from combining your explanation and this video together, but I feel I'm left with more questions than I began with... Firstly, if the whole concept is based on filling the base with electrons from the input until there are no openings left, and then applying a positive charge to the base via the middle lead in order to create more openings for new electrons, allowing electrons or 'current' to flow from the input-base-ouput, where does this positive charge, or "signal" come from?


Very simply, the signal represents one BIT.
If the BIT in question is on, there is a signal current is applied to the Control segment of the transistor and thus power flows from the Input side to the Output side.

Where does the signal come from, well typically memory, or it's passed on as part of a logic circuit, as we were discussing in the ADDER example.

Memory uses both a Transistor as the switch and a Capacitor (acts like a small battery)

In dynamic random access memory (DRAM), a transistor and a capacitor are paired to create a memory cell (a single bit).

The capacitor actually holds the bit of information and the the transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state.

To store a 1 in the memory cell, the Capacitor is charged with electrons.
To store a 0 the Capacitor is discharged.

Since the capacitors don't hold their charge very long (which is why when you turn off your computer the memory all resets to zeros), each memory cell is read and reset every other cycle. In between this refresh, the computer can read or write any of the memory locations for its own needs. This is why faster memory speeds (represented in millions of cycles per second) are important for overall computer processing performance.

QUOTE
These next two questions more or less ties into the above; if the input of a transistor is connected to a preceeding transistor in the circuit, and the output is connected to the proceeding transistor, what is the middle signal lead interconnected with? You mentioned that the power emanating throughout the system is constant, which seems to be true considering the nature of a power-supply and it's role. If this is so, where does the opportunity arise for a transistor to change it's state if it's being supplied with a constant flow of power?


If you have multiple transistors connected in a logic circuit, the first one's control lead will equate to a bit fetched from memory. The other's control inputs can be from other memory locations or from the output of the logic circuit itself.

Since the memory locations can be either 0 or 1 and the output of any logic circuit can be 0 or 1, there is no constant flow of power to the CONTROL layer of the transistors.

Arthur


--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
adoucette
Posted: Jan 27 2010, 03:27 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (SirLee @ Jan 26 2010, 10:37 PM)
I understand the concept of carrying in bits over from a prior set of bits to be used as the Carry In, and that the result of each line yields a Carry Out bit for the next set of addition. Why is it that the right most Carry In bit is always 0, though? Wouldn't it be the left most, seeing as when you first begin there are no bits to be carried in?

You add bits, just like you do decimal numbers from RIGHT to LEFT.

Thus the rightmost Carryin Bit starts at 0.

CODE

So to add these two binary numbers together.

0 1 1 0 0
0 0 1 0 1

Starting at the right most bit we get:

0 + 1 = 1 + Carry In 0 = 1 (Sum) with a Carry Out = 0 <This is the rightmost bits

0 + 0 = 0 + Carry In 0 = 0 (Sum) with a Carry Out = 0

1 + 1 = 0 + Carry In 0 = 0 (Sum) with a Carry Out = 1 <== from first addition

1 + 0 = 1 + Carry In 1 = 0 (Sum) with a Carry Out = 1 <== from 2nd addition

0 + 0 = 0 + Carry In 1 = 1 (Sum) with a Carry Out = 0

Thus the answer is (the Sum bits written right to left)

1 0 0 0 1



Note, always end by adding two zeros together to take care of the overflow of a carry bit, as in the example above.

QUOTE (SirLee)
So for example, you have two variables: A+B. A just so happens to equal 1, and B so happens to equal 0. Now, when you add them, you'll obviously get 1, your S1 value. Your Cin value would be 0, seeing as there have been no Cout values before now to carry in. The following is how it would look, cutting off the end Cout value:
A+B=S1+Cin=Sfinal
1 + 0 = 1 + 0 = 1


Correct.


QUOTE (SirLee)
how exactly are you acquiring the Cout value? Is it the carry value for the first addition/second addition, and if neither yields a 1, it defaults to 0?


Yes, Cout is 1 if either the first or second addition creates a carry. If neither do, then Cout is 0.

QUOTE (SirLee)
I haven't seen the reason for carry values as well


You can't add two numbers together without them and it's also necessary for logic, for instance it is used in determing the AND condition, ie. if you add to bits together, the Carry bit is the logical answer to the AND test.

CODE


0 + 0 = 0, Carry = 0
0 AND 0 = 0

0 + 1 = 1, Carry = 0
0 AND 1 = 0

1 + 1 = 0, Carry = 1
1 AND 1 = 1


Arthur




--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
SirLee
Posted: Jan 30 2010, 02:09 AM


Newbie
*

Group: Members
Posts: 5
Joined: 17-January 10

Positive Feedback: 100%
Feedback Score: 1


I believe I just had quite a few "aha" moments. ;D It took me a couple tries, but I finally see a bit of the bigger picture now. In response to your earlier problem-
QUOTE
You want to add two values, one bit at a time.
A = 0011 to B = 0110 to come up a new value S (or Sum).
You can use an intermediate field called Carry.

The way to do so using your carry system would be as such:
0+1= 1+0= 1 Cout=0
1+1= 0+0= 0 Cout=1
1+0= 1+1= 0 Cout=1
0+0= 0+1= 1 Cout=0
0+0= 0+0= 0 Cout=0
Answer:
01001

I had the correct answer before as you said, but I had obviously failed to do it the computer's way. Now, in response to the other problem you posed to me of doing the same addition purely with XORs and ANDs...:

(Note: C1 is the result of A AND B, and C2 is the result of S1 AND Cin as can be seen below. Also, ignore the hyphens, I only used them because of spacing issues.)
A ---- B -- S1 --- Cin - SF -- A --- B - C1 -- S1 ---- Cin - C2 - C1 -- C2 -- Cout
0 XOR 1 = 1 XOR 0 = 1 --- 0 AND 1 = 0 --- 1 AND 0 = 0 --- 0 XOR 0 = 0
1 XOR 1 = 0 XOR 0 = 0 --- 1 AND 1 = 1 --- 0 AND 0 = 0 --- 1 XOR 0 = 1
1 XOR 0 = 1 XOR 1 = 0 --- 1 AND 1 = 1 --- 1 AND 0 = 0 --- 1 XOR 0 = 1
0 XOR 0 = 0 XOR 1 = 1 --- 0 AND 1 = 0 --- 0 AND 0 = 0 --- 0 XOR 0 = 0
0 XOR 0 = 0 XOR 0 = 0 --- 0 AND 0 = 0 --- 0 AND 0 = 0 --- 0 XOR 0 = 0

It seems to work, but I can't help but feel it's wrong. As you can see, I just basically replaced the ADDS with XORs and then added a sort of "checker" at the end for the Couts. Perhaps I need to include an extra step for the derision of my Cins, but I couldn't quite figure out how to do so using just XOR and AND. Thoughts?

QUOTE
Very simply, the signal represents one BIT.
If the BIT in question is on, there is a signal current is applied to the Control segment of the transistor and thus power flows from the Input side to the Output side.

Where does the signal come from, well typically memory, or it's passed on as part of a logic circuit, as we were discussing in the ADDER example.

I understood that the signal and the third lead it's passed on through is in essence a 'bit', seeing as it's integral to determining whether the transistor lets current flow or not. That's my fault for not explaining that.

QUOTE
Memory uses both a Transistor as the switch and a Capacitor (acts like a small battery)

In dynamic random access memory (DRAM), a transistor and a capacitor are paired to create a memory cell (a single bit).

The capacitor actually holds the bit of information and the the transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state.
So are you saying that each Transistor has it's own Capacitor? O.o Also, I was kinda hoping to know how the third lead on a transistor, the "switch" lead in essence, is physically connected. That's for another time, though, I'll do my own research to try and avoid constantly asking you every little thing.

QUOTE

If you have multiple transistors connected in a logic circuit, the first one's control lead will equate to a bit fetched from memory. The other's control inputs can be from other memory locations or from the output of the logic circuit itself.

Since the memory locations can be either 0 or 1 and the output of any logic circuit can be 0 or 1, there is no constant flow of power to the CONTROL layer of the transistors.

This makes A LOT more sense to me than it did before, though the question I had asked was very similar to the prior, and that was regarding the physical connection of the third leads themselves.

This post has been edited by SirLee on Jan 30 2010, 02:17 AM
Top
adoucette
Posted: Jan 30 2010, 06:41 AM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (SirLee @ Jan 29 2010, 09:09 PM)
I believe I just had quite a few "aha" moments. ;D It took me a couple tries, but I finally see a bit of the bigger picture now.

Great.
I'm trying to give you just enough so you can take it the next step and get that "aha!" moment.

QUOTE
in response to the other problem you posed to me of doing the same addition purely with XORs and ANDs...:

(Note: C1 is the result of A AND B, and C2 is the result of S1 AND Cin as can be seen below. Also, ignore the hyphens, I only used them because of spacing issues.)
A ---- B -- S1 --- Cin - SF -- A --- B - C1 --  S1 ---- Cin - C2 - C1 -- C2 -- Cout
0 XOR 1 = 1  XOR 0 = 1 --- 0 AND 1 = 0  --- 1 AND 0 = 0  --- 0 XOR 0 =  0
1 XOR 1 = 0  XOR 0 = 0 --- 1 AND 1 = 1  --- 0 AND 0 = 0  --- 1 XOR 0 =  1
1 XOR 0 = 1  XOR 1 = 0 --- 1 AND 1 = 1  --- 1 AND 0 = 0  --- 1 XOR 0 =  1
0 XOR 0 = 0  XOR 1 = 1 --- 0 AND 1 = 0  --- 0 AND 0 = 0  --- 0 XOR 0 =  0
0 XOR 0 = 0  XOR 0 = 0 --- 0 AND 0 = 0  --- 0 AND 0 = 0  --- 0 XOR 0 =  0

It seems to work, but I can't help but feel it's wrong. As you can see, I just basically replaced the ADDS with XORs and then added a sort of "checker" at the end for the Couts. Perhaps I need to include an extra step for the derision of my Cins, but I couldn't quite figure out how to do so using just XOR and AND. Thoughts?


Good job!, that's correct.

Note though I limited you to XOR and AND, the last step could also be an done with an OR.

So this is really a key step in learning how computers work and that is, using basic electrical circuits involving transistors as switches, we can create logic gates that allow us to test two bits for the OR, AND and XOR conditions, and that arranged properly, one can do math using logic.

It also brings up the next point, and that is that in doing that addition, each step was one logical operation and each one had to be done in sequence.

Which brings us to the heart of a computer, its CLOCK.

A computer has within it a clock that generates a signal every nth of a second.

Each pulse is equal to one computer cycle, and during that cycle, the next logical operation would be performed. Thus the Cout bit from one step of the ADD can be used as the Cin for the next step.

You see this clock speed referenced when you see a computer listed as running at say 3.2 Ghz

Which would mean its internal clock beats at 3.2 Billion cycles per second.



There are some other basic operations that can be done using logic.

For instance one can clear a set of bits to all zeros, regardless of original value to all zeros by:

First OR the bits with a set of all one bits.
Then XOR the bits with a set of all one bits.

Field to be set to zero
0101
1111 <= OR with all 1 bits
1111 <= result
1111 <= XOR with all 1 bits
0000 <= Final Result.

Or one can Swap a set of bits with another set of bits using just XOR
Field A = 1100
Field B = 0101

1100 <= Field A
0101 <= Field B
1001 <= B XOR A, Result replaces what was in Field B.

1001 <= new field B
1100 <= Field A
0101 <= A XOR B, Result replaces what was in Field A

0101 <= new field A value
1001 <= new field B value (result from first XOR)
1100 <= B XOR A, result replaces what was in field B

End result
Field A = 0101
Field B = 1100

Think about these types of logical operations for a minute and you will see the computer can use the same OR, AND and XOR logic to move data from one memory location to another.

Similarly one can use these to compare two locations to see if they are the same or not (basic IF A EQUAL B logic statement)

The next aha! moment should be to realize that if one can do logic tests and math and move data around, then there really is no limitation to what you can do pragmatically with these basic constructs.

The fact is the computer is never doing anything more complicated than the few logical operations we just went over. Its really just doing billions of these simple logic operations every second.

But these operations are SO basic, that to program a computer to do things at this level would take forever.

Just look at the steps involved in Adding a few binary bits together.

So the next level of complexity is that the Central Processing Unit or CPU of the computer has a bunch of these low level logic gates grouped together to perform higher level functions.

Thus a computer has sets of logic gates arranged so it can do all the basic math operations.

Thus allowing a higher level instruction to the CPU to be ADD 32 bit Field A to 32 Field B and put the result in B.

Or MOVE 48 bytes starting at location A to location B.

Or COMPARE the two fields A and B and determine if they are the same.

In this way, the logic gates are arranged in groups which equate to functions that we are more comfortable in dealing with and allow us to make the computer do something by programming it.

Even these level commands can be packaged into groups of instructions that operate on an even higher level, thus we can have very high level languages that operate at an object level, so we can program the computer using even higher levels of abstraction.

(this is way beyond the basics, but take a look at this for an idea of what this equates to: http://en.wikipedia.org/wiki/Java_(programming_language)

The net is though, that we can use high level computer languages that use logical constructs that are relatively simple for a person to understand, but the programming language can then be broken down (compiled using another program) into the much smaller and more basic instructions the computer understands.


Arthur



--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
adoucette
Posted: Jan 30 2010, 02:18 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (SirLee @ Jan 29 2010, 09:09 PM)
So are you saying that each Transistor has it's own Capacitor? O.o Also, I was kinda hoping to know how the third lead on a transistor, the "switch" lead in essence, is physically connected. That's for another time, though, I'll do my own research to try and avoid constantly asking you every little thing.


Not every transistor.
I mentioned it because it's used in DRAM memory, and that's the most common memory in use today.

Not trying to go too deep, but one can build a circuit, using just logic, to store the on/off state. The problem is that doing so requires more circuitry and thus you get less density, meaning fewer bits per chip, and higher power requirements, thus more heat/less battery life etc.

Arthur


--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
ElaineAbigayle
Posted: Apr 20 2010, 08:29 AM


Newbie
*

Group: Members
Posts: 3
Joined: 20-April 10

Positive Feedback: 100%
Feedback Score: 2


Computers, the ones we know and love have not been around all that long. The first home personal computer was not sold until 1977. We have come a long way since then. Did you know that in 1983 there were approximately 2 million personal computers in use in the United States. However just 10 years later in 1993 the number had jumped to more than 90 million. And now in 2009 the number is estimated at over 200 million.

Computers, today are small, fast, reliable, and extremely useful. Back in 1977 that really was not the case. However, they both operated in basically the same way. They both receive data, stored data, processed data, and then output data similar the the way our own brain functions. This article deals with those 4 functions: Memory, Processing, Input, and Output.
Memory
Lets look at computer memory first. The function of storage in a computer comes in many different sizes, types and shapes. However there are two basic categories: short-term and long-term. A typical computer contains numerous types of memory including RAM, ROM, virtual, cache, and various long-term storage devices. Each type of computer memory serves a specific function and purpose.

Computer memory is measured in bytes. A single byte is made up of a series of 1's and 0's normally traveling in pairs of eight. These eight 0's and 1's are the way the computer communicates and stores information. With each keystroke or character a byte of memory is used. In another article you will learn more about bits and how the computer thinks.

Measuring Memory
Term/Byte

Abbreviation

Value
Kilo K, KB 1,024 bytes
Mega M, MB, Meg 1,048,576 bytes (Million)
Giga G, GB, Giga 1,073,741,824 bytes (Billion)
Tera T, TB, Tera 1,099,511,628,000 bytes (Trillion)

Here is another way of looking at the measurement of memory:

Measuring Bytes
8 bits

=
1 byte
1000 bytes

=
1 kilobyte
1000 kilobytes

=
1 megabyte
1000 megabytes

=
1 gigabyte
1000 gigabytes

=
1 terabyte


ROM
ROM, or read-only memory is permanent, long-term, nonvolatile memory. Nonvolatile means is doesn't disappear when the computer is shut off. It also can not be erased or changed in anyway. However there are types of ROM called PROM that can be altered. The P stands for programmable. ROM's purpose is to store the basic input/output system (BIOS) that controls the start-up, or boot process.

RAM
RAM, or random-access memory unlike ROM works only when the computer is turned on. This memory is vital to the computer because it controls the moment by moment processes of the computer. The first thing that goes into RAM is the OS (operating system) which is most cases is Windows 95. Next for the RAM might be a game, or the Internet browser, or some type of software that you want to use.

Early personal computer only needed about 64K of RAM. Today that number is drastically higher. With photos, sounds, and even movies going into RAM, the amount need is now in the millions. The computer I am currently using has 80 MB or 80,000K of RAM.

Multitasking has put more demand on RAM in the past few years. Multitasking is the ability to run more than one program at the same time. For instance, many people like to run Netscape Communicator along with their word processing software. This means you need lots of RAM to hold both programs.

Other types of temporary memory are cache (pronounced "cash") and virtual memory. Both of these types of memory supplement the computer's primary RAM and perform the same function as RAM.

Storage Devices:

RAM and ROM may be very important parts of the computer; however, without storage devices like hard drives and disk drives your computer would not be near as useful.

Here are the most common forms of Storage Devices found on your home computer
Top
Granouille
Posted: Apr 20 2010, 02:17 PM


Et le cheval que vous roulé sur!
*****

Group: Power Member
Posts: 1395
Joined: 12-February 09

Positive Feedback: 66.67%
Feedback Score: 77


Is there some point to your blast from the past?

QUOTE
Early personal computer only needed about 64K of RAM. Today that number is drastically higher. With photos, sounds, and even movies going into RAM, the amount need is now in the millions. The computer I am currently using has 80 MB or 80,000K of RAM.

Multitasking has put more demand on RAM in the past few years. Multitasking is the ability to run more than one program at the same time. For instance, many people like to run Netscape Communicator along with their word processing software. This means you need lots of RAM to hold both programs.


80Mb and Netscape Communicator? Where did you copy this crud from, and from what year? Most of my computers have at least 1Gb of RAM, and Communicator has been defunct for years.

http://browser.netscape.com/

You'll notice the end of support notice as of 2008?


--------------------
-=Forum Mafia=-

Member of the Original FM, 2007
Top
adoucette
Posted: Apr 20 2010, 02:25 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (Granouille @ Apr 20 2010, 09:17 AM)
Is there some point to your blast from the past?


Maybe, but I doubt it's good.

We've had several posters show up recently, with compound common names, and post wiki definitions as replys to posts keying off a word in the previous post.

Sorta makes it look like "ElaineAbigayle", or "LaurenJadeTaylor", or "MaraShylaStewart" is a real person.

As in:

http://www.physforum.com/index.php?showtop...ndpost&p=455139

and

http://www.physforum.com/index.php?showtop...ndpost&p=455170

and

http://www.physforum.com/index.php?showtop...ndpost&p=453443


But I'm pretty sure these are bots and why they are trying to establish these fake identities is not likely to be for any good purpose.

Arthur

This post has been edited by adoucette on Apr 20 2010, 03:01 PM


--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
rpenner
Posted: Apr 20 2010, 03:38 PM


Fully Wired
*****

Group: Moderators
Posts: 5841
Joined: 27-December 04

Positive Feedback: 84.5%
Feedback Score: 397


Dude, it's a bot copying from Wikipedia.


--------------------
愛平兎仏主
"And the peace of God, which passeth all understanding, shall keep your hearts and minds through Christ Jesus." Philippians 4:7
It's just good Netiquette. Failing that, Chlorpromazine.
Top
adoucette
Posted: Apr 20 2010, 03:51 PM


Illegitimi non carborundum
*****

Group: Power Member
Posts: 12894
Joined: 14-April 05

Positive Feedback: 77.59%
Feedback Score: 205


QUOTE (rpenner @ Apr 20 2010, 10:38 AM)
Dude, it's a bot copying from Wikipedia.

That's exactly what I said:

QUOTE
post wiki definitions as replys to posts keying off a word in the previous post


The question though is: for what purpose?

Just seems like they are trying to create a fake identity that might fool someone who didn't notice the pattern, as it apparently did with Granouille.

For what reason someone would do this though escapes me.

Arthur



--------------------
"We cannot prove that those are in error who tell us that society has reached a turning point; that we have seen our best days. But so said all before us, and with just as much apparent reason. On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?"

Thomas B. Macaulay
Top
light in the tunnel
Posted: Apr 20 2010, 07:29 PM


Advanced Member
*****

Group: Power Member
Posts: 2055
Joined: 9-August 09

Positive Feedback: 35.29%
Feedback Score: -92


QUOTE (adoucette @ Jan 18 2010, 05:47 AM)
As long as power remains, the data loaded in memory remains.

Is this why the desktop image freezes but remains visible when the computer (processor?) crashes?

It's all those RAM 1s and 0s staying the same despite the processes that generated them having collapsed?

This post has been edited by light in the tunnel on Apr 20 2010, 07:29 PM
Top

Topic Options Pages: (5) 1 [2] 3 4 ... Last »

Add reply · Start new topic · Start new poll


 

Terms of use