Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Kosh on September 04, 2009, 04:44:22 am
-
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
-
One would assume it's because computers are inherently digital machines since they can only understand whether the current is on or off., 1 or 0.
-
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
Where have you heard that? I'm not dismissing it out of hand, I just want to know what you are talking about.
-
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer
-
because anylog is inherently imprecise.
though I supose in low precision high performance situations it could be useable
-
Kosh, the main difference between analog and digital is the importance given to the variation of values.
While in analog systems every variation reflects on the result, meaning it uses a continuous scale for it's values, in digital systems the variation is only relevant if it causes the value to change from one discrete interval to another, meaning it uses a discrete scale for it's values.
This gives digital systems a much higher degree of precision than analog systems at the unfortunate cost of complexity. However in most real-world applications, this precision is critical.
-
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer
This.
I'm not sure you understand what 'analog' and 'digital' mean.
-
its not about whats faster its about whats better suited for a certain situation. for a long time analog was the only way to send tv signals. until the technology surfaced that could encode, compress, transmit, receive, decompress, decode the video/audio stream, analog was about the only way to send audio and/or video. digital is starting to steal a lot of the analog bandwidth (us a to d tv conversions for example) because you can use compression to get more out of digital that you could do with analog.
computers have always been about precision, which is why analog and decimal computers went the way of the dinosaur. digital was just so much faster and easier to deal with from an engineering standpoint. and the digital components got faster while analog was eft mostly abandoned in terms of development. it also made it easier to deal with digital signaling.
-
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer
I never said analog computers didn't exist, what I asked was why they weren't ubiquitous. Look around, most of what you see today is digital digital digital. I've never actually seen an analog computer in my whole life, and I don't know anyone who has either, that's how rare they are.
-
Because they suck, pretty much.
-
Analog: not good about memory.
-
I never said analog computers didn't exist, what I asked was why they weren't ubiquitous. Look around, most of what you see today is digital digital digital. I've never actually seen an analog computer in my whole life, and I don't know anyone who has either, that's how rare they are.
Try writing a program for an analog computer that fits in your room.
-
My understanding is that the earliest Fly-By-Wire systems were actually analog computers. Though they were replaced by digital computers quite quickly. Also, my understanding is that there have also been some mechanical computers doing the computing in bombers!
I'm not sure whether digital computer is inherently faster by design or because it is helluva lot easier to make digital parts click at rates of GHz rather than analogs. I would guess signal attenuation and form degradation that happens within wiring could pose serious problems for analog computers.
-
Digital systems are slower than analog due to their complexity.
A simple analog adder can be built with 3 or 4 resistances. The digital equivalent (let's say 4 bit) has at least 36 digital logic gates which in CMOS logic would be something approximating 120 transistors, and in NMOS logic (for better comparison with the analog adder) it would have at least 36 resistances and about 60 or so transistors.
The signal can only propagate so fast.
-
a better question would be why don't we see more asynchronous digital systems vs the synchronous systems everything uses. computers would run a **** ton faster if it didn't have the clock telling all the parts to sit and wait and not do anything.
-
a better question would be why don't we see more asynchronous digital systems vs the synchronous systems everything uses. computers would run a **** ton faster if it didn't have the clock telling all the parts to sit and wait and not do anything.
An interesting point, I think. Why do you think it would become faster?
As a sidenote, there were no national standard times in 1700s. The need for standard time arised at least partially because of railways...
-
it would be faster because the components would be working as fast as they physically can, in current synchronous systems the parts all work for a short amount of time and then wait, then the clock ticks and they work for a little bit then stop. in an asynchronous system the parts never stop working. as soon as the start of the pipeline is done with one instruction it immediately starts working on the next, so rather than the push stop push stop push stop sort of cycle they cpu has now it would have more of a continuous flow to it.
if you know anything about pipelineing in CPU design this is sort of like taking it to the extreme and having the clock be replaced by the propagation time of the component circuits.
-
it would be faster because the components would be working as fast as they physically can, in current synchronous systems the parts all work for a short amount of time and then wait, then the cock ticks and they work for a little bit then stop. in an asynchronous system the parts never stop working. as soon as the start of the pipeline is done with one instruction it immediately starts working on the next, so rather than the push stop push stop push stop sort of cycle they cpu has now it would have more of a continuous flow to it.
if you know anything about pipelineing in CPU design this is sort of like taking it to the extreme and having the clock be replaced by the propagation time of the component circuits.
http://www.youtube.com/watch?v=josAIJZnw-Y
-
so I missed an 'l', it's called a wireless keyboard.
-
In short, designing asynchronous circuits becomes a nightmare even at moderate levels of complexity (compared to modern CPUs). You run into all sorts of headaches with signal stability, handshakes between the asynchronous blocks, etc.
It is possible though and has been done every once in a while, but it's much easier to design a CPU with a global clock.