Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: Kosh on September 04, 2009, 04:44:22 am

Title: Analog
Post by: Kosh on September 04, 2009, 04:44:22 am
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
Title: Re: Analog
Post by: Liberator on September 04, 2009, 05:13:46 am
One would assume it's because computers are inherently digital machines since they can only understand whether the current is on or off., 1 or 0.
Title: Re: Analog
Post by: The E on September 04, 2009, 06:07:46 am
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?

Where have you heard that? I'm not dismissing it out of hand, I just want to know what you are talking about.
Title: Re: Analog
Post by: Fury on September 04, 2009, 07:08:34 am
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer
Title: Re: Analog
Post by: Bobboau on September 04, 2009, 07:27:22 am
because anylog is inherently imprecise.
though I supose in low precision high performance situations it could be useable
Title: Re: Analog
Post by: Ghostavo on September 04, 2009, 12:44:35 pm
Kosh, the main difference between analog and digital is the importance given to the variation of values.

While in analog systems every variation reflects on the result, meaning it uses a continuous scale for it's values, in digital systems the variation is only relevant if it causes the value to change from one discrete interval to another, meaning it uses a discrete scale for it's values.

This gives digital systems a much higher degree of precision than analog systems at the unfortunate cost of complexity. However in most real-world applications, this precision is critical.
Title: Re: Analog
Post by: General Battuta on September 04, 2009, 12:47:21 pm
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer

This.

I'm not sure you understand what 'analog' and 'digital' mean.
Title: Re: Analog
Post by: Nuke on September 04, 2009, 01:14:03 pm
its not about whats faster its about whats better suited for a certain situation. for a long time analog was the only way to send tv signals. until the technology surfaced that could encode, compress, transmit, receive, decompress, decode the video/audio stream, analog was about the only way to send audio and/or video. digital is starting to steal a lot of the analog bandwidth (us a to d tv conversions for example) because you can use compression to get more out of digital that you could do with analog.

computers have always been about precision, which is why analog and decimal computers went the way of the dinosaur. digital was just so much faster and easier to deal with from an engineering standpoint. and the digital components got faster while analog was eft mostly abandoned in terms of development. it also made it easier to deal with digital signaling.
Title: Re: Analog
Post by: Kosh on September 04, 2009, 07:57:31 pm
Ok, so if analog is faster than digital how come it isn't in widespread use today in the computing world?
wtf are you talking about?
http://en.wikipedia.org/wiki/Analog_computer

I never said analog computers didn't exist, what I asked was why they weren't ubiquitous. Look around, most of what you see today is digital digital digital. I've never actually seen an analog computer in my whole life, and I don't know anyone who has either, that's how rare they are.
Title: Re: Analog
Post by: General Battuta on September 04, 2009, 08:39:07 pm
Because they suck, pretty much.
Title: Re: Analog
Post by: NGTM-1R on September 04, 2009, 08:48:04 pm
Analog: not good about memory.
Title: Re: Analog
Post by: blackhole on September 05, 2009, 01:24:07 am
I never said analog computers didn't exist, what I asked was why they weren't ubiquitous. Look around, most of what you see today is digital digital digital. I've never actually seen an analog computer in my whole life, and I don't know anyone who has either, that's how rare they are.

Try writing a program for an analog computer that fits in your room.
Title: Re: Analog
Post by: Mika on September 05, 2009, 03:02:08 am
My understanding is that the earliest Fly-By-Wire systems were actually analog computers. Though they were replaced by digital computers quite quickly. Also, my understanding is that there have also been some mechanical computers doing the computing in bombers!

I'm not sure whether digital computer is inherently faster by design or because it is helluva lot easier to make digital parts click at rates of GHz rather than analogs. I would guess signal attenuation and form degradation that happens within wiring could pose serious problems for analog computers.
Title: Re: Analog
Post by: Ghostavo on September 05, 2009, 03:44:14 am
Digital systems are slower than analog due to their complexity.

A simple analog adder can be built with 3 or 4 resistances. The digital equivalent (let's say 4 bit) has at least 36 digital logic gates which in CMOS logic would be something approximating 120 transistors, and in NMOS logic (for better comparison with the analog adder) it would have at least 36 resistances and about 60 or so transistors.

The signal can only propagate so fast.
Title: Re: Analog
Post by: Bobboau on September 05, 2009, 09:43:49 am
a better question would be why don't we see more asynchronous digital systems vs the synchronous systems everything uses. computers would run a **** ton faster if it didn't have the clock telling all the parts to sit and wait and not do anything.
Title: Re: Analog
Post by: Mika on September 05, 2009, 04:37:37 pm
Quote
a better question would be why don't we see more asynchronous digital systems vs the synchronous systems everything uses. computers would run a **** ton faster if it didn't have the clock telling all the parts to sit and wait and not do anything.

An interesting point, I think. Why do you think it would become faster?

As a sidenote, there were no national standard times in 1700s. The need for standard time arised at least partially because of railways...
Title: Re: Analog
Post by: Bobboau on September 06, 2009, 12:22:26 am
it would be faster because the components would be working as fast as they physically can, in current synchronous systems the parts all work for a short amount of time and then wait, then the clock ticks and they work for a little bit then stop. in an asynchronous system the parts never stop working. as soon as the start of the pipeline is done with one instruction it immediately starts working on the next, so rather than the push stop push stop push stop sort of cycle they cpu has now it would have more of a continuous flow to it.

if you know anything about pipelineing in CPU design this is sort of like taking it to the extreme and having the clock be replaced by the propagation time of the component circuits.
Title: Re: Analog
Post by: BloodEagle on September 06, 2009, 12:43:13 am
it would be faster because the components would be working as fast as they physically can, in current synchronous systems the parts all work for a short amount of time and then wait, then the cock ticks and they work for a little bit then stop. in an asynchronous system the parts never stop working. as soon as the start of the pipeline is done with one instruction it immediately starts working on the next, so rather than the push stop push stop push stop sort of cycle they cpu has now it would have more of a continuous flow to it.

if you know anything about pipelineing in CPU design this is sort of like taking it to the extreme and having the clock be replaced by the propagation time of the component circuits.

http://www.youtube.com/watch?v=josAIJZnw-Y
Title: Re: Analog
Post by: Bobboau on September 06, 2009, 02:49:43 am
so I missed an 'l', it's called a wireless keyboard.
Title: Re: Analog
Post by: Col. Fishguts on September 08, 2009, 03:32:46 pm
In short, designing asynchronous circuits becomes a nightmare even at moderate levels of complexity (compared to modern CPUs). You run into all sorts of headaches with signal stability, handshakes between the asynchronous blocks, etc.

It is possible though and has been done every once in a while, but it's much easier to design a CPU with a global clock.