Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: FlamingCobra on March 01, 2012, 04:25:12 pm
-
Would it be possible to make a computer that operates in base four on the very lowest level? Skipping binary altogether.
I mean, the human brain basically a computer, and biology is essentially carried out in base 4 on the lowest level.
But I suppose I'm using medieval logic, and we all know what results from that! :lol:
A witch burns. What else burns? Wood. What happens when you throw wood in a pond? It floats. What else floats in a pond? A duck. So if a woman weighs the same as a duck, she must be a witch. :p
NOT
-
wait what? I'm not entirely sure how feasible this would be with current transistor technology. I'm no electrical engineer I'm still fairly certain you'd need much more precise dielectrics and semiconductors to get it to work, and honestly I don't think it would be worth it at all, at least at this point. personally I'd rather see more research put into low-power transistors before we have any hair brained schemes like this.
also I think you lack the understanding of both the human brain and computer organization to fully appreciate why your second line is nonsense.
-
There's a lot of scientific mumbo-jumbo about "DNA computers" but it is precisely that - mumbo-jumbo. The molecular biology of life on Earth and the physics of computing are not related concepts. The human brain does not operate in base 4.
-
The short answer is, yes.
Base 3 computers were actually implemented. (http://en.wikipedia.org/wiki/Ternary_computer)
It's just a question of assigning different voltage ranges for different values.
-
There's a lot of scientific mumbo-jumbo about "DNA computers" but it is precisely that - mumbo-jumbo. The molecular biology of life on Earth and the physics of computing are not related concepts. The human brain does not operate in base 4.
The notion that the human brain is a computer with DNA as a program or something like that is just plain wrong, but I was actually still under the impression that it was possible to build DNA based computers. Not that they're particularly useful as general-purpose computers or anything, though.
-
you can run a computer on base anything greater than 1. there have been base 10 computers (using binary coded decimal), but those were just a way of using binary technology to represent decimal numbers. with binary (high and low) signaling, you can just use transisters in saturation mode where they act as switches. you can use binary signaling on power of 2 base computers just use a n bit allocation unit, where n is the n in 2^n. now when you do a 4-level signal, you are essentially doing analog computing, and you could just as easily do 10 level signaling, but this would cause a speed penalty and require more complex signal processing circuits. we used to use analog computers eons ago, before eniac, and these were for military use, like gun stabilizers on battleships and the like.
also id like to point out that the human brain more closely resembles a neural network than a cpu, and that any comparison between the 2 is a subject for laymen. if your drawing a paralell between base 4 and ctga rungs in a dna strund, shoot yourself. it is actually base 5 because there are places in the strand where a rung may be incomplete. also dna has nothing to do with information processing. i kind of think of it as a seed value in a fractal, its not really all that variable, and only drasticly changes when a new strand is formed. it can even be stated that the information in the strand is random noise. it is not source code nor is it a computer.
wait what? I'm not entirely sure how feasible this would be with current transistor technology. I'm no electrical engineer I'm still fairly certain you'd need much more precise dielectrics and semiconductors to get it to work, and honestly I don't think it would be worth it at all, at least at this point. personally I'd rather see more research put into low-power transistors before we have any hair brained schemes like this.
also I think you lack the understanding of both the human brain and computer organization to fully appreciate why your second line is nonsense.
transistors essentially work in 2 modes. when saturated they act as a switch, otherwise its an amplifier (analog computers used them as multipliers).
-
Nuke, hypothetical base 4 computers wouldn't be analog otherwise they wouldn't be... you know, base 4.
And the problem with analog computers wasn't speed, quite the opposite, it was precision.
-
DNA is a way to store data. It doesn't specifically "operate" in base 4 because the DNA replication doesn't use mathematics - it just replicates, the only logic involved is the pair forming of the four bases. DNA is a continuous data block that just happens to use four different bases (in CHEMICAL context) as its method of storing the protein encoding instructions. Making a logical operator that uses DNA would be quite hard (and would likely require use of RNA to modify the DNA sequences), but as memory, DNA would be better suited. I am unfamiliar with the proposed DNA computers' principles, so I won't say any more on that subject.
Human brain, on the other hand, is a neural net with binary signals, but much more complicated than binary logic. The signals are either on/off (electric potential opens calcium ion gates between nerve cells and electric potential pulses travel through synapses, not as electrons like in metallic conductors, but as ions more similar to what you have if you put two types of electrolytes in two glasses and connect them with a wetted paper strip, and then put anode in one glass and cathode in the other glass; you can run current through the wetted paper strip in form of ions traveling through it.
The neural network formed by nerve cells and their connections is, then, hooked to a LOT of input/output nodes, and is largely dependant on those inputs and outputs to function properly and meaningfully (sensory deprivation is a very nerve-wracking situation, literally). And while individual nerve signals are on/off variety, the brain still interpretes sensory input as largely analog signals, depending on how many nerve endings are sending the same signal, in which case the signal amplitude increases. Each brain is has similar parts, such as the main input/output lines up to the spinal chord and brain stem, and their direct handling areas (for example visual cortex is roughly similar in structure for each person and in roughly same location, too), but each brain is also individually structured based on genetics and the experiences forming new neural pathways.
The resulting jumble of nerve signals results in personality and consciousness, but it cannot really be looked as analogous to a "computer" as we see it. There are a lot of parts in the brain that do tasks similar to "computing" - most of it subconscious routines such as breathing or hormone control, as well as balance handling which affects things like motion control, image stabilization, eye tracking, and a lot more. But even then these subroutines can't really be thought of as binary computers that get input values from senses and send output values accordingly; it's more of an analogous system that has assembled to respond to stimuli in a way that produces certain results.
It would be expected that a neural network assembled with evolutionary algorithms would become largely similar system, with no specific designed features, but instead stuff that just works as needed.
-
I take it that a base 4 computer would not make it any easier to emulate the human body/mind or build an android, and the very idea is laughable and that person clearly knows little about how the body/mind or computers work.
On the subject of Androids, however, I have always wondered that if you wanted to create a 'perfect' replication/emulation of a human being except artificial and computerized, it would have to eat. If you never saw somebody eat, you'd know something was up.
I have heard of these things called "reformers." and basically what they do is take any organic substance and strip it of its hydrogen and then the hydrogen can be used in a fuel cell. So I was wondering if the android could be powered by fuel cells instead of batteries and get its energy in this way like a normal person (by 'eating'). Or I'm just stupid.
-
why would you want to create a perfect replication/emulation of a human being except artificial and computerized
-
that person
are you Hanar? :wtf:
-
Nuke, hypothetical base 4 computers wouldn't be analog otherwise they wouldn't be... you know, base 4.
And the problem with analog computers wasn't speed, quite the opposite, it was precision.
it need not be analog, you can simulate base 4 with 2 bits, because base is divisible by 2. just the smallest unit you could work with would be a bit pair, instead of a single bit. this is like bcd or bi-quinary encoding where the smallest applicable unit of data is 4 and 6 bits respectively, which are used in base 10 computers. thing is we realized we can just use binary and greatly simplify the electronics, and we could just format the input/output for the users sake. if you want to represent the data with 4 signal levels (0,1,2,3 volts, for example), then you enter the analog realm.
also digital circuits are faster due to their reduced complexity. every time you hit a p-n junction you get a small propagation delay. a diode has 1 junction, transistors have 2 junctions each (npn or pnp, depending on type). if you need to compare 2 bits of data, it takes a xor gate, which (from a diagram i found in a google search) has 2 diodes and 2 transistors. this will return a logical zero if the operands match (you can throw an not gate, aka inverter, on there to give a true return on match, and this is just another transistor) if you need more than one bit you can cascade the xor gates and have them operate in paralell, which takes the same amount of time for 1 bit as it does 2, 8, whatever. now to compare 2 analog signals (voltages) takes a couple analog comparators. a common tutorial on the net is to build an analog comparator from an opamp. this can only tell you if one voltage is higher than another, so you need 2 of them to determine if you are within a certain tolerance of the value youre looking for (the precision issue). of course an opamp uses many transistors (one schematic shows as many as 20), reulting in much longer delays (sum of all delays at junctions in series). analog signals can move more data, but analog processors are slower due to the greater number of semiconductor devices used in their construction.
-
There's a lot of scientific mumbo-jumbo about "DNA computers" but it is precisely that - mumbo-jumbo. The molecular biology of life on Earth and the physics of computing are not related concepts. The human brain does not operate in base 4.
The notion that the human brain is a computer with DNA as a program or something like that is just plain wrong, but I was actually still under the impression that it was possible to build DNA based computers. Not that they're particularly useful as general-purpose computers or anything, though.
The concept I've heard of is using nucleotide switches (giving you four options per site instead of two) in a solution to produce a computing environment, but I've never heard of or seen a practical demonstration of the concept. That's not really a DNA computer so much as a molecular chemical computer, though.
-
I have heard of these things called "reformers." and basically what they do is take any organic substance and strip it of its hydrogen and then the hydrogen can be used in a fuel cell. So I was wondering if the android could be powered by fuel cells instead of batteries and get its energy in this way like a normal person (by 'eating'). Or I'm just stupid.
http://en.wikipedia.org/wiki/Hydrogen_reformer
These things aren't exactly small. And why you would want to build an android, let alone power it by fuel cell, is a little beyond me.
-
Nuke, hypothetical base 4 computers wouldn't be analog otherwise they wouldn't be... you know, base 4.
And the problem with analog computers wasn't speed, quite the opposite, it was precision.
it need not be analog, you can simulate base 4 with 2 bits, because base is divisible by 2. just the smallest unit you could work with would be a bit pair, instead of a single bit. this is like bcd or bi-quinary encoding where the smallest applicable unit of data is 4 and 6 bits respectively, which are used in base 10 computers. thing is we realized we can just use binary and greatly simplify the electronics, and we could just format the input/output for the users sake. if you want to represent the data with 4 signal levels (0,1,2,3 volts, for example), then you enter the analog realm.
You are missing the point. You can represent any integer with a binary representation, that's not the point. The point was having a computer use 4 base logic, without emulating it in base 2.
Base 4 is NOT analog, since you are using discret values! Are you even aware of what analog is?
also digital circuits are faster due to their reduced complexity. every time you hit a p-n junction you get a small propagation delay. a diode has 1 junction, transistors have 2 junctions each (npn or pnp, depending on type). if you need to compare 2 bits of data, it takes a xor gate, which (from a diagram i found in a google search) has 2 diodes and 2 transistors. this will return a logical zero if the operands match (you can throw an not gate, aka inverter, on there to give a true return on match, and this is just another transistor) if you need more than one bit you can cascade the xor gates and have them operate in paralell, which takes the same amount of time for 1 bit as it does 2, 8, whatever. now to compare 2 analog signals (voltages) takes a couple analog comparators. a common tutorial on the net is to build an analog comparator from an opamp. this can only tell you if one voltage is higher than another, so you need 2 of them to determine if you are within a certain tolerance of the value youre looking for (the precision issue). of course an opamp uses many transistors (one schematic shows as many as 20), reulting in much longer delays (sum of all delays at junctions in series). analog signals can move more data, but analog processors are slower due to the greater number of semiconductor devices used in their construction.
Digital circuits are slower than analog, period. Try to design a simple calculator using analog and digital components and you'll see that. The problem with analog is not speed but precision. Small errors accumulate while the same doesn't (or at least is not supposed to happen) in the digital realm.
-
The advantage with Base 2 is that there is only 'on' or 'off'. Whilst it could be argued that computers could run on other bases using 'fractions' of voltages, if they had, the odds are that the expansion in telecommunications we have seen would have been a lot harder to achieve, because, whilst inside a computer the environment might be controllable, but once you get outside the chassis, it's a lot harder to impose and maintain a more complex multi-signal system :)
-
Nuke, hypothetical base 4 computers wouldn't be analog otherwise they wouldn't be... you know, base 4.
And the problem with analog computers wasn't speed, quite the opposite, it was precision.
it need not be analog, you can simulate base 4 with 2 bits, because base is divisible by 2. just the smallest unit you could work with would be a bit pair, instead of a single bit. this is like bcd or bi-quinary encoding where the smallest applicable unit of data is 4 and 6 bits respectively, which are used in base 10 computers. thing is we realized we can just use binary and greatly simplify the electronics, and we could just format the input/output for the users sake. if you want to represent the data with 4 signal levels (0,1,2,3 volts, for example), then you enter the analog realm.
You are missing the point. You can represent any integer with a binary representation, that's not the point. The point was having a computer use 4 base logic, without emulating it in base 2.
Base 4 is NOT analog, since you are using discret values! Are you even aware of what analog is?
you still need analog cuircuitry to process that kind of 4 stage signal. its not something a transistor in saturation can do. it cant tell the difference between 2 volts and 4. put any voltage at the base and whatever voltage you have on the collector comes through the emitter (at least is the case with an npn bjt). you need to use the transistor in amplifier mode (analog!) to interpret the signal as a 4 level. you need to do a lot of extra processing on those signals to be able to build an instruction set around them. it is far easier to use a 2-bit encoding scheme.
you need only to look at traditional methods of doing something like base10 computing. we didnt use 10 discrete levels. instead used an encoding scheme to operate in binary, with traditional gates, however, the smallest peice of data for operation was a decimal place. these are cascaded to provide numbers of any length. all input and output was in native decimal. numeric instructions were designed around decimal encoding, so it was in fact a decimal computer, despite the internal use of binary signaling.
also digital circuits are faster due to their reduced complexity. every time you hit a p-n junction you get a small propagation delay. a diode has 1 junction, transistors have 2 junctions each (npn or pnp, depending on type). if you need to compare 2 bits of data, it takes a xor gate, which (from a diagram i found in a google search) has 2 diodes and 2 transistors. this will return a logical zero if the operands match (you can throw an not gate, aka inverter, on there to give a true return on match, and this is just another transistor) if you need more than one bit you can cascade the xor gates and have them operate in paralell, which takes the same amount of time for 1 bit as it does 2, 8, whatever. now to compare 2 analog signals (voltages) takes a couple analog comparators. a common tutorial on the net is to build an analog comparator from an opamp. this can only tell you if one voltage is higher than another, so you need 2 of them to determine if you are within a certain tolerance of the value youre looking for (the precision issue). of course an opamp uses many transistors (one schematic shows as many as 20), reulting in much longer delays (sum of all delays at junctions in series). analog signals can move more data, but analog processors are slower due to the greater number of semiconductor devices used in their construction.
Digital circuits are slower than analog, period. Try to design a simple calculator using analog and digital components and you'll see that. The problem with analog is not speed but precision. Small errors accumulate while the same doesn't (or at least is not supposed to happen) in the digital realm.
depends on the instuction. the comparator is one example. but say you get into multipliers, then for analog it comes down to just 2 junctions. your operands become your input signal, and whatever gain you set for the transistor. division is a simple voltage divider. there are some things analog does well, like operating on numeric values, but when you start doing logic the circuit design becomes a convoluted mess. it all comes down to analog comparators, which are slow in terms of propagation delay. im not even gonna touch on analog memory systems. there is a reason why our digital computers are fast as **** and analog computers are completely unheard of. it has nothing to do with the precision of neumeric data.
in fact digital computers have a huge issue with numeric precision. without an fpu 5/2 is 2. you need to use fixed point computation to get the proper answer of 2.5 out of an integer unit. you get into floating point, and your only guaranteed a precision to a few places, you cant have a large number with a lot of useful information right of the decimal. analog actually has more precision in this regard, even though it fails horrifically at logic. digital is very good at logic, with only 4 types of gates you can do every operation necessary for a computer. that includes floating point operations. only by virtue of bus width can digital really handle numeric data.
-
You seem to be confusing how you (as a human) do calculations with how the computer represents information.
If you use a base, no matter what that base is, you are using a digital reference frame! It doesn't matter if it's two, four, a billion or twenty six quadrillion, as long as you have a fixed finite base what you have is digital. Sure, you might have difficulty making it work due to overshots and whatnot, but that's just a detail and doesn't make such a computer impossible.
Analog by definition doesn't have any sort of base since you are not using discret values in the first place! (although I'd argue analog is just a badly done digital system with an enormous base)
Regarding circuitry, analog comparators' latency is measured in the pico-seconds, about the same as logic gates... I'm not sure where you are getting the "analog circuits are slow" from. Think of it this way, with analog computers you can use the values as it is, but with digital you still need to interpret it.
The whole reason analog computers are not used, as I've repeatedly mentioned is precision. You have a device that can give you an unpredictable result (or a predictable one with a range of error, if you view it in another way), which makes errors accumulate.
For instance, you have a extremely long arithmetic operation. Unless your analog computer was ridiculously precise, you'd have a range of error much larger than whatever result you might get.
To put it simpler, no one wants to consult their bank account statement repeatedly and get different observations for what should be the same value.
-
You seem to be confusing how you (as a human) do calculations with how the computer represents information.
If you use a base, no matter what that base is, you are using a digital reference frame! It doesn't matter if it's two, four, a billion or twenty six quadrillion, as long as you have a fixed finite base what you have is digital. Sure, you might have difficulty making it work due to overshots and whatnot, but that's just a detail and doesn't make such a computer impossible.
Analog by definition doesn't have any sort of base since you are not using discret values in the first place! (although I'd argue analog is just a badly done digital system with an enormous base)
Regarding circuitry, analog comparators' latency is measured in the pico-seconds, about the same as logic gates... I'm not sure where you are getting the "analog circuits are slow" from. Think of it this way, with analog computers you can use the values as it is, but with digital you still need to interpret it.
The whole reason analog computers are not used, as I've repeatedly mentioned is precision. You have a device that can give you an unpredictable result (or a predictable one with a range of error, if you view it in another way), which makes errors accumulate.
For instance, you have a extremely long arithmetic operation. Unless your analog computer was ridiculously precise, you'd have a range of error much larger than whatever result you might get.
To put it simpler, no one wants to consult their bank account statement repeatedly and get different observations for what should be the same value.
to revisit my previous analogy, a common opamp (opa705) has a response rate of about 20 microseconds. this is the time between reciving a signal and gain, and the output settiling out as a response. an opamp is generally used as a comparator it can do a morethan or lessthan operation, depending on operands, and 2 can be used to proove equality, this is similar to how when dealing with floating point numbers in c, its not a good idea to use the == operator. it is often better to see if the number is somewhere within a tolerance range. 'if(a == b)' should instead be 'if(a > b-0.001f && a < b+0.001f)', to compensate from any floating point discrepency. the digital part that does the same job is an xor gate + an inverter. a 74136 quad xor gate has a switching time between 12 and 55 nanoseconds, and a 7404 hex inverter has a delay of 3-33 nanoseconds. not to say you cant find high speed analog parts, but these are values from the datasheets of existing analog and digital ics, which i use in my electronics projects. when i work with microcontrollers i find i must spend a lot of cpu time waiting for the analog parts of the chip to do their jobs. now this is on the order of a few hundred microseconds, so its not an eternity, but id does effect performance when reading analog sensors and the like.
-
Unlike base-4, base-3 or trinary computers can be made with digital circuitry if you use balanced tenary, with signals of +1,0,-1. Balanced ternary can express negative values as easily as positive ones, without the need for a leading negative sign as with decimal numbers. These advantages make some calculations more efficient in ternary than binary.
The Russian Satun computers in the '60 and '70 used this system and they outperformed their binary equivalents by some margin. This came at the cost of greatly increased programming complexity though, so eventually tenary computers were abandoned.
Balanced ternary may be resurrected in the future, as optronic computing using polarization could easily distinguish between +1 and -1 signals.
-
ternary is easy to isolate with diodes, so its not hard to distinguish high, low and negative high. you can increase the throughput of a single wire by 50% over binary, with only a few extra diodes. thing is that computers are so damn fast that many high speed busses use differential signaling, where you have a positive signal and a complementary (negative) signal on the other line. a positive voltage is complemented by a negative voltage on the other line. ternary might not work well, as the middle level (0) has no complement. also a signal does not transition instantly. it takes time to go from 1 to zero, and for a ternary signal takes twice as long to go from 1 to -1 as it does to go from 1 to 0. you just have to allocate time for a full transition to occur before the signal is valid, of course this is costly.
-
DNA is a way to store data. It doesn't specifically "operate" in base 4 because the DNA replication doesn't use mathematics - it just replicates, the only logic involved is the pair forming of the four bases. DNA is a continuous data block that just happens to use four different bases (in CHEMICAL context) as its method of storing the protein encoding instructions. Making a logical operator that uses DNA would be quite hard (and would likely require use of RNA to modify the DNA sequences), but as memory, DNA would be better suited. I am unfamiliar with the proposed DNA computers' principles, so I won't say any more on that subject.
Human brain, on the other hand, is a neural net with binary signals, but much more complicated than binary logic. The signals are either on/off (electric potential opens calcium ion gates between nerve cells and electric potential pulses travel through synapses, not as electrons like in metallic conductors, but as ions more similar to what you have if you put two types of electrolytes in two glasses and connect them with a wetted paper strip, and then put anode in one glass and cathode in the other glass; you can run current through the wetted paper strip in form of ions traveling through it.
The neural network formed by nerve cells and their connections is, then, hooked to a LOT of input/output nodes, and is largely dependant on those inputs and outputs to function properly and meaningfully (sensory deprivation is a very nerve-wracking situation, literally). And while individual nerve signals are on/off variety, the brain still interpretes sensory input as largely analog signals, depending on how many nerve endings are sending the same signal, in which case the signal amplitude increases. Each brain is has similar parts, such as the main input/output lines up to the spinal chord and brain stem, and their direct handling areas (for example visual cortex is roughly similar in structure for each person and in roughly same location, too), but each brain is also individually structured based on genetics and the experiences forming new neural pathways.
The resulting jumble of nerve signals results in personality and consciousness, but it cannot really be looked as analogous to a "computer" as we see it. There are a lot of parts in the brain that do tasks similar to "computing" - most of it subconscious routines such as breathing or hormone control, as well as balance handling which affects things like motion control, image stabilization, eye tracking, and a lot more. But even then these subroutines can't really be thought of as binary computers that get input values from senses and send output values accordingly; it's more of an analogous system that has assembled to respond to stimuli in a way that produces certain results.
It would be expected that a neural network assembled with evolutionary algorithms would become largely similar system, with no specific designed features, but instead stuff that just works as needed.
not exactly wrong but not completely right
-
not exactly wrong but not completely right
(http://alltheragefaces.com/img/faces/large/****-yeah-close-enough-l.png)
-
Well, I still say the real strength of computing comes from Networking, it's that which has encouraged the leap forward of digital technology far more than internal processing versatility, whether that be in the form of mobile devices, cloud computing even things like SETI online show the usefulness of it.
The problem with non-binary systems, for me, is the fact that even getting a simple binary signal to remain stable and readable across a variety of telecommunications networks is quite a difficult job. Voltages are not universal, and so a computer actually has to look for the difference between the two signals it gets rather than distinct values and, at least until recently, those systems were very unreliable for data transfer in the first place.
I see multi-state computing to be something similar to the IDE interface. The whole reason it was adopted in the first place was because it was believed that sending multiple signals at once would be faster than serial systems. This was true for a short while, but the reason that SATA was developed was because making sure all the signals arrived at the same time on an IDE port was actually turning out to take longer than a modern Serial connection.
Whilst more modern communication systems might be a bit more receptive to 3+ state signals, I still don't think we are quite there.
-
close enough
wellllll
nerve signals aren't always on/off digital. last time i was really involved in neurobio, the field had just begun to agree that neurons transmit both a binary action potential and the analog signal present in the cell body.
it's also important to note that this
electric potential opens calcium ion gates between nerve cells and electric potential pulses travel through synapses
is a little misleading; if a cell fires an action potential, that action potential doesn't flip adjacent cells ON or OFF - rather, it contributes an excitatory or inhibitory potential to connected neurons (EPSP or IPSP), altering the probability that that cell will fire in a graded fashion summed across all contributing nerves. so individual nerve signals aren't exactly 'on/off' variety; even the action potential itself, as it turns out, isn't on/off
this is a really important mechanistic difference between computers and brains and it's one that makes computational neuro a tricky field
the brain still interpretes sensory input as largely analog signals, depending on how many nerve endings are sending the same signal, in which case the signal amplitude increases
this is also a bit problematic. there's not a direct correlation, exactly, between number of nerves firing and signal amplitude, nor between stimulus intensity and number of nerves firing. some sensory nerves inhibit others; some are frequency-dependent; some have specialized modality-linked functions. psychophysics is also a really interesting field.
it's important to note that the brain does a VAST amount of postprocessing on most sensory data, much of it additive and interpolative rather than reductive
The resulting jumble of nerve signals results in personality and consciousness
(http://www.ideacenter.org/stuff/contentmgr/files/e27b080d92450837e43d44bf73780847/misc/image5.gif)
It would be expected that a neural network assembled with evolutionary algorithms would become largely similar system, with no specific designed features, but instead stuff that just works as needed
this is...a very interesting question, and one that i'm not totally prepared to hypothesize on (regarding the convergent development of a simulated neural system)
i spent the weekend in the neuro lab at NYU and they are really ****ing with what i thought were accepted dogmas in the neurobio and neuropsych field
-
Very interesting. Especially the part on action potential not being strictly 0/1 on individual signal basis, I have no idea how that would work. Are there multiple channels in individual synapse connecting two neurons? Or does it work with general intensity of the action potential, or a timed sequence with pulse data?
Just one thing I disagree on - the miracle thing... what I meant is that we don't exactly know how it happens (due to immense complexity of the whole process) but that the brain definitely does produce the individual personality and consciousness (however you want to define it), no miracles involved.
I just suspect we aren't really well equipped to categorize and analyze the process of how consciousness emerges from the net activity of the brain, as it's an evolved process and we're sort of geared to think on design basis. All that the brain does is based on a long ass evolutionary algorithm that has produced what we have now, there is probably a LOT of leftovers from solutions that didn't quite work but left unused structures (one example would be epithalamus and its connection to parietal eye) that might end up having a similar role as what we thought was "junk" DNA.
Strictly speaking, you could look at any current gravity theory and paste that "then a miracle occurs" image there at some point, too. :p The fact that a theory has gaps in it doesn't discredit the theory, only limits its field of application.
Similar to how we can empirically observe the existence of gravity, we can clearly say that things such as personality and consciousness do exist. And while we can't exactly say HOW both phenomena work, we have a pretty good idea that gravity is connected to property called mass... and that personality and consciousness are connected to neural activity in the brain.
Of course, all this depends on the definition of term "miracle" as well. ;7
-
Just one thing I disagree on - the miracle thing... what I meant is that we don't exactly know how it happens (due to immense complexity of the whole process) but that the brain definitely does produce the individual personality and consciousness (however you want to define it), no miracles involved.
yes, any good scientist (or level-headed thinker) is a strong monist materialist by this point, myself included
the comic calls attention to an understandable oversimplification of an extremely thorny scientific problem we have yet to completely resolve
you probably do not need to expend lots of words convincing me that personality and consciousness are encoded in brain matter given my participation in events like the teleport threadnought
-
It was mostly for other people's benefit. :p
-
Very interesting. Especially the part on action potential not being strictly 0/1 on individual signal basis, I have no idea how that would work. Are there multiple channels in individual synapse connecting two neurons? Or does it work with general intensity of the action potential, or a timed sequence with pulse data?
Action potentials involve a series of ions. The simple explanation talks about calcium, but calcium, sodium, potassium, and a few other ions all play roles in action potential traveling down a synapse. All of these are involved in both the net result - binary - and the magnitude - analog. If I'm thinking of the same thing batts is, action potentials can "partially" fire, resulting in a lesser response at the synapse. As action potentials reset incredibly quickly, I believe the synapse also responds to pulse. If you want more detail, I'd have to go back and look it up.
-
Democritus was bound to regard the soul as material (composed of round, smooth, specially mobile atoms, identified with the fire-atoms floating in the air)
We're still at the same spot we were 2000 years ago on this front. Nobody has a clue, because it's not something we have any moral way to study in the slightest.
-
Democritus was bound to regard the soul as material (composed of round, smooth, specially mobile atoms, identified with the fire-atoms floating in the air)
We're still at the same spot we were 2000 years ago on this front. Nobody has a clue, because it's not something we have any moral way to study in the slightest.
well we have as much compelling evidence for the immortal soul as we do for the phlogiston, giving it a pretty solid scientific position for the time being
-
We're still at the same spot we were 2000 years ago on this front. Nobody has a clue, because it's not something we have any moral way to study in the slightest.
What color is the unicorn that you see? Mine is green! :)
EDIT:
(There is just as much evidence for an invisible unicorn only I can see as there is for anyone's immortal soul. This post is not here to taunt, it is to point out the inherent error in the OPs logic)
-
Ah, sorry about that, used the wrong word, "soul". It was a quote from some website. Obviously I don't think there's some part of me that's going to Hades like that person, It was a Greek philosopher by the way, not Christian, The 2k was just rounding, one significant digit, see. I don't even know where immortal came from.
I was trying to point to some guy theorizing on the same thing back from the 2300 something years ago. Also to point out that whatever consciousness is, we haven't made any progress on identifying it. His idea was silly but also on par with anything else I've seen.
I've long since given up on trying to convince people that my unicorn is real. Btw, yeah, it is green.