Discussion about this post

User's avatar
Bonny Becker's avatar

The conversation starts out by distinguishing between consciousness and thought (reasoning/logic) but then conflates them again. Consciousness, to me, includes emotions. An acid stomach. A tightening throat. A feeling of awe. A feeling of fear. Perhaps most fundamentally a sense of existence and non-existence. What would an AI "feel" with. There's nothing physical there. No hormones. No adrenaline. Not biological tissues to be soothed or harmed. I suppose AI could learn that it doesn't want to cease existing, but is that the same as the deep, deep drive to live and proliferate that is built into biology? I guess for now I'm in the biology camp when it comes to what I consider consciousness.

Expand full comment
Bob Gobron's avatar

My experience with ChatGPT, UAI and others as a code writing assistant leads me to think very strenuously that AI does NOT pass the Turing Test as defined as something that cannot be distinguished from a human counterpart. An example: say you chugging along in C#, iterating your program, banging out the dents and so forth, and you say like, "Hey AI, I'm getting a naming error on thing-a-ma-jig. Can you rewrite the program exactly the same except change thing-a-ma-jig to thing-a-ma-bob and reindex it everywhere it occurs? And the AI will say, "Sure" and then come back with the whole program except instead of C# the whole thing is in Python. That is something that no human code developer would ever do.

Then you say, "Hey AI what the actual...?" And it responds, in a very cheerful and apologetic tone, "Terribly sorry! I'll fix that right away!" And then it does which again, very, very different than working with a human programmer. The total lack of snark, or passive aggression is disconcerting. I've been a systems engineer for quite some time, but my expertise is much more on the hardware side (optics), so I've needed the help of a high-level programmer at least to get me set up and going on stuff many, many times. As far as programmers go, I've worked with almost all of the different phenotypes of the breed (there's only like, four or five) and none of them respond with anything remotely resembling "cheer" when you tell them that their code isn't working. Not once in 25 years.

As far as the "consciousness" question goes, wouldn't that be easier to answer after neuroscientists define how consciousness works in humans? Where it lives and how it interacts with limbic functions? Because if we stick to the latest learning as I understand it, where consciousness begins and ends is not quite as clear as it may seem. But for the sake of this discussion, I'm going to stand on "no cerebral cortex, no consciousness." Those NVIDIA machines are powerful but nothing to a few hundred million years of brain evolution.

Expand full comment
112 more comments...

No posts