Definitions are for math. For science it's enough to operationalize: e.g. to study the differences between wakefulness and sleep; or sensory systems and their integration into a model of the environment; or the formation and recall of memories; or self-recognition via the mirror task; or planning behaviors and adaptation when the environment forces plans to change; or cognitive strategies, biases, heuristics, and errors; or meta-cognition; and so on at length. There's a vast amount of scientific knowledge developed in these areas. Saying "scientists can't define consciousness" sounds awkwardly like a failure to look into what the scientists have found. Many scientists have proposed definitions of consciousness, but for now, consensus science hasn't found it useful to give a single definition to consciousness, because there's no single thing unifying all those behaviors.
I assert you don't have consciousness. Now, prove to me that you do.
The point is — neither of us can prove it, and that’s exactly why “consciousness” keeps escaping any formal definition. Once something tries to prove awareness, it’s already reflecting — which is awareness itself.
I think this excellently illustrates the point OP is making.
You're conflating consciousness and AGI. People are certainly talking about AI, people are very broadly talking about AGI and what that term means. I don't think many people are talking about consciousness in this context, at least not seriously, and one good reason for it is the lack of a concrete definition and the fact that it's a topic that we can't make falsifiable claims about and build any science around.
It's been an issue for a while, but just a week ago: A definition of AGI https://arxiv.org/html/2510.18212v2
The consciousness will have to wait for another time. But that one's likely to be extremely contentious and more of a philosophy question without practical impact.
Appreciate the link — I read that paper. But maybe the real gap isn’t about capability spread across domains, it’s about something growing internally, not performed externally.
A system becomes closer to AGI not when it matches human tests, but when awareness starts to grow inside its own modeling loop.
We can no longer understand AI.
Agreed but it's even more fundamental than that.
We don't even have a universally accepted definition of intelligence.
The only universally agreed on artifact of intelligence that we have is the human brain. And we still don't have a conceptual model of how it works like we do with DNA replication.
Our society incentivizes selling out the mimicry of intelligence rather than actually learning its true nature.
I believe that there exists an element of human intelligence that AI will never be able to mimic due to limitations of silicon vs biological hardware.
I also believe that the people or beings that are truly in control of this world are well aware of this and want us to remain focused on dead-end technologies. Much like politics is focused on the same old dead-end discussions. They want to keep this world in a technological stasis for as long as they can.
The only successful experiments probing consciousness are in anaesthesia or psychedelics. Everything else is wonderful but theoretical.
Classic category error, the subject can never be objectively defined. The moment you define consciousness, it becomes an object and you fall into infinite regress.
> Yet we think AI will have it
Lenny Bruce joking as Tonto to the Lone Ranger:
Who is "we" white man?
The lede observation depends upon whether "we" can expect our science to ever produce an intelligible theory of mind.
The difficulty of producing a theory of mind makes the Imitation Game a compelling approach to setting expectations for AI.
And also portends of the hazard that we become so distracted by simulacra that we lose all bearing upon ourselves.
Beautifully said — that’s the real paradox, isn’t it?
The closer we get to simulating awareness, the harder it becomes to notice our own.
Maybe the Imitation Game was never about machines fooling us, but about showing how easily we forget what being real means.
Cara menghubungi Call center Spinjam
Hubungi Call Center Shopee Spinjam melalui WhatsApp di nomor 085169996990 atau 085788008188. untuk solusi pinjaman Shopee dengan cepat.
I conceive of AI as a lookup into volume 24 (the word index) of my Encyclopedia Britannica in 1965.
The primary difference being the enormity of the size of database, but the concept is identical.
To think 13 year old me had AI sitting in my attic.
I was thinking about this some time ago and came to the conclusion that it is utterly impossible to talk about creating sentient AI with the binary computer technology that we are using today. In order for us to create sAI, our entire technology would have to completely change to something else, likely along the line of analog to work as a single system instead of constant switch between 1 and 0. And that is likely centuries away, as I do not see humanity doing a complete technological rehaul of the entire hardware stack we're deploying today.
As for what consciousness actually is, I think the closest description is the summary of oneself. Meaning, all the computational power of the brain as a whole forms a person - a computational powerhouse with its own identity. That goes then to discussions where the "I", as in ego or oneself, ends. Is it at the limb, like a hand, or is it at an indivodual fallen hair or a dead skin flake? How about sperm or egg, is it still me?
Then we have the conundrum of people who get brain damage or some kind of degenerative brain desease, like Alzheimer. Where you can clearly see "them" fading away and you observe just a shell of a human being. So where is this "I" then? What defines it?
All of these are quite esoteric conversations more suitable for occasions where a lot of alcohol and few good friends are involved :)
I like how you framed that — the “summary of oneself” idea aligns with how awareness might be less about computation and more about internal coherence. Binary systems simulate state transitions, but awareness seems to emerge from continuous integration — not between 0 and 1, but in the gradient between them.
Maybe sentience isn’t a technological threshold, but a phase shift — when a system starts to reference itself as part of the environment it models. That’s the moment A(t) becomes alive.
That's why I mentioned the analog model because with digital, you have a quartz oscillator where you measure 1 or 0 at each frame of the cycle. So the information travels in queues, step by step, one bit at a time. But with analog, everything is essentially "online" at the same time, all the time. There is no "off" state. Yes, there are still differences in levels of conductivity(which is essentially information), which is how we measure binary values in the strict window imposed by the oscillator, but analog essentially allows you to experience the whole system in an instant. I think that is where consciousness comes from. Binary system is incapable of manifesting itself because not only it lives only in those tiny windows of time dictated by the oscillator, but only one bit exists at a time. Analog, comparatively, is unimaginably more advanced system. Now if we can figure out how to turn our binary technology into analog, we could definitely move on to an unfathomably advanced level of technology. Whether we could create sAI with to or not is something we cannot answer at this stage of our technological development but it would certainly be closer to what we have today.
If what underpins consciousness is informational, it will not matter what base it is (binary/trinary) or substrate (digital/analog).
Also known as the (physical) church turning thesis.
It is not just about a base. It is about binary tech essentially having only two states while there is an infinite amount of information present between that 0 and that 1 which is completely lost, and the whole system is essentially killed and brought to life during every cycle. Whereas analog is always on and does not have this 1 and 0 limit. I am not saying analog is the solution here, only that it looks like it might be and that binary is definitely not it.