I find Thomas Nagel's "what it is like to be" [1] concept fascinating. I have spent quite some time trying to imagine what it is like to be a rock. Mind you, not from the perspective of a human ("A rock will probably experience time very quickly, because it erodes, etc. etc"), but from the perspective of the rock itself. That is, it has no senses, no memory, no capabilities for abstraction, no consciousness.
This ruminating has led me to believe that time and logic are human concepts, and are not as universal as is commonly believed. With recent insights into neural networks, I wouldn't even be surprised if the laws of physics follow from the way our brains are wired, instead of the other way around. Perhaps this is simply a modern take on idealism or realism, but I can't find a strand of philosophy with which I feel at home.
Obviously, there is a bootstrapping problem with trying to reason from something that cannot reason. And I am well aware that my brain must exist in some form of reality. To conclusively prove some apparatus for that is way out of the scope of science. Scientifically there is probably very little to learn from this anyway, apart from opening one's mind to some alternative possibilities. It's a fun exercise, though.
However, the entire discussion about what consciousness is, strikes me as less interesting. Is this really more than being able to conjure up memories of past experiences?
[1] https://en.wikipedia.org/wiki/What_Is_It_Like_to_Be_a_Bat%3F
> However, the entire discussion about what consciousness is, strikes me as less interesting. Is this really more than being able to conjure up memories of past experiences?
I don't think memory and consciousness are intrinsically linked. Memory is something consciousness can be aware of, but it's not consciousness itself. Someone can have their ability to process and remember memories permanently or temporarily damaged, and yet still have a conscious experience. An AI can have memory, but not have a conscious experience. (Or at least it seems that way - if something like Integrated Information Theory is true, then maybe AI does have some sort of first person conscious experience. I tend to doubt that, but I could be wrong.)
EDIT: although I might be conflating short term and long term memory. I wonder if consciousness requires at LEAST some form of memory processing, even if it's just the past couple of seconds. Perhaps the "Strange Loop" needs at least that to arise. I'm not sure.
> And I am well aware that my brain must exist in some form of reality. T
To mess with your head a bit more:
We know of no other way that we know the flow of time than indirectly through memory of the flow of events and sensory inputs.
And so while it seems probable that our brains must exist, consider that e.g. a simulated mind that is paused, and where the same step is executed over and over, with no ability to change state, would have no way of telling that apart from the same step being executed only once, to move on to the next.
In that case it's not clear that there'd need to be any full brain, just whatever consciousness is, in a state that has a notion that it has a memory of a past instant.
Put another way: Your consciousness could be just a single frame of a movie, hinting at a past and future that might not exist.
Forever repeating the same infinitely short instant, and nothing else. Maybe the universe is just a large tableau of conscious instants statically laid out and never changing or interacting with each other. We wouldn't know any different.
Of course that is entirely untestable, and so just a fun philosophical question to ponder, mostly as a means to point out just how little we can know, and so how willing we need to be to accept that what matters isn't what we can know, but what is likely to be able to affect our observed world.
E.g. I see myself as a materialist (philosophically speaking) not because I believe it is or can be proven, but because it is our observable reality. If that materialist reality is shaped by our minds and only exists in some abstract form, or we're all in a simulation etc., then that is irrelevant unless/until we find a way to pierce the veil and/or those aspects "leaks" into our observable reality somehow.
Scientifically, there is a lot to learn. If we understand alternate forms of consciousness, we can potentially alter our own and open up new avenues of experience.
Your last comment strikes me as strange for someone who seems to be well read on the topic. Saying that consciousness is an ability to recall memories doesn’t really describe what it is in the natural sense. The memories themselves are composed of conscious experiences, so that definition is circular. An explanation of what consciousness is would include an explanation about why, for example, chocolate tastes the way it does, rather than like vanilla, or some other completely unknown taste. Until we can explain its character (rather than just describe it), we can’t explain what it is. It’s sort of like dark energy: we can describe the phenomenon but we haven’t fully explained what it is.
I can recommend Being No One, by Thomas Metzinger, for essays.
For sci-fi, have a look at Blindsight, by Peter Watts (for free on his website: Https://www.rifters.com/real/Blindsight.html)
This one resonates very well with me: https://www.organism.earth/library/document/simulation-consc...
You have to give it a chance. He is first building up an argument about why consciousness cannot depend just on the physical substrate itself but rather on the "interpretation" of this. It is very important to understand this part/argument. What follows is something that resonates with you namely how our consciousness is now 'tuned' to the current physical laws.
what of a stones gestation and its long existance as a geological stratum then some event!, and a fracuring of its colective existance as stone, and aquiring an identity as rock, not mear sand or gravel, short of the exalted state of bieng a bolder, a rock
in a very real sense, we share this
> trying to imagine what it is like to be a rock
That is how I perceive meditation to be. At least, the end goal that I have yet to achieve, anyway.
It's a dumb click-bait title (riffing on Nagel's "What is it like to be a bat?"), but the actual question presented a bit further down is:
"Moving down the scale through lizards and fish to slugs, similar considerations apply. There does not seem to be much reason to suppose that phenomenology should wink out while a reasonably complex perceptual psychology persists… As we move along the scale from fish and slugs through simple neural networks all the way to thermostats, where should consciousness wink out?"
The author seems to have succeeded in answering her own question (at least in hand-wavy fashion) at the same time as posing it, as well as implicitly defining consciousness. So, yeah, it's not like anything to be a thermostat.
Surely you agree that not only thermostats are not conscious, but also simple neural networks are not. E.g. a single layer perceptron. And it's intuitively also not just a matter of number of layers or neurons.
By the way: The headline is, I assume, by Annaka Harris, while the essay is by David Chalmers.
Perfect!
Title reminds me of Tim Hunkin's BBC series "The Secret Life of Machines", which he's put on YouTube. There is, funnily enough, an episode on central heating systems:
https://www.youtube.com/watch?v=PnQ9zkBzbYc
EDIT: there is of course a bit about thermostats: https://www.youtube.com/watch?v=PnQ9zkBzbYc&t=1137
Being a thermostat is fucking exhausting. My wife and I are the equivalent of a thermostat for our type 1 diabetic son's blood sugar. It's in our face 24/7.
Closed loop. Seriously, my t1d-related mental exhaustion is 90% reduced now that I'm using a closed loop. 95% if you don't count the "why did androidaps disconnect from my pump and which bit do I need to restart to get it working again" headaches.
In a way it's eliminating too much mental effort; while it's useful as a backup, the fact I sometimes completely forget to take insulin with meals is not ideal, even if the closed loop notices and takes care of it for me (since there's inherently more lag when relying on the loop than if I dialed in the insulin at meal time).
I'm type 1 myself, so I feel for you. People don't realize that the hardest part is NOT needles or anything like that, it's the constant mental overhead of thinking about and managing your blood sugar every moment of every day. Every decision you make is informed by it. You make literally hundreds of micro and macro decisions related to your diabetes every day.
A CGM and insulin pump have made my life easier.
How old is your son?
Does anyone know of a smart thermostat that actually has this function? Every thermostat I've looked for has "heat mode" where it decides if it should be blasting heat or not, and "cool mode", where it decides it should be blasting the AC or not. I have not found the mythical smart thermostat that does the job of "keep the temperature around here" +/- a few degrees.
I live in an area where its cold at night and hot during the day and I am bad at remembering to change the thermostat from mode to mode and haven't found a programmable IOT thermostat I can write a script for, recommendations welcome!
Another useful metaphor that doesn't cross the threshold from human experience into the "experiencing" that non-living things do, would be chemotaxis.
A bacterium finds food with a simple set of states. At it's most basic:
- you are experiencing an increasing concentration of food. you keep swimming straight ahead.
- you are experiencing a decreasing concentration of food. you move in a continuously randomized direction.
this eventually gets them onto a track where they are moving towards food.
Extremely simple like a thermostat, yet effective.
I feel that there is an alternative way of approaching the question: to propose that it is only meaningful to ask what it is like to be an X if the X has certain mental abilities, such as some sort of self-awareness of itself as a participant in a wider world. How would we go about evaluating and choosing between these two views, and is there room for there being degrees of 'what it is like' and self-awareness? It is almost as if we are trying to write the dictionary definition before we know enough to complete the job (which is not necessarily a bad thing, unless we assume that by making a choice, we have, ipso facto, filled in the previously-incomplete knowledge.)
I definitely take issue with Chalmers' opening sentence of his final paragraph: "… A final consideration in favor of simple systems having experience: if experience is truly a fundamental property, it seems natural for it to be widespread." I feel he is putting the cart before the horse here - something that seems quite common in the philosophy of mind - by first deciding that experience is a fundamental property, and then using it to justify the assumption that it is widespread. This strikes me as almost circular, as it seems one could at least as reasonably justify it being fundamental on account of the arguments for it being ubiquitous.
If you believe a thermostat has consciousness, you would also logically believe that the cascading subset of (your body - n*atoms) would also be conscious. E.g. your arm.
This is one reason the topic is so slippery.
I think that we find the idea of thermostat having experiences strange because, subconsciously, we think of experiences only being accessible to someone / something that has a "will to live" (in the words of Schopenhauer).
I.e. I don't think thermostats want anything. They don't have a capacity to care whether they fall apart or not, whether anyone is satisfied with their function or not. But, life, even in the very simplest form wants something. Experiences to living organisms is what makes them more effective at doing what they want.
What makes living creatures want something: I have no idea. I remember hearing a hypothesis tying this to better preservation of energy... but I don't know about that. But, if I had to guess, it must be something that's available to micro-organisms, so, it has to have some physical / chemical explanation...
Their use of "phenomenal" and "phenomenology" confuses me as a layman, but I'll lay out their (likely relevant) definitions and hope to use that to better understand what is being proposed.
> phenomenal: Known or derived through the senses rather than through the mind.
> phenomenology: A philosophy or method of inquiry based on the premise that reality consists of objects and events as they are perceived or understood in human consciousness and not of anything independent of human consciousness.
So the claim (highlighted especially in paragraph 3) is that, outside of humans, things that are perceived may also exist in conscious thought (of non-humans).
I remember inspecting the thermostat in my parent’s house as a child. It was a coil of something metalic which I assume expands and contracts with temperature and physically pushes electrical contacts together to turn on the heat when needed. Knowing how it works, it’s hard for me to imagine that this feels like anything. The whole contraption is just an arrangement of molecules doing what molecules do. But then again, so am I.
Chalmers is a dualist living in a Cartesian past. If you like a lively treatment of dead scholasticism of the mind-vs-brain problem then you can do no better. Ditto Nagel.
In contrast if you want modern post-cartesian scientific thought on consciousness then hit Dennett hard for philosophy or Ray W. Guillery if you want hard neuroscience (The Brain as a Tool).
"What is it like to be a bat (or a thermostat)?" is too abstract for anyone to thoughtfully grasp.
Instead try asking yourself "what is it like to be asleep?" or "what is it like to be waking up?" or "what is it like to be heavily sedated?"
We all experience various gradients of consciousness every day as we do things like drift off to sleep or slowly gain consciousness in the morning. You don't have to try to imagine the experience of another primitive life form when you can just recall what there is or isn't to your own conscious experience as you drift between states.
https://consc.net/notes/lloyd-comments.html has some more info
This kind of panpsychistic talk to me ends up feeling more closer to a reductive materialism than what I would firstly associate Chalmers with ("hey, did you forget you can experience stuff?"), which is probably just my ignorance with his work.
Because yes, you acknowledge "experience", but you make it a function of a physical state described in such and such a way. In the same way that a set of particles at points A, B, C, .. correspond to such and such a (e.g. electric) field strength at point Z, we now imagine it could correspond also to such and such an experience.
It's just barely "experience" on its own terms. and elicits a kind of epiphenomenalism and powerlesness. The thermostat*, after all, doesn't choose anything nor does it profess to have any agency. So agency ought to end up some kind of ephiphenomenal "observable" of a system.
But besides being deflationary in this distasteful way, what bothers me with pictures like this is that they make use of entirely subject-made divisions between objects and their environments, and presume that they might correspond to experiences because - why not? Why not thing of the bottom and upper half of thermostat as corresponding to two fields of experience? Or the quarters, sixtheents, and so on until we get to individual atoms.
The thermostat doesn't "care" if I think of it as the wax and glass separately, or as a single object containing both. But we do have a unified field of experience, and it doesn't matter how another person "cuts us up" in their mind, whether it is as atoms interacting, organs behaving in unison, or just as a "body".
It seems silly to say that between me and Bob having our separate experiences, there is an experience corresponding to "me and Bob", supposedly free-floating somewhere just by virtue of the two of us being cognizable as a physical system.
It turns "experience" and that infamous "qualia" from something that's the most direct and obvious to a weird phantom as the output of a presumed equation which maps some description of a physical state to an "experience".
No wonder you'll find people who'll retort that they don't experience things or that their consciousness is illusory - they have these weird detached notions of experiences to fight against.
* I imagined a thermometer throughout reading this piece, hence the mention of wax and such. It doesn't really change the point so I'm leaving it.
For it to be 'like' to be anything, the anything must have some send of self. Without a sense of self, there is just information processing.
For that reason, it isn't 'like' anything to be, say, most insects, let alone a thermostat.
Wouldn't the thermostat be more closely aligned with a nerve in the overall system and the control board be more aligned as the brain? The brain can get signals from multiple thermostatats in a system to control the temperature.
anyone know what font that is? (on mobile) reminds me of like an old 70s print
1996
This is why no one actually likes philosophers