Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am of the mind that consciousness is an emergent property of the various interconnected information processing systems of the brain. There's just too much evidence: for instance, changing the physiology of the brain through drugs or trauma has predictable changes on consciousness. From that perspective, I think it's pretty clear that consciousness is some kind of computation rooted in the physical world.

However to the question of whether consciousness is a classical computation which could be expressed as a classical computer program, that part is up for grabs. The brain processes information in a fundamentally different way than classical computers: it's essentially a massive dynamical system of many many variables all interacting with each-other in real time. It's entirely possible that, given the dimensionality and parallelism of the brain, that there's not enough material or energy in the universe to construct a computer capable of simulating the entire system in real time. Maybe quantum computers will be able to manage it, or maybe we'll need to engineer biological neural networks to get there artificially.

Or maybe consciousness is a thin illusion on top of a couple of clever tricks we have yet to figure out yet.

Anyway I hope we get a little closer to figuring it out in my lifetime.



I think there are precisely two options.

Consciousness is a pure information process, and therefore computable.

Or consciousness is not an information process.

We only know of one thing that is not an information process, and that is entropy.

Given that entropy is precisely the creation of new information, and that this is a big part of most people's conception of free will, it seems reasonable to conclude that consciousness is much like a combination of self reference and entropy generation rather than some special other thing.


It's an interesting thought. I can imagine that consciousness is something like the experience of the possibility space expanding through entropy, and free will is collapsing that possibility space into a single interpretation of the reality of the moment and a single course of action.


> There's just too much evidence: for instance, changing the physiology of the brain through drugs or trauma has predictable changes on consciousness

You can demonstrate that humans react (and quite predictably) to weather - when it rains, they take their umbrellas out, when there's a hurricane, they seek for shelter, and so on. It's not evidence that human behaviour is an emergent property of weather and nothing else.

In case of consciousness then, this observation only proves that the brain function is susceptible to physical and chemical factors (which is rather obvious).

But not that this function can be FULLY reduced to the computational aspect though.

It doesn't even invalidate the belief in a supernatural soul, or some sort of metaphysical "spark" required to ignite consciousness (which, just to be clear, I personally don't subscribe to, but it's beyond the point) - just because the brain is affected by physical factors doesn't prove it is ONLY physical factors that are at play. It doesn't demonstrate that they have a monopoly.

The big question persists: how does it happen that a computational system, no matter how complex, FEELS something?


You're right that there's no proof that consciousness can be explained by biological function alone. But my belief is that the preponderance of the evidence makes it the most likely explanation by far.

> You can demonstrate that humans react (and quite predictably) to weather - when it rains, they take their umbrellas out, when there's a hurricane, they seek for shelter, and so on. It's not evidence that human behaviour is an emergent property of weather and nothing else.

In the literature, there's a term "necessary and sufficient" which is used quite often. For instance, destroying a certain percentage of a certain type of dopamine receptors is necessary and sufficient to produce parkinsons-like symptoms in rats.

Rain can be demonstrated to make people take umbrellas out, but it's not sufficient. Sometimes it rains and people don't take umbrellas out. It's also not necessary, sometimes people take out umbrellas when it's too sunny. So it's hard to establish that strong causal relationship between rain and umbrellas.

With consciousness, we can't point to a single example of human consciousness which is not at the same place at the same time as a reasonably well-functioning human brain. We can't prove 100% that it's sufficient, but it certainly seems to be necessary, and there's zero evidence so far it's not sufficient.

> The big question persists: how does it happen that a computational system, no matter how complex, FEELS something?

This is actually fairly well understood. So if you are talking about "feeling" in the sense of emotion, we actually have a very detailed understanding of how that system works.

So some of our lower brain regions are responsible for preparing our body for action. For instance, if you've been bitten by a dog before, your brain might learn to get your body ready to run when there is a large dog around. Your sympathetic nervous system will kick in when your visual cortex detects the right patterns, and elevate your heart rate, quicken your breathing, and cause your hairs to stand on end.

So your emotions are essentially a sense which observes that type of sympathetic arousal in your body. Your emotional systems will notice that your body has entered flight-or-fight mode, and will interpret that based on your context and memories to signal to your higher brain function that you are experiencing fear or anxiety.

So that's just one example, but it's extremely plausible to me that consciousness/subjective experience is either just the sum total of all these functions the brain is performing, or is some emergent property on top of them, or else is even some highly specialized function of some subsystem of the brain we don't fully understand yet.

For instance there are credible arguments that the thalamus is the seat of consciousness in the brain.


> So if you are talking about "feeling" in the sense of emotion, we actually have a very detailed understanding of how that system works.

I'm talking about consciousness, not emotions. About the sense of self. Understanding how emotions manifest as a chemical process doesn't have much to do with it. It doesn't solve the consciousness issue, doesn't explain how this "observer process" that is self-aware (and capable of registering: "oh, I'm experiencing such and such emotion now") comes to be.


I did a search of comments for "thalamus" and found yours. Could you elaborate?

This is my own contention as well. See down the page here for th section on consciousness, the thalamus, what I've found anatomically and speculation on how it works

https://sites.google.com/site/pablomayrgundter/mind


> for instance, changing the physiology of the brain through drugs or trauma has predictable changes on consciousness

Assumes a definition of consciousness, which is not in evidence. What drugs or trauma alter is experience, which one might define as the content of consciousness, i.e. thoughts, beliefs etc.


How would you define consciousness?


I wouldn't! I could have a go, but I'm not inclined to.

OK, let's try this: consciousness is the canvas on which subjective experience is painted.

But I'm afraid that's not much help, because it begs a definition of subjective experience, and because it side-steps the nature of the blank canvas.


Idk it seems to me like you’re reaching for something which may or may not exist - how do we know there even is such a canvas?

If you can’t even define what this elusive thing might be, how can you raise this as a serious argument against the idea that consciousness is a product of brain function?


Oh, I know it's there; it's the only thing I know. (Not "we" - it's private knowledge; but it's not hard to find - it's always there, for eveyone).

To extend the analogy: the painting isn't reliable - it's not even certain it's not a hallucination. But there has to be a substrate to paint on, even if the painting itself consists of layer upon layer of deception.

I don't know if this is true; but it seems more plausible to me than that consciousness is an accidental consequence of the bizarre tumblings of matter.

And anyway: the idea that consciousness is emergent, and matter is fundamental, doesn't offer any explanation of the subjectivity of experience and consciousness. You can't explain consciousness as "emergent" without saying first what it is that emerges. Of course, the appearance of consciousness arises from programs like Eliza, or LaMDAwhotsit. They were made to produce that impression.


Well I suppose we can believe whatever we want, but as our understanding of neuroscience increases, it seems as if the trend is that there is less and less unknown space for that substrate to exist in over time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: