The Free Dictionary  
mailing list For webmasters
Welcome Guest Forum Search | Active Topics | Members

Consciousness; An Explanatory Gap does not Require Extraordinary Claims Options
Epiphileon
Posted: Sunday, October 21, 2018 4:06:51 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
I came across this article at Scientific American and felt that it was one of the best, and most concise, arguments for the monist view of consciousness that I have read. The authors refer to the so-called "hard problem" by a much better term "the explanatory gap", and show that there is no need for supernatural, or exotic physics, effects for the production of consciousness.

Unlocking the "Mystery" of Consciousness
FounDit
Posted: Monday, October 22, 2018 3:55:23 PM

Rank: Advanced Member

Joined: 9/19/2011
Posts: 12,233
Neurons: 60,694
Epiphileon wrote:
I came across this article at Scientific American and felt that it was one of the best, and most concise, arguments for the monist view of consciousness that I have read. The authors refer to the so-called "hard problem" by a much better term "the explanatory gap", and show that there is no need for supernatural, or exotic physics, effects for the production of consciousness.

Unlocking the "Mystery" of Consciousness


I read it, but something seems missing to me. I'll have to ponder it a bit before formalizing any questions.
Hope123
Posted: Monday, October 22, 2018 4:52:34 PM

Rank: Advanced Member

Joined: 3/23/2015
Posts: 8,925
Neurons: 51,153
Location: Burlington, Ontario, Canada
Epiphileon wrote:
I came across this article at Scientific American and felt that it was one of the best, and most concise, arguments for the monist view of consciousness that I have read. The authors refer to the so-called "hard problem" by a much better term "the explanatory gap", and show that there is no need for supernatural, or exotic physics, effects for the production of consciousness.

Unlocking the "Mystery" of Consciousness


Epi, I saw your thread Sunday morning but was unable to get onto the forum till later, and have not had time yet since then - but will definitely have a look-see. When it is all quiet later at night. 😀
FounDit
Posted: Monday, October 22, 2018 6:21:14 PM

Rank: Advanced Member

Joined: 9/19/2011
Posts: 12,233
Neurons: 60,694
Epiphileon,

I think I found what feels missing: it’s a definition for the word “feelings”. The authors say,
“The first factor is that consciousness and the creation of feelings are fundamentally grounded in general life functions. Just look at all the basic commonalities between life and feelings.” I don’t think they mean information conducted by the senses, but emotions instead, yet their “-otopic” maps seem to indicate sensations. It’s somewhat confusing. But “emotions” opens up a whole new bag of ideas and necessary contemplation and explanation relative to consciousness, IMO.

Then we get the second part:
” The second factor in our explanation of the ontological aspect of the explanatory gap is that to these general features are added numerous and neurobiologically unique special neurobiological features of complex nervous systems, especially of complex brains, that all together create consciousness.”

These special neurobiological features would have to be items such as directed imagination and creative thinking as a result of that directed imagination.

Okay, so far so good. If this is what is meant by “feelings” and special neurobiological features, it seems to me they have simply stated the obvious: that humans are different kinds of creatures, but they haven’t done anything towards advancing the understanding of consciousness. They title it “Unlocking the "Mystery" of Consciousness but I don’t see what they have “unlocked”. I don’t see the “gap” they are trying to explain, or any explanation for consciousness we don’t already know. Did I miss something?



Epiphileon
Posted: Tuesday, October 23, 2018 4:48:40 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
FounDit wrote:
Epiphileon,

I think I found what feels missing: it’s a definition for the word “feelings”. The authors say,
“The first factor is that consciousness and the creation of feelings are fundamentally grounded in general life functions. Just look at all the basic commonalities between life and feelings.” I don’t think they mean information conducted by the senses, but emotions instead, yet their “-otopic” maps seem to indicate sensations. It’s somewhat confusing. But “emotions” opens up a whole new bag of ideas and necessary contemplation and explanation relative to consciousness, IMO.

Yeah I completely understand why this is misleading. I don't remember what issue it was within the study of consciousness I had completely wrong for years because of the way it was talked about, but there was one. The problem is very few of the words we use to describe consciousness are specific to the study of consciousness. Even when a word is made specific to the issue researchers are forever quibbling over the minutia of the definition and so I think shy away from using it. Substitute qualia for feelings and that should straighten it out.

Then we get the second part: ” The second factor in our explanation of the ontological aspect of the explanatory gap is that to these general features are added numerous and neurobiologically unique special neurobiological features of complex nervous systems, especially of complex brains, that all together create consciousness.”

These special neurobiological features would have to be items such as directed imagination and creative thinking as a result of that directed imagination.

No, I am nearly certain that they are referring here to specific biological characteristics and not behavioral, when speaking of the brain/mind things like creative thinking are behaviors. I think they are talking about things like the unique architecture of more complex, (base consciousness capable), nervous systems like the existence of nested distributed systems, and phasic reentrant signaling.

Okay, so far so good. If this is what is meant by “feelings” and special neurobiological features, it seems to me they have simply stated the obvious: that humans are different kinds of creatures, but they haven’t done anything towards advancing the understanding of consciousness. They title it “Unlocking the "Mystery" of Consciousness but I don’t see what they have “unlocked”. I don’t see the “gap” they are trying to explain, or any explanation for consciousness we don’t already know. Did I miss something?

Yes, and again it is entirely understandable, I couldn't come up with a word to describe the impression I had of this article while I was reading it, and I still don't have one, incredibly concise(?) perhaps to a fault. I do not think the intent was to in any way actually further our understanding of consciousness, only to further our understanding of the study of consciousness.

I think a more accurate way to state their intent would be to say, "invalidating the need for mysterious causations in the production of consciousness."

Oh and one more thing to keep in mind, they are only talking about base consciousness, an organism's awareness of the environment, not the consciousness I believe you describe as an organism's awareness of self within its awareness of the environment, or what I call the I of mind.

FounDit
Posted: Tuesday, October 23, 2018 11:12:44 AM

Rank: Advanced Member

Joined: 9/19/2011
Posts: 12,233
Neurons: 60,694
Epiphileon wrote:
FounDit wrote:
Epiphileon,

I think I found what feels missing: it’s a definition for the word “feelings”. The authors say,
“The first factor is that consciousness and the creation of feelings are fundamentally grounded in general life functions. Just look at all the basic commonalities between life and feelings.” I don’t think they mean information conducted by the senses, but emotions instead, yet their “-otopic” maps seem to indicate sensations. It’s somewhat confusing. But “emotions” opens up a whole new bag of ideas and necessary contemplation and explanation relative to consciousness, IMO.

Yeah I completely understand why this is misleading. I don't remember what issue it was within the study of consciousness I had completely wrong for years because of the way it was talked about, but there was one. The problem is very few of the words we use to describe consciousness are specific to the study of consciousness. Even when a word is made specific to the issue researchers are forever quibbling over the minutia of the definition and so I think shy away from using it. Substitute qualia for feelings and that should straighten it out.

Then we get the second part: ” The second factor in our explanation of the ontological aspect of the explanatory gap is that to these general features are added numerous and neurobiologically unique special neurobiological features of complex nervous systems, especially of complex brains, that all together create consciousness.”

These special neurobiological features would have to be items such as directed imagination and creative thinking as a result of that directed imagination.

No, I am nearly certain that they are referring here to specific biological characteristics and not behavioral, when speaking of the brain/mind things like creative thinking are behaviors. I think they are talking about things like the unique architecture of more complex, (base consciousness capable), nervous systems like the existence of nested distributed systems, and phasic reentrant signaling.
I agree, and this was what I had in mind also since they refer to it as "special neurobiological" features. I interpreted that to mean the biological mechanisms (whatever they are) that permit directed imagination and its concomitant creative thinking.

Okay, so far so good. If this is what is meant by “feelings” and special neurobiological features, it seems to me they have simply stated the obvious: that humans are different kinds of creatures, but they haven’t done anything towards advancing the understanding of consciousness. They title it “Unlocking the "Mystery" of Consciousness but I don’t see what they have “unlocked”. I don’t see the “gap” they are trying to explain, or any explanation for consciousness we don’t already know. Did I miss something?

Yes, and again it is entirely understandable, I couldn't come up with a word to describe the impression I had of this article while I was reading it, and I still don't have one, incredibly concise(?) perhaps to a fault. I do not think the intent was to in any way actually further our understanding of consciousness, only to further our understanding of the study of consciousness.

I think a more accurate way to state their intent would be to say, "invalidating the need for mysterious causations in the production of consciousness."

Oh and one more thing to keep in mind, they are only talking about base consciousness, an organism's awareness of the environment, not the consciousness I believe you describe as an organism's awareness of self within its awareness of the environment, or what I call the I of mind.
But then, wouldn't that simply result in memory, and not true consciousness? Simple consciousness would tend to indicate little more than being awake and actively receiving input from the senses, no?

I tend to think of consciousness as that awareness of a discrete self within its environment, although for clarity's sake, it should really be called self-consciousness, or self-awareness, I suppose. We need clarity on our terms.

Epiphileon
Posted: Saturday, October 27, 2018 4:11:13 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
Previous Posts wrote:
These special neurobiological features would have to be items such as directed imagination and creative thinking as a result of that directed imagination.

No, I am nearly certain that they are referring here to specific biological characteristics and not behavioral, when speaking of the brain/mind things like creative thinking are behaviors. I think they are talking about things like the unique architecture of more complex, (base consciousness capable), nervous systems like the existence of nested distributed systems, and phasic reentrant signaling.

I agree, and this was what I had in mind also since they refer to it as "special neurobiological" features. I interpreted that to mean the biological mechanisms (whatever they are) that permit directed imagination and its concomitant creative thinking.

No they are only referring to an organisms ability to have an experience early in the article they point this out.
Quote:
We study primary consciousness, the most basic type of sensory experience. This is the ability to have any experience or feeling at all...


This was one of the other misconceptions I had when reading the literature for a number of years that is, that they were always talking about human-style consciousness, i.e. the I of mind; however, this is actually rarely the case. The so-called hard problem is that the brain generates any experience at all. Experience in the sense of an internal representation of the world and the sensations of the body, literally a virtual reality. If we had a detailed explanation for how this happens that would be the solution to the hard problem, and the further step to the inclusion of the I of mind would be no problem.
Quote:
We need clarity on our terms.

Here you go...
Wikipedia wrote:
Primary consciousness is a term the American biologist Gerald Edelman coined to describe the ability, found in humans and some animals, to integrate observed events with memory to create an awareness of the present and immediate past of the world around them. This form of consciousness is also sometimes called "sensory consciousness". Put another way, primary consciousness is the presence of various subjective sensory contents of consciousness such as sensations, perceptions, and mental images. For example, primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain. Thus, primary consciousness refers to being mentally aware of things in the world in the present without any sense of past and future; it is composed of mental images bound to a time around the measurable present.

Conversely, higher order consciousness can be described as being "conscious of being conscious"; it includes reflective thought, a concept of the past, and speculation about the future.


The demystifying that, in my opinion, they succeed at is in showing that there is no need to call upon mysterious forces such as supernatural or exotic physics, that consciousness can be explained fully with no such requirement.

Personally, I am not entirely certain how to classify this article, it seems like a science based, philosophical refutation of the notion that "woo" will be a necessary condition of any explanation of consciousness.
will
Posted: Saturday, October 27, 2018 8:20:13 AM
Rank: Advanced Member

Joined: 6/29/2009
Posts: 1,167
Neurons: 4,830
Epiphileon wrote:
Personally, I am not entirely certain how to classify this article…


A broad examination of the null hypothesis?

.
FounDit
Posted: Saturday, October 27, 2018 11:29:25 AM

Rank: Advanced Member

Joined: 9/19/2011
Posts: 12,233
Neurons: 60,694
Epiphileon wrote:
Previous Posts wrote:
These special neurobiological features would have to be items such as directed imagination and creative thinking as a result of that directed imagination.

No, I am nearly certain that they are referring here to specific biological characteristics and not behavioral, when speaking of the brain/mind things like creative thinking are behaviors. I think they are talking about things like the unique architecture of more complex, (base consciousness capable), nervous systems like the existence of nested distributed systems, and phasic reentrant signaling.

I agree, and this was what I had in mind also since they refer to it as "special neurobiological" features. I interpreted that to mean the biological mechanisms (whatever they are) that permit directed imagination and its concomitant creative thinking.

No they are only referring to an organisms ability to have an experience early in the article they point this out.
Quote:
We study primary consciousness, the most basic type of sensory experience. This is the ability to have any experience or feeling at all...


This was one of the other misconceptions I had when reading the literature for a number of years that is, that they were always talking about human-style consciousness, i.e. the I of mind; however, this is actually rarely the case. The so-called hard problem is that the brain generates any experience at all. Experience in the sense of an internal representation of the world and the sensations of the body, literally a virtual reality. If we had a detailed explanation for how this happens that would be the solution to the hard problem, and the further step to the inclusion of the I of mind would be no problem.
Quote:
We need clarity on our terms.

Here you go...
Wikipedia wrote:
Primary consciousness is a term the American biologist Gerald Edelman coined to describe the ability, found in humans and some animals, to integrate observed events with memory to create an awareness of the present and immediate past of the world around them. This form of consciousness is also sometimes called "sensory consciousness". Put another way, primary consciousness is the presence of various subjective sensory contents of consciousness such as sensations, perceptions, and mental images. For example, primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain. Thus, primary consciousness refers to being mentally aware of things in the world in the present without any sense of past and future; it is composed of mental images bound to a time around the measurable present.
This seems contradictory to me. The first part of the definition appears to me to result in nothing more than mere sensation and memory, not true "consciousness". It is the next part that seems to define more accurately the idea of "consciousness" when it mentions the "blueness of the ocean"; and not just a bird's song, or pain, but the ability to evaluate those on a personal level.

That cannot be accomplished by animals equipped with mere memory. It seems to me that requires the sense of a self and an ability to evaluate degrees of "blueness" and "pain". So this definition doesn't seem to fit basic "sensory consciousness".

Conversely, higher order consciousness can be described as being "conscious of being conscious"; it includes reflective thought, a concept of the past, and speculation about the future.

This appears to be the definition of consciousness as opposed to sensory consciousness, or memory.

The demystifying that, in my opinion, they succeed at is in showing that there is no need to call upon mysterious forces such as supernatural or exotic physics, that consciousness can be explained fully with no such requirement.

Personally, I am not entirely certain how to classify this article, it seems like a science based, philosophical refutation of the notion that "woo" will be a necessary condition of any explanation of consciousness.

I think the mystery can be solved once it is understood how a biological cell, or collection of cells, can create, or hold, an image, or what we understand as a mental image. What does a blue sky or a forest look like to brain cells in the occipital region of the brain, for example? I wonder if we can ever truly know.

So many questions. Can a single cell hold more than one image, or part of an image? How many cells does it take to hold an image? How many images and memories can a brain hold over a lifetime?...Think
Hope123
Posted: Monday, October 29, 2018 9:52:54 PM

Rank: Advanced Member

Joined: 3/23/2015
Posts: 8,925
Neurons: 51,153
Location: Burlington, Ontario, Canada
Epi, just getting back as promised. I've been on a self imposed break for a few days from the internet. The nastiness online was getting to me and spilling over.

Epi, as you've stated, the reason for the study is to show that all the theist ontological arguments for proof of a god that creates consciousness and all the quantum physics arguments that are supposed to be getting close to the solution of finding exactly what consciousness is, are not necessary, because all the physical ingredients are already there. And they list those ingredients not really telling us anything we don't already know, but putting it all together. Of course there are still many questions to be answered.

Since their focus is on Primary Consciousness, they equate mental images with neurobiologically unique subjective feelings associated with certain brain states. Therefore there must be a nervous and brain system that is complex enough to have a reward system, as in operant conditioning.

They say it has to be in an animal. But what if a machine were to be so efficient as to be like a human brain - would that state then be considered consciousness? Do machines even have a reward system? The big question would be - does the machine feel anything? Are feelings the only basis of consciousness?

As for the epistemic aspect - I for one am absolutely grateful that my brain does not have a feedback system telling me how it is working, and secondly that it is not connected to the brain of any other living creature!

Sounds like a good idea how to classify the argument, Will.
Epiphileon
Posted: Tuesday, October 30, 2018 4:48:20 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
FounDit wrote:

Quote:
We need clarity on our terms.

Epiphileon wrote:
Here you go...

[quoteWikipedia]Primary consciousness is a term the American biologist Gerald Edelman coined to describe the ability, found in humans and some animals, to integrate observed events with memory to create an awareness of the present and immediate past of the world around them. This form of consciousness is also sometimes called "sensory consciousness". Put another way, primary consciousness is the presence of various subjective sensory contents of consciousness such as sensations, perceptions, and mental images. For example, primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain. Thus, primary consciousness refers to being mentally aware of things in the world in the present without any sense of past and future; it is composed of mental images bound to a time around the measurable present.

This seems contradictory to me. The first part of the definition appears to me to result in nothing more than mere sensation and memory, not true "consciousness".
No this is not accurate, and I understand that it seems intuitive to you; however, rigorous science often forces us to adopt counter-intuitive positions in order to be able to parse a problem into addressable components. If what we were talking about here was only sensation and memory, then we could be talking about paramecium, which as a result of repeated shocks when turning left in a t-maze, will only turn right.

When you have a sufficiently complex nervous system that is building a representation of the external world on the basis of multiple sensory inputs, that is consciousness of the environment, without consciousness of self. Mice do not contemplate the nature of being, yet they are clearly conscious of the environment. The so-called hard problem is how is this representation of the world built from wetware. If we ever can answer that, then adding the consciousness of self into that explanation will be no great leap.


That, (primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain.), cannot be accomplished by animals equipped with mere memory. It seems to me that requires the sense of a self and an ability to evaluate degrees of "blueness" and "pain". So this definition doesn't seem to fit basic "sensory consciousness".
This is called sensory discrimination and does not require consciousness. Of course, in order to call it blueness requires language and probably consciousness but, it does not require awareness of self to make that discrimination, this has been demonstrated in countless experiments on animals that have no sense of self.

It is the next part that seems to define more accurately the idea of "consciousness" when it mentions the "blueness of the ocean"; and not just a bird's song, or pain, but the ability to evaluate those on a personal level.
When you say "evaluate on a personal level", you are talking about the consciousness of consciousness reflecting on an aspect of consciousness, the discrimination task has already occurred.

[quoteWikipedia]Conversely, higher order consciousness can be described as being "conscious of being conscious"; it includes reflective thought, a concept of the past, and speculation about the future.[/quote]
This appears to be the definition of consciousness as opposed to sensory consciousness, or memory.
This is the definition of the consciousness of consciousness

The demystifying that, in my opinion, they succeed at is in showing that there is no need to call upon mysterious forces such as supernatural or exotic physics, that consciousness can be explained fully with no such requirement."woo" will be a necessary condition of any explanation of consciousness.[/quote]
I think the mystery can be solved once it is understood how a biological cell, or collection of cells, can create, or hold, an image, or what we understand as a mental image. What does a blue sky or a forest look like to brain cells in the occipital region of the brain, for example? I wonder if we can ever truly know.
Contained in this statement is one of the biggest hurdles to coming to grips with any persons attempt to objectively contemplate consciousness and address the issue on a neuroscientific basis. It turns out that the use of the word image is extremely unfortunate as it implies something that does not literally exist, we do construct a virtual reality in the mind; however, there are no actual images involved. This is a fascinating issue and one that we could take up at another time but first, if we are to be able to continue talking about the issue from a shared frame of reference we must settle the issue addressed above.
[/quote]
FounDit
Posted: Tuesday, October 30, 2018 1:38:42 PM

Rank: Advanced Member

Joined: 9/19/2011
Posts: 12,233
Neurons: 60,694
Epiphileon wrote:
FounDit wrote:

Quote:
We need clarity on our terms.

Epiphileon wrote:
Here you go...

[quoteWikipedia]Primary consciousness is a term the American biologist Gerald Edelman coined to describe the ability, found in humans and some animals, to integrate observed events with memory to create an awareness of the present and immediate past of the world around them. This form of consciousness is also sometimes called "sensory consciousness". Put another way, primary consciousness is the presence of various subjective sensory contents of consciousness such as sensations, perceptions, and mental images. For example, primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain. Thus, primary consciousness refers to being mentally aware of things in the world in the present without any sense of past and future; it is composed of mental images bound to a time around the measurable present.

This seems contradictory to me. The first part of the definition appears to me to result in nothing more than mere sensation and memory, not true "consciousness".
No this is not accurate, and I understand that it seems intuitive to you; however, rigorous science often forces us to adopt counter-intuitive positions in order to be able to parse a problem into addressable components. If what we were talking about here was only sensation and memory, then we could be talking about paramecium, which as a result of repeated shocks when turning left in a t-maze, will only turn right.
But is that consciousness? It seems all that has been accomplished is reflexive training based on pain response.

When you have a sufficiently complex nervous system that is building a representation of the external world on the basis of multiple sensory inputs, that is consciousness of the environment, without consciousness of self. Mice do not contemplate the nature of being, yet they are clearly conscious of the environment. The so-called hard problem is how is this representation of the world built from wetware. If we ever can answer that, then adding the consciousness of self into that explanation will be no great leap.
Well, you are right in that it seems counter-intuitive because a mouse aware of its environment appears to me to simply be demonstrating memory. From it's birth, it begins to build a memory map of its environment through experience and exploration, but I wouldn't call this consciousness. So what are we calling "consciousness"?

That, (primary consciousness includes a person's experience of the blueness of the ocean, a bird's song, and the feeling of pain.), cannot be accomplished by animals equipped with mere memory. It seems to me that requires the sense of a self and an ability to evaluate degrees of "blueness" and "pain". So this definition doesn't seem to fit basic "sensory consciousness".
This is called sensory discrimination and does not require consciousness. Of course, in order to call it blueness requires language and probably consciousness but, it does not require awareness of self to make that discrimination, this has been demonstrated in countless experiments on animals that have no sense of self.
But discrimination between blueness still doesn't seem to be consciousness, but merely memory, again. Surely an experiment of reward and pain would demonstrate that, but I don't think it would demonstrate consciousness.

It is the next part that seems to define more accurately the idea of "consciousness" when it mentions the "blueness of the ocean"; and not just a bird's song, or pain, but the ability to evaluate those on a personal level.
When you say "evaluate on a personal level", you are talking about the consciousness of consciousness reflecting on an aspect of consciousness, the discrimination task has already occurred.

[quoteWikipedia]Conversely, higher order consciousness can be described as being "conscious of being conscious"; it includes reflective thought, a concept of the past, and speculation about the future.

This appears to be the definition of consciousness as opposed to sensory consciousness, or memory.
This is the definition of the consciousness of consciousness
Okay, now that we have a definition of consciousness, sensory memory doesn't seem to me to be accurately called by the term, "consciousness".

The demystifying that, in my opinion, they succeed at is in showing that there is no need to call upon mysterious forces such as supernatural or exotic physics, that consciousness can be explained fully with no such requirement."woo" will be a necessary condition of any explanation of consciousness.[/quote]
I think the mystery can be solved once it is understood how a biological cell, or collection of cells, can create, or hold, an image, or what we understand as a mental image. What does a blue sky or a forest look like to brain cells in the occipital region of the brain, for example? I wonder if we can ever truly know.
Contained in this statement is one of the biggest hurdles to coming to grips with any persons attempt to objectively contemplate consciousness and address the issue on a neuroscientific basis. It turns out that the use of the word image is extremely unfortunate as it implies something that does not literally exist, we do construct a virtual reality in the mind; however, there are no actual images involved. This is a fascinating issue and one that we could take up at another time but first, if we are to be able to continue talking about the issue from a shared frame of reference we must settle the issue addressed above.
[/quote][/quote]
"Image" is an unfortunate term, but I don't know what else to call it, since we tend to think in terms of "images". With no understanding of how brain cells can form "images" from chemical and electrical signals, it seems we are going to remain trapped in the position of looking for some mystery "thing".
Epiphileon
Posted: Wednesday, October 31, 2018 3:23:52 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
will wrote:
Epiphileon wrote:
Personally, I am not entirely certain how to classify this article…

A broad examination of the null hypothesis?


Good Morning Will, I am curious of your thinking on this, could you expand on it please?
Epiphileon
Posted: Wednesday, October 31, 2018 3:58:25 AM

Rank: Advanced Member

Joined: 3/22/2009
Posts: 4,200
Neurons: 164,996
Hope wrote:
Since their focus is on Primary Consciousness, they equate mental images with neurobiologically unique subjective feelings associated with certain brain states. Therefore there must be a nervous and brain system that is complex enough to have a reward system, as in operant conditioning.

High Hope, this is an interesting notion, I have never thought specifically about the role of conditioning in the development of consciousness; however, offhand I do not see the argument for it in the development of primary consciousness. I do see it easily being a necessary condition for the development of secondary consciousness though, that is the I of mind.

Quote:
They say it has to be in an animal. But what if a machine were to be so efficient as to be like a human brain - would that state then be considered consciousness?

Ah, the 64 million dollar question! And the answer, in my opinion, is definitely yes. Consciousness is a physical phenomenon if you replicate all the necessary conditions for its existence, then it will exist.

Quote:
The big question would be - does the machine feel anything? Are feelings the only basis of consciousness?

I'm going to have to give this more consideration, but I think their use of "feeling" was unfortunate. "What it feels like to be" has a very specific connotation, on the other hand, "feelings" is ambiguous and is to easily confused with emotional states. I would say that anything that has qualia is conscious.


Quote:
As for the epistemic aspect - I for one am absolutely grateful that my brain does not have a feedback system telling me how it is working, and secondly that it is not connected to the brain of any other living creature!

I am in emphatic agreement with the first point. I would think it would be an evolutionary impossibility in fact, as there would be no possibility of interacting with the environment.

On your second point, I'm not so sure, mind to mind communication would be incredible, although I would want to have control over what was communicated. This could be a cultural bias though, as in human culture if we had evolved with a form of telepathy that allowed for no secrets, complete transparency that is, the resultant society would certainly be different. I think I have seen this theme a few times in speculative fiction but I do not recall the specific stories.
Hope123
Posted: Wednesday, October 31, 2018 12:21:34 PM

Rank: Advanced Member

Joined: 3/23/2015
Posts: 8,925
Neurons: 51,153
Location: Burlington, Ontario, Canada
Epi,

According to the article, my understanding is that being able to achieve operant conditioning is just the prerequisite complex hardware necessary for primary consciousness. I'd have to go read it again.

Mind to mind - Having control over what is communicated is what we have now, no? Whistle

But I get your point that it would be nice to send a message brain to brain without having to text. etc. I was thinking not so much of people knowing everything I think, (I say most of it now lol) but I surely do not want other people's thoughts intruding in my brain. I have enough problems keeping my own thoughts organized. Whistle You are correct that it surely would be a different society if everyone's thoughts were transparent. Maybe a better society.

Just think that as you hear someone following you on a deserted street you would know if their intentions were harmful or not.

But then the problem of distance comes in - can you send a message across the oceans? How do you zero in on one person. etc. Fantasy fun and games.

I forget the details but I have seen experiments where when brains were connected physically with electrodes or whatever, a thought from one brain caused a limb to move in the other person. You may have already read about brain computer interfaces but here's a link I found interesting, although it is probably a digression from a "consciousness" definition.

https://www.economist.com/technology-quarterly/2018/01/06/how-brains-and-machines-can-be-made-to-work-together

I tend to think of consciousness as being human but if they ever get a machine with a brain that can do everything a human can I'd have to rethink it.

When I play duplicate bridge against the computer usually there is a given number of tricks possible in a hand. But if there is a chance for an extra trick, a human can beat the computer program because after playing a bit, they know exactly what the computer will do in certain circumstances every time. And in a couple of cases, the computer cannot make a distinction between situations where it is not always a good idea to follow the general rule. It follows the rule regardless where a human may not. The computer right now needs humans to program it but it is interesting what they are trying to do to get a "thinking" machine. We do live in interesting times.
Users browsing this topic
Guest


Forum Jump
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

Main Forum RSS : RSS
Forum Terms and Guidelines | Privacy policy | Copyright © 2008-2019 Farlex, Inc. All rights reserved.