Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

She ran the gamut of emotions from 'A' to 'B'. -- Dorothy Parker, on a Kate Hepburn performance


interests / talk.origins / Re: Evolution of consciousness

SubjectAuthor
* Evolution of consciousnessMark Isaak
+- Re: Evolution of consciousnessChris Thompson
+- Re: Evolution of consciousnessJTEM
+- Re: Evolution of consciousnessMartin Harran
+- Re: Evolution of consciousnessRichmond
+- Re: Evolution of consciousnesserik simpson
+- Re: Evolution of consciousnessKalkidas
+- Re: Evolution of consciousnessLDagget
`* Re: Evolution of consciousnessArkalen
 `* Re: Evolution of consciousnessMark Isaak
  `* Re: Evolution of consciousnessArkalen
   `* Re: Evolution of consciousnessMark Isaak
    `* Re: Evolution of consciousnessArkalen
     `- Re: Evolution of consciousnessArkalen

1
Evolution of consciousness

<v0paug$21fia$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10019&group=talk.origins#10019

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!newsfeed.xs3.de!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: specimen...@curioustaxon.omy.net (Mark Isaak)
Newsgroups: talk.origins
Subject: Evolution of consciousness
Date: Mon, 29 Apr 2024 16:36:45 -0700
Organization: A noiseless patient Spider
Lines: 28
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="86623"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:0F5fOPokQe9er+F/2fI2uXpBk3I=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id A0FAA22976C; Mon, 29 Apr 2024 19:36:24 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 7C26B229758
for <talk-origins@ediacara.org>; Mon, 29 Apr 2024 19:36:22 -0400 (EDT)
id 9662B7D128; Mon, 29 Apr 2024 23:36:50 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 738117D009
for <talk-origins@moderators.isc.org>; Mon, 29 Apr 2024 23:36:50 +0000 (UTC)
id C6B75DC01A9; Tue, 30 Apr 2024 01:36:48 +0200 (CEST)
X-Injection-Date: Tue, 30 Apr 2024 01:36:48 +0200 (CEST)
Content-Language: en-US
X-Auth-Sender: U2FsdGVkX1/A3SfLahbIyHVL5XlRm29oZdon5bLwouc=
 by: Mark Isaak - Mon, 29 Apr 2024 23:36 UTC

My views on the evolution of consciousness are starting to gel.

1. Rudimentary nervous systems evolve.
2. Brains evolve, capable of memory and of decisions other than reflex.
3. Those decisions probably work better if the brain has a model of the
world to work with. So such a model evolves.
4. Some creatures live socially. Their brains need a model of that
important aspect of the world: the fellow beings one lives with,
including how they think.
5. So we've now got a model of minds. How about if we apply it to *our
own mind*? That might make our thinking about interactions with others'
minds more efficient.
6. Viola! Consciousness!

Does that make sense to people? Is it time for me to write a book on
the subject? (Do you think publishers will want the book to be more than
106 words long?)

There's also the problem of testing it. I'm open to suggestions there,
too. Step 4 implies that the model of how we think need not agree with
how we think, much as the mental model of our world is flat, not
spherical. This has at least some confirmation (e.g., blindness to many
biases). More would be better.

--
Mark Isaak
"Wisdom begins when you discover the difference between 'That
doesn't make sense' and 'I don't understand.'" - Mary Doria Russell

Re: Evolution of consciousness

<v0pbd6$21hqq$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10020&group=talk.origins#10020

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!newsfeed.xs3.de!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: the_thom...@earthlink.net (Chris Thompson)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Mon, 29 Apr 2024 19:44:39 -0400
Organization: A noiseless patient Spider
Lines: 30
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v0pbd6$21hqq$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="86849"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
Firefox/91.0 SeaMonkey/2.53.18.2
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:oZYpoZf9fiNKGvr0SG4t7EqrsJM=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 90CD622976C; Mon, 29 Apr 2024 19:44:14 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 7309A229758
for <talk-origins@ediacara.org>; Mon, 29 Apr 2024 19:44:12 -0400 (EDT)
id A66DB7D128; Mon, 29 Apr 2024 23:44:40 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 87C587D009
for <talk-origins@moderators.isc.org>; Mon, 29 Apr 2024 23:44:40 +0000 (UTC)
id CB146DC01A9; Tue, 30 Apr 2024 01:44:38 +0200 (CEST)
X-Injection-Date: Tue, 30 Apr 2024 01:44:38 +0200 (CEST)
X-Auth-Sender: U2FsdGVkX189G1NM3gPG52D8qxQqpGf+x8DLU0s0koIZXZwqyGSrAYk9lQSf3kqu
In-Reply-To: <v0paug$21fia$1@dont-email.me>
 by: Chris Thompson - Mon, 29 Apr 2024 23:44 UTC

Mark Isaak wrote:
> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people?  Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more than
> 106 words long?)
>
> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to many
> biases). More would be better.
>

I especially like how you bring in string theory in number 6.

Chris

Re: Evolution of consciousness

<v0psuo$28vq3$2@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10024&group=talk.origins#10024

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!rocksolid2!i2pn.org!usenet.goja.nl.eu.org!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!earthli!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: jte...@gmail.com (JTEM)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 30 Apr 2024 00:44:08 -0400
Organization: Eek
Lines: 33
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v0psuo$28vq3$2@dont-email.me>
References: <v0paug$21fia$1@dont-email.me>
Reply-To: jtem01@gmail.com
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="98617"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:6mDxOAFCWs8eD40KvhwoUOHngWE=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 7A4F022976C; Tue, 30 Apr 2024 00:43:53 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTPS id 4F2AA229758
for <talk-origins@ediacara.org>; Tue, 30 Apr 2024 00:43:51 -0400 (EDT)
by moderators.individual.net (Exim 4.97)
for talk-origins@moderators.isc.org with esmtps (TLS1.3)
tls TLS_AES_256_GCM_SHA384
(envelope-from <news@eternal-september.org>)
id 1s1fLm-00000000g0n-3V2x; Tue, 30 Apr 2024 06:44:18 +0200
id 30E17DC01A9; Tue, 30 Apr 2024 06:44:09 +0200 (CEST)
X-Injection-Date: Tue, 30 Apr 2024 06:44:09 +0200 (CEST)
X-Auth-Sender: U2FsdGVkX19drlblHd/tOU3KR2EKdoAJc7XULF41ob0=
In-Reply-To: <v0paug$21fia$1@dont-email.me>
Content-Language: en-US
 by: JTEM - Tue, 30 Apr 2024 04:44 UTC

Mark Isaak wrote:

> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people?

At the present, I favor the "Quantum Mind" hypothesis:

https://en.wikipedia.org/wiki/Quantum_mind

I realize this shit is as much of more philosophy as it is
science, but that works both ways. There is no "Explanation"
for consciousness which is actual science.

--
https://jtem.tumblr.com/tagged/The%20Book%20of%20JTEM/page/5

Re: Evolution of consciousness

<q5h13jpa18432sm26smdflq3e0ok0hbhms@4ax.com>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10043&group=talk.origins#10043

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!newsfeed.bofh.team!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: martinha...@gmail.com (Martin Harran)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 30 Apr 2024 11:18:37 +0100
Organization: University of Ediacara
Lines: 34
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <q5h13jpa18432sm26smdflq3e0ok0hbhms@4ax.com>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="7612"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: ForteAgent/8.00.32.1272
To: talk-origins@moderators.individual.net
Cancel-Lock: sha1:MtG1660RbvyLH8tFHY5nhrs94Lg= sha256:Y2pRUfdPZQm/3I6TArve43CTHryL+dR1Nnu0fkO2Mn0=
Return-Path: <mod-submit@uni-berlin.de>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id B50FB22976C; Tue, 30 Apr 2024 06:18:30 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTPS id 784AE229758
for <talk-origins@ediacara.org>; Tue, 30 Apr 2024 06:18:28 -0400 (EDT)
by moderators.individual.net (Exim 4.97)
for talk-origins@moderators.individual.net with esmtps (TLS1.3)
tls TLS_AES_256_GCM_SHA384
(envelope-from <mod-submit@uni-berlin.de>)
id 1s1kZc-000000013PF-13r4; Tue, 30 Apr 2024 12:18:56 +0200
by outpost.zedat.fu-berlin.de (Exim 4.97)
for talk-origins@moderators.individual.net with esmtps (TLS1.3)
tls TLS_AES_256_GCM_SHA384
(envelope-from <mod-submit@uni-berlin.de>)
id 1s1kZM-00000002Sao-0ekG; Tue, 30 Apr 2024 12:18:40 +0200
by relay1.zedat.fu-berlin.de (Exim 4.97)
for talk-origins@moderators.individual.net with esmtps (TLS1.3)
tls TLS_AES_256_GCM_SHA384
(envelope-from <mod-submit@uni-berlin.de>)
id 1s1kZM-00000001vV3-0N28; Tue, 30 Apr 2024 12:18:40 +0200
for talk-origins@moderators.individual.net with local-bsmtp
(envelope-from <mod-submit@uni-berlin.de>)
id 1s1kZK-00000003ofe-3oCc; Tue, 30 Apr 2024 12:18:39 +0200
X-Path: individual.net!not-for-mail
X-Orig-X-Trace: individual.net 7dtyR7X2xzkOQZa6mBpwfgRXTQ0F/sRu5MjFFfkwcu6BdpA78B
X-Originating-IP: 130.133.4.5
X-ZEDAT-Hint: RO
 by: Martin Harran - Tue, 30 Apr 2024 10:18 UTC

On Mon, 29 Apr 2024 16:36:45 -0700, Mark Isaak
<specimenNOSPAM@curioustaxon.omy.net> wrote:

>My views on the evolution of consciousness are starting to gel.
>
>1. Rudimentary nervous systems evolve.
>2. Brains evolve, capable of memory and of decisions other than reflex.
>3. Those decisions probably work better if the brain has a model of the
>world to work with. So such a model evolves.
>4. Some creatures live socially. Their brains need a model of that
>important aspect of the world: the fellow beings one lives with,
>including how they think.
>5. So we've now got a model of minds. How about if we apply it to *our
>own mind*? That might make our thinking about interactions with others'
>minds more efficient.
>6. Viola! Consciousness!
>
>Does that make sense to people? Is it time for me to write a book on
>the subject? (Do you think publishers will want the book to be more than
>106 words long?)

Michael Tomasello has already done a lot of the work for you and it's
a bit more than 106 words.

https://www.amazon.co.uk/Evolution-Agency-Behavioral-Organization-Lizards-ebook/dp/B09N6M6HDY/ref=sr_1_1

>
>There's also the problem of testing it. I'm open to suggestions there,
>too. Step 4 implies that the model of how we think need not agree with
>how we think, much as the mental model of our world is flat, not
>spherical. This has at least some confirmation (e.g., blindness to many
>biases). More would be better.

Re: Evolution of consciousness

<8634r30y76.fsf@example.com>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10047&group=talk.origins#10047

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!newsfeed.bofh.team!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: dnomh...@gmx.com (Richmond)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 30 Apr 2024 14:36:29 +0100
Organization: Frantic
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <8634r30y76.fsf@example.com>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="21509"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/28.2 (gnu/linux)
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:9ehTdbMU/nCeCfXXDVJAFi+k6GQ= sha1:E1kNVHgPCMNmTgbztUkT4v42jdU=
Return-Path: <news@reader6.news.weretis.net>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id ED0C8229782; Tue, 30 Apr 2024 09:36:36 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id CFC31229765
for <talk-origins@ediacara.org>; Tue, 30 Apr 2024 09:36:34 -0400 (EDT)
id 779A87D128; Tue, 30 Apr 2024 13:36:34 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 581D27D009
for <talk-origins@moderators.isc.org>; Tue, 30 Apr 2024 13:36:34 +0000 (UTC)
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
key-exchange X25519 server-signature RSA-PSS (2048 bits) server-digest SHA256)
(No client certificate requested)
by pmx.weretis.net (Postfix) with ESMTPS id 693063E87A
for <talk-origins@moderators.isc.org>; Tue, 30 Apr 2024 15:36:30 +0200 (CEST)
id 3F18C3E860; Tue, 30 Apr 2024 15:36:30 +0200 (CEST)
X-User-ID: eJwFwYEBACAEBMCVkP8yTsj+I3SHRWVtJ+gYzDNLzDG8xo47FZEojoyUinp4WWfXWb2YFx8nehGS
 by: Richmond - Tue, 30 Apr 2024 13:36 UTC

Mark Isaak <specimenNOSPAM@curioustaxon.omy.net> writes:

> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of
> the world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with
> others' minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people? Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more
> than 106 words long?)
>
> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to
> many biases). More would be better.

You might want to read Nicholas Humphrey:

https://en.wikipedia.org/wiki/Nicholas_Humphrey

I see he did a lecture "How did consciousness evolve?"

https://www.youtube.com/watch?v=9QWaZp_2I1k

Back in the days when television wasn't aimed at vegetable based life, I
think he had a TV series about consciousness, but I have forgotten what
it was called.

Re: Evolution of consciousness

<f2055835-29c4-4c75-b4ee-eb50c84652dc@gmail.com>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10050&group=talk.origins#10050

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!newsfeed.bofh.team!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: eastside...@gmail.com (erik simpson)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 30 Apr 2024 08:34:53 -0700
Organization: University of Ediacara
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <f2055835-29c4-4c75-b4ee-eb50c84652dc@gmail.com>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="24404"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Return-Path: <eastside.erik@gmail.com>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id ADAC2229788; Tue, 30 Apr 2024 11:35:18 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTPS id 77927229787
for <talk-origins@ediacara.org>; Tue, 30 Apr 2024 11:35:16 -0400 (EDT)
by moderators.individual.net (Exim 4.97)
for talk-origins@moderators.isc.org with esmtps (TLS1.3)
tls TLS_AES_128_GCM_SHA256
(envelope-from <eastside.erik@gmail.com>)
id 1s1pVj-00000001Pl9-1pXe; Tue, 30 Apr 2024 17:35:15 +0200
for <talk-origins@moderators.isc.org>; Tue, 30 Apr 2024 08:34:57 -0700 (PDT)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=gmail.com; s=20230601; t=1714491296; x=1715096096; darn=moderators.isc.org;
h=content-transfer-encoding:in-reply-to:from:references:newsgroups:to
:content-language:subject:user-agent:mime-version:date:message-id
:from:to:cc:subject:date:message-id:reply-to;
bh=dchWomwAslwalRa2bnsd/hgeGlO79RVmY5fCzsLdzj4=;
b=iR/G5A11fFbKGSLlH+XgPVrqKYmX98S8K7FYoQzFMxysfidpU67DbmXXH+pAIH5MDa
cZN8iR1wP8sOmtRw8W5wpLN4DDRBZujnGjTKN8NIwho+13QyOOAfmsf6CLC6sNZFYEFj
nk4OMDHBZr01m4v9GRdamoNcycSd+SAKamWWwq8hY25XJ18xRQE1KS8pgYxWYLdL1XAP
U39OybrCjszRujq8sceBvVcZrpMp3Xzm8qCuh+qRtw6ptoKlWGnUT+aPLPQdxgjQYhtU
shq+v6cVPG6SAsR2HQlqrp9bigfkcaQCiCqsUq8/GaVXY3O1gl+ssFHRO/DCRjpGGb/9
YdwA==
X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed;
d=1e100.net; s=20230601; t=1714491296; x=1715096096;
h=content-transfer-encoding:in-reply-to:from:references:newsgroups:to
:content-language:subject:user-agent:mime-version:date:message-id
:x-gm-message-state:from:to:cc:subject:date:message-id:reply-to;
bh=dchWomwAslwalRa2bnsd/hgeGlO79RVmY5fCzsLdzj4=;
b=oLqLHuZmWXi3Ql0Vomhu7Eik0PHnxXRYyoahTEySSbXuy99sklwMfHLQcLNTHl6MZL
/nnT8tTUZYMtxfY8L4sLj/FUc2HRRcQ4kg4bwbq8kBDC3qjANAKA6h1hoYlooH8LZgmT
gL6BOkKU+codEmuBRvXc9Y96bNEHMsyzKpp1A+waXo6qk7/nnEBR3blY2rPmVoDV4N6F
T+bn5e/4TBLgvBdeq4OqSJAkHjATefrYYMRYIrMDp505AWTze2+OXaqoaQzoGY+2+sQ5
D+/47lSD+c8E3o/Ydji9z3nCOyXSg8yQsWO8q+Yi8mfR6TWK+ZA4g/XrqUse9++Doj2Y
bE2g==
X-Gm-Message-State: AOJu0YyJcm35RMiSeFUJpRAMZcgSZEP9+t+Vyz6z+xBQYQG2yWMiewLo
cq7IbPqOL/iBaTlHhK38+0MHzv9JZnnbNSg52R5/a4GBCVIFTttSE49b4Q==
X-Google-Smtp-Source: AGHT+IFX35xfrInTKs0Z6dOCq3lk+4SqmpoFhxDPaqnYdfpD+8nY0UsEnFQdyljjVlC9+UPSxB3QDg==
X-Received: by 2002:a05:6a20:9149:b0:1a3:c3e0:518c with SMTP id x9-20020a056a20914900b001a3c3e0518cmr211678pzc.52.1714491296133;
Tue, 30 Apr 2024 08:34:56 -0700 (PDT)
by smtp.gmail.com with ESMTPSA id u12-20020a056a00098c00b006f09d5807ebsm20753754pfg.82.2024.04.30.08.34.55
for <talk-origins@moderators.isc.org>
(version=TLS1_3 cipher=TLS_AES_128_GCM_SHA256 bits=128/128);
Tue, 30 Apr 2024 08:34:55 -0700 (PDT)
Content-Language: en-US
In-Reply-To: <v0paug$21fia$1@dont-email.me>
 by: erik simpson - Tue, 30 Apr 2024 15:34 UTC

On 4/29/24 4:36 PM, Mark Isaak wrote:
> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people?  Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more than
> 106 words long?)
>
> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to many
> biases). More would be better.
>
String theory aside, #5 sounds suspiciously like the evolution of navel
gazing.

Re: Evolution of consciousness

<v0tnip$383hs$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10066&group=talk.origins#10066

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!newsfeed.xs3.de!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: eat...@joes.pub (Kalkidas)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Wed, 1 May 2024 08:36:55 -0700
Organization: A noiseless patient Spider
Lines: 28
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v0tnip$383hs$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="62244"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:Ud7vS0wDxUmLtMeVD/NJKRYvFVY=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id DC0CF229782; Wed, 1 May 2024 11:37:00 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id BA686229765
for <talk-origins@ediacara.org>; Wed, 1 May 2024 11:36:58 -0400 (EDT)
id 6E3C47D12B; Wed, 1 May 2024 15:36:59 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 4DB107D129
for <talk-origins@moderators.isc.org>; Wed, 1 May 2024 15:36:59 +0000 (UTC)
id EE6A7DC01A9; Wed, 1 May 2024 17:36:57 +0200 (CEST)
X-Injection-Date: Wed, 01 May 2024 17:36:57 +0200 (CEST)
In-Reply-To: <v0paug$21fia$1@dont-email.me>
Content-Language: en-US
X-Auth-Sender: U2FsdGVkX1+WAnvAk3LyT9hdPZ+R68Ct4+66pDoSYC0=
 by: Kalkidas - Wed, 1 May 2024 15:36 UTC

On 4/29/2024 4:36 PM, Mark Isaak wrote:
> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people?  Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more than
> 106 words long?)
>
> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to many
> biases). More would be better.
>

As the famous evolutionist Professor Bullwinkle said: "Watch me pull a
rabbit out of my hat!"

Re: Evolution of consciousness

<673c254cce66897b8a108fcb25cf842a@www.novabbs.com>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10070&group=talk.origins#10070

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!newsfeed.xs3.de!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: j.nobel....@gmail.com (LDagget)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Thu, 2 May 2024 09:33:17 +0000
Organization: novaBBS
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <673c254cce66897b8a108fcb25cf842a@www.novabbs.com>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="91140"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Rocksolid Light
To: talk-origins@moderators.isc.org
Return-Path: <news@i2pn2.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 26DDB229782; Thu, 2 May 2024 05:36:30 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id F1D82229765
for <talk-origins@ediacara.org>; Thu, 2 May 2024 05:36:27 -0400 (EDT)
id 57E075DC2E; Thu, 2 May 2024 09:36:29 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay-1.kamens.us (Postfix) with ESMTP id 4BB425DC29
for <talk-origins@moderators.isc.org>; Thu, 2 May 2024 09:36:29 +0000 (UTC)
id 056DB598002; Thu, 2 May 2024 09:35:28 +0000 (UTC)
X-Injection-Info: ;
posting-account="qhH+T1yN3MnDfqdWP55TElwg8eoca+szEiV/R0PpX0Y";
X-Rslight-Site: $2y$10$OUM1loGvI25iHmRrztcs3evn/ouJRvUMgSlZT6.vxsVc3pIB86Tl.
X-Rslight-Posting-User: c5f6b781ff4ba2020b43295a6d215cc93d00a846
 by: LDagget - Thu, 2 May 2024 09:33 UTC

Mark Isaak wrote:

> My views on the evolution of consciousness are starting to gel.

> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!

> Does that make sense to people? Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more than
> 106 words long?)

> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to many
> biases). More would be better.

I believe I've already expanded on a similar theme.
Regards rudimentary nervous systems, there are questions of what one
means by rudimentary. I would suggest exploring some of the details of
chemotaxis. When doing so, it may be of some utility to examine the
language people use to describe what happens in publications, making
notes about the use of anthropomophized terms like "want" and "desire".
It's a good lead in to later steps where it is also useful to explore
a root understanding of what such terms actually mean.
I'm also a big fan of the egg laying dance of California aplysia because
there's some fairly good research detail on chemical effectors of
somewhat complex behavior. It can be tied into chemotaxis as a documented
pathway of behavior.
Jumping up to 3, there are some thought provoking aspects of mental
models involving vision. Some things to consider: the extent of "hard
wired" neuronal pathways and more plastic aspects. Brains can adjust
to re-right their internal model if exposed to inverted images. And
there a wealth of consideration to be had over the nature of optical
illusions and what they reveal about translation of the direct sense
of vision and our perception of those discrete inputs.

I would recommend that developing a foundation built upon the above,
which can be a solid evidentiary foundation, provides a good framework
from which to extrapolate interpretations of other somewhat fuzzier
phenomena. (I'm also neglecting lots of other empirical observations
that are revealing, including modern brain injury studies backed
up by very detailed neuroanatomy, specific neuropharmacology, and
dynamic imaging studies).

Or you could go for a graphic novel.

Re: Evolution of consciousness

<v1040r$3s8fe$2@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10076&group=talk.origins#10076

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!usenet.network!eternal-september.org!feeder3.eternal-september.org!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!earthli!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: arka...@proton.me (Arkalen)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Thu, 2 May 2024 15:21:31 +0200
Organization: A noiseless patient Spider
Lines: 30
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v1040r$3s8fe$2@dont-email.me>
References: <v0paug$21fia$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="96646"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.14.0
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:Siv9tBtyNokhoEkeIxrFdoGfQa4=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 92CB6229782; Thu, 2 May 2024 09:21:33 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 6D76B229765
for <talk-origins@ediacara.org>; Thu, 2 May 2024 09:21:31 -0400 (EDT)
id 6B1C47D129; Thu, 2 May 2024 13:21:33 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 47D3B7D11E
for <talk-origins@moderators.isc.org>; Thu, 2 May 2024 13:21:33 +0000 (UTC)
id 286C4DC01A9; Thu, 2 May 2024 15:21:32 +0200 (CEST)
X-Injection-Date: Thu, 02 May 2024 15:21:32 +0200 (CEST)
In-Reply-To: <v0paug$21fia$1@dont-email.me>
X-Auth-Sender: U2FsdGVkX19YQ2J0L8GKK7Oy7peFUpoGm09N+4+vlP0=
Content-Language: en-US
 by: Arkalen - Thu, 2 May 2024 13:21 UTC

On 30/04/2024 01:36, Mark Isaak wrote:
> My views on the evolution of consciousness are starting to gel.
>
> 1. Rudimentary nervous systems evolve.
> 2. Brains evolve, capable of memory and of decisions other than reflex.
> 3. Those decisions probably work better if the brain has a model of the
> world to work with. So such a model evolves.
> 4. Some creatures live socially. Their brains need a model of that
> important aspect of the world: the fellow beings one lives with,
> including how they think.
> 5. So we've now got a model of minds. How about if we apply it to *our
> own mind*? That might make our thinking about interactions with others'
> minds more efficient.
> 6. Viola! Consciousness!
>
> Does that make sense to people?  Is it time for me to write a book on
> the subject? (Do you think publishers will want the book to be more than
> 106 words long?)
>
> There's also the problem of testing it. I'm open to suggestions there,
> too. Step 4 implies that the model of how we think need not agree with
> how we think, much as the mental model of our world is flat, not
> spherical. This has at least some confirmation (e.g., blindness to many
> biases). More would be better.
>

Have you seen my thread on Michael Tomasello's "The Evolution of
Agency"? I think the book would interest you. If you want more detail I
have a post somewhere in that thread summarizing its arguments, I'd be
happy to hear your take.

Re: Evolution of consciousness

<v10h12$3vcs7$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10084&group=talk.origins#10084

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!usenet.network!eternal-september.org!feeder3.eternal-september.org!newsfeed.xs3.de!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: specimen...@curioustaxon.omy.net (Mark Isaak)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Thu, 2 May 2024 10:03:27 -0700
Organization: A noiseless patient Spider
Lines: 42
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v10h12$3vcs7$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me> <v1040r$3s8fe$2@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="2070"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:XGFXkBcuKO9r+2q+inbXt9iu6ko=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 03058229782; Thu, 2 May 2024 13:03:34 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id DC19D229765
for <talk-origins@ediacara.org>; Thu, 2 May 2024 13:03:31 -0400 (EDT)
id A0BD07D129; Thu, 2 May 2024 17:03:33 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 7D4FE7D11E
for <talk-origins@moderators.isc.org>; Thu, 2 May 2024 17:03:33 +0000 (UTC)
id 5E813DC01A9; Thu, 2 May 2024 19:03:31 +0200 (CEST)
X-Injection-Date: Thu, 02 May 2024 19:03:31 +0200 (CEST)
X-Auth-Sender: U2FsdGVkX1/yARiiJlsirLI1Fr/XGa0Pl6YwNAZqYaQ=
Content-Language: en-US
In-Reply-To: <v1040r$3s8fe$2@dont-email.me>
 by: Mark Isaak - Thu, 2 May 2024 17:03 UTC

On 5/2/24 6:21 AM, Arkalen wrote:
> On 30/04/2024 01:36, Mark Isaak wrote:
>> My views on the evolution of consciousness are starting to gel.
>>
>> 1. Rudimentary nervous systems evolve.
>> 2. Brains evolve, capable of memory and of decisions other than reflex.
>> 3. Those decisions probably work better if the brain has a model of
>> the world to work with. So such a model evolves.
>> 4. Some creatures live socially. Their brains need a model of that
>> important aspect of the world: the fellow beings one lives with,
>> including how they think.
>> 5. So we've now got a model of minds. How about if we apply it to *our
>> own mind*? That might make our thinking about interactions with
>> others' minds more efficient.
>> 6. Viola! Consciousness!
>>
>> Does that make sense to people?  Is it time for me to write a book on
>> the subject? (Do you think publishers will want the book to be more
>> than 106 words long?)
>>
>> There's also the problem of testing it. I'm open to suggestions there,
>> too. Step 4 implies that the model of how we think need not agree with
>> how we think, much as the mental model of our world is flat, not
>> spherical. This has at least some confirmation (e.g., blindness to
>> many biases). More would be better.
>
> Have you seen my thread on Michael Tomasello's "The Evolution of
> Agency"? I think the book would interest you. If you want more detail I
> have a post somewhere in that thread summarizing its arguments, I'd be
> happy to hear your take.

I have seen it, but I don't remember particular points.

I just came across reference to another book by Michael S.S. Graziano,
_Consciousness and the Social Brain_, which appears to make an argument
similar to mine above (particularly steps 4 and 5).

--
Mark Isaak
"Wisdom begins when you discover the difference between 'That
doesn't make sense' and 'I don't understand.'" - Mary Doria Russell

Re: Evolution of consciousness

<v139v0$m8nm$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10105&group=talk.origins#10105

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: arka...@proton.me (Arkalen)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Fri, 3 May 2024 20:21:17 +0200
Organization: A noiseless patient Spider
Lines: 140
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v139v0$m8nm$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me> <v1040r$3s8fe$2@dont-email.me>
<v10h12$3vcs7$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="41737"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.14.0
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:RKu/MqT7ddseHHpozVN4cCOZGdg=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 86183229782; Fri, 3 May 2024 14:21:31 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTPS id 4CE6E229765
for <talk-origins@ediacara.org>; Fri, 3 May 2024 14:21:29 -0400 (EDT)
by moderators.individual.net (Exim 4.97)
for talk-origins@moderators.isc.org with esmtps (TLS1.3)
tls TLS_AES_256_GCM_SHA384
(envelope-from <news@eternal-september.org>)
id 1s2xXH-00000002Hwl-2Ui8; Fri, 03 May 2024 20:21:31 +0200
id C288CDC01A9; Fri, 3 May 2024 20:21:20 +0200 (CEST)
X-Injection-Date: Fri, 03 May 2024 20:21:20 +0200 (CEST)
X-Auth-Sender: U2FsdGVkX1/3YlbYva/9v0iDQNZTeIw2f5ZOj2+YRL4=
In-Reply-To: <v10h12$3vcs7$1@dont-email.me>
Content-Language: en-US
 by: Arkalen - Fri, 3 May 2024 18:21 UTC

On 02/05/2024 19:03, Mark Isaak wrote:
> On 5/2/24 6:21 AM, Arkalen wrote:
>> On 30/04/2024 01:36, Mark Isaak wrote:
>>> My views on the evolution of consciousness are starting to gel.
>>>
>>> 1. Rudimentary nervous systems evolve.
>>> 2. Brains evolve, capable of memory and of decisions other than reflex.
>>> 3. Those decisions probably work better if the brain has a model of
>>> the world to work with. So such a model evolves.
>>> 4. Some creatures live socially. Their brains need a model of that
>>> important aspect of the world: the fellow beings one lives with,
>>> including how they think.
>>> 5. So we've now got a model of minds. How about if we apply it to
>>> *our own mind*? That might make our thinking about interactions with
>>> others' minds more efficient.
>>> 6. Viola! Consciousness!
>>>
>>> Does that make sense to people?  Is it time for me to write a book on
>>> the subject? (Do you think publishers will want the book to be more
>>> than 106 words long?)
>>>
>>> There's also the problem of testing it. I'm open to suggestions
>>> there, too. Step 4 implies that the model of how we think need not
>>> agree with how we think, much as the mental model of our world is
>>> flat, not spherical. This has at least some confirmation (e.g.,
>>> blindness to many biases). More would be better.
>>
>> Have you seen my thread on Michael Tomasello's "The Evolution of
>> Agency"? I think the book would interest you. If you want more detail
>> I have a post somewhere in that thread summarizing its arguments, I'd
>> be happy to hear your take.
>
> I have seen it, but I don't remember particular points.
>
> I just came across reference to another book by Michael S.S. Graziano,
> _Consciousness and the Social Brain_, which appears to make an argument
> similar to mine above (particularly steps 4 and 5).
>

Basically (if you don't mind me going on about it again) he proposes a
scheme similar to what you did but more specific, fleshed-out and (IMO)
convincing. It revolves around the notion of "agents" or "agency" which
Tomasello defines as a system that achieves goals via a feedback-control
mechanisms where the system perceives aspects of the environment,
compares them to the desired goal, engages in behaviors meant to bring
it closer to the goal, checks the environment again, and loops this way
until the goal is achieved.

His parallels to your steps might be:

1) rudimentary nervous systems evolve that coordinate perception with
behavior on a stimulus-response basis but not the feedback-control
system involved in true agency.

2) brains evolve that do implement such a feedback-control system [I'm
not sure in the book he explicitly associates it with brains, but he
does associate it with vertebrates which do have distinct brains as a
feature so I'll say it's close enough for a paraphrase]

He doesn't have a parallel to your step "3" because models of the world
are implicit in all of the cognitive models he presents, in fact the
differences in he calls "experiential niches" (which could be thought of
as "world models") are pretty important. So for example he points out
that with agency comes the mechanism of *attention* (i.e. you orient
your perceptions in specific ways depending on what goals you're working
towards and where you're currently at in working towards them) which
implies experiences of an outside world and internal states that are or
aren't in sync, full of things that are relevant/irrelevant, good/bad etc.

4) He does bring in social living as a possible cause of his next step
in the evolution of agency that he sets at early mammals: the appearance
of a feedback-control system applied on top of the previous one to
monitor and control the goal-seeking process itself (he sees social
living as a driver for this because of the competition between peers
would induce a benefit in more flexible, efficient decision-making).
These early mammals would be able to not only perceive the world, pick a
behavior to fulfill a goal and shut everything down in case of danger
(as he describes lizards doing), but mentally play out possible
behaviors and flexibly inhibit some in favor of others depending on
which they anticipate working out best. This would introduce into the
"world model" or "experiential niche" notions of goals, behaviors and
cause-and-effect relationships between the two. I don't think he
introduces models of other *minds* at this step per se although it's a
bit like world models - they're implicit in several steps it's more of a
question of what aspect of minds is being modelled.

5) I do think there is still some similarity between your 5 and the next
level of agency Tomasello suggests, although he sets it at great apes
and you seem to set it at humans (then again many would argue great apes
are conscious and I don't think Tomasello would disagree). He proposes
an extra metacognitive feedback-control system monitoring the lower ones
allowing control not only over the behaviors taken in service of a goal
but of the goals themselves, and an understanding of cause-and-effect in
general and not only as concerns one's own actions. It also induces an
understanding of others as being agents with goals they behave in
service of.

6) While he does think of 5 as the ability to reason and I'm pretty sure
would call it "consciousness" he does have 2 other steps separating
humans from that, which involve collective agency. He proposes the
critical difference between humans and other great apes is the ability
to coordinate as part of a group that itself fits the criteria for being
an agent - with collective goals, the ability to monitor their
completion and act and self-regulate in service of them. He sees this as
coming in two parts, first the ability to coordinate pairwise to achieve
specific tasks (somewhere in hominid evolution - he gives several
examples illustrating how strikingly worse chimpanzees are at basic
cooperation than even human children) and then the ability to function
as part of a larger community with shared norms that allow coordination
with strangers (which he sets early in the evolution of our own
species). He talks about this inducing a kind of triple mental model of
agency, the "self" agent (the individual's goals, parallel to the sense
of agency of other great apes), the "role" agent (the goals implied by
one's role in some collective enterprise) and the "collective" agent
(the goals of the collective enterprise itself). He then talks about how
various aspects of our experience like culture, morality etc follow from
that.

I think it's interesting how this suggests a difference between having a
model of one's own mind, having a model of others' minds, and having a
model of *mind in general* that's then applied to oneself and others.
"Models of the world" and "models of the mind" really collapses a lot of
functionality and variability and I think Tomasello's model does a
better job of separating out different potential strands and honing in
on those that actually account for how we resemble and differ from other
species.

I also like how this model justifies that the last step, and only the
last step, is truly self-reflective. All the other steps involve taking
a system at a certain level of agency and adding a monitoring/control
level, resulting in a system that's aware of itself *as a system of the
lower level*. That last step is the only one that involves the system
monitoring/controlling a level *above* itself, and indeed being able to
monitor/control any arbitrary system of agency at all (given any
combination of humans can display collective agency and a human can be
part of multiple collective agencies at any given time). Meaning the
recursion ends there, it's the only agent model that can model itself as
being the level it is.

Re: Evolution of consciousness

<v1aqer$2k360$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10146&group=talk.origins#10146

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!usenet.goja.nl.eu.org!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: specimen...@curioustaxon.omy.net (Mark Isaak)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Mon, 6 May 2024 07:45:46 -0700
Organization: A noiseless patient Spider
Lines: 153
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v1aqer$2k360$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me> <v1040r$3s8fe$2@dont-email.me>
<v10h12$3vcs7$1@dont-email.me> <v139v0$m8nm$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="95617"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla Thunderbird
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:Dc/tPgn6IzfUX8v3ckyCQHTJan8=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 816A0229786; Mon, 06 May 2024 10:45:53 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 65A71229767
for <talk-origins@ediacara.org>; Mon, 06 May 2024 10:45:51 -0400 (EDT)
id 3C13D7D11E; Mon, 6 May 2024 14:45:53 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 1D0677D009
for <talk-origins@moderators.isc.org>; Mon, 6 May 2024 14:45:53 +0000 (UTC)
id DCA1DDC01A9; Mon, 6 May 2024 16:45:48 +0200 (CEST)
X-Injection-Date: Mon, 06 May 2024 16:45:48 +0200 (CEST)
X-Auth-Sender: U2FsdGVkX18p3GrnX9WeMAiWCudT4rl54bfSkcIagm8=
Content-Language: en-US
In-Reply-To: <v139v0$m8nm$1@dont-email.me>
 by: Mark Isaak - Mon, 6 May 2024 14:45 UTC

On 5/3/24 11:21 AM, Arkalen wrote:
> On 02/05/2024 19:03, Mark Isaak wrote:
>> On 5/2/24 6:21 AM, Arkalen wrote:
>>> On 30/04/2024 01:36, Mark Isaak wrote:
>>>> My views on the evolution of consciousness are starting to gel.
>>>>
>>>> 1. Rudimentary nervous systems evolve.
>>>> 2. Brains evolve, capable of memory and of decisions other than reflex.
>>>> 3. Those decisions probably work better if the brain has a model of
>>>> the world to work with. So such a model evolves.
>>>> 4. Some creatures live socially. Their brains need a model of that
>>>> important aspect of the world: the fellow beings one lives with,
>>>> including how they think.
>>>> 5. So we've now got a model of minds. How about if we apply it to
>>>> *our own mind*? That might make our thinking about interactions with
>>>> others' minds more efficient.
>>>> 6. Viola! Consciousness!
>>>>
>>>> Does that make sense to people?  Is it time for me to write a book
>>>> on the subject? (Do you think publishers will want the book to be
>>>> more than 106 words long?)
>>>>
>>>> There's also the problem of testing it. I'm open to suggestions
>>>> there, too. Step 4 implies that the model of how we think need not
>>>> agree with how we think, much as the mental model of our world is
>>>> flat, not spherical. This has at least some confirmation (e.g.,
>>>> blindness to many biases). More would be better.
>>>
>>> Have you seen my thread on Michael Tomasello's "The Evolution of
>>> Agency"? I think the book would interest you. If you want more detail
>>> I have a post somewhere in that thread summarizing its arguments, I'd
>>> be happy to hear your take.
>>
>> I have seen it, but I don't remember particular points.
>>
>> I just came across reference to another book by Michael S.S. Graziano,
>> _Consciousness and the Social Brain_, which appears to make an
>> argument similar to mine above (particularly steps 4 and 5).
>>
>
> Basically (if you don't mind me going on about it again) he proposes a
> scheme similar to what you did but more specific, fleshed-out and (IMO)
> convincing. It revolves around the notion of "agents" or "agency" which
> Tomasello defines as a system that achieves goals via a feedback-control
> mechanisms where the system perceives aspects of the environment,
> compares them to the desired goal, engages in behaviors meant to bring
> it closer to the goal, checks the environment again, and loops this way
> until the goal is achieved.
>
> His parallels to your steps might be:
>
> 1) rudimentary nervous systems evolve that coordinate perception with
> behavior on a stimulus-response basis but not the feedback-control
> system involved in true agency.
>
> 2) brains evolve that do implement such a feedback-control system [I'm
> not sure in the book he explicitly associates it with brains, but he
> does associate it with vertebrates which do have distinct brains as a
> feature so I'll say it's close enough for a paraphrase]
>
> He doesn't have a parallel to your step "3" because models of the world
> are implicit in all of the cognitive models he presents, in fact the
> differences in he calls "experiential niches" (which could be thought of
> as "world models") are pretty important. So for example he points out
> that with agency comes the mechanism of *attention* (i.e. you orient
> your perceptions in specific ways depending on what goals you're working
> towards and where you're currently at in working towards them) which
> implies experiences of an outside world and internal states that are or
> aren't in sync, full of things that are relevant/irrelevant, good/bad etc.
>
> 4) He does bring in social living as a possible cause of his next step
> in the evolution of agency that he sets at early mammals: the appearance
> of a feedback-control system applied on top of the previous one to
> monitor and control the goal-seeking process itself (he sees social
> living as a driver for this because of the competition between peers
> would induce a benefit in more flexible, efficient decision-making).
> These early mammals would be able to not only perceive the world, pick a
> behavior to fulfill a goal and shut everything down in case of danger
> (as he describes lizards doing), but mentally play out possible
> behaviors and flexibly inhibit some in favor of others depending on
> which they anticipate working out best. This would introduce into the
> "world model" or "experiential niche" notions of goals, behaviors and
> cause-and-effect relationships between the two. I don't think he
> introduces models of other *minds* at this step per se although it's a
> bit like world models - they're implicit in several steps it's more of a
> question of what aspect of minds is being modelled.
>
> 5) I do think there is still some similarity between your 5 and the next
> level of agency Tomasello suggests, although he sets it at great apes
> and you seem to set it at humans (then again many would argue great apes
> are conscious and I don't think Tomasello would disagree). He proposes
> an extra metacognitive feedback-control system monitoring the lower ones
> allowing control not only over the behaviors taken in service of a goal
> but of the goals themselves, and an understanding of cause-and-effect in
> general and not only as concerns one's own actions. It also induces an
> understanding of others as being agents with goals they behave in
> service of.
>
> 6) While he does think of 5 as the ability to reason and I'm pretty sure
> would call it "consciousness" he does have 2 other steps separating
> humans from that, which involve collective agency. He proposes the
> critical difference between humans and other great apes is the ability
> to coordinate as part of a group that itself fits the criteria for being
> an agent - with collective goals, the ability to monitor their
> completion and act and self-regulate in service of them. He sees this as
> coming in two parts, first the ability to coordinate pairwise to achieve
> specific tasks (somewhere in hominid evolution - he gives several
> examples illustrating how strikingly worse chimpanzees are at basic
> cooperation than even human children) and then the ability to function
> as part of a larger community with shared norms that allow coordination
> with strangers (which he sets early in the evolution of our own
> species). He talks about this inducing a kind of triple mental model of
> agency, the "self" agent (the individual's goals, parallel to the sense
> of agency of other great apes), the "role" agent (the goals implied by
> one's role in some collective enterprise) and the "collective" agent
> (the goals of the collective enterprise itself). He then talks about how
> various aspects of our experience like culture, morality etc follow from
> that.
>
>
> I think it's interesting how this suggests a difference between having a
> model of one's own mind, having a model of others' minds, and having a
> model of *mind in general* that's then applied to oneself and others.
> "Models of the world" and "models of the mind" really collapses a lot of
> functionality and variability and I think Tomasello's model does a
> better job of separating out different potential strands and honing in
> on those that actually account for how we resemble and differ from other
> species.
>
>
> I also like how this model justifies that the last step, and only the
> last step, is truly self-reflective. All the other steps involve taking
> a system at a certain level of agency and adding a monitoring/control
> level, resulting in a system that's aware of itself *as a system of the
> lower level*. That last step is the only one that involves the system
> monitoring/controlling a level *above* itself, and indeed being able to
> monitor/control any arbitrary system of agency at all (given any
> combination of humans can display collective agency and a human can be
> part of multiple collective agencies at any given time). Meaning the
> recursion ends there, it's the only agent model that can model itself as
> being the level it is.

Does he give a definition of consciousness? It sounds like he sees the
"collective" agent as an essential part of it. I don't doubt that it is
essential for humanity's achievements, but I'm not convinced it is
necessary for consciousness. I still like my definition of consciousness
as having a mental model of one's own mind.


Click here to read the complete article
Re: Evolution of consciousness

<v1chm2$33raf$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10158&group=talk.origins#10158

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!newsfeed.bofh.team!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: arka...@proton.me (Arkalen)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 7 May 2024 08:28:17 +0200
Organization: A noiseless patient Spider
Lines: 181
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v1chm2$33raf$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me> <v1040r$3s8fe$2@dont-email.me>
<v10h12$3vcs7$1@dont-email.me> <v139v0$m8nm$1@dont-email.me>
<v1aqer$2k360$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="21179"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.14.0
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:x5c513x8kkPkTGgqza7xvJil+7g=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 75042229786; Tue, 07 May 2024 02:28:21 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 45804229767
for <talk-origins@ediacara.org>; Tue, 07 May 2024 02:28:19 -0400 (EDT)
id AD6747D11E; Tue, 7 May 2024 06:28:21 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay.zaccari.net (Postfix) with ESMTP id 8A6297D009
for <talk-origins@moderators.isc.org>; Tue, 7 May 2024 06:28:21 +0000 (UTC)
id 50F38DC01A9; Tue, 7 May 2024 08:28:18 +0200 (CEST)
X-Injection-Date: Tue, 07 May 2024 08:28:18 +0200 (CEST)
Content-Language: en-US
In-Reply-To: <v1aqer$2k360$1@dont-email.me>
X-Auth-Sender: U2FsdGVkX19Q6VBmp5ds62pvO9XVZuSZc65DdkFomLE=
 by: Arkalen - Tue, 7 May 2024 06:28 UTC

On 06/05/2024 16:45, Mark Isaak wrote:
> On 5/3/24 11:21 AM, Arkalen wrote:
>> On 02/05/2024 19:03, Mark Isaak wrote:
>>> On 5/2/24 6:21 AM, Arkalen wrote:
>>>> On 30/04/2024 01:36, Mark Isaak wrote:
>>>>> My views on the evolution of consciousness are starting to gel.
>>>>>
>>>>> 1. Rudimentary nervous systems evolve.
>>>>> 2. Brains evolve, capable of memory and of decisions other than
>>>>> reflex.
>>>>> 3. Those decisions probably work better if the brain has a model of
>>>>> the world to work with. So such a model evolves.
>>>>> 4. Some creatures live socially. Their brains need a model of that
>>>>> important aspect of the world: the fellow beings one lives with,
>>>>> including how they think.
>>>>> 5. So we've now got a model of minds. How about if we apply it to
>>>>> *our own mind*? That might make our thinking about interactions
>>>>> with others' minds more efficient.
>>>>> 6. Viola! Consciousness!
>>>>>
>>>>> Does that make sense to people?  Is it time for me to write a book
>>>>> on the subject? (Do you think publishers will want the book to be
>>>>> more than 106 words long?)
>>>>>
>>>>> There's also the problem of testing it. I'm open to suggestions
>>>>> there, too. Step 4 implies that the model of how we think need not
>>>>> agree with how we think, much as the mental model of our world is
>>>>> flat, not spherical. This has at least some confirmation (e.g.,
>>>>> blindness to many biases). More would be better.
>>>>
>>>> Have you seen my thread on Michael Tomasello's "The Evolution of
>>>> Agency"? I think the book would interest you. If you want more
>>>> detail I have a post somewhere in that thread summarizing its
>>>> arguments, I'd be happy to hear your take.
>>>
>>> I have seen it, but I don't remember particular points.
>>>
>>> I just came across reference to another book by Michael S.S.
>>> Graziano, _Consciousness and the Social Brain_, which appears to make
>>> an argument similar to mine above (particularly steps 4 and 5).
>>>
>>
>> Basically (if you don't mind me going on about it again) he proposes a
>> scheme similar to what you did but more specific, fleshed-out and
>> (IMO) convincing. It revolves around the notion of "agents" or
>> "agency" which Tomasello defines as a system that achieves goals via a
>> feedback-control mechanisms where the system perceives aspects of the
>> environment, compares them to the desired goal, engages in behaviors
>> meant to bring it closer to the goal, checks the environment again,
>> and loops this way until the goal is achieved.
>>
>> His parallels to your steps might be:
>>
>> 1) rudimentary nervous systems evolve that coordinate perception with
>> behavior on a stimulus-response basis but not the feedback-control
>> system involved in true agency.
>>
>> 2) brains evolve that do implement such a feedback-control system [I'm
>> not sure in the book he explicitly associates it with brains, but he
>> does associate it with vertebrates which do have distinct brains as a
>> feature so I'll say it's close enough for a paraphrase]
>>
>> He doesn't have a parallel to your step "3" because models of the
>> world are implicit in all of the cognitive models he presents, in fact
>> the differences in he calls "experiential niches" (which could be
>> thought of as "world models") are pretty important. So for example he
>> points out that with agency comes the mechanism of *attention* (i.e.
>> you orient your perceptions in specific ways depending on what goals
>> you're working towards and where you're currently at in working
>> towards them) which implies experiences of an outside world and
>> internal states that are or aren't in sync, full of things that are
>> relevant/irrelevant, good/bad etc.
>>
>> 4) He does bring in social living as a possible cause of his next step
>> in the evolution of agency that he sets at early mammals: the
>> appearance of a feedback-control system applied on top of the previous
>> one to monitor and control the goal-seeking process itself (he sees
>> social living as a driver for this because of the competition between
>> peers would induce a benefit in more flexible, efficient
>> decision-making). These early mammals would be able to not only
>> perceive the world, pick a behavior to fulfill a goal and shut
>> everything down in case of danger (as he describes lizards doing), but
>> mentally play out possible behaviors and flexibly inhibit some in
>> favor of others depending on which they anticipate working out best.
>> This would introduce into the "world model" or "experiential niche"
>> notions of goals, behaviors and cause-and-effect relationships between
>> the two. I don't think he introduces models of other *minds* at this
>> step per se although it's a bit like world models - they're implicit
>> in several steps it's more of a question of what aspect of minds is
>> being modelled.
>>
>> 5) I do think there is still some similarity between your 5 and the
>> next level of agency Tomasello suggests, although he sets it at great
>> apes and you seem to set it at humans (then again many would argue
>> great apes are conscious and I don't think Tomasello would disagree).
>> He proposes an extra metacognitive feedback-control system monitoring
>> the lower ones allowing control not only over the behaviors taken in
>> service of a goal but of the goals themselves, and an understanding of
>> cause-and-effect in general and not only as concerns one's own
>> actions. It also induces an understanding of others as being agents
>> with goals they behave in service of.
>>
>> 6) While he does think of 5 as the ability to reason and I'm pretty
>> sure would call it "consciousness" he does have 2 other steps
>> separating humans from that, which involve collective agency. He
>> proposes the critical difference between humans and other great apes
>> is the ability to coordinate as part of a group that itself fits the
>> criteria for being an agent - with collective goals, the ability to
>> monitor their completion and act and self-regulate in service of them.
>> He sees this as coming in two parts, first the ability to coordinate
>> pairwise to achieve specific tasks (somewhere in hominid evolution -
>> he gives several examples illustrating how strikingly worse
>> chimpanzees are at basic cooperation than even human children) and
>> then the ability to function as part of a larger community with shared
>> norms that allow coordination with strangers (which he sets early in
>> the evolution of our own species). He talks about this inducing a kind
>> of triple mental model of agency, the "self" agent (the individual's
>> goals, parallel to the sense of agency of other great apes), the
>> "role" agent (the goals implied by one's role in some collective
>> enterprise) and the "collective" agent (the goals of the collective
>> enterprise itself). He then talks about how various aspects of our
>> experience like culture, morality etc follow from that.
>>
>>
>> I think it's interesting how this suggests a difference between having
>> a model of one's own mind, having a model of others' minds, and having
>> a model of *mind in general* that's then applied to oneself and
>> others. "Models of the world" and "models of the mind" really
>> collapses a lot of functionality and variability and I think
>> Tomasello's model does a better job of separating out different
>> potential strands and honing in on those that actually account for how
>> we resemble and differ from other species.
>>
>>
>> I also like how this model justifies that the last step, and only the
>> last step, is truly self-reflective. All the other steps involve
>> taking a system at a certain level of agency and adding a
>> monitoring/control level, resulting in a system that's aware of itself
>> *as a system of the lower level*. That last step is the only one that
>> involves the system monitoring/controlling a level *above* itself, and
>> indeed being able to monitor/control any arbitrary system of agency at
>> all (given any combination of humans can display collective agency and
>> a human can be part of multiple collective agencies at any given
>> time). Meaning the recursion ends there, it's the only agent model
>> that can model itself as being the level it is.
>
> Does he give a definition of consciousness? It sounds like he sees the
> "collective" agent as an essential part of it. I don't doubt that it is
> essential for humanity's achievements, but I'm not convinced it is
> necessary for consciousness. I still like my definition of consciousness
> as having a mental model of one's own mind.
>


Click here to read the complete article
Re: Evolution of consciousness

<v1cv01$36krn$1@dont-email.me>

  copy mid

https://news.novabbs.com/interests/article-flat.php?id=10159&group=talk.origins#10159

  copy link   Newsgroups: talk.origins
Path: i2pn2.org!i2pn.org!usenet.goja.nl.eu.org!2.eu.feeder.erje.net!feeder.erje.net!feeds.news.ox.ac.uk!news.ox.ac.uk!nntp-feed.chiark.greenend.org.uk!ewrotcd!news.eyrie.org!beagle.ediacara.org!.POSTED.beagle.ediacara.org!not-for-mail
From: arka...@proton.me (Arkalen)
Newsgroups: talk.origins
Subject: Re: Evolution of consciousness
Date: Tue, 7 May 2024 12:15:27 +0200
Organization: A noiseless patient Spider
Lines: 342
Sender: to%beagle.ediacara.org
Approved: moderator@beagle.ediacara.org
Message-ID: <v1cv01$36krn$1@dont-email.me>
References: <v0paug$21fia$1@dont-email.me> <v1040r$3s8fe$2@dont-email.me>
<v10h12$3vcs7$1@dont-email.me> <v139v0$m8nm$1@dont-email.me>
<v1aqer$2k360$1@dont-email.me> <v1chm2$33raf$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: beagle.ediacara.org; posting-host="beagle.ediacara.org:3.132.105.89";
logging-data="27106"; mail-complaints-to="usenet@beagle.ediacara.org"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.14.0
To: talk-origins@moderators.isc.org
Cancel-Lock: sha1:WkKP/o7m452lmwKorNUaAOyzrBI=
Return-Path: <news@eternal-september.org>
X-Original-To: talk-origins@ediacara.org
Delivered-To: talk-origins@ediacara.org
id 61E31229786; Tue, 07 May 2024 06:15:33 -0400 (EDT)
by beagle.ediacara.org (Postfix) with ESMTP id 3267A229767
for <talk-origins@ediacara.org>; Tue, 07 May 2024 06:15:31 -0400 (EDT)
id 9C3055DC2C; Tue, 7 May 2024 10:15:33 +0000 (UTC)
Delivered-To: talk-origins@moderators.isc.org
by mod-relay-1.kamens.us (Postfix) with ESMTPS id 5CAF15DC29
for <talk-origins@moderators.isc.org>; Tue, 7 May 2024 10:15:33 +0000 (UTC)
id 62773DC01A9; Tue, 7 May 2024 12:15:30 +0200 (CEST)
X-Injection-Date: Tue, 07 May 2024 12:15:30 +0200 (CEST)
Content-Language: en-US
X-Auth-Sender: U2FsdGVkX1/6q8E6xaTCZB7/jri5GDmLsfZhwyZFrRs=
In-Reply-To: <v1chm2$33raf$1@dont-email.me>
 by: Arkalen - Tue, 7 May 2024 10:15 UTC

On 07/05/2024 08:28, Arkalen wrote:
> On 06/05/2024 16:45, Mark Isaak wrote:
>> On 5/3/24 11:21 AM, Arkalen wrote:
>>> On 02/05/2024 19:03, Mark Isaak wrote:
>>>> On 5/2/24 6:21 AM, Arkalen wrote:
>>>>> On 30/04/2024 01:36, Mark Isaak wrote:
>>>>>> My views on the evolution of consciousness are starting to gel.
>>>>>>
>>>>>> 1. Rudimentary nervous systems evolve.
>>>>>> 2. Brains evolve, capable of memory and of decisions other than
>>>>>> reflex.
>>>>>> 3. Those decisions probably work better if the brain has a model
>>>>>> of the world to work with. So such a model evolves.
>>>>>> 4. Some creatures live socially. Their brains need a model of that
>>>>>> important aspect of the world: the fellow beings one lives with,
>>>>>> including how they think.
>>>>>> 5. So we've now got a model of minds. How about if we apply it to
>>>>>> *our own mind*? That might make our thinking about interactions
>>>>>> with others' minds more efficient.
>>>>>> 6. Viola! Consciousness!
>>>>>>
>>>>>> Does that make sense to people?  Is it time for me to write a book
>>>>>> on the subject? (Do you think publishers will want the book to be
>>>>>> more than 106 words long?)
>>>>>>
>>>>>> There's also the problem of testing it. I'm open to suggestions
>>>>>> there, too. Step 4 implies that the model of how we think need not
>>>>>> agree with how we think, much as the mental model of our world is
>>>>>> flat, not spherical. This has at least some confirmation (e.g.,
>>>>>> blindness to many biases). More would be better.
>>>>>
>>>>> Have you seen my thread on Michael Tomasello's "The Evolution of
>>>>> Agency"? I think the book would interest you. If you want more
>>>>> detail I have a post somewhere in that thread summarizing its
>>>>> arguments, I'd be happy to hear your take.
>>>>
>>>> I have seen it, but I don't remember particular points.
>>>>
>>>> I just came across reference to another book by Michael S.S.
>>>> Graziano, _Consciousness and the Social Brain_, which appears to
>>>> make an argument similar to mine above (particularly steps 4 and 5).
>>>>
>>>
>>> Basically (if you don't mind me going on about it again) he proposes
>>> a scheme similar to what you did but more specific, fleshed-out and
>>> (IMO) convincing. It revolves around the notion of "agents" or
>>> "agency" which Tomasello defines as a system that achieves goals via
>>> a feedback-control mechanisms where the system perceives aspects of
>>> the environment, compares them to the desired goal, engages in
>>> behaviors meant to bring it closer to the goal, checks the
>>> environment again, and loops this way until the goal is achieved.
>>>
>>> His parallels to your steps might be:
>>>
>>> 1) rudimentary nervous systems evolve that coordinate perception with
>>> behavior on a stimulus-response basis but not the feedback-control
>>> system involved in true agency.
>>>
>>> 2) brains evolve that do implement such a feedback-control system
>>> [I'm not sure in the book he explicitly associates it with brains,
>>> but he does associate it with vertebrates which do have distinct
>>> brains as a feature so I'll say it's close enough for a paraphrase]
>>>
>>> He doesn't have a parallel to your step "3" because models of the
>>> world are implicit in all of the cognitive models he presents, in
>>> fact the differences in he calls "experiential niches" (which could
>>> be thought of as "world models") are pretty important. So for example
>>> he points out that with agency comes the mechanism of *attention*
>>> (i.e. you orient your perceptions in specific ways depending on what
>>> goals you're working towards and where you're currently at in working
>>> towards them) which implies experiences of an outside world and
>>> internal states that are or aren't in sync, full of things that are
>>> relevant/irrelevant, good/bad etc.
>>>
>>> 4) He does bring in social living as a possible cause of his next
>>> step in the evolution of agency that he sets at early mammals: the
>>> appearance of a feedback-control system applied on top of the
>>> previous one to monitor and control the goal-seeking process itself
>>> (he sees social living as a driver for this because of the
>>> competition between peers would induce a benefit in more flexible,
>>> efficient decision-making). These early mammals would be able to not
>>> only perceive the world, pick a behavior to fulfill a goal and shut
>>> everything down in case of danger (as he describes lizards doing),
>>> but mentally play out possible behaviors and flexibly inhibit some in
>>> favor of others depending on which they anticipate working out best.
>>> This would introduce into the "world model" or "experiential niche"
>>> notions of goals, behaviors and cause-and-effect relationships
>>> between the two. I don't think he introduces models of other *minds*
>>> at this step per se although it's a bit like world models - they're
>>> implicit in several steps it's more of a question of what aspect of
>>> minds is being modelled.
>>>
>>> 5) I do think there is still some similarity between your 5 and the
>>> next level of agency Tomasello suggests, although he sets it at great
>>> apes and you seem to set it at humans (then again many would argue
>>> great apes are conscious and I don't think Tomasello would disagree).
>>> He proposes an extra metacognitive feedback-control system monitoring
>>> the lower ones allowing control not only over the behaviors taken in
>>> service of a goal but of the goals themselves, and an understanding
>>> of cause-and-effect in general and not only as concerns one's own
>>> actions. It also induces an understanding of others as being agents
>>> with goals they behave in service of.
>>>
>>> 6) While he does think of 5 as the ability to reason and I'm pretty
>>> sure would call it "consciousness" he does have 2 other steps
>>> separating humans from that, which involve collective agency. He
>>> proposes the critical difference between humans and other great apes
>>> is the ability to coordinate as part of a group that itself fits the
>>> criteria for being an agent - with collective goals, the ability to
>>> monitor their completion and act and self-regulate in service of
>>> them. He sees this as coming in two parts, first the ability to
>>> coordinate pairwise to achieve specific tasks (somewhere in hominid
>>> evolution - he gives several examples illustrating how strikingly
>>> worse chimpanzees are at basic cooperation than even human children)
>>> and then the ability to function as part of a larger community with
>>> shared norms that allow coordination with strangers (which he sets
>>> early in the evolution of our own species). He talks about this
>>> inducing a kind of triple mental model of agency, the "self" agent
>>> (the individual's goals, parallel to the sense of agency of other
>>> great apes), the "role" agent (the goals implied by one's role in
>>> some collective enterprise) and the "collective" agent (the goals of
>>> the collective enterprise itself). He then talks about how various
>>> aspects of our experience like culture, morality etc follow from that.
>>>
>>>
>>> I think it's interesting how this suggests a difference between
>>> having a model of one's own mind, having a model of others' minds,
>>> and having a model of *mind in general* that's then applied to
>>> oneself and others. "Models of the world" and "models of the mind"
>>> really collapses a lot of functionality and variability and I think
>>> Tomasello's model does a better job of separating out different
>>> potential strands and honing in on those that actually account for
>>> how we resemble and differ from other species.
>>>
>>>
>>> I also like how this model justifies that the last step, and only the
>>> last step, is truly self-reflective. All the other steps involve
>>> taking a system at a certain level of agency and adding a
>>> monitoring/control level, resulting in a system that's aware of
>>> itself *as a system of the lower level*. That last step is the only
>>> one that involves the system monitoring/controlling a level *above*
>>> itself, and indeed being able to monitor/control any arbitrary system
>>> of agency at all (given any combination of humans can display
>>> collective agency and a human can be part of multiple collective
>>> agencies at any given time). Meaning the recursion ends there, it's
>>> the only agent model that can model itself as being the level it is.
>>
>> Does he give a definition of consciousness? It sounds like he sees the
>> "collective" agent as an essential part of it. I don't doubt that it
>> is essential for humanity's achievements, but I'm not convinced it is
>> necessary for consciousness. I still like my definition of
>> consciousness as having a mental model of one's own mind.
>>
>
> The book isn't about consciousness; it's about the evolution of agency
> and it doesn't conflate agency with consciousness, agency as it defines
> it is completely different. I think it has very obvious and clarifying
> implications on the evolution of consciousness but the notion that the
> "collective agent" is essential to consciousness as we experience it is
> my extrapolation, not his. I'd guess he's in the "there are many kinds
> of consciousness" camp and I'd further guess that he thinks of great
> apes at least as "fully" conscious like us.
>
>
> For me it's a bit like I said in another thread recently - when I look
> at my own consciousness I feel that the self-reflective "this is what
> happened and here's how I think of it" is a very important part of it,
> and that an existence that just had the experiences of the moment
> without the integrating, looking-back part wouldn't be fully conscious
> even if *in us* it's obviously an integral part of our conscious state.
> And I wouldn't be surprised if that self-reflective part really does
> occur only at part 6.
>
>
> "Having a mental model of one's own mind" is all well and good but
> "model" is never 1:1 and plenty of simplified representations we could
> call "models" obviously don't suffice so it just pushes the question
> back to "what kind of model?".
>
>
> I might have another post giving more detail on what Tomasello says
> about experiential niches later.
>


Click here to read the complete article

interests / talk.origins / Re: Evolution of consciousness

1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor