Viewing a single comment thread. View all comments

Cheeks wrote

That's some deep thought shit! There is the ethical delima, of sorts, as others not knowing it's a simulation. I would imagine that early philosophy has touched on this. Personally, I'm with u/dumai on this, overclock it and get the suffering over with asap! If it is a simulation, then is it actual suffering?

5

yaaqov wrote

If it is a simulation, then is it actual suffering?

Absolutely, yes.

9

[deleted] wrote (edited )

1

Zzzxxxyyy wrote

This is very arbitrary and human-centric. What about animals? What if a machine exhibited all the outward signs of consciousnesses?

Why not reserve consciousness to only yourself and everyone else just has a syntactic engine in their skull? You’re the only one living your own version of consciousness after all.

I think it’s clear and has been demonstrated time and again, that the whole can be greater than the sum. Human consciousness emerges from a “syntactic” network, even if it’s signaling is somewhat noisy and wet. Any other conclusion stems from deeply held religious beliefs and those arguments should begin with the soul, not logic.

3

[deleted] wrote

0

Zzzxxxyyy wrote

“computers don't do that so i'm not sure what the point is here”

We’re literally in the middle of a conversation about whether human consciousness can emerge from a simulation and whether we live in such a simulation...

Likewise, can you prove that you are conscious? If you won’t accept experimental evidence then how do you know anything is true?

2

[deleted] wrote (edited )

0

Zzzxxxyyy wrote

“we're also in a conversation about the physical processes of a computer”

We’re also in a conversation about the physical processes of biological nervous systems.

I just don’t see how you make such a strong argument about “syntactic engines” being incapable of semantic evaluation. Do you actually understand how human consciousness emerges from physical processes? I don’t, but I’m confident there’s no magic between the syntactic localized behavior of neurons and the aggregate semantic behavior. So human consciousness must emerge from syntactic complexity.

One could argue, that AI won’t have “consciousness”, but it will likely be more aware and more self directed than humans.

Current, best effort AI is able to defeat humans in go, using heuristics for board configurations, without simulating all possible configurations. It’s not brute force or rules based. No one programmed in a set of rules for valuation.

2

surreal wrote

ah the old Soul that travels to the Aeons out of Space and Time.

what about quantum computer technology? machine learn that machine to know it's existence and deduce stuff from this 'realization'. aren't humans biological computers that just realize that?

if quantum phenomena affect our biological PCs is debatable. but the piece of software that is consciousness can run on any hardware, whether it is human, mechanical, or an alien made out of gas.

3

[deleted] wrote

0

surreal wrote (edited )

i mean... no, for reasons i just explained

where?

edit: if you mean the article, i will get on it

duelism doesn't mean that the one can exist without the other

do animals have consciousness?

1

[deleted] wrote

0

ziq OP wrote

What evidence do you have that we're not operating entirely based on rigid predetermined code?

4

surreal wrote

even so, why does this prevent the existence of consciousness. in eastern philosophy even rocks have it. do you believe that animals have consciousness?

2

[deleted] wrote

0

ziq OP wrote (edited )

it just has some pre-set rules for organising symbols

That's basically what humans do. Our behaviours are completely predictable, as if our responses to stimuli are hard-coded into us. Experience X + Stimuli A = Reaction XA.

2

[deleted] wrote

0

ziq OP wrote

not a very respectable position

Really dumai? I'm not some philosophy grad, do you have to be so elitist? Idk wtf hard determinism is but it sounds like insular thinking along the lines of "a bunch of distinguished gentleman philosophers I've studied would scoff at your thoughts so they're not worth my time". You often use very condescending language to communicate and it almost feels classist.

2

yaaqov wrote (edited )

You expand on this? Specifically, I don't think I understand what you mean by "conscious"meaning; the expressions operated over by computers have a semantics. The operations themselves have a semantics.

2

[deleted] wrote

0

yaaqov wrote (edited )

Ooh well that makes sense that we'd differ here; I've never been convinced by the Chinese room argument. It's my position that the room (taken as a whole, including the operator inside the room, but not the operator individually) does (or can, at least) know language.

2

[deleted] wrote (edited )

0

yaaqov wrote

What evidence is there that we perceive and comprehend language?

Taking another tack, syntactic rules (not exactly "pre-set", but acquired) are precisely what make up any speaker's syntactic competence . Of course a room can know a language, even if its parts don't! No individual subpart of my brain knows English, but I do.

2

[deleted] wrote

0

yaaqov wrote (edited )

Well, wouldn't the type of evidence that leads us to believe that humans have linguistic capacity be of the same type that leads us to believe that a non-human has linguistic capacity? It seems that Searle holds his language room to a different standard than he would a human speaker.

In fact, doesn't his view require a type of mind/body dualism in of itself? Doesn't Searle believe that philosophical zombies (which I understand to be something that extensionally acts exactly like a human but does not have consciousness) could logically exist? Isn't that itself dualistic?

I don't intend these questions to be rhetorical. I'm a total beginner in this territory.

2