Comments

You must log in or register to comment.

7

NEOalquimista wrote (edited )

Maybe humanity is not the body we're currently in, but our ways of living and interacting with each other. It's all leading towards the merge or replacement of the biological nature with one that is more programmable and resistant to the hazards we would find when expanding beyond Earth, for instance.

So maybe embracing singularity isn't killing humanity, just making sure it survives longer.

But how can anyone be sure...

EDIT: And, of course, there are many issues we need to address first. We don't want to use technology that is centralized and highly regulated. We have to build this future on open and decentralized technology in order to guarantee privacy, freedom and security.

5

[deleted] wrote (edited )

3

_deleted____ wrote

Maybe being simply human isn't enough. Maybe that is a dead end and we need to shed these flesh sacks to continue our evolution :)

3

zorblax wrote

Should it?

What makes humanity more important than other consciousness? Why should we not consider AIs or enhanced humans or composites or analogues of either not people?

I also don't think "the singularity" is ever going to come, but in the far future human-level non-human intelligence is almost certain, and I think treating it like people is as important as treating your neighbor like people.

3

_deleted____ wrote

We'll survive it by becoming part of it. The singularity will be our evolution to a higher plane of existence.

3

alqm wrote (edited )

Could be. It would be good.

But you could consider that we're most likely heading for the stone age again, instead of a futuristic world. Look around you. This is becoming "Mad Max." The future's gonna be hot, polluted, filled with clans fighting each other for resources and many small tyrants controlling the distribution of water in a world without government.

And we... anarchists. Will be chased by fascist clans. Annihilated. If anarchy is to thrive, it will be because the anarchists will do a good job in educating the populations to resist.

2

_deleted____ wrote

All I see around me is great potential and people that need direction.

3

alqm wrote (edited )

I see that too. But that's not all there is. The bad is outnumbering the good. It's on the news, it's in our neighborhoods, the forums being attacked like Raddle last night... it's not alright. It's intensifying. The planet is sick.

3

ziq_postcivver wrote

I don't think it'll get to that point, collapse will arrive sooner and technology will stop progressing.

2

jadedctrl wrote

I'd say almost certainly (if it does indeed happen before some sort of terrible collapse)-- if the concern is the singularity somehow overall harming/killing humanity for some reason, I mean.
Any AI capable of reaching the singularity would be programmed with constraints to prevent such behavior. (TBH, I'm thinking something like the Three Laws)