Connor McCormick q/dau pfp

Connor McCormick q/dau

@nor

1742 Following
6875 Followers


Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
A Great Filter may be ahead of us, which would explain why we don't see signs of alien life.
1 reply
0 recast
2 reactions

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
If we create the next evolution of humankind we can cure diseases, reduce suffering, and chart our own destiny.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
The next evolution of humankind will allow us to become wiser.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Creating the next evolution of humankind will allow us to be better custodians of our ecosystems.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Enhancing our intelligence and or abilities would not positively affect the world or its inhabitants.
3 replies
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Humanity will be obsoleted by machines if we don't create the next evolution of humankind.
1 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Climate change is not real.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Risks imposed by climate change are dire in terms of lives, economics, and wellbeing, but even in the worst scenario are unlikely to cause a global extinction of humans.
0 reply
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Risks imposed by climate change are an existential threat to humanity.
2 replies
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Risk of ecosystem collapse is an existential threat to humanity.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Risk of easy development and release of bioweapons is an existential threat to humanity.
0 reply
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Risk of self-annihilation by nuclear weapons is an existential threat for humanity.
0 reply
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
We do not face existential threats.
5 replies
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
We should avoid transhumanism to preserve our natural human essence and prevent unforeseen risks.
2 replies
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
While I enjoy many of Schmachtenberger's ideas, he's wrong about this. Evolution definitely increases entropy, if it didn't it couldn't exist. I can't tell how important this is for his overall argument but it seems like a pretty basic physics mistake to make. https://civilizationemerging.com/new-economics-series-4/
3 replies
1 recast
12 reactions

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
UBI would not reduce wealth inequality.
0 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
I really want to get started with streaming on @unlonely, but remember it was difficult to get the technology working. Who can help?
0 reply
1 recast
8 reactions

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
If machines are already smarter than human beings today, that means they were smarter than humans sooner than 2045.
0 reply
0 recast
1 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
It's impossible for machines to be smarter than human beings.
1 reply
0 recast
0 reaction

Connor McCormick q/dau pfp
Connor McCormick q/dau
@nor
Machines are already smarter than human beings.
0 reply
0 recast
0 reaction