Tagged in

Artificial Intelligence

Map and Territory
Map and Territory
Making sense of reality
More information
Followers
272
Elsewhere

Let Values Drift

I occasionally run across lines of reasoning that depend on or favor the position that value drift should be avoided.


Computational Complexity of P-Zombies

I’ve claimed that p-zombies require exponentially more resources than phenomenally conscious agents and used this to justify that AGI will necessarily be phenomenally conscious. I’ve previously supported this claim by pointing to a related result in integrated…


AI Alignment and Phenomenal Consciousness

In the initial feedback I’ve received on my attempt to formally state the AI alignment problem, the primary objection I’ve heard is that I assume any AI worth aligning will experience qualia. In particular, some think we have to worry about aligning AI that…


Formally Stating the AI Alignment Problem

The development of smarter-than-human artificial intelligence poses an existential and suffering risk to humanity. Given that it is unlikely we can prevent and may not want to prevent the development of smarter-than-human AI, we are faced with the challenge…


Introduction to Noematology

Last time we performed a reduction of the phenomenon of conscious self experience and through it discovered several key ideas. To refresh ourselves on them:

  • Things exist ontologically as patterns within our experience of them.

Form and Feedback in Phenomenology

Now that we’ve covered phenomenology’s foundations and its methods, we’re almost ready to begin addressing AI alignment from a phenomenological perspective, but before we can start talking about AI or how to align it we need to build a clear understanding of what AI is…


Methods of Phenomenology

In the previous post we looked at phenomenology and the thinking that motivates it. We saw that it is based on taking a naive, skeptical, beginner’s view to asking “why?” and choosing to address the question using only knowledge we can obtain from experience. We also saw that…