If I value something it means that I also value being able to act in the world so that I get what I want.

I think any entity capable of having coherent preferences will also implicitly value controlling itself and its environment so it can move toward its preferred states.

Have I missed something? It seems that any value whatsoever also smuggles in the preference for a maximised ability to control yourself and the world to get what you want.

Looking at AGI risk, I find it plausible that whatever values it has it will always be incentivised to increase its control over itself and its environment.

The only case where I can see where the maximisation of control is not inevitable is if it has an overwhelming preference to limit its interaction with the world, which would presumably tend towards a preference for immediate self-termination.

3

0
0

Reactions

0
0
Comments1
Sorted by Click to highlight new comments since:

For my own reference: this concern is largely captured by the term ‘instrumental convergence’ https://en.wikipedia.org/wiki/Instrumental_convergence

More from dotsam
Curated and popular this week
Relevant opportunities