Regarding the Wytham Abbey purchase, there has been discussion over whether or not optics should be considered when making decisions.

Some objections include that optics can be hard to correctly predict / understand, and thinking around optics could be prone to motivated reasoning, so optics should be set aside for decision making.

But the same is true for almost every aspect of EA, aside from the highly empirical randomista development wing!

Especially over the longer term, optics affects community building, including how many people get into EA, and maybe more importantly, who gets into EA, i.e, what kind of pre-existing beliefs and opinions they bring with them. As EAs aim to improve government policy in EA priority areas, EA's optics affects their ability to do this. Optics also affect how EA ideas diffuse outside of EA, and where they diffuse to.

Like with every other hard to predict, highly uncertain factor that goes into lots of EA decision making, we should make uncertain estimates around optics anyway, work on constantly refining our predictions around optics, and include optics as a factor when working out the EV of decisions.

 

(Of course, one might still decide it's worth making big purchases for community building, but optics should be taken into account!)

10

0
0

Reactions

0
0
Comments12
Sorted by Click to highlight new comments since:

The problem with considering optics is that it’s chaotic. I think Wytham is a reasonable example. You might want a fancy space so you can have good optics - imagining that you need to convince fancy people of things, otherwise they won’t take you seriously. Or you might imagine that it looks too fancy, and then people won’t take you seriously because it looks like you’re spending too much money.

Pretty much everything in “PR” has weird nonlinear dynamics like this. I’m not going to say that it is completely unpredictable but I do think that it’s quite hard to predict, and subtleties really matter, and most people seem overconfident; I think “bad optics” only looks predictable in hindsight. It also changes quickly, like fashion: what seems like bad optics now could be good countersignaling in year, and standard practice in three.

It’s a better heuristic to focus on things which are actually good for the world, consistent with your values. I think in most cases if you can justify your actions being consistent with a set of values you can survive most short term optical disasters and even come out of it stronger.

The problem with considering optics is that it’s chaotic.

The world is chaotic, and everything EAs try to do have a largely unpredictable long-term effect because of complex dynamic interactions. We should try to think through the contingencies and make the best guess we can, but completely ignoring chaotic considerations just seems impossible.

It’s a better heuristic to focus on things which are actually good for the world, consistent with your values.

This sounds good in principle, but there are a ton of things that might conceivably be good-but-for-pr-reasons where the pr reasons are decisive. E.g. should EAs engage in personal harassment campaigns against productive ML researchers in order to slow AI capabilities research? Maybe that would be good if it weren't terrible PR, but I think we very obviously should not do it because it would be terrible PR.

Holding conferences is not "actually good for the world" in any direct sense. It is good only to the extent that it results in net good outcomes -- and you're quite right that those outcomes can be hard to predict. What I think we have to be careful to avoid is the crediting the hoped-for positive aspects while dismissing the negative aspects as "optics" that cannot be adequately predicted. 

Also, you could always commission a survey to generate at least some data on how the public would perceive an action. That doesn't give much confidence in what the actual perception would be . . . but these sorts of things are hard to measure/predict on both the positive and negative ends. If people are just too unpredictable to make EV estimates based on their reactions to anything, then we should just hold all conferences at the local Motel 6 or wherever the cheapest venue is. "Dollars spent" is at least measurable.

I agree with this. It's also not clear where to draw the boundary. If even well-informed people who shared your worldview and values thought a given purchase was bad, then there's no need to call it "optics" – it's just a bad purchase. 

So "optics" is about what people think who either don't have all the info or who have different views and values. There's a whole range of potential  differences here that can affect what people think. 

Some people are more averse to spending large amounts of money without some careful process that's there to prevent corruption. Some people might be fine with the decision but would've liked to see things being addressed and explained more proactively. Some people may have uncharitable priors towards EA or towards everyone (including themselves?) so they'd never accept multi-step arguments about why some investment is actually altruistic if it superficially looks like what a selfish rich person would also buy. And maybe some people don't understand how investments work (the fact that you can sell something again and get money back). 

At the extreme, it seems unreasonable to give weight to all the ways a decision could cause backlash – some of the viewpoints I described above are clearly stupid.

At the same time, factoring in that there are parts of EA that would welcome more transparency or some kind of process designed to prevent risk of corruption – that seems fine/good. 

Relevant: PR is corrosive reputation is not

Copying my response to you from that thread in the Wytham post:

This is fair, and I don't want to argue that optics don't matter at all or that we shouldn't try to think about them.

My argument is more that actually properly accounting for optics in your EV calculations is really hard, and that most naive attempts to do so can easily do more harm than good. And that I think people can easily underestimate the costs of caring less about truth or effectiveness or integrity, and overestimate the costs of being legibly popular or safe from criticism. Generally, people have a strong desire to be popular and to fit in, and I think this can significantly bias thinking around optics! I particularly think this is the case with naive expected value calculations of the form "if there's even a 0.1% chance of bad outcome X we should not do this, because X would be super bad". Because it's easy to anchor on some particularly salient example of X, and miss out on a bunch of other tail risk considerations.

The "annoying people by showing that we care more about style than substance" was an example of a counter-veiling consideration that argues in the opposite direction and could also be super bad.

This argument is motivated by the same reasoning as the "don't kill people to steal their organs, even if it seems like a really good idea at the time, and you're confident no one will ever find out" argument.

Thanks for copying your comment!

”most naive attempts to do so can easily do more harm than good.”

I agree that factoring in optics can accidentally do harm, but I think if we’re trying to approximately maximise EV, we should be happy to risk doing harm.

I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in. If we’re worried about overweighting it, can’t we just weight it less?

If I’m interpreting your comment correctly, you’re arguing that systematic biases to how we would estimate optics value mean that we’re better off not factoring in optics into the EV calculations.

There are other systematic biases to the “wanting to fit in” bias that affect EV calculations - self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations - I’d be interested to get your thoughts on this.

(My view is that we should continue to factor everything in and just consciously reduce the weighting of things that we think we might be prone to overweighting or overvaluing.)

I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in.

My main argument is that "naive" weighting of optics is common and can easily do more harm than good. And that sophisticated weighting of optics is just really hard. Even if you're aware of this problem! If "not weighing optics at all" is a better strategy than naively weighting optics, then I recommend not weighting it at all.

And I think that people the kind of people who talk a lot about weighting optics have systematic biases towards conservatism and overweighting optics (obviously there are people who have the reverse biases and should care way more!!).

I think there are clearly sophisticated strategies that are better, but I'm concerned that they're hard to follow in practice, while the strategy of "don't overthink optics" is fairly easy to follow.

If we’re worried about overweighting it, can’t we just weight it less?

I think this is true in theory but very hard to do in practice - it's just really hard to account for your biases right, even if you're aware of them and trying to correct for them!

There are other systematic biases to the “wanting to fit in” bias that affect EV calculations - self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations - I’d be interested to get your thoughts on this.

This is a fair point! I haven't thought much about this, but do think that a similar argument goes through there. I think that time saved feels easier to me - it's much easier to quantify, and putting a price on someone's time is a common enough calculation that you can at least try to be consistent.

Nice conference venues is harder to quantify and easy to be self-serving with, and I do agree there. (Though if the person making the decision spends the conference worked off their feet doing operations/management, and dealing with dumb BS to do with having a nice venue like historical building protection, I'm less concerned about bias!)

EDIT: And to clarify, my position is not "optics never matter and it's a mistake to think about them". I just think that it's difficult to do right, and important to be careful about this, and often reasonable to decide it's not a significant factor. Eg, I can see the case for not caring about optics with Wytham, but I think that if you're eg killing people for their organs, optics are a pretty relevant consideration! (Along with all of the other reasons that's a terrible and immoral idea)

I think optics concerns are corrosive in the same way that PR concerns are. I quite like Rob Bensinger's perspective on this, as well as Anna's "PR" is corrosive, reputation is not.

I'd like to know what you think of these strategies. Notably, I think they defend against SBF, but not against Wytham Abbey type stuff, and conditional on Wytham Abbey being an object-level smart purchase, I think that's a good thing.

I like both perspectives you linked, but I think Rob is preventing a false binary between being virtuous / following deontological rule and optimising for optics. I think optics should be factored into EV calculations, but we should definitely not be optimising for optics. 

I think my ideal approach to EA's reputation would be - encourage EAs to follow the law, to reject physical violence, to be virtuous and rule following, and then to also factor in optics as one factor in EV calculations for decision making.

Hm. I think I mostly don’t think people are good at doing that kind of reasoning. Generally when I see it in the wild, it seems very naive.

I’d like to know if you, factoring in optics into your EV calcs, see any optics mistakes EA is currently making which haven’t already blown up, and that (say) Rob Bensinger probably can’t given he’s not directly factoring in optics to his EV calcs.

Would you be up for making a few concrete proposals for how to factor in the optics of a contemplated action with some example cases?

Some illegal stuff (i.e - financial fraud for earning-to-give, bribing politicians to prioritise EA cause areas) seems positive EV before considering optics and negative EV after considering optics.

(I’m purely focusing on the effects of optics on EV here. Obviously, EV shouldn’t be the only consideration when making decisions, and we should avoid doing illegal stuff even when it maximises EV because we should follow certain deontological constraints.)

You could just break down optics into a set of smaller factors like with any Fermi estimate - number of people who would hear about *thing *, proportion who would think badly of EA because of it, proportion who would have counterfactually become engaged with EA otherwise, etc.

Curated and popular this week
Relevant opportunities