IT

Ian Turner

601 karmaJoined

Comments
152

Oh sure, I'll readily agree that most startups don't have a safety culture. The part I was disagreeing with was this:

I think it’s hard to have a safety-focused culture just by “wanting it” hard enough in the abstract

Regarding finance, I don't think this is about 2008, because there are plenty of trading firms that were careful from the outset that were also founded well before the financial crisis. I do think there is a strong selection effect happening, where we don't really observe the firms that weren't careful (because they blew up eventually, even if they were lucky in the beginning).

How do careful startups happen? Basically I think it just takes safety-minded founders. That's why the quote above didn't seem quite right to me. Why are most startups not safety-minded? Because most founders are not safety-minded, which in turn is probably due in part to a combination of incentives and selection effects.

Not disagreeing with your thesis necessarily, but I disagree that a startup can't have a safety-focused culture. Most mainstream (i.e., not crypto) financial trading firms started out as a very risk-conscious startup. This can be hard to evaluate from the outside, though, and definitely depends on committed executives.

Regarding the actual companies we have, though, my sense is that OpenAI is not careful and I'm not feeling great about Anthropic either.

(I didn’t read the whole post)

Is deep honesty different from candor? I was surprised not to see that word anywhere in this post.

I am not that knowledgable myself. But about the vaccines, my understanding is that they are not that effective and that distributing them is very expensive. The vaccines require a cold chain, multiple doses spread well apart, and the vaccine is delivered as an injection. These are all major obstacles to cost-effective distribution in a developing country setting, so while some might say that "progress is slower than it should be", personally I have pretty low expectations.

My impression is that most CO2 offsets are bogus, basically the climate change version of “just 25 cents will help save a child’s life”. If you subject them to a GiveWell style analysis, I would guess most of these offset programs fall apart, or at least deliver way less than the promised counterfactual impact.

Also logically I think it would make sense to lump offsets in with other charitable giving and subject them to the same scrutiny, and when you do that it just doesn’t make sense to buy offsets. Even within the climate cause area, I really doubt that buying offsets would be cost effective, and I also doubt that climate is the most cost effective cause area right now.

The posts do have the “April Fool’s Day” tag right at the beginning?

I guess the question I have is, if the fraud wasn't noticed by SBF's investors, who had much better access to information and incentives to find fraud, why would anyone expect the recipients of his charitable donations to notice it? If it was a failure of the EA movement not to know that FTX was fraudulent, isn't it many times more of a failure that the fraud was unnoticed by the major sophisticated investment firms that were large FTX shareholders?

Thanks for posting this. I think this is the kind of practical, actionable analysis that we need.

Regarding this:

Given that there is still no way for model developers to deterministically guarantee a model’s expected behavior to downstream actors, and given the benefits that advanced AI could have in society, we think it is unfair for an actor to be forced to pay damages regardless of any steps they’ve taken to ensure the advanced AI in question is safe.

It seems to me that this is begging the question. If we don't know how to make AIs safe, that is a reason not to make AIs at all, not a reason to make unsafe AIs. This is not really any different from how the nuclear power industry has been regulated out of existence in some countries[1].


  1. I think this analogy holds regardless of your opinions about the actual dangerousness of nuclear power. ↩︎

Load more