Ulrik Horn

1068 karmaJoined Working (6-15 years)

Bio

Participation
4

​​I have received funding from the LTFF and the SFF and am also doing work for an EA-adjacent organization.

My EA journey started in 2007 as I considered switching from a Wall Street career to instead help tackle climate change by making wind energy cheaper – unfortunately, the University of Pennsylvania did not have an EA chapter back then! A few years later, I started having doubts about my decision that climate change was the best use of my time. After reading a few books on philosophy and psychology, I decided that moral circle expansion was neglected but important and donated a few thousand sterling pounds of my modest income to a somewhat evidence-based organisation. Serendipitously, my boss stumbled upon EA in a thread on Stack Exchange around 2014 and sent me a link. After reading up on EA, I then pursued E2G with my modest income, donating ~USD35k to AMF. I have done some limited volunteering for building the EA community here in Stockholm, Sweden. Additionally, I set up and was an admin of the ~1k member EA system change Facebook group (apologies for not having time to make more of it!). Lastly, (and I am leaving out a lot of smaller stuff like giving career guidance, etc.) I have coordinated with other people interested in doing EA community building in UWC high schools and have even run a couple of EA events at these schools.

How others can help me

Lately, and in consultation with 80k hours and some “EA veterans”, I have concluded that I should consider instead working directly on EA priority causes. Thus, I am determined to keep seeking opportunities for entrepreneurship within EA, especially considering if I could contribute to launching new projects. Therefore, if you have a project where you think I could contribute, please do not hesitate to reach out (even if I am engaged in a current project - my time might be better used getting another project up and running and handing over the reins of my current project to a successor)!

How I can help others

I can share my experience working at the intersection of people and technology in deploying infrastructure/a new technology/wind energy globally. I can also share my experience in coming from "industry" and doing EA entrepreneurship/direct work. Or anything else you think I can help with.

I am also concerned about the "Diversity and Inclusion" aspects of EA and would be keen to contribute to make EA a place where even more people from all walks of life feel safe and at home. Please DM me if you think there is any way I can help. Currently, I expect to have ~5 hrs/month to contribute to this (a number that will grow as my kids become older and more independent).

Comments
285

Would it go some way to answer the question if an ex-lab person has said something pretty bad about their past employer? Because this would in my simplistic world view mean either that they do not care about legal consequences or that they do not have such an agreement. And I think, perhaps naively that both of these would make me trust the person to some degree.

One observation here, which might not be useful: I think the people most effective at combating climate change do consume fossil fuels both directly (e.g. international flights) and indirectly (e.g. plastics in stuff they purchase). There are value too in those people that try to delineate themselves completely - they all play different roles.  But they might try to limit their consumption. So perhaps I should not use GPT for e.g. entertainment, but only strictly for work.

Keep in mind that even if we stop our subscription, thousands of people in the supply chain of your goods will have subscriptions so you will still be indirectly funding OpenAI unless you go to the extreme length of only buying stuff from supply chains free of OpenAI.

I might well be biased here, I used to be working on climate and try to wean myself completely off fossil fuels and now I have greatly moderated/rationalized my views.

Yeah or maybe you could do like with toothpaste: one for white teeth, another for good breath. I think it's the same toothpaste in both tubes.

Thanks and sorry I missed that. 

Super nice work and kudos for breaking down by demographics. On that topic, do you see that any changes over time in cause prioritization could be explained by changing demographics? You mentioned e.g. engagement level predicting cause prioritization. I am then thinking that if the % of low to high engagement EAs have changed over time, perhaps that partially drives the trends in your 4th chart?

Just listened to this post right before this recent post on the giving pledge.

  1. I think a lot (probably a majority) of EAs think one should try to live as frugally as possible and then donate any surplus.
  2. I think a lot (a majority?) of EAs think it is good if more people do this.

Is that not socialism? 

I think EA is more about leading by example, changing norms and not antagonizing the "others". Like a nice kind of socialism? And EA is quite inclusive by trying to accommodate other causes and approaches too, such as AI safety researchers that live perhaps a bit more of a lavish lifestyle without donating because there is not only one way of doing good and people are constrained differently (e.g. having a social circle spending a lot). And I guess we also have scepticism about whether what we are doing is good or not and thus it seems better to be humble to other people's approaches to doing good - it's a complex world and many previous attempts of doing great good have failed miserably. In socialist circles I feel there is less of such inclusivity and humility.

I really like that you have both highlighted the importance of branding as well as made some reflections on how it affects different groups. I also really like the suggestion to think about one's audience. On the part about audience I think one should also think carefully about what one actually needs - jumping straight to interests of your audience without considering what you actually need could introduce bias and ineffectiveness into your outreach. For example, perhaps what you need is someone really good at research management, or interfacing between technical and policy people (probably not the best examples). If you go straight into interests, you might think "oh it's someone hackery that plays D&D on Friday night and should be able to make a 10am meeting on Saturday" (again, probably bad example). But in reality, maybe the people doing best in such positions are more likely to socialize on Fridays and not wake up before 3pm. 

On the same topic I also feel a bit unsure about the "grouping" above of People with a strong interest in CS or ML and People who relate to hacker culture. I loved my CS courses but never related much to hacker culture and know many like me. That said I do know there was some CS hiring algorithm that used engagement with certain Manga websites as a strong predictor of coding skills but I would be quite careful in using such correlations, and again really nailing down the "complete" set of characteristics you need, which might not only include coding skills but also communication, management, etc. I think others have written about and fleshed out in quite a lot of detail the specific needs of AI Safety and would suggest to really engage with the stated needs of the field to make sure your branding aligns with what is predicted to be specific talent gaps going forward.

I had not heard of double-cruxing before reading about it above and I think I am an EA - haha! In my mind it suffices to be kind of open-minded and curious and have a strong will to be of service/help to others. Moreover, on team cohesion and culture I am not sure I fit neatly into EA either - I often receive a lot of downvotes here on the forum!

I kind of think EA or non-EA is a pretty long sliding scale and my completely unfounded observation is that there is a lot of talent currently employed or funded that almost tend more towards the non-EA scale. I feel like the "very EA" people might to a large degree be people that are "fans" of EA, engage a lot here on the forum but that might actually not have a high percentage of employment in EA orgs. But I could be wrong here - I have no data to back this up and think one could assert this to some degree from surveys if there are surveys that not only go out to EAs, but to people employed and funded by "EA orgs".

That makes sense. I guess it's then not really that EA is elitist, but the part of EA that focuses on students.

I have never done community building and am probably ignorant of many ongoing initiatives so maybe I am stating the obvious below. 

I am just wondering about mid-career professionals: Could one not easily abandon the focus on elite universities for this group? I think I have seen calls for getting more mid-career professionals into EA (@Letian Wang mentions this in another comment on this post), and I think at a mid-career point people have sufficient track record in their discipline/industry that one can almost completely disregard their education. In my experience, some of the most talented people I have worked with were people who either never considered moving to the UK/US to attend elite universities, or who just did not take university too seriously but later found ways to make significant contributions in their field. Maybe this is more true outside of research roles, as researchers still seem to have a harder time "decoupling" from their undergrad.

Load more