The names of classes even, are sometimes derived from tech terms. One class, dubbed "propagating urges," comes from the machine learning term backpropagation.CFAR likes to ask, If something, either a human or an AI, were to make a perfectly rational choice, what would that look like?The founding of CFAR
"We had all noticed in different ways in different contexts that being smart, and being well educated and even being really well intentioned was far from a guarantee from making what turned out to be really stupid decisions."
Many of CFAR's participants are followers of Yudkowsky's work, but it has also managed to appeal to a broader audience.In addition to helping individuals reach their goals, CFAR's proponents fear that Earth's most powerful and important leaders aren't aware of their own shortcomings, and that ignorance is at the root of much of the world's problems. If we can train our leaders learn to think more effectively about complex problems, CFAR says, we could save humanity from future crises.Thinking like an AIThe task of saving the entire human race seemed lofty, but when I spoke to Galef about some of CFAR's specific techniques and their connections to AI, they appeared almost intuitive. One connection she made was between a statistical model, and the way that humans actually make decisions.
If something, either a human or an AI, were to make a perfectly rational choice, what would that look like?