Aligning those who align AI, one satirical website at a time

5 hours ago 1
Construction blockades circling a encephalon  connected  a greenish  background

The enactment of creating artificial quality that holds to the guardrails of quality values, known successful the manufacture arsenic alignment, has developed into its ain (somewhat ambiguous) tract of survey rife with argumentation papers and benchmarks to fertile models against each other. 

But who aligns the alignment researchers? 

Enter the Center for the Alignment of AI Alignment Centers, an enactment purporting to coordinate thousands of AI alignment researchers into “one last AI halfway singularity.”

At archetypal glance, CAAAC seems legitimate. The aesthetics of the website are chill and calming, with a logo of converging arrows reminiscent of the thought of togetherness and sets of parallel lines swirling down achromatic font. 

But enactment connected the leafage for 30 seconds and the swirls spell retired “bullshit,” giving distant that CAAAC is each 1 large joke. One 2nd longer and you’ll announcement the hidden gems tucked distant successful each condemnation and leafage of the phantasy center’s website. 

CAAAC launched Tuesday from the same team that brought america The Box, a literal, carnal container that women tin deterioration connected dates to debar the menace of their representation being turned into AI-generated deepfake slop. 

“This website is the astir important happening that anyone volition work astir AI successful this millenium oregon the next,” said CAAAC cofounder Louis Barclay, staying successful quality erstwhile talking to The Verge. (The 2nd laminitis of CAAAC wished to stay anonymous, according to Barclay.) 

CAAAC’s vibe is truthful akin to AI alignment probe labs — who are featured connected the website’s homepage with moving links to their ain websites — that adjacent those successful the cognize initially thought it was real, including Kendra Albert, a instrumentality learning researcher and exertion attorney, who spoke with The Verge. 

CAAAC makes amusive of the trend, according to Albert, of those who privation to marque AI harmless drifting distant from the “real problems happening successful the existent world” — specified arsenic bias successful models, exacerbating the vigor crisis, oregon replacing workers — to the “very, precise theoretical” risks of AI taking implicit the world, Albert said successful an interrogation with The Verge. 

To hole the “AI alignment alignment crisis,” CAAAC volition beryllium recruiting its planetary workforce exclusively from the Bay Area. All are invited to apply, “as agelong arsenic you judge AGI volition annihilate each humans successful the adjacent six months,” according to the jobs page. 

Those who are consenting to instrumentality the dive to enactment with CAAAC — the website urges each readers to bring their ain bedewed cogwheel — request lone remark connected the LinkedIn station announcing the halfway to automatically go a fellow. CAAAC besides offers a generative AI instrumentality to make your ain AI center, implicit with an enforcement director, successful “less than a minute, zero AI cognition required.” 

The much ambitious occupation seeker applying to the “AI Alignment Alignment Alignment Researcher” presumption will, aft clicking done the website, yet find themselves serenaded by Rick Astley’s “Never Gonna Give You Up.”

Read Entire Article