My main objection against this kind of predictions is that predictions (at least, the known enough ones) are part of the past that shape the future. Even doing a good extrapolation on the current trends, the prediction itself could make things diverge, converge or do something totally different, because the main decision makers will take it into account and that is not part of the trend. Specially with disruptive enough predictions that paints an undesirable future for all or most decision makers.
Unless it hits hard in some of the areas that we have cognitive biases and are not fully rational on the consequences.
Sometimes I feel like I'm losing my mind with this shit.
Am I to understand that a bunch of "experts" created a model, they surrounded the findings of that model with a fancy website, replete with charts and diagrams, that website suggests the possibility of some doomsday scenario, the headline of the website says "We predict that the impact of superhuman AI over the next decade will be enormous, exceeding that of the Industrial Revolution." WILL be enormous. Not MIGHT be, they went on some of the biggest podcasts in the world talking about it, a physicist comes along and says yeah this is shoddy work, the clap back is "Well yeah it's an informed guess, not physics or anything"?
What was the point of the website if this is just some guess? What was the point of the press tour? I mean are these people literally fucking insane?
Reading through the comments, I am so glad I’m not the only one beyond done with these stupid clapbacks between boosters and doomers over a work of fiction that conveniently ignores present harms and tangible reality in knowledge domains outside of AI - like physics, biology, economics, etc.
If I didn’t know better, it’s almost like there’s a vested interest in propping these things up rather than letting them stand freely and let the “invisible hand of the free market” decide if they’re of value.
It's like the invention of the washing machine. People didn't stop doing chores, they just do it more efficiently.
Coders won't stop being, they'll just do more, compete at higher levels. The losers are the ones who won't/can't adapt.
bangs head against the table.
Look, fitting a single metric to a curve and projecting from that only gets you a "model" that conforms to your curve fitting.
"proper" AI, where it starts to remove 10-15% of jobs will cause an economic blood bath.
The current rate of AI expansion requires almost exponential amounts of cash injections. That cash comes from petro-dollars and advertising sales. (and the ability of investment banks to print money based on those investment) Those sources of cash require a functioning world economy.
given that the US economy is three fox news headlines away from collapse[1] exponential money supply looks a bit dicey
If you, in the space of 2 years remove 10-15% of all jobs, you will spark revolutions. This will cause loands to be called in, banks to fail and the dollar, presently run obvious dipshits, to evaporate.
This will stop investment in AI, which means no exponential growth.
Sure you can talk about universal credit, but unless something radical changes, the people who run our economies will not consent to giving away cash to the plebs.
AI 2027 is unmitigated bullshit, but with graphs, so people think there is a science to it.
[1] trump needs a "good" economy. If the fed, who are currently mostly independent need to raise interest rates, and fox news doesn't like it, then trump will remove it's independence. This will really raise the chance of the dollar being dumped for something else (and its either the euro or renminbi, but more likely the latter)
That'll also kill the UK because for some reason we hold ~1.2 times our GDP in US short term bonds.
TLDR: you need an exponential supply of cash for AI 2027 to even be close to working.
I think the author is right about AI only accelerating to the next frontier when AI takes over AI research. If the timelines are correct and that happens in the next few years, the widely desired job of AI researcher may not even exist by then -- it'll all be a machine-based research feedback loop where humans only hinder the process.
Every other intellectual job will presumably be gone by then too. Maybe AI will be the second great equalizer, after death.
AI proponents keep drawing perfectly straight lines from "no AI --> LLMs exist --> LLMs write some adequate code sometimes" up into the horizon of the Y axis where AIs run all governments, write all code, paint all paintings and so on.
There's a large overlap with the crypto true-believers who were convinced after seeing "no blockchain --> blockchain exists" that all laws would be enshrined in the blockchain, all business would be done with blockchains, etc.
We've had automation in the past; it didn't decimate the labour-force; it just changed how people work.
And we didn't go from handwashing clothes --> washing machines --> all flat surfaces are cleaned daily by washing robots...