"Spending more on AI than humans" tells you nothing about whether it works.
Cost per-output is the metric and by that I've watched startups do worse than last year, just more expensively.
Feels like investor signal: "we're AI-forward, mark us up next round"
I’m so far removed from how that stuff works that it often sounds insane to me. Maybe someone who knows can explain it.
Do the people with money not actually care about making more money? Aren’t they first and foremost concerned with your chances of financial success?
These people can’t possibly be thinking, “Well they say they spent an absolute shit ton on inference… they’re definitely going to be big winners!” and cutting massive checks, right?
The argument that AI guys are making about the coming mass unemployment goes like this: those companies that are spending on AI rather than humans may have a huge competitive advantage that allows them to take marketshare from human run companies and thus there's less and less demand for human labor.
But, how many businesses/sectors of the economy actually need to compete for marketshare? we assume it's nearly all of them. if that were the case, we'd see AI taking over much quicker.
I am very skeptical of the argument that companies are competing with each other on market share. There is arguably a lot more competition between AI companies than in most of the sectors of our economy.
Reminds me of the Railway CEO bragging that they're spending $300,000 / month on Claude [0], yet their service is getting worse and they're clearly vibe-coding to the point that their SOC2/HIPAA compliance is coming into question. For example they had an issue last month where a breaking change was pushed by a single engineer without any oversight [1].
How many humans could you pay for $300,000 a month and not have quality & reliability degrade like this?
i particularly like the idea that "a GTM team" is an organic component of running a business which can be impersonated by a grip of agents, as opposed to a convention that developed as a result of needing to pay a bunch of humans too much money to strategically choose to fuck over customers or sellers in the course of handling each unpredictable product adoption development, lest a poor poor pitiful technostructure be ripped apart by making too little, or too much, money. why don't all these tokenmaxxing people focus on making something BETTER
As someone who was deeply immersed in the crypto / NFT twitter scene in 2021 (yes I was an idiot, moving on…) it bears an uncanny resemblance to the current behavior of AI CEOs and speculators.
You kind of had to be there to understand. When you’re immersed in that stuff, the rational part of your brain takes a backseat, and the primitive social / visual parts start to run the show. You start to develop incredibly warped perceptions of value entirely driven by the predominant narrative and most importantly, price action. When you see prices go parabolic, you start to interpret that as confirmation of the narrative. This generates a positive feedback loop that can lead to unbelievable and insane valuations. And by extension equally insane narratives.
What makes it even more uncanny is that a lot of the same actors (tech CEOs, VCs) are involved in this. Make no mistake - they understand how to leverage mania to their advantage. They go on long soliloquies about how game changing this or that asset is, and how anyone not buying in NOW is “NGMI” (not gonna make it).
This will not end well. I’ll never forget the incredibly insane financial decisions I made - it really felt like being under the influence of a drug.
> Amos Bar-Joseph, the CEO of Swan AI, a coding agent startup, wrote in a viral LinkedIn post recently
Feels like investor signal: "we're AI-forward, mark us up next round"
Do the people with money not actually care about making more money? Aren’t they first and foremost concerned with your chances of financial success?
These people can’t possibly be thinking, “Well they say they spent an absolute shit ton on inference… they’re definitely going to be big winners!” and cutting massive checks, right?
But, how many businesses/sectors of the economy actually need to compete for marketshare? we assume it's nearly all of them. if that were the case, we'd see AI taking over much quicker.
Is the reported behaviour an example of OpEx/CapEx but with humans?
How many humans could you pay for $300,000 a month and not have quality & reliability degrade like this?
0: https://xcancel.com/JustJake/status/2030063630709096483#m
1: https://news.ycombinator.com/item?id=47581721
https://blogs.uca.edu/sherring2/2024/08/02/the-most-expensiv...
They just really, really fucking hate the labor force they view as little more than cattle.
You kind of had to be there to understand. When you’re immersed in that stuff, the rational part of your brain takes a backseat, and the primitive social / visual parts start to run the show. You start to develop incredibly warped perceptions of value entirely driven by the predominant narrative and most importantly, price action. When you see prices go parabolic, you start to interpret that as confirmation of the narrative. This generates a positive feedback loop that can lead to unbelievable and insane valuations. And by extension equally insane narratives.
What makes it even more uncanny is that a lot of the same actors (tech CEOs, VCs) are involved in this. Make no mistake - they understand how to leverage mania to their advantage. They go on long soliloquies about how game changing this or that asset is, and how anyone not buying in NOW is “NGMI” (not gonna make it).
This will not end well. I’ll never forget the incredibly insane financial decisions I made - it really felt like being under the influence of a drug.
That reliance on third-party AI is a huge risk, just saying.
2126: AI brags that it's reached 100% efficiency in Earth utilization after it's eliminated all organic life.