Does it do stuff that sometimes impresses me? Yes.
Does it sometimes save me time? Yes.
Does it sometimes waste my time? Yes.
Will it take over the world? No.
Do the L's in "LLM" have anything to do with logic? No.
It seems we are at a plateau. Although things have gotten better over the last year, it is not by much. The latest models were delayed many times before their current release. There was much talk of the companies having discovered "AGI" and were delaying new model releases due to the unknown effect it would have on the world.
It is clear now that they models were delayed because they were lackluster.
This article is written from the perspective of AI data centers are burning too much coal. But correctly points out that many data centers that were built for AI specific compute as currently sitting unused.
Summary:
Then today over on hackerfags we finally get the proof that Microsoft (and probably Meta) have been exaggerating. Both have been claiming in random interviews that "30%" of all new code in their codebase is coming from AI. Yet they never elaborate or say anything more than that, many would love to know error rates, etc.AI Not Living Up to Expectations
Minimal Business Returns Despite Massive Investment
Despite $252.3 billion in global AI investment in 2024, Stanford research shows most companies report disappointing results:Technical Scaling Has Hit a Wall
- Only 49% of organizations using AI in service operations reported cost savings
- Most cost savings are less than 10%
- Revenue gains are even more modest - typically less than 5%
- 71% reported revenue gains in marketing/sales, but at very low levels
Unsustainable Economics
- 76% of AI researchers say current "scaling up" approaches are "unlikely" or "very unlikely" to achieve artificial general intelligence
- OpenAI's own researchers found their upcoming GPT model showed "significantly less improvement" and "in some cases, no improvements at all" compared to previous versions
- Newer reasoning models (o3, o4-mini) actually hallucinate more than previous versions - o3 hallucinates 33% of the time vs 16% for o1
- DeepSeek demonstrated that Western companies' multibillion-dollar models could be matched at "a fraction of the training cost"
Market Reality Check
- OpenAI loses money on every single prompt and output due to compute costs
- Unlike traditional software that gets cheaper to serve at scale, AI gets more expensive with more users
- ChatGPT's 400 million weekly users create massive infrastructure costs that don't exist for platforms like Instagram
- Ed Zitron argues OpenAI "requires more money and more compute than is reasonable to acquire" and may be systemically unsustainable
The article suggests the AI boom may be following the classic pattern of a bubble - massive investment and hype followed by the reality that the technology doesn't deliver the promised returns at the projected costs.
- In China, approximately 80% of newly built AI data centers remain unused
- CoreWeave's IPO was so poorly received that Nvidia had to provide $250 million in support just to keep it from failing
- Microsoft has cancelled data center leases worth over 3 gigawatts globally, representing 14% of their current capacity
- The proliferation of AI "slopaganda" newsletters and paid influencers suggests companies are struggling to find genuine demand
So today some microsoft employees working on some open source dot net repo thought it would be good to let copilot work on some issues.
And what it shows is the typical back and forth that one always ends up in when trying to use AI tools. It can get you 80%, but then you have to also rewrite %50 of that plus add the left out 20% which is always the hardest part. The back and forth with their obviously indian employees with the AI bot is both funny and sad and gay