Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways
I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.
Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away
Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance
Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…
It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?
COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.
Then that ended, and they all wanted to hold onto that “value”.
It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.
“The markets can remain irrational longer than you can remain solvent” are wise words for anyone thinking of shorting this kind of thing.
Shows that You are in the UK. Just want to clarify I’m talking specifically about the USA but I agree with everything you said. Tech stocks became so inflated! Don’t know if people are seeing it in Europe, but here in the USA, there is this really toxic and very cringe behavior from these tech companies to get people back to office, they can force people to return to office across the country, basically you have to relocate and upend your entire life which could cost you $50,000 and they’re not paying for that, if you don’t do that you get fired. Easy way to start laying off people without having to pay them anything because you can call it insubordination, since they refuse to return to office. Now they supposedly have cause to get rid of people or deny them promotions for more money. IBM for example is doing this right now, Cisco was doing it as well. One of the most major networking software companies in the market. Scumbag behavior
It’s interesting how interconnected those points are.
Generative A"I" drives GPU prices up. NVidia now cares more about it than about graphics. AMD feels no pressure to improve GPUs.
Stagnant hardware means that game studios, who used to rely on “our game currently runs like shit but future hardware will handle it” and similar assumptions get wrecked. And gen A"I" hits them directly due to FOMO + corporates buying trends without understanding how the underlying tech works, so wasting talent by firing people under the hopes that A"I" can replace it.
Large game companies are also suffering due to their investment on the mobile market. A good example of is Ishihara; sure, Nintendo simply ignored his views on phones replacing consoles, but how many game company CEOs thought the same and rolled with it?
I’m predicting that everything will go down once it becomes common knowledge that LLMs and diffusion models are 20% actual usage, 80% bubble.
The backlash to this is going to be fun. Having lived through the .com boom/bust, which wasn’t a scam, the web was actually the future and was undersold if anything, no one with the stink of computer on them outside of a tiny elite could get decent fulltime work for like 5 years. AI is a scam, full stop. It has virtually no non-fraud real world applications that don’t reflect the underlying uselessness of the activity it can do. People are going to go full Butlerian Jihad from Dune when this blows up the economy, and it’s going to suck so much more for everyone in tech, scammer or no…
The backlash to this is going to be fun.
In some cases it’s already happening - since the bubble forces AI-invested corporations to shove it down everywhere. Cue to Microsoft Recall, and the outrage against it.
It has virtually no non-fraud real world applications that don’t reflect the underlying uselessness of the activity it can do.
It is not completely useless but it’s oversold as fuck. Like selling you a bicycle with the claim that you can go to the Moon with it, plus a “trust me = be gullible, eventually bikes will reach Mars!” A bike is still useful, even if they’re building a scam around it.
Here’s three practical examples:
- I use ChatGPT as a translation aid. Mostly to list potential translations for a specific word, or as conjugation/declension table. Also as a second layer of spell-proofing. I can’t use it to translate full texts without it shitting its own virtual pants - it inserts extraneous info, repeats sentences, removes key details from the text, butcher the tone, etc.
- I was looking for papers concerning a very specific topic, and got a huge pile (~150) of them. Too much text to read on my own. So I used the titles to pre-select a few of them into a “must check” pile, then asked Gemini to provide me three paragraphs summaries for the rest. A few of them were useful; without Gemini I’d probably have missed them.
- [Note: reported use.] I’ve seen programmers claiming that they do something similar to #1, with code instead. Basically asking Copilot how a function works, or to write extremely simple code (if you ask it to generate complex code it starts lying/assuming/making up non-existent libraries).
None of those activities is underlyingly useless; but they have some common grounds - they don’t require you to trust the output of the bot at all. It’s either things that you wouldn’t use otherwise (#2) or things that you can reliably say “yup, that’s bullshit” (#1, #3).
I would argue all of these things were possible sans “AI” (although it would have been sold as AI in the 90’s) via existing heuristics adequately developed if people knew that was the desired application
They probably could, indeed - but you’d need multiple different applications, each for one use case. In the meantime a LLM offers you a tool that won’t hit all the nails, or screw all the screws, but does both decently enough in the lack of both a hammer and a screwdriver.
it’s time for you to play PACMAN, as i did when i was young 😂
no AI, no GPU, no shitcoin: you just have to eat ghost, which is very strange in fact when you think about it 🤪Correction the ghosts are AI and based on how many times they killed me clearly a step above anything mainstream today (º ロ º๑).
. . . with 10% increase in performance rather than 50 or 60% like we really need
Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore’s Law has been more to the economic side than actually packing transistors in.
We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away
Sure you can, today, and this is why:
So many gaming companies are incapable of putting out a successful AAA title because . . .
Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that’s all been done at the indie level. Which is where the real party is at.
Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We’re not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn’t really exist at the time, and the control scheme is a bit wonky, but it’s playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven’t thought of yet?
Yeah, there will be worse graphics because of this. Meh. You’ll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they’re trying to do.
I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.
The limit on Moore’s Law has been more to the economic side than actually packing transistors in.
The reason why those economic limits exist is because we’re reaching the limit of what’s physically possible. Fabs are still squeezing more transistors into less space, for now, but the cost per transistor hasn’t fallen for some time, IIRC about 10nm thereabouts is still the most economical node. Things just get difficult and exponentially fickle the smaller you get, and at some point there’s going to be a wall. Of note currently we’re talking more about things like backside power delivery than actually shrinking anything. Die-on-die packaging and stuff.
Long story short: Node shrinks aren’t the low-hanging fruit any more. Haven’t been since the end of planar transistors (if it had been possible to just shrink back then they wouldn’t have engineered FinFETs) but it’s really been taking up speed with the start of the EUV era. Finer and finer pitches don’t really matter if you have to have more and more lithography/etching/coating steps because the structures you’re building are getting more and more involved in the z axis, every additional step costs additional machine time. On the upside, newer production lines could spit out older nodes at pretty much printing press speed.
I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.
I agree. Wholeheartedly. I think it’s just so obvious how quality dramatically takes off when the people creating it feel safe, sound, and economically stable. Financial Security (UBI) drives creativity probably more than anything else. It’s a huge win!
This post really nails my take on the issue. Give me original cs level graphics or even aq2 graphics, a decent story, more levels, and a few new little gimmicks (rocket arena grappling hook, anyone?!?!) and you don’t need 4k blah blah bullshit.
The #1 game for kids is literally Minecraft or Roblox…8 bit level gfx outselling your horse armor hi res bullshit.
The last game i bought was 2 days ago. Mohaa airborne for PC for $5 at a pawn shop Give me 100 of this quality of game instead of anything PS5 ever made.
Here are the number of hours I’ve spent on indie games VS AAA titles, according to my Steam library:
- Indie - Valheim - 435 hours
- Indie - Space Haven - 332 hours
- Indie - Satisfactory - 215 hours
- Indie - Dyson Sphere Program - 203 hours
- AAA - Skyrim - 98 hours
- AAA - Control - 47 hours
- AAA - Far Cry 6 - 29 hours
- AAA - Max Payne 3 - 43 minutes
If we’re talking about value - the amount of playtime I’ve gotten out of games with simpler graphics and unique ideas blows the billions spent by the industry out of the water.
Depending on where you draw the line, mine looks similar:
- EU4 - >800 hours
- Cities Skylines - ~180 hours
- Magic: Arena - >100 hours
- Crusader Kings 2 - ~100 hours
After that it depends on the length of the game. I normally just play through the campaign on most games once (except the above, which have lots of replay value), so looking at playtime isn’t particularly interesting IMO. The ratio of games with interesting playtime (i.e. I probably rolled credits) between indie and AAA is easily 2:1, if not something way higher like 5:1 or even 10:1, but again, that really depends on where you draw the line. If we look at 100% completion, I have 22 indie games and zero AAA games, because I rarely find AAA games to be worth going after achievements in. If I sort by achievement completion, the top two AAA games are Yakuza games (I love that series), and that’s after scrolling through dozens of indies, many of which have a fair amount of achievements (i.e. you need to do more than just roll credits).
So yeah, AAA games really don’t interest me. If you compare the amount I’ve spent on indie vs AAA games, it would be a huge difference since I pretty much only play older AAA games if I get them on sale, and that’s mostly so I can talk about them w/ friends…
None of them have innovated anything in years
Well, they’ve innovated news ways to take up disk space…
There’s a reason I don’t play new release AAA games, and it’s because they’re simply not worth the price. They’re buggy at launch, take up tons of disk space (with lots of updates the first few months), and honestly aren’t even that fun even when the bugs are fixed. Indie games, on the other hand, seem to release in a better state, tend to be fairly small, and usually add something innovative to the gameplay.
The only reason to go AAA IMO is for fancy graphics (I honestly don’t care) and big media franchises (i.e. if you want Spiderman, you have to go to the license holder), and for me, those lose their novelty pretty quickly. The only games I buy near release time anymore are Nintendo titles and indie games from devs I like. AAA just isn’t worth thinking about, except the one or two each year that are actually decent (i.e. Baldur’s Gate 3).
OP, when you say AI is really really fucking things up, what do you have in mind? Setting aside the ludicrous things people say about AI, do you see it directly fucking something up? I’m just curious what is on your mind when you say that.
To me it’s seeing the Nvidia stock price in the same sort of range as Cisco stock prices were in the dot com bubble- I don’t have any confidence they’re going to reach the promised land of profitability this time either.
Nvidia is already profitable and has been for over a decade.
Pretty sure Cisco was too, the’re supplying hardware to the bubble. I think you misunderstood my point- I don’t think AI is going to be profitable in the long term, it’s probably very profitable to sell the hardware for that to people with investor money- their stock price reflects that.
Just like always, it depends on how you define or redefine ai. For example, what used to be called ai has been very successful in photo processing. The same thing is going to happen: some portion or incarnation of the current generative ai will be successful, but it will be dismissed similar to “it’s just machine learning, not ai”
I have a lot of hope for Apple’s approach, where they are incorporating it as tools into specific capabilities, and prioritizing privacy. While there’s no direct profit, it should help sell a lot more devices with ever higher tech specs. I also like their “private cloud” model that has a lot of potential beyond private ai
That’s pretty much where I see the ending for a lot of this, there’s a wide variety of useful applications, but hard to capitalize on especially for things that are self contained and not phoning home to some server you need to maintain access to for billing purposes
What’s happening is that support from VC money is drying up. Tech companies have for a long time survived on the promise that they will eventually be much more profitable in the future. It doesn’t matter if it’s not profitable today. They will be in the future.
Now we’re in a period where there’s more pressure on tech companies to be profitable today. That’s why they’re going for such anti consumer behaviors. They want to make more with less.
I’m not sure if there’s a bubble bursting. It could just be a plateau.
I agree. Smartphones, for example, have hardly changed at all over the last ten years, but you don’t see Apple and Samsung going out of business.
And it would be so easy to make a big splash in the market by having a phone where the camera doesn’t protrude out of the back.
I understand you don’t appreciate where we’ve come from and how fast, can’t see the year to years changes, but the iPhone is just a little over ten years old. Do you really not see huge changes between an early iPhone and today’s?
On the contrary, I absolutely appreciate it. I was about 15 when mobile phones first became a thing that everyone owned, so I’ve lived through the entire progression from when they were something only a well to do businessman would have all the way through to today. The first iPhone was 2007, 17 years ago btw.
When mobile phones became popular, each new generation of phones saw HUGE improvements and innovation. However, the last ten years has pretty much just been slight improvements to screen/camera/memory/CPU. Form wise and functionally, they’re very similar to the phone of ten years ago.
I understand that some technophiles will always be able to justify why the new iPhone is worth £1600 and if that’s what they want to spend their money on then good for them, but I personally think that they are kidding themselves. Today you can get a brilliant phone for £300 or even less.
I’d never justify that urge to spend ridiculous money updating every year to the latest and greatest, but people tend to under appreciate the massive improvements from accumulated incremental improvements.
OLED screen on my iPhone X was revolutionary (and I’m sure Android had it first), as just one example, and now most phones are. Personally I find ultrawideband and “find my” very innovative and well implemented. Or if that’s too small a change, how about the entire revolution of Apple designing their own SoC for every new model. There’s emergency satellite texting, fall/crash detection, even Apple mostly solving phone theft is innovative (even if you don’t like their approach)
When we see steady improvements, humans tend to under-appreciate how it adds up
you don’t see Apple and Samsung going out of business.
Samsung is damn near the point of going bankrupt. Samsung saw a 95% drop in profits for a second consecutive quarter 2023
In more recent news;
BBC News - Samsung profits jump by more than 900% on chips https://www.bbc.com/news/business-68738046
Damn that’s wild. Any business that has that drastic of spikes of profit and loss cannot possibly be sustainable. I can’t see how it could be. Look at the automobile giants in the USA. All it took was one major economic event to bankrupt them, and they got bailed out which should’ve never happened. It’s bullshit.
I would love to have a VR headset that didn’t require a damn account with a 3rd party just to use it. I don’t need an account for my monitor or my mouse. Plus when I bought the thing, it was just Oculus, then meta bought it and promised nothing would change, before requiring a meta account to use the fucking thing.
That unfortunately is the consequence of letting a company have a monopoly. The US govt should’ve opposed that, and should’ve forced them to sell it. They own such a huge share of the entire VR market right now it’s unbelievable, and Pico by byte dance isn’t legally able to be sold on the USA
Well, that’s the doomer take.
The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.
I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.
And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).
And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:
- It’s not a social experience at all.
- There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.
If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.
And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.
The 5080 is rumored to be 10% faster, but also use 90% the power. While performance has a normal generational leap, power consumption has gone up to match leaving you with a much smaller actual improvement.
Power consumption numbers like that are expected, though.
One thing to keep in mind is how big the die is and how many transistors are in a GPU.
As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.
Big die + lots and lots of transistors = bigly power usage.
I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.
Conversly, the apple silicon products ship huge, expensive dies fabbed on leading TSMC processes which sip power relative to contemporaries. You can have excellent power efficiency on a large die at a specific frequency range, moreso than a smaller die clocked more aggressively.
You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.
nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.
Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.
I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.
I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.
This outlines several issues, a key one is outbidding apple for wafer alloc on leading processes. They primarily sell such high margin products that I suppose they can go full send on huge dies with no sweat. Similarly, the 4090’s asking price was likely directly related to it’s production cost. A chunky boy with a huge l2$.
I like the way Mike Clark frames challenges in semi eng as a balancing act between area, power, freq and performance (IPC); like a chip that’s twice as fast but twice the size of its predecessor is not considered progress.
I wish ultra-efficient giga dies were more feasible but it’s kind of rough when TSMC has been unmatched for so long. I gather Intel’s diverting focus in 18A, and I hope that turns our well for them.
I’m not sure that arm as an ISA (or even RISC) is inherently more efficient than CISC today, particularly when we look at Qualcomm’s latest l efforts at notebooks, more that Apple have extremely proficient designers and benefit significantly from vertical integration.
I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong.
Some of the best games I’ve played have graphics that’ll run on a midrange GPU from a decade ago, if not just integrated graphics
Case in point, this is what I’m playing right now:
Little bit of pushback on the vr front: Sure, there aren’t many massive publishers driving it forward, but I would wholeheartedly argue that it can very much be a social experience, and offers experiences it is damn near impossible to get anywhere else, and three games immediately come to mind:
VRchat (obviously): Literally entirely a social game, and has a pretty large community of people making things for it, from character models to worlds because that’s what drives the game. There is a massive scene of online parties, raves, hangouts, etc. that bring people together across the whole world in a medium more real than any flat game because of the custom models, worlds, and the relative abundance of people using full body tracking to show off, dance, and interact with each other.
VTOL VR: This is still fairly social in that you can either play with friends or people online, but the main draw for me is the level of immersion in flying you can get. You have full interactable cockpits that you basically just use your real hands to interact with (depending on your controller/hand tracking) and it’s all pretty realistic. It’s just impossible to have the same level of experience without VR.
Walkabout mini golf: I was pretty skeptical of this game when my friends wanted to play it, it’s literally just a mini golf sim. The thing is, the ability to play mini golf with friends who live across the country/world is amazing, and the physics of just swinging your controller/hands in the same way as real mini golf is so special.
It is still quite expensive to get really good gear, and that is definitely the current biggest hurdle. It may forever be a smaller community due to the space/tech/cost requirements to make the experience truly incredible, but for me even just on a quest 2 in my room without a lot of fancy stuff, it is still interesting and something special. A lot of people really do care a lot about VR, and even if it is far less than conventional gaming, it should not be entirely discounted. And I personally think that while is probably won’t ever replace flat screen gaming, it is an entirely different kind of experience and has a at least decent future ahead
Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.
IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.
I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.
PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.
Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.
I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.
I agree with you on the GPU hardware and AI bubbles, but I’m not sure I would consider VR/AR to be a bubble right now. The hype has mostly died down by now, and I think it’s stabilized to the point where it will remain until we have new advances in hardware.
VR is on the verge of collapse in the USA thanks to the US government banning byte dance. We can’t even order the new Pico 4 ultra, which is one of the most anticipated VR sets in the world right now. Meta basically has a monopoly and just announced they’re cutting funding to VR
Sorry but a new pico headset wouldnt do much of anything. New meta headset, new valve headset would give a bump.
Really needs better content. The hardware is almost there (in terms of cost and accessibility of the experience).
Its slowly getting there. But the current population of vr users is characterized by: who would play the same limited experiences consistently with hardware that is often cumbersome and loading screens that arent super long but become your entire existence and its annoying.
Meta sucks but they have been a boon for vr development.
As others have said, gaming is thriving - AAA and bloated incumbants are not doing well but the indie sector is thriving.
VR is not on the verge of collapse, but it is growing slowly as we still have not reached the right price point for a mobile high powered headset. Apple made a big play for the future of VR with its Apple Vision Pro but that was not a short term play; that was laying the ground works for trying to control or shape a market that is still probably at least 5 if not 10 years away from something that will provide high quality VR, untethefed from a. PC.
AI meanwhile is a bubble. We are not in an age of AI, we are in an age of algorithms - they will and are useful but will not meet the hype or hyperbole being banded about. Expect that market to pop and probably with spectacular damage to some companies.
Other computing hardware is not really stagnating - we are going through a generational transition period. AMD is pushing Zen 5 and Intel it’s 14th gen, and all the chip makers are desperately trying to get on the AI band wagon. People are not upgrading because they don’t see the need - there aren’t compelling software reasons to upgrade yet (AI is certainly not compelling consumers to buy new systems). They will emerge eventually.
The lack of any landmark PC AAA games is likely holding back demand for consumer graphics cards, and we’re seeing similar issues with consoles. The games industry has certainly been here many times before. There is no Cyberpunk 2077 coming up - instead we’ve had flops like Star Wars Outlaws, or underperformers like Starfield. But look at the biggest game of last year - Baldurs Gate 3 came from a small studio and was a megahit.
I don’t see doom and gloom, just the usual ups and downs of the tech industry. We happen to be in a transition period, and also being distracted by the AI bubble and people realising it is a crock of shit. But technology continues to progress.
VR
Yeah, I think it’s ripe for an explosion, provided it gets more accessible. Right now, your options are:
- pay out the nose for a great experience
- buy into Meta’s ecosystem for a mediocre experience
I’m unwilling to do either, so I’m sitting on the sidelines. If I can get a headset for <$500 that works well on my platform (Linux), I’ll get VR. In fact, I might buy 4 so I can play with my SO and kids. However, I’m not going to spend $2k just for myself. I’m guessing a lot of other people are the same way. If Microsoft or Sony makes VR accessible for console, we’ll probably see more interest on PC as well.
People are not upgrading because they don’t see the need
Exactly. I have a Ryzen 5600 and an RX 6650, and it basically plays anything I want to play. I also have a Steam Deck, and that’s still doing a great job. Yeah, I could upgrade things and get a little better everything, but I can play basically everything I care about (hint: not many recent AAA games in there) on reasonable settings on my 1440p display. My SO has basically the same setup, but with an RX 6700 XT.
I’ll upgrade when either the hardware fails or I want to play a game that needs better hardware. But I don’t see that happening until the next round of consoles comes out.
Yeah Sony was my hope here but despite a few great experiences, they have dropped the ball overall. I’m bored of the cartooney Quest stuff, so I’ll probably not buy another headset for a good 5-10 years until there’s something with a good library and something equivalent to a high end PC experience today.
Yup, but with good headsets costing way more than good monitors and generally needing even better GPUs, I’m just not interested. Yeah, the immersion is cool, but at current prices and with the current selection of games, the value proposition just isn’t there. Add to that the bulk, it’ll probably be on my wishlist for a while (then again, Bigscreen VR headset looks cool, just need a way to swap pads so my SO/kids can try it).
So yeah, maybe in 5-10 years it’ll make more sense. It could also happen sooner if consoles really got behind it, because they’re great at bringing down entry costs.
Unfortunately Sony was our last hope for consoles and they half assed it. The very last hope is that Flat2VR ports tens of AAA titles at a rapid procession to PS5.
I really truly suggest diversifying to newsfeeds without comment sections like techmeme for a bit.
Increasing complexity is overwhelming and theres plenty of bad shit going on but theres a lot overblown in your post.
I agree. But also add in the movie industry that’s been complete trash for a while now. Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.
My take is we’ve already left the golden ages of movies, music, and books and probably won’t get another for an extremely long time.
Video games are going through the same downfall which streaming services brought. Physical media left the movie scene as a standard while ago, but video games took longer. Now it’s going to be all streaming and subscriptions where you can never own anything.
Once that happens, enshittification will peak, companies won’t be incentivized to make the games good anymore, standards tank, and people will forget how good things once were.
movie industry that’s been complete trash for a while now.
This is not a callout of you in particular so don’t get offended, but that’s really only true if you look at the trash coming out of Hollywood.
There’s some spectacularly good shit coming out of like France and South Korea (depending on what genres you’re a fan of, anyways), as well as like, everywhere else.
Shitty movies that are just shitty sequels to something that wasn’t very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can’t possibly do better.
Not to mention an ungodly amount of Animated content of all varieties. Anime, cartoons, indie (Helluva Boss is hilarious and (un?)surprisingly dark), I recall seeing a screenshot of something French with amazing art style I want to look into watching.
One Piece is gearing up for a re-animation from the beginning using its new style from the Wano arc IIRC, and that is a hell of a long epic story.
Interesting! Anything you’d recommend?
Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.
I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.
Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.
As with video games, the real gems imo for movies and music are from the indie scenes.
Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.
Are you talking quality or popularity? Because there are many, many books that are just as good or better than Harry Potter.
Check out the Mistborn and Wheel of Time series for books that are waaay better than Harry Potter. Anything by Brandon Sanderson and Neil Gaiman is a good time.
Also highly recommend any comics by Moebius and/or Alejandro Jodorowsky, and Neil Gaiman. Some incredible mind altering works to enjoy there like The Incal and Sandman.
Wait till the Y2K38 event occurs.
If only we had some way of working with a bigger integer…maybe we’d call it something like BigInteger…
Or just a u64. 64 bit computers are pretty standard nowadays.
I had heard that. Maybe I’ll get my hands on one someday. I hear Commodore makes one.
(I do wonder now if whatever variable is being used to denote time is signed or unsigned, because that would make a big difference, too.)
My biggest gripe with big tech is how governments of the world encourage their worst behaviours. Governments and businesses have failed to maintain their own level of expertise and understanding of technology.
Today everything relies on tech but all the solutions are outsourced and rely on “guidance” and free hand outs from vendors like Microsoft. This has caused situations where billions are poured into digital transformation efforts with fuck all to show for it but administrative headaches, ballooning costs and security breaches.
I’m so tired of silicon valley frat boys being the leaders of our industry. We need to go back to an engineer and ideas led industry. Focused on solving problems and making lives better. Not making bullshit unsustainable business monopolies with a huge pile of money. Right now big tech is the embodiment of all of capitalisms worst qualities.
P.s. apologies if my comment is a bit simplistic and vague. didn’t want to write a 10 page rant but still wanted to say my 2c about the state of things.
It’s a societal bubble, soon we all go pop. c/collapse