Kích thước video: 1280 X 720853 X 480640 X 360
Hiển thị các điều khiển trình phát
Get the news on the NVIDIA RTX 40 launch here: vnclip.net/video/3tZ01ymHZEs/video.htmlLearn about EVGA leaving the video card industry here: vnclip.net/video/cV9QES-FUAM/video.htmlGrab the BRAND NEW GN 'Amp' Medium Anti-Static Modmat for PC building, in stock and shipping now: store.gamersnexus.net/products/medium-modmat-v2
F*CK NVIDIA.Monopolizing mofos scamming us consumers.
I am quite interested in the iGame GeForce RTX 4090 Neptune, it would be really cool to add it to my white build.
XLR8 = accelerateHave you never read a personalized vanity plate on a car? Come on, no one is that dumb.
I don't even know what graphics card I put in my computer. It was 250$ and I play Spelunkey.
You don't say jolf, you say golf. You don't say jold, you say gold. You don't say jif, you say gif.
DLSS is amazing; it can even upscale prices.
Best comment ever.
@Michel van Briemen A listed! or the actually competition between: vudoo, Matrox, nvidia, god only knows who else. and who was it that gimped their cards if specific direct X and hardware was on the computer? I just remember their was a massive lawsuit that broke that company.
and GPU size dimensions
I think I know what really happened to EVGA. They heard the rumors of marketing on these cards from their competition and they were like "I'm too old for maximum dark power obelisk bullshit" and just left the industry. Seriously tho imagine how hard the market is on these manufacturers, you have to design something that will be functional in coolling down what is essentially a furnace on a chip and make it look "special", market it right and sell it with a laughable margin because of the insane competition all the while jensen goes around asking you how you dare wanting partner information on a product you're going to sell and provide support for, when you don't do anything at all and rake in a lot of money for doing nothing. This whole market is insane and it's unlike anything else really.
I think it’s pretty obvious evga was pissed nvidia was letting random Chinese companies cut their legs from under them given all the new cooler manufacturers appearing from thin air. It’s all made in China anyways so we will see if there’s a real difference
6:35 That dev literally went back ~10 years and found an ancient Stackoverflow technique to rape mouse wheel events with jQuery and control video playback via parallax scrolling jank.Incredible. Counting my blessings as a front-end developer. We are a special people.
Yugi: "Dark Magician Suprim! Use *Absolute Dark Power* Attack!"Kaiba: "I activate my trap card! *Dark Obelisk* holds your attack!"Yugi: "Big Mistake Kaiba! My Night Baron uses his special effect *Midnight Kaleidoscope* to absorb your trap card for me to use later!"Kaiba: "NO!!"
Maybe we should start considering having the GPU be the main board and CPU/etc as addon cards.
I just thought to myself, that the Asus cards just look like Mini-Towers themselfes. So just have a tower for the graphics card and a tower for the rest :-DIt's rediculous
Nvidia is actually very considerate of their buyers by providing them with video card, space heater, dumbbell and building brick all in one.
lol as in you'd need to be a dumb ass to buy a fucking 1,200-> 2,000 dollar fucking graphics card?
@Snickerdoodle lulz imagine actually having your head so far up Jensen's ass that you're defending nVidia's downfall. Like, "hey PND why do you keep shitting on nVidia all pandemic long, maybe just give them a chance again." Yeah, nah. EVGA themselves dropped that hot brick of shit even if it cost them much of their business. What's amusing to me is they treat the end user at least as much like a piece of shit as they do every single other company, but it's like the average nVidia buyer gets fucked in the ass and loves it.
@MISTER SIR water cooling isn't magic it's just better heat distribution. It actually takes way more space and introducdes another breakabls active moving part - the pump. It's honestly not the greatest solution.
@ArtisChronicles i can't see it coming in the next few years for sure.
LOL, marketing for these brands feels like they're targeting 13 year olds on Xbox Live, circa 2005.It's absolutely hilarious.
@Shendue lol what are you even on about, nVidia's software fucking SUCKS and I didn't even realize GPU software could be good until having a 5700XT. I don't even use FreeSync because Enhanced Sync is just so much better, it just automatically matches my refresh in just about every game. The drivers are also fine, which is hilarious you're on about this when 2080ti's drivers were legendarily bad and Ampere's drivers sucked, which then they had to actively nerf your Ampere's performance just to make it stop blackscreening and crashing, an interesting way to age like milk and pay more for less.I mean, the amount of cope in your post is just insane. And yeah, speaking of bad drivers considering how godawful the entire Ampere launch was is indeed ironic, because every nVtard is still trying to flog a dead horse about the driver problems on release for 5700XT as though we can't just scroll through newegg's reviews of the horrendous driver issues the 2080ti had, and then the bad drivers on RTX 3000 cards.Basically, the problem with people like you is that you're still stuck in the past with GTX 900 or 1000 cards, and expecting to be able to talk about it like we're in 2015 or some shit. It's been a long time now, and the roles are totally reversed. Where you are now is basically where Intel fanboys were back in like 2019, 2020. What's one way to know this? Noticing how you spent no time talking about performance, because we all know how badly nVidia's performance fucking sucks for the cost since Pascal and it's only been getting worse. Overheating shutdowns btw, fucking lol enjoy Lovelace. It isn't even Fermi tier at this point. Ampere was Fermi tier. This is just a bad joke at your expense.
@pandemicNEETbux Call me when AMD will learn to code drivers, Freesync won't suck compared to Gsync and they'll have comparable features, for example in terms of Ray Tracing. I used to be an AMD buyer years ago and gladly left it after getting bored of having to deal with microstuttering, driver problems with some games, overheating shutdowns and other assorted BS. Never had such issues with Nvidia card in many many years.Speaking of bad drivers while praising AMD is fairly ironic.
@Watchful Fox Facts
@Boba Vhett parents. The toy industry learned that people buy for their kids and pets regardless of the economy. They're attempting to recession proof this line of cards because Biden killed our economy.
Of course they are aimed at kids who will tell their parents that it's what they want for a birthday, Christmas or other reward. Young adults recently moved out can't afford them and mature users don't care about the marketing, they buy based on specs so the marketing doesn't come into play at all, it's ignored.
This must have been so incredibly embarrassing to cover. I am convinced that these board manufacturers are convinced that all gamers are idiots.
In fairness, they are. Have you ever tried interacting with "people" on Steam forums? You can easily forget just how F'ing stupid average gamers are until you actually go and interact with the ones not playing your niche city builders or whatever, and realizing these are literally Qanon follower tiers. Plus it's nVidia anyway, so they tend to get a much bigger portion of the bottom end of the bell curve regardless.
I now want to see a Frigidaire GPU, it will be 6 slots long, and plug into 3 pci express ports just for support. It will be a mix of white, grey, and beige color themes and look like an over size box with some fans on it. It's website landing page will only have the specs and a user manual on it. It'll also have a remote that you can use to heat up your house in the winter, as such the card will also have a build in psu and require a separate power cable to be plugged into the "card's" back I.O. As for branding it'll only have a part number and be prices just a bit more then the rest of the gpu's out there.
All jokes aside... the idea of using a PCIe slot as additional support doesn't sound all that stupid 🤔
lol I thought you said frigate at first. Reminded me of mandalore's BFGA2 video describing a 40k ship as "a flying gun-brick the size of a freeway" and I thought, well yeah might as well just introduce the frigate class of nVidia GPUs. Then they can also introduce the corvette, cruiser, battleship. At least it'd be consistent!
The new cards are so big and heavy that under normal operation, two smaller graphic cards are orbiting them in an elliptical orbit.
Thats why they need the anti-gravity plate to prevent that from happening, duhh
I get the feeling that the reason many manufacturers have omitted the RTX 4080 12GB pictures is because they originally rendered them as "RTX 4070" and Nvidia blindsided them.
To crawl back now xD
@Mavis dude took a hit rip ✖️
@Mavis came back to see how you're liking that 4080 12gb now that NVidia unlaunched it? 😂
@Drew First bodied
The troll in this comment Sodafs has changed his name. Guess he couldn't take the heat from his fanboy comments. His new name is Armor.... How fitting.
By the time we get to the 5000 series, we’ll need AIO cards or custom water loops for the cards to keep them cooled. I just upgraded from my 1080ti that was a beast of a card from gigabyte to the 3080 strix card and the size differences is actually comical!
Same story here, had 1080 TI Strix for a while , sold it to a friend a year+ ago.Got my 3080 Strix recently, this friend came to me to clean his computer, was overwhelmed by the size and design.1080 looks waay cheaper now with the materials and build quality.
Asetek can only wet dream that this will happen. Last time a GPU was designed facotry like this it was sued and shutdown from Asetek (Radeon Fury X)
it's kind of weird how GPU's are getting bigger, We live in a time where everything gets smaller and more powerful.
@Kacey I can't even imagine running a 450w-600w card in this room honestly. Even at 225w + my modest but very efficient 3700x this room gets so hot in the summer. Apparently that 600w TDP figure we got before is because some of the AIB cards are going to have some kind of unlocked power delivery spiking up to 600w, probably the 4090ti or something dumb like that. It makes Fermi look cool. Not that any of this matters because even just a 320w card is hot and at the absolute edge of what I can do without needing a new PSU if I get a 5900x or something too.
Because they're dumping efficiency for power, to be the best raw performance at any cost, particularly the cost of your increased power bill from both powering the monster, and cooling the heat it outputs into your home. At least in winter, you won't need to run your heater nearly as much.
@RogueStar777 I didn't think it would actually work though because at least in the states, you had kids and all these young poor people ages 18-30 who'd suddenly get a $1200 check mailed to them, so of course you can sell a 3070 for that much to somebody who's really bad with money and can't prioritize for shit. But then after they stopped being money printing machines you'd think anyone would've gotten a clue and stopped buying these shitty ass GTX 3050's and alleged RTX 3060's for over $300. It just sucks because it impacts AMD, which is also a corpo so they jack prices too.At this point I'm realizing maybe I shouldn't even bother upgrading this year, because if AMD has prices anywhere close to nVidia I'm checked out. My 5700XT still works fine, and while I'd love to have a widescreen or better 4k panel, I don't actually need it. Seeing how I'd be buying a new monitor, it's a big chunk of cash for upgrading already. I just assumed people would realize paying that kind of money for graphics cards is utterly insane unless you're literally making money off it, and people would stop paying for it.
@pandemicNEETbux They make graphics cards a trendy object to justify ever more aberrant prices but as there will always be pigeons Nvidia is rubbing its hands ....
Pretty disappointed in the latest gen of PC hardware so far, feels like we're going backwards. Everything is getting bigger, hotter and more power hungry.
Card itself aint big, it's just a hunk of cooling
I tried warning all you guys a year ago what kind of a total shitshow nVidia had become but you didn't listen. You could've prevented this. I even called, hell not just this, but Ampere when I warned you all that Ampere was looking pretty sketchy right before release, and it turned out to be one of the most meme-worthy nVidia generations since Fermi. They even started using the Sapphire Nitro Fury-X passthrough cooler design, which was an actual meme on AMD's old R9 300/Fury series cards for how hot and unstable they were, and suddenly nobody bats an eye when nVidia has to adopt it and make it mandatory. Then when I saw the TDPs I just knew it was over, because Jensen will do everything in his power to make sure they have the ostensibly "best" performing halo card regardless how much the rest of the lineup is total shite. This signalled to me they already knew that AMD was gonna beat them with RDNA3 so all what this is is some truly desperate moves on nVidia's part to maintain their image, because optics is all that really matters to nVidia, not the performance. Meanwhile they even managed to drive EVGA away, after having successfully ensured that all Macs (shitty as they are) and all consoles now have AMD hardware on it, thus ensuring much better stability, support, and optimization for productivity software and games running on AMD. I mean what do they even have left, the Switch and Tesla or something? EVGA clearly bailed at the right time. Sadly, the average nVidia buyer is now looking like a beaten housewife trying to rationalize how the corpo really loves them deep down inside, and shelling out 1080ti levels of cash for a shitty 4070 that's going to draw more power and perform even worse than a 7700XT and still require you replacing your 850/750w PSU most likely, so now you get to add that into the cost of upgrading your monitor and GPU. It absolutely astounds me that even after EVGA themselves finally had enough with nVidia, that there's still honest to God thousands of people out there who insist on buying nVidia cards for no good reason at all but branding and memes.
I love BFGPU concept. Finally we are getting real performance gains. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one.
Well done for getting through that video Steve. Do you think any of these manufacturers will realise that a lot of their customers are actually older than 14? Some of those cards are seriously hideous and they come in packaging that make me thankful for mail order so no-one has to actually see me carrying one of those! I think the Zotac cards are the only ones that I looked at and thought "yeah that looks ok!"
@Defining Slawek 🤣oh dear I didn't notice that
ASUS TUF and the MSI ones are okay aswell but tbh the founders edition just looks the best.
Agreed on the zotac, but it says 'live to game'
But but, we need to show off our waifu GPU !!! 🤣
I'm impressed at how little i care about the 4000 series. Well done nvidia.
@TrajanRomeo Shut up rich kid.
@TrajanRomeo I judge by power consumption thanks. Shit's DOA to me.
@Animalyze71 1080Ti? more like 1080p high lole
@Animalyze71 I mean, in the specific case I was referencing in my reply, hardware would certainly affect frame rates in a game like RDR2. Not to say games have perfect optimization.
@Zaiquiri Is it the games functions or the programming errors in those games that cause the issues? Small minds always blame the hardware when most of the loss and stutter comes from the code itself, lay folks call it optimization but it's a small deal more than just that.
That was the best laugh I have had in a solid week! Honestly, these GPU companies are just rediculous these days. 4 slots is crazy enough, but when they cover their card in sparkly dark matter and fall in love with three fan designs supported by dark obelists, I have to wonder if they even know what kind of products they make?
When these cards drop and you do your reviews, could you do a breakdown of which cards would be the best option if you're planning to watercool them in a custom loop?
Went for cases with horizontal mobo layout a while ago.. in the near future we seem to be sticking something small (MoBo, CPU and such) on the GPU instead of putting the GPU on something. So ATX standard still a thing ;-)
With how they decided to tune these cards, do you think the longevity of the 40 series will be drastically shortened?
The more I hear of the 4000 series the more I'm excited for the AMD cards
@Big Mat but rtx games don't work with amd right?
@B__Tier well you have FSR on AMD and its very likely the new AMD cards will run Ray tracing better than this last gen
@Cacodemon345 My GPU paying for itself twice over (just through mining, not the other professional work I use it for) is a problem in your eyes? I'm confused. Oh, are you a "noooo you can't just heckin use electricity" guy? My PC's in the basement, it's about 10 degrees Celsius down there in spring and fall and the pipes freeze in the winter if I don't have a heater running. So yes, my 3080 Ti not only 200% paid for itself (not counting a $250K job contract it recently helped me win), it cut my heating bill, I forgot to mention that. Damn you Nvidia! So evil!
@A Fistful of 4K "mine 2x its value in crypto"There's your problem.
ewwwwwww AMD is trash
Your reviews are truly a jift to the gaming world.
Jreatest thread in the comments
@SevasTra388 say the word gym and replace the m with and f it's both, it doesn't matter which one you use because there's no fking rules for how to pronounce g in GIFgiant, gym, giraffe, Germguard, great, gas, gush like you can say it either way, who the hell cares ?
The juy gust doesn't jet it.
Jamers Nexus is awesome!
@D Dood that's obvious to intelligent people, it escapes Steve's grasp however.
Remember this happened almost 8 years ago and AMD rocked the scene with Radeon Fury Nano.Hopefully Intel or AMD do that again because at this point Nvidia basically went off the rails requiring a full sized PC for a card that is basically a rebranded 60 or 70 series selling at Titan prices.
AMD still rocks my scene… never could afford a high-end GPU… and I'll never want one anyway.
Just like apple, Nvidia could focus in making GPUs less bulky and more energy efficient. We're getting to a point where all GPUs will require an entire separate unit outside a case.
@Chief Judge that's a good point.
I love BFGPU concept. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one.
@Fil Sapia I mean, firstly that's just patently not true and completely against the trend of the last 20 or 30 years in gaming, especially PC gaming. Secondly, probably the biggest reason hardware capabilities need to progress in a somewhat linear fashion is because VR tech is an exploding industry starting to reach maturity in gaming and other computing applications as well, and VR tech has very high spec requirements. The 30 and likely even the 40 series is just scratching the surface of what's possible in VR. At this point GPU tech advancing is more important to VR than it is regular gaming, as the 30 series and 40 series will blow through any game on the market for probably the next 6 years.
@Free Thought Left Not true, developers will just start to target lower end hardware.
@Fil Sapia and the short term future is definitely more power demand, until we figure out how to break the current barriers but right now we've hit bedrock
Had to laugh so much when hearing about the brilliance, dark powers, and anti-gravity technology. Hikes, Albert Einstein was nothing compared to these geniuses.
EVGA pulled out just in time
@Ace Too big to fail? lol. lmao even. You've clearly no idea what the east empire company was.
Some wouod say they didn't pull out fast enough, as these designs and names look like they could be their babies.
@Shendue for current XFX maybe. But 12-14 years ago they were very formidable AIB. In forums they got a lot of praise and they were extremely popular with their double lifetime warranty.
@ArenZ RIcodeXD XFX sucked balls. A lot of marketing hype and their card were faulty AF. I had overheating, shutdown, microstuttering like crazy with XFX cards and their customer service sucked. EVGA? Never an issue, great step-up program, fairly good customer service. They weren't perfect, sure. They pissed me off during the pandemic because of their "waiting lists" policies towards european customers, but compared to other manifacturers, they were great.
Nearly made a dumpster baby
"We'll see how long they last as an NVIDEA partner with those AMD colours" Well said!!!!
I remember when my R9 FURY Strix was considered a huge graphics card with huge power consumption... Now some medium range lines are bigger than the Strix and high-end cards can take up to FOUR SLOTS.
I remember the time when a top level GPU has 1 slot design and it was awesome.
I like how you started with ASUS as they seem to be the only one following their naming model seriously. It's like you didn't want people to confuse them with the other nonsense they are surrounded by.
These cards need to start being sold in the home appliance sections of big box stores. The power and heat management are gaining ground on small cooking appliances.
Plumb the water cooling loop into your HWS for free hot showers.
This had me laughing
@Javi Lopex "yellow stickers denoting the energy usage per year."^ThisAlthough there is a lot of wiggle room, so it should be based on say yearly cost per 1 hour of gaming per day or hours of gaming per week, and just multiply by how much we game. Along with hours per week web browsing and such.They would use unrealistically low gaming time to make the number low if they can.
@Zag Zagzag the US has enough natural gas for 98 years. & it is a renewable resource. There should not be any shortage.
True Ninja Foodi and off grid, do not go together.
I would love to see a electricity drain test with a 4090 ocerclocked and AMD's new 95W+ CPUs. I hope it frys a psu or 2
To be fair, GALAX is partially a Japanese company - localization is tough, lol. The cards are pretty well binned in my experence, they've got gamers at the helm in general (I've talked with a couple of them thanks to them advertising with us once)! I wish I could get their stuff in canada.
Thanks for the walk through of all the 40xx cards . Was kinda hopping to see something smaller like what they did for the 3090 turbo. Looks like nothing is going to fit in my sg13 lol!!!!. Or just use a ribbon n mount the card at the top of the case.
If they're going to make GPU coolers this thick, surely they might as well stick a fan at each end and make them closer to a CPU tower cooler. That way the heat from the GPU is actually exhausted from the case through the slot rather than just being thrown into the motherboard and glass side panel.
NVidia: "We need you to rebrand your RTX 4070s as RTX 4080 12GB"_EVGA has left the chat._
@Ernismeister rtx 5090 512MB
@Usama Rasheed more the other way around: Nvidia tried to "do" EVGA and they said "not with us, idiots"
@insuna they'll use stickers just like they did with the super release of 20 series.
@thesinaclwon There'll be no worries on that unless crypto mining becomes profitable again.
@Ernismeister Ultra underrated. Made me chuckle.
When I purchased an AMD 6950 from HIS, it also came with a support stick. That was a 2 slot card. With these being 3 to 4 slot cards, I'm not really surprised that they have need for them.
With this serie of GPUs, watercolling seems to be the option to go for, the Inno 3d Ichill black is the only one good looking enough for me
With my favorite GPU manufacturer leaving the business, the only other company I would even consider is Asus. Considering how good their 780 DCU II OC was, I'm willing to give them the benefit of the doubt on their latest design for the 4000 series. Or I might just skip this gen since I have a 3080, and this is looking to be the worst price/performance generation ever, and that's saying a lot considering the past two years.Also, it's not pronounced like peanut butter.
If something is taking up 1/2 slot, isn't it effectively taking a whole slot? What cards fit into 1/2 slot?
You know, back in the pascal days, i thought that "zotac arctic storm" and "palit super jetstream" were absurd names for graphics cards. Now we have serious "night baron" and "midnight kaleidoscope" cards with bionic shark fans supported by dark obelisks and anti-gravity that give me 905,637 square millimeters of absolute dark power and i dont know if i want to laugh or faint... great vid though!
I predict a strong trend in watercooling this generation. Its starting to make sense now with oddball slots multipliers.
@Superbus Starodub ikr nobody saw that one coming. Im quite the Nostradamus. Just like when I predicted I would eat spaghettis for diner yesterday... Im never wrong.
wow what a brave prediction
either that or a strong trend in people skipping 40-series
This 40xx series reminds me of the 900 series a few years back. They were getting BIG and way too power hungry. Competition came up and forced actual innovation for the 10xx series.
whoa those cards are huge. it seems like it's overkill for most of us. if it's beyond the mainstream it is probably better for the electrical grid
Oh men.. i remember how sick cards looked back then. The red MSI cards from the 9xx generation. Or the EVGA cards with full copper cooler. Or the Jetstream cards, the Aorus Xtreme cards from the 1xxx series. And so on... Or my very first card the amd sapphire R9 390x nitro. Beautiful.
EVGA looking smarter and smarter every day since the 40 series announcement.
@Cacodemon345 did you just eat a sticky grenade?😄 I meant that all iPhones look the same so you need a sleeve to make them look different. So if all AIBs leave and only Nvidia makes the cards, all cards will look the same just like all iPhones look the same
@Cacodemon345 I suppose you entirely missed the point. it's not what Founders Edition Nvidia GPUs or iphones do, can do or can't do (obviously you can't call someone with a GPU....), it's about them looking the same if there won't be AIBs left for Nvidias GPUscan't deny it, if you place 5 iphones next to each other they look the same. no widgets, no custom grid size or spacing, etc. So what's left is a different wallpaper, and of course attachments and cases/sleeves.And yeah, calling it "gaming" if you play on a phone is still ridiculous in 2022. That's like calling yourself a gardener if you have one plot plant in your kitchen, or calling yourself a professional baseball player if you played catch as a kid. #shotsfired (ok, not really, but I hope you get my point)
@XuryFromCanada GPUs aren't general-purpose computers or even if are, aren't designed to work like one.iPhones are useful for more than just gaming, with an entire ecosystem built around them. Not GPUs.
@XuryFromCanada I don't really care too much about what my next graphics card will LOOK like physically as long as it offers good specs at a good price and good drivers, ... but I kind of agree with you ... the day nvidia manages to drive away all of their AIB partners will be a sad day ... for consumers and nvidia alike.There once was a popular graphics chip manufacturer who at some point decided to exclusively produce graphics cards "in house", thereby pissing off any AIBs who previously worked with this graphics chip manufacturer ... the AIBs moved on to partner with nvidia instead and that other graphics chip manufacturer went bankrupt and got bought by nvidia a while later ... you know who I'm talking about ... 3dfx of course ...I think the AIB cards played an important role in nvidias rise from being just one relatively new and unknown graphics chip manufacturer amongst many (Ati, avance logic, cirrus logic, rendition, S3, tseng labs ... they have all been there long before nvidia) to becoming the biggest manufacturer of (PC) graphics chips.It appears that nvidia has forgotten what happens when you piss off your AIB partners ...
@kyle hubner They do look great! But I love the comical variety of the AIB cards and all that crazy marketing talk that just hypes you up to rip & tear 😄
Thanks for the video, saves a lot of time. I think the days of air cooling are coming to an end. 4 slot cards blowing heat everywhere just seems cringe, and any high end cpu that's air cooled is probably being held back. Surely anti gravity shark fans will solve it.
I think I'm pretty happy with my EVGA FTW3 3080 10GB. I'll probably skip the 4000 series and may move to AMD this generation or next. Waiting to see RDNA 3.
Well the names are a bit ridiculous but they are also fun, in such a way that I much prefer them over unassuming names. Just like with PWOs.
What an absolute circus, thank you brands for a good laugh!I got my hands on a brand new 3060, see you in another 3 card gens or so! 😂Probably gonna go with AMD next to be honest, Nvidia is really getting ridiculous 🤔
Inno3D isn't just "brutal", they're "201% committed". That's impressive.
Does that mean they have fired half the staff?
”And we will charge you extra for it”
Maybe they really should be committed…
Since they are in Hong Kong, maybe BRUTAL is their tongue-in-cheek comment paying homage to Hong Kong's brutal government events in 2019 thru now.
@Mike Watts Being an Aldi customer and an Inno3D card owner, I approve
I recently bought a 6800 from Asus and I was like oh lord the size of this card is absolutely absurd and it barely fit into my case after moving a hard drive. Then suddenly:Nvidia 4000 has entered the challenge.
Man, we definitely need Noctua aftermarket GPU coolers to cope with this insanity....
Thanks for the laughs and making my day a bit brighter :) You should start doing more videos with this sense of humor!
Now we need case manufacturers to include radiators in BOTH side panels and we are good.
Can't wait for an RTX 7090 that will finally use all 7 of my PCIE slots
By then we having graphic cards formed as motherboards.
@DagobertX2 When the swat helicopter flies over your house. The pilot will call into HQ. Pilot: On the thermal sensor it seems we just found another major marijuana grow op! Or maybe another current gen PC Gamer....
Remember to attach anus lead.
@Misty Kathrine 😆
@Zamaric I mean, I'm actually expecting a future GPU to have a power cable coming out the back of that you plug directly into your wall.
Heh, decades ago before it had the name "gpu support stick" I was using a cassette tape case (remember those?), wedged between the bottom of the PC case and the gpu. It was the perfect fit, non-conductive (in case it got dislodged and fell against any motherboard contact points), and depending on your PC case, mostly invisible.
I recommend a long ziptie instead, you should have something to hang the GPU off in the case. Support sticks have a tendency of finding themselves tipped over or obstructing something. In a pinch, twine also works.
DLSS upscales now sizes of GPUs, images and the prices, what an amazing work.
i think cases will need 2 x PSUs moving forward... 1 for gfx cards, 1 for the other components
If I install an anti-gravity plate, would I have to put a 20kg rock on top of my computer to keep it from floating into space? Would a gravity-related event be covered by the extended warranty?
You can really feel Steve's will to live fading over the course of this video
It's also the first time I see Steve visibly cringing at the wannabe edgy/cool marketing terminology lol
He's being drained by the dark obelisk.
His reactions are priceless and somewhat concerning. Dont lose the will to live Steve!
it gets sucked up by the Dark Power
@Benjamin Oechsli No True Nerdsman, if he wanted it pronounced that way he should have written it that way (blame USA copyright maximalism since Jif (food brand) could sue Jif (file format) even though the risk of confusion is non-existent) though I am sad that this decades long dispute is irrelevant now that webms are better quality for lower file size and hardware accelerated on practically every mobile device.
these marketing summaries remind me of the good ol days when board partners would put graphics on the outside plastic of the gpu shroud. feels like gaming is staking a step backward.
We've somehow gone from great names like the voodoo cards to now having companies name their cards the xxxx fan dabby dozie......There must be a bet between the CEO's of the companies / marketing directors at the least having an intercompany bet on to see who can release under the lamest names possible. SURELY they can't suck at their jobs so much that they actually think the average person is going to see those names and think 'sounds good'.....
looking for their performance comparisons from you guys! 🤞
PS5 uses 300 watts, around 200W for PS4 emulation.These cards are insane. Especially these days with energy crisis here in Europe. I get that they're more powerful and now will be able to handle ray tracing with proper frame rate but... Actually you know what? Maybe Nvidia thought it through and will let us use them as heaters during the winter. Might be slightly cheaper than using gas to heat radiators. Good job, guys.
@Johnny boogalo yeah casual and price are fair points, I keep forgetting I'm actually old x.x also I feel called out for having a 1060 myself LOL
@Goldy Weather you hate it or not most people play on console cause of the price and most are just casual gamers.Only hardcore gamers get high end pc's in fact most pc gamers still using gpus like 1060's.Console will always be around ps5 can't even stay in stock cause of the demand and scalpers.Xbox series x is selling great too
@Cacodemon345 gaming consoles shouldn''t exist in 2022 anymore anyways, unless we're talking about the switch.
PS5's GPU is RDNA1 with RT bolted on top of it. RDNA1 is known to be very efficient as compared to GCN.This however doesn't spell good news for PS6 and next-gen Xboxes, because if AMD decides to go full on this route, I can see gaming becoming unaffordable altogether.
I burst out laughing at "new level of brilliance and absolute dark power", that's actually incredible.
Imagine that in a glass case with Trident Royal memory. BLING overload.
That's exactly what got me too. The whole build up of everything being nonsense, that was the straw that broke the camel's back.
My routine of buying a reference card, taking of the cooler and adding an EK block, then plumbing into my loop, seems to make more sense with each generation of card that comes out.
I'm not feeling any regrets over my purchase of a 3080 10gb on prime day. These 40 series cards are not even a little bit compelling.
We are going to need a server room for this new hardware. The size,power and heat?!? New AMD at 95 degrees and 4 slot beasts
I swapped out 2 1080's for a 3080. I was Impressed that the 3080 was lighter and smaller than a single 1080 and thought the next series of cards would surely be more efficient and powerfull.I thank Nvidia for releasing such cringe. My freind is a scalper who buys cards and sells them on, even he said no one would buy the 4000 series so he's not going to bother. Honestly the pricing is just ridiculous. I'm fortunate enough that I could buy one of the new cards but I didn't get that money making poor financial decisions and I'm not going to waste it on this garbage.Also there's no games worth playing anymore that would warrent such a card, mining is pointless now. What are Nvidia doing?
Also, being so power hungry, you'd be stupid to try and mine with 40 series, lol
I am interested in upping my "dark power" game. Would I be best going for a Galax card to get the "Dark Obelisk" or would Palit's "absolute dark power" be more powerful and ...absolute? I feel like the "dark obelisk" would require more rituals and sacrificing and I'm already caught for time as it is!
I love that the dark obelisk and the dark power graphics cards are covered with RGB lights.
One step to turning to dark side. "Yes yes use the anger Anakin feel the power of the dark side."
Yeah, might get dangerous ;) . Seriously though, what happened to these manufacturers ?! It's like they are advertising toys for small children. The marketing jargon is ridiculous. I think that EVGA left the GPU scene at the right time. This all feels like a bad comedy.
... and then there's the whole issue with anti-gravity. Will the dark powers grant me the power to defeat gravity or will I have to decide between the two?! Should I buy a range of cards to take advantage of all the magic or would I be meddling with powers I cannot possibly comprehend?
What about combining Dark Obelisk stick with Dark Power for the greatest effect? Although that would definitely require a sacrifice to afford it...
I love BFGPU concept. Finally we are getting real performance gains. Anyone who thinks a halo flagship discrete graphics card for a desktop computer that plugs into a wall should be small and "energy efficient", is stupid.
At some point, it would make more sense to just make the super hardcore GPU's a separate module. One box with a GPU (or 2-3 if you're insane), power supply and dedicated cooling, jumper to a PCIe connector on a single slot expansion plate, small cable to the GPU module. If you're going that hard, I doubt anyone is going to care about an extra box as long as you have ALL THE POWER!!!!
I can't wait to get myself one of those extra large rate cards from PNY!
It's getting to the point these GPU's might as well just integrate to a motherboard and optimize. Or at least the MOBO manufacturers should start putting the GPU slot on the bottom (for support)
I really think the GTX 1000 series really was the peak of GPU Design.
@Dingickso The funny thing is that the man who invented that term (Yahtzee) meant it as a tongue-in-cheek way, and regrets saying that now.
3000 series was actually really good value if you got them at MSRP, especially the 3060ti founders at $399 faster than a 2080 super
I guess they really meant it when they made the slogan "Gaming Perfected". If I wasn't playing at 4K I would probably keep running my Titan XP a little longer tbh.
@Susan doesn't understand fair use That's admirable but you should use all that money you saved from not upgrading your GPU to get a 120hz+ 1440P monitor and stop playing shit at 1080p
In hindsight, the 1080ti was the smartest buy ever. Unless you go 4k, that thing is still absolutely capable in 1080p/1440p and will be okay for another few years depending on what you use it for and want out of it. Stunning value!
My evga is far more stable than my gigabyte graphics card for rendering, it’s a shame to see them leave! I’m holding out and hoping Navi 3 brings a smack down. Hopefully all the extra cash from their cpu business helps their graphics card development!
Wonder when the fan will have enough LED's to make full patterns as it spins?
The GPU's need their own cases now.
Yup. I'm hopping onto the AMD Train from now on. I've usually always went with their CPU's, and now that Linux supports most/all of my games now, and windows in shitting the bed. Now NVidia is getting too expensive now too. I'm just going to hop onto AMD for GPU now, supports awesome for it too, so I'm done stressing over this shit. Cheaper, better, faster. EVGA saw what was happening.
If you gaming on Linux, AMD is the way to go.
When your graphics cards need walking sticks to stop them from falling over its time to consider if things have gone too far.
@lunsmann Yep something like the Thermaltake Level 20 VT would relieve a lot of that bending stress on the pci slot but I'm not sure its long enough. Or many of the open bench cases.
Maybe the good old fashioned desktop case design needs to be brought back. Forget tower designs.
@Nick Stanley Shout out Pyro and his 18 subscribers :) I am one of¬
@Pietr Muffei , Pyrocynical
Kane & Lynch
It's getting to the stage where we will have to have a stand alone box just to house the video card.
The thickness of these reminds me of when I owned a 4870x2 lol. Not quite as wide as these beasts but it was heavy af.
At this point even if the new AMD cards are 10% less in performance I’d go with AMD.
That moment when "for serious gamers only" style marketing becomes more embarrassing than straight up waifu cards.
We're officially at the point where the ATX standard now needs to apply to graphics cards and we should screw the graphics card into the chassis and then have the motherboard plug into the graphics cards PCIe slot.
I remember old cases with guide rails on the front to support long ISA cards. It's like fashion: everyone who kept his old trumpet trousers from the seventies was in again a few years ago...
When you guys get your hands on 40-series cards i hope, in your testing, you make sure to address lag, framerate and frame times. Relevant analysis: vnclip.net/video/bEr9AmEkImg/video.html
Somehow I have a feeling that game requirements won't scale with the availability of this latest generation. There will be too many people who used to to go for -80 or -90 series cards that will settle for a -70 instead or even simply wait longer than normal. If and when that happens, I am sure NVidia will blame demand. But it will be more likely that supply is to blame for once. What are they thinking? That everyone and their granny will upgrade to a 1200 watt power supply, a new case, a custom cooling solution and double the energy costs all without having games that actually require all that? 4k@240hz is nice but not a quantum leap in experience, isn't it?
This is what happens when marketing departments and businessmen make decisions that should be left to engineers. Wrong decisions get made.
Just bought myself a brand new 3080Ti for $500. Thanks Nvidia for making my choice so easy. I'm glad to skip the 4000s altogether. Still got an eye on AMD though...
It's ironic how detached the manufacturers are from the customer base. This generation really deserves a hard flop, and we can only hope the next one is better
@Animalyze71 which won't ever happen cause that gets clicks. Hell even when mocking it you get as much if not more views.
Until we can make ppl understand they don't need these cards yet and stop wasting money on them the cycle will only continue. Once content creators stop drooling over these new releases making people want them more than they need them, that's when things will calm down.
@Pirojf Mifhghek Could you imagine if we were still pushing multi-GPU + SLI/Xfire cards at this point? Imagine needing an entire power circuit to power just your GPU's. >
@CovantePNY is also headquartered in the US.
looking forward to the 5000 series, those gpu are way to cheap and small
"The company's Top brutalist Model is the I Chill Frostbite" gave me the hardest and most uncontrollable laugh in a long time. Thanks man!
These massive cards makes me think that we should not have moved to vertical tower cases, and stuck with horizontal cases instead.
Between the astronomically increasing power draw and equally gigantic air coolers, when are these companies going to realise that they can't push preformance in such a way year over year. These cards draw more power than an average office computer and are getting to be the same size of small mid towers. INSANITY!!!!
I've not been this underwhelmed by a GPU launch in many years.
@You Are Talking To Yourself is it really fps if the GPU makes it up?
@Cracksüchtiger Pockenaffe! Nvidia cannot open source their drivers because the drivers do not belong to them. Nvidia licenses technology from other companies. So Nvidia cannot give away what is not theirs.
Even though the performance improvement is decent, it's all so underwhelming and a parody of itself.
Just wait for the perfomance gains, boy.
@realbadTech I find the frame interpolation suspect. How usable is it really going to be? Are we talking about a frame of latency? 5 or 10? I want to see it demonstrated thoroughly before I get hyped up.
At what point do you think the graphics card is what we buy and then we put a cpu, ram, and storage into it instead of mobo?
I haven't been here in a min, love the new backdrop and your content. Pro tip, back the BKG lighting down, get rid of that glare on the blue tool board, and shoot with a little more depth of field. Keep up the good work!
NVidia: "We need you to rebrand your RTX 4070s as RTX 4080 12GB" EVGA has left the chat.