Avieshek@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoEdward Snowden slams Nvidia's RTX 50-series 'F-tier value,' whistleblows on lackluster VRAM capacitywww.tomshardware.comexternal-linkmessage-square101fedilinkarrow-up1374arrow-down195
arrow-up1279arrow-down1external-linkEdward Snowden slams Nvidia's RTX 50-series 'F-tier value,' whistleblows on lackluster VRAM capacitywww.tomshardware.comAvieshek@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square101fedilink
minus-squareTeamAssimilation@infosec.publinkfedilinkEnglisharrow-up390·1 month agoEdward Snowden doing GPU reviews? This timeline is becoming weirder every day.
minus-squareWinged_Hussar@lemmy.worldlinkfedilinkEnglisharrow-up91·1 month agoLegitimately thought this was a hard-drive.net post
minus-squareGamingChairModel@lemmy.worldlinkfedilinkEnglisharrow-up56·1 month ago“Whistleblows” as if he’s some kind of NVIDIA insider.
minus-square0x0@programming.devlinkfedilinkEnglisharrow-up1·1 month agoIntel Insider now that would’ve made for great whistleblowing headlines.
minus-squareEager Eagle@lemmy.worldlinkfedilinkEnglisharrow-up48·1 month agoI bet he just wants a card to self host models and not give companies his data, but the amount of vram is indeed ridiculous.
minus-squareJeena@piefed.jeena.netlinkfedilinkEnglisharrow-up25·1 month agoExactly, I’m in the same situation now and the 8GB in those cheaper cards don’t even let you run a 13B model. I’m trying to research if I can run a 13B one on a 3060 with 12 GB.
minus-squareThe Hobbyist@lemmy.ziplinkfedilinkEnglisharrow-up15·1 month agoYou can. I’m running a 14B deepseek model on mine. It achieves 28 t/s.
minus-squareJeena@piefed.jeena.netlinkfedilinkEnglisharrow-up6·1 month agoOh nice, that’s faster than I imagined.
minus-squarelevzzz@lemmy.worldlinkfedilinkEnglisharrow-up4·1 month agoYou need a pretty large context window to fit all the reasoning, ollama forces 2048 by default and more uses more memory
minus-squaremanicdave@feddit.uklinkfedilinkEnglisharrow-up4·1 month agoI’m running deepseek-r1:14b on a 12GB rx6700. It just about fits in memory and is pretty fast.
minus-squaresecret300@lemmy.sdf.orglinkfedilinkEnglisharrow-up11·1 month agoSwear next he’s gonna review hentai games Oh wait… https://www.youtube.com/watch?v=fAf1Syz17JE
minus-squarenewcockroach@lemmy.worldlinkfedilinkEnglisharrow-up8·1 month ago“Some hentai games are good” -Edward Snowden
minus-squareඞmir@lemmy.mllinkfedilinkEnglisharrow-up9·1 month agoI’ll keep believing this is a theonion post
minus-squareSimulation6@sopuli.xyzlinkfedilinkEnglisharrow-up1·1 month agoDoes he work for Nvidia? Seems out of character for him.
Edward Snowden doing GPU reviews? This timeline is becoming weirder every day.
Legitimately thought this was a hard-drive.net post
“Whistleblows” as if he’s some kind of NVIDIA insider.
Intel Insider now that would’ve made for great whistleblowing headlines.
I bet he just wants a card to self host models and not give companies his data, but the amount of vram is indeed ridiculous.
Exactly, I’m in the same situation now and the 8GB in those cheaper cards don’t even let you run a 13B model. I’m trying to research if I can run a 13B one on a 3060 with 12 GB.
You can. I’m running a 14B deepseek model on mine. It achieves 28 t/s.
Oh nice, that’s faster than I imagined.
You need a pretty large context window to fit all the reasoning, ollama forces 2048 by default and more uses more memory
I’m running deepseek-r1:14b on a 12GB rx6700. It just about fits in memory and is pretty fast.
Swear next he’s gonna review hentai games
Oh wait… https://www.youtube.com/watch?v=fAf1Syz17JE
“Some hentai games are good” -Edward Snowden
Note that this is from 2003
I’ll keep believing this is a theonion post
Does he work for Nvidia? Seems out of character for him.