×
Login Register an account
Top Submissions Explore Upgoat Search Random Subverse Random Post Colorize! Site Rules Donate
4

What graphic card should I buy?

submitted by rapid_water to AskUpgoat 9 monthsSep 10, 2024 22:28:47 ago (+4/-0)     (AskUpgoat)

I want to run AI models from hugging face. Should I buy the 3060 12GB for $285?

Or the 4060 16gb for $450?

This will mainly be used to learn how to run models And play around with image/speech generation. I don't want to regret my purchase, but $450 is a lot...


16 comments block


[ - ] UncleDoug 1 point 9 monthsSep 10, 2024 22:38:08 ago (+1/-0)

Something expensive and overpowered so you can continue using tlol and YouTube.

[ - ] Trope 2 points 9 monthsSep 10, 2024 22:45:33 ago (+2/-0)

Considering video games look no different than they did a decade ago other than they’ve gone up in resolution and frame rate, I’d say you’d do well with any card assuming AI does not run in 4096 by 2160 resolution at 120 frames per second.

At that price point, you can expect a nice mid-level card that will serve you well for many years. More RAM and video RAM should be best for video rendering.

[ - ] BrokenVoat 1 point 9 monthsSep 11, 2024 07:01:17 ago (+1/-0)

Bullshit even 22gb models have stutter thanks to win11. Not sure bigger card solves the issue.

[ - ] Trope 0 points 9 monthsSep 11, 2024 23:40:29 ago (+0/-0)

I’m not disagreeing with you but why are you installing Windows 11?

[ - ] lolxd 0 points 9 monthsSep 14, 2024 01:58:36 ago (+0/-0)

Windows 11 is 80% bloatware.

[ - ] DitchPig 0 points 9 monthsSep 10, 2024 22:56:11 ago (+0/-0)

More RAM the better.

So, whatever 3000+ series that has the most RAM is what you are after.

4080 16Gb, or might as well just go 3080ti 12Gb

[ - ] GrayDragon 0 points 9 monthsSep 10, 2024 23:00:12 ago (+1/-1)

"... used to learn ..." "... play around ..."

Based upon the limited information, it sounds like you should go cheaper. If you can justify it with other things like gaming, then maybe go higher.

Anyway, I presume you mean the RTX 4060 Ti 16GB for ~$430. This chart might also help:

https://www.videocardbenchmark.net/high_end_gpus.html

Ranks:
33 - RTX 4060 Ti 16GB
56 - RTX 4060
81 - RTX 3060 12GB

[ - ] dosvydanya_freedomz 0 points 9 monthsSep 10, 2024 23:22:12 ago (+0/-0)

yeah the 3060 is not that great despite the ram size

[ - ] Cantaloupe 3 points 9 monthsSep 10, 2024 23:18:18 ago (+3/-0)

That's a smart problem to have. So you probably have some money.

So the 4060 with 16G of vram is significant

Good for deep learning, with larger models and bigger datasets.

Good for nlp & vision

[ - ] puremadness 0 points 9 monthsSep 11, 2024 00:38:58 ago (+0/-0)

This^
I'm on 3060 8GB and I'd droll over a laptop with the 4060+16gb for the AI stuff, its pretty intense computering.

[ - ] dosvydanya_freedomz 3 points 9 monthsSep 10, 2024 23:20:49 ago (+3/-0)

buy the 16 gb model, its better for what you want to do. it has more ram than the 3060 plus you have access to frame gen tech off the bat

[ - ] TheOriginal1Icemonkey 3 points 9 monthsSep 10, 2024 23:28:52 ago (+5/-2)

You should spend your money on guns and ammo.

[ - ] Trope 2 points 9 monthsSep 11, 2024 00:29:19 ago (+2/-0)

Sound advice.

Especially considering the active invasion.

[ - ] beece 0 points 9 monthsSep 11, 2024 00:28:20 ago (+0/-0)

Git a gud one this time Cletus!

[ - ] puremadness 0 points 9 monthsSep 11, 2024 00:37:34 ago (+0/-0)

I can run on a 4 year old laptop (high end gamer laptop but still) with a 3060 8GB
If its desktop I'd want to biggest one so 4000 range for sure, $450 aint bad, I paid $900 for a 2070 8GB years ago, just to play X4

[ - ] CoronaHoax 0 points 9 monthsSep 13, 2024 14:32:03 ago (+0/-0)*

Imo if you have no clue about your data set size, 12gb should be fine. Also it's a card, you'll be able to resell and buy 'a new.

With nvidia new models have "more processing power" per cuda core etc. So if your data sets never use more than 4gb, you'd technically be better off with a new model than an older one with more gb.

This site will tell you the amount of ram, cuda cores, and cuda version of each card -

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

Keep in mind the cards are never really your make or break. You'll be lucky to get 2x performance over just the geforce GTX 950 (oldest card that runs modern ai code).

Properly reworking your data, what you choose to bring in or not, some things like loading your data as a memory map instead of into normal ram each run before hand will help on your speed performances much more than 2x or 4x maybe if you're spending gold level sums.