New Workshop! Lighting 3 | Advanced Off Camera Flash

Gear & Apps

Content Creator’s Guide To RAM and CPU Cores: More Is Better, Right?

By Anthony Thurston on April 25th 2015

As photographers and videographers, we ask a lot from our primary editing systems. We need them to be fast, efficient, and reliable. But just because you use and rely on these computers, doesn’t mean you are all geeked out over tech specs.


So today, we are here to answer a question. A question you have probably thought about at some point if you have upgraded to a new editing machine. The question is: How much RAM and CPU Cores do I need? The conventional thought has always been, more is better, but is that really the case?

Well, the short and sweet answer is yes, but only to a point. Lucky for you, the guys over on Linus Tech Tips have made this great video with all the details you need to know. Unlike other resources you may find, they actually based their testing around the Adobe Creative Cloud 2014 programs (After Effects, Photoshop, and Premiere), so you can get a pretty clear look at what sort of power you will need to get the most out of these programs.

As you can see, more is definitely better across the board, but once you get to a certain point (the sweet spot as we will call it), your returns on investment drop dramatically. So while it might be cool to brag about having a maxed out system, your performance will actually not do much more than someone with a system at the sweet spot.

Where does your current system stand in relation to the sweet spots talked about in this video? Maybe it’s worth upgrading your RAM or CPU a little to see your performance increase a ton, or maybe your system is right where it needs to be to get optimum performance without spending more to get small gains.

[via Linus Tech Tips on YouTube]

This site contains affiliate links to products. We may receive a commission for purchases made through these links, however, this does not impact accuracy or integrity of our content.

Anthony Thurston is a photographer based in the Salem, Oregon area specializing in Boudoir. He recently started a new project, Fiercely Boudoir to help support the growing boudoir community. Find him over on Instagram. You may also connect with him via Email.

Q&A Discussions

Please or register to post a comment.

  1. Rob Harris

    Keep in mind real life. How many people work only with their Adobe CC software? Do you also have some email software open? Maybe a web browser? Are you referencing a Word file? Sometimes we have to multi-task what we are doing on our computers. Many different programs may be open at once which means more RAM is better. Just something to think about.

    | |
  2. Greg Silver

    I knew this article would bring out some discussion on recommended hardware. Good suggestions.

    | |
  3. Barry Cunningham

    Didn’t get a lot of useful information on processing workflow in PhotoShop or LightRoom from this video.
    I’m not rendering animations or video on a 12-core Xenon system, so that part is no help.
    The benchmark processing being used for PhotoShop is not specified. The bit about more than 8GB RAM not being useful makes me wonder. The conclusion for that section (on a 12-core Xenon system) was “It depends…”. Again, not really useful. I do know that trying to process 60 stacked images in PhotoShop on my paltry 4-core machine with only 8GB of RAM brought the system to its knees; whereas it ran smoothly after I upgraded to 24GB.

    | |
    • Drew Valadez

      If you are testing components, you want to eliminate any other factors to come to a conclusion.

      Yea, you won’t have a 12-core xenon CPU running your video renders but you can’t test the theory of 4/8/16/32/64 GB of RAM performance difference if your CPU can’t keep up.

      GPU benches are done this way as well. You hear over and over again that you don’t need an i7 for video gaming or more than 4GB of RAM yet EVERY bench is done with top shelf i7 and something like 16 GB of 2133 RAM just to make sure the tested component is the one pegged at 100% to show it’s full performance.

      What you described was an issue with RAM, that is your scenario but won’t be for everyone. If i am running a 2-core i3 or even the 12-core Xenon, 8GB of ram to 24GB will show improvement given the example you just gave.

      | |
  4. Anthony Saleh

    Hold the phones — did he say ProRes render on Windows 8 …or did he mean Avid? Go to 4:17

    | |
  5. robert garfinkle

    learned a lot from this video – good stuff. book mark it.

    right off the bat, when he mentions ECC ram, I think of server / workstation class computers – what I take away from this is, when designing a system, either get the software / benchmarking tools and a calculator out – similar to what he did, and / or get someone like him to help you out…

    If I were designing a system –

    0. realize you just can’t walk into best buy etc and buy the pc already built. Online stands a better chance as their are firms who build ready-to-implement workstations with enough Cojones / resources..

    1. but do your research first before throwing down cash for some honkin’ system – once built, very costly to switch out / return physical components if you find your performance lacks and you’ve tried throwing all the switches. Read forums, best settings, best practices… then proceed.

    2. First component I consider is, the case in which your computer will reside in. if your choice is server / workstation based, gear up for a case which is capable of supporting redundancy (i.e. dual power supplies, and enough space for fans, HDDs / SSDs), then consider what motherboards it can support… if your choice is a workstation designed around a gaming platform (poor-mans server) which translates to an expandable motherboard with lots of ram / storage and resources for multiple video cards of substantial size / ram…

    3. Opt to design with redundancy (yup, a reason I state that twice… pun intended) and efficiency…

    4. pick a motherboard that can warehouse / support your computing architecture, and cooling…

    5. If it were me, and could afford it, just fill the ram slots with the maximum you can throw at it, with the best timings / latency – however pay attention to the video, which speaks of CPU scheduling, which I translate to “tuning” your physical system, but this is contingent on the software, again, research, forums, call the tech support at Adobe, maybe they’d be happy to help for nothing – a.k.a. Father Knows Best…

    6. While redundancy is first, I’d find a way to isolate your Operating System, from your data drives, and I like the independent scratch-disk (not partition) but complete physical “separate” drive. Some advice that was passed to me a few years ago, stay on drives which have a size no greater than 2 TB’s, raid them for both redundancy, efficiency. And if you do not have enough room internally, opt to get external drives yet whose interface to the motherboard is as fast, if not faster than onboard… the two external choices I can think of, one for backup and the other for scratch disk (again – only if the interface is fast enough)

    7. Video section – I like nVidia, I hear Quattro is great, but again, ask around, sometimes the unexpected works better… as far as monitor, your call..

    8. best practice – wired Ethernet only / gigabit Ethernet only – stay off wireless…

    9. Tune your system, ensure your operating system / software is updated, and when it works – hold off on the updates (unless it brings a needed feature or really optimizes your editing software) – thumb-of-rule if it aint broke, don’t fix it – keep your system free of change…

    tip – here is what I learned over the last day or two… I upgraded my Lightroom, to LR 6, and found it to be disastrously slow… So, I looked, and saw that LR6 was using the GPU, my Intel HD 4000. Once I instructed it to use the regular CPU, it lit up like a Christmas tree, man it’s fast… This may not apply to you, but proves that you have to do some system tweaking to get maximum performance…

    have a nice day.

    | |
    • Graham Curran

      I thought the idea was that using the GPU was intened to speed up LR, not slow it down.

      | |
    • Drew Valadez

      Your rambling and alot of it doesn’t make sense.

      0. Agreed 100%

      1. Agreed again but sometimes looking at a few benchmarks and coming up with a conclusion after 20-30 mins of research is all one needs.

      2. Why are you discussing server and/or workstation in the same sentence? Server grade components would be unnecessary for the works of LR or PS. Buy a good case, read reviews, consider that when you purchase drives you can have a bunch of left over drives or do it the right way and toss in quality drives from the get go, whether they be HDD/SSHD/SSD. Cooling should also be purchased in quality. A hyper 212 variant will go a long way.

      3. Agreed, a 3 drive system should be min IMO for ANYONE taking money for photography. 1 for OS/Apps, 1 Data and another duplicate of the 1 data.

      4. whut? since when does a mobo determine the cooling? is this your way of saying get an Intel Board if you have an Intel CPU? This made no sense at all.

      5. No. this is throwing your money away. Check benches like you said, throwing money at unnecessary amounts of ram is a waste. Also RAM speeds to price do not bring good return. Buy what you want that is affordable but IMO in the larger sized sticks to keep RAM slots open. (IE: get 1×8 stick for $60 vs 2×4 sticks for $50) that way when you upgrade you aren’t having to rebuy RAM.

      6. Your referencing that Blackblaze article, aren’t you? Look up Tweaktowns arguement for their claims. They ran consumer drives in conditions that no one would ever do. Their “test” should be held with a grain of salt. aalthough the 3TB seagates are a known problem, they are the ones to of tainted the 3TB HDD market. OTherwise, what are you talking about external HDD interfacing with the mobo? Are you trying to sound “l33t” with your geek speak? It’s called USB/Firewire/Thunderbolt/eSata. Go internal and call it a day when you can, use externals when y ou need portability. Keep it simple. And don’t use an external for scratch, talk about slow. Check your facts on that one again. If anything using a SSD for a scratch will bring better results but definitely not external over say USB.

      7. If were speaking for Adobe products, yeah, Nvidia wins. Although that might change in the future for AMD/Radeon/Firepro cards.

      8. Rubbish. Unless you need full throttle LAN speeds, wireless is fine. Have a NAS? Ok, wired will be nicer but otherwise, nonsense.

      9. Kinda agree-don’t agree… That varies from person to person. I am in the same boat as you.

      Google “Will an SSD Improve Adobe Lightroom Performance?” and read the article from Computer Darkroom. SSDs are awesome but they aren’t the magic bullet in terms of LR.

      | |
    • John Cavan

      I think you’re probably overthinking it here Robert. Video editing and rendering will benefit, to a point, from certain hardware configs, but the photography industry has been quite readily using standard Apple hardware, iMacs and MacBook Pro systems, without any performance issues.

      My main two personal machines are a Mid-2011 iMac with 32GB of RAM and SSD and a Late-2013 Retina MacBook Pro with 16GB of RAM, dual GPU and SSD. Both machines handle, quite readily, the images out of my D800, with many layers, and neither of them are loaded up with high-end cooling systems and so-on. A totally tricked out PC might do some of that marginally faster, but probably not as much as the cost would justify to me.

      | |
    • robert garfinkle

      @Graham –

      HI Graham. I am sure LR is geared to use the GPU. If I had a dedicated video card it’d most likely handle LR not a problem…

      I have an onboard Intel HD 4000 etc… It shares ram with the system, and when I was using LR5 I never had to make any changes, I just used it. Now, after installation of LR6, it really sucked. The entire screen flickered when making changes to my images, just selecting images it flickered, and it’d be just dog slow…

      When I instructed LR6 to “not use” the GPU, it opened up, all the issues went away. and speed was similar if not better than LR5 – that’s backwards, I know. but for me it worked that way…

      I would imagine a stand-alone video card even of nominal girth would be just fine…

      I have a Dell XPS 18. It’s function is more to display images / video than to edit. My mistake was thinking it’d be ok for even moderate processing – as I program on it too, and do some heavy lifting with really large databases. it does somewhat ok with that…

      At the time I purchased it there were really no 18.5 inch tablets out there that were using i7 (quad-core), this one uses an ultra-portable i7 dual-core – pretty lame in the processing dept…

      There are other “tablet-like” options today, with plenty of processing power, more than enough…

      @ Everyone else –

      When the video mentioned ECC ram, that usually denotes a server / workstation – fortified with a XEON processor (or multiple thereof). I am not knowing at least in the “typical” PC world of ECC ram being used in the home or office – not even gaming machines… Not sure about mac…

      The video also mentioned separate scratch disk. ok, I took it one step further, as I have been setting up systems for years, with independent OS drives, Data drives, Backup drives, with raid configurations. There are additional benefits to be gained in keeping drives (as per their function) separate. For example, if the OS drive, with applications is separate, then, if it for some reason goes down and I restore it, data, completely unaffected – I use ghost to do OS backups / restores – works perfect…

      The external scratch disk was on the premise that you won’t have too much room with all those separate raided drives inside the box, right? And maybe I need to do research, but why could you not use an external scratch disk if the throughput / interface (thunderbolt, USB 3, eSata) were there…

      Though I setup intense systems – I typically do not overclock or tune. Honestly, this is the first time I have really been exposed to performance tune-ups relating to media applications. I usually work with databases and application development and familiar with some / not all performance tune ups for those environments…

      and so it goes

      | |
  6. Brandon Dewey

    good video

    | |