Jump to content

Hey guys, check out new PolyGnome tool for Cinema 4D from our long-time member C4DS. You can find PolyGnome in our Store and we hope you like it!! :cowboypistol:



hvanderwegen

AMD Threadripper: Cinebench R15 results

Recommended Posts

littlepony    4
1 hour ago, Cutman said:

I find AdoredTV one of the best Tech channels, his video are always extremely well researched and free from bias. He does play fairly and calls a spade a spade.

 

No kidding, I watched the whole video without realizing that 23 minutes passed by

Share this post


Link to post
Share on other sites

hvanderwegen    23
On 7/14/2017 at 11:30 PM, littlepony said:

@Zmotive come on man, both of AMD processors outperformed intel's flagship processors and they are both cheaper. So why the disappointed look? Did you design a better processor than the threadripper at your basement

Name it coreripper if you did :'D

Obama is not looking sad in that photo - it is an "I am pretty impressed" facial expression. Not everyone is as adept at reading facial expressions, and that one can be confused with "I am sad/disappointed".

It really is quite an interesting field of research. Women are generally better at reading expressions, I believe I read somewhere.

Thing though is, nowadays more and more (younger) people are getting worse at reading facial expressions and body language - due to internet and cell phones. Emoticons/emoji are very shallow replacements.

-----

Back on topic: Intel is getting trounced in other areas as well. It has been confirmed that the new Intel i9 has SEVERE overheating issues:

 
Quote

Again, I recommend reading the full story, but the bottom line is this: Even at 160W, Skylake-X can’t run a high-end air cooler like the Noctua DH-15 for more than a few minutes before it begins to throttle. At least a decent CLLC (closed loop liquid cooler) is required, and a top-end system is mandatory for full performance. These settings, however, do not push temperatures on the VRMs or other components all that high. Keep in mind, no overclocking is being done in either of these cases.

You read that right: even with a high-end air cooler the CPU will be throttled after a few minutes.

It basically means it is worthless for 3d rendering at anything beyond a couple of minutes, unless you invest heavily in a costly high-end water cooled system.

Read up on the story here:
https://www.extremetech.com/computin...xs-design-well

And all because Intel decided to scrape the bottom in favour of profits. Intel is looking pretty bad at this point, while AMD is looking better and better.

Share this post


Link to post
Share on other sites
Greatszalam    68
On ‎7‎/‎15‎/‎2017 at 1:30 AM, littlepony said:

@Zmotive come on man, both of AMD processors outperformed intel's flagship processors and they are both cheaper. So why the disappointed look?

I didn't see that as a disappointed look. It's a "hmmmm, not baaaaaaaad" kind of look. It might be regional, but it's usually accompanied by a head nod. It's actually more impressed than anything.

Share this post


Link to post
Share on other sites
littlepony    4
19 hours ago, Greatszalam said:

I didn't see that as a disappointed look. It's a "hmmmm, not baaaaaaaad" kind of look. It might be regional, but it's usually accompanied by a head nod. It's actually more impressed than anything.

Hmm, I see. Here at Dream Valley we keep our expressions simple so that less intelligent people like me don't  get the wrong idea XD

  • Like 1

Share this post


Link to post
Share on other sites
Cutman    123
On 23/07/2017 at 3:21 AM, hvanderwegen said:

Now, this is getting really interesting. AMD's new Epyc server CPUs are blowing Intel right out of the water.

New benchmarks for Cinebench:

http://www.cpu-monkey.com/en/cpu_benchmark-cinebench_r15_multi_core-8

Threadripper and Epic really have me conflicted over which way to go with a new PC build for 3D. I was originally going to build a PC for GPU based rendering but one of the top Epycs plus VRay may be faster than the same amount of money spent on GPUs for Octane.

One thing I've always enjoyed with VRay is knowing if I have a massive project I can always find a render farm the same can't be said for Octane (the last time I checked)...

I also need to confirm which is best for Houdini performance, I'm wondering just how multithreaded Houdini is and whether an Epyc is a better choice than the Threadripper.

Decisions, decisions... 

  • Like 1

Share this post


Link to post
Share on other sites
ArtVenture    3

It's really hard for me to not recommend AMD chips for creative work now.  Unfortunately, it's hard for me to not 110% recommend a Nvidia card at the same time... CUDA is too good.  My machine at work had 4 GTX 1080 Ti cards, and using Octane render with that thing was unlike any experience I had in C4D :wow:

 

I just don't know if CPU rendering is going to be around much longer. so I don't know if Threadripper is something to get very excited about, when we have a 'frame'ripper called GPU rendering.  I honestly think it's going to replace CPU rendering. 

Share this post


Link to post
Share on other sites
grain    96
1 hour ago, ArtVenture said:

 

I just don't know if CPU rendering is going to be around much longer. so I don't know if Threadripper is something to get very excited about, when we have a 'frame'ripper called GPU rendering.  I honestly think it's going to replace CPU rendering. 

 

@Cutman Even if you stick with GPU rendering, the huge amount of PCIe lanes and motherboards designed to be loaded up with fast GPUs, M2 drives, fast ram.. it makes the AMD platform very attractive (and cost effective) for both CPU and GPU rendering systems. Only thing I'm a bit shocked by is the cost of the motherboards, but to be fair the one I was looking at ( https://www.overclockers.co.uk/asus-x399-rog-zenith-extreme-amd-x399-socket-tr4-e-atx-motherboard-mb-6a6-as.html ) does basically everything except cook you breakfast.

 

I'm going to hold off just for a little while, see how the systems go in the wild for a couple months, see how Epyc looks.. and then probably get drunk and buy something on the company card. STRATEGY!

  • Like 2

Share this post


Link to post
Share on other sites
Cutman    123
20 hours ago, grain said:

 

@Cutman Even if you stick with GPU rendering, the huge amount of PCIe lanes and motherboards designed to be loaded up with fast GPUs, M2 drives, fast ram.. it makes the AMD platform very attractive (and cost effective) for both CPU and GPU rendering systems. Only thing I'm a bit shocked by is the cost of the motherboards, but to be fair the one I was looking at ( https://www.overclockers.co.uk/asus-x399-rog-zenith-extreme-amd-x399-socket-tr4-e-atx-motherboard-mb-6a6-as.html ) does basically everything except cook you breakfast.

 

I'm going to hold off just for a little while, see how the systems go in the wild for a couple months, see how Epyc looks.. and then probably get drunk and buy something on the company card. STRATEGY!

The Threadripper platform does look ideal for CPU or GPU rendering depending on how deep your pockets are. The costs of these motherboards are more in line with Intel Xeon level motherboards and I guess that's a fair comparison given that Threadrippers are actually dual CPUs in one package plus a heck of a lot of PCIE lanes. If the demand is high for these CPUs then the motherboards prices will hopefully drop like a stone.

 

I've done a bit more research, I think Epyc will be excellent for a renderfarm but the clock speed looks a shade too low for workstation work, not enough to feed the GPU as single core clock speed is still important for OGL. Threadripper's clock speed is just about OK for a badass workstation/renderer though. Ryzen 1800x looks bang on for Workstation and Dual GPU rendering. Decisions, decisions....

 

Every fibre in my body is trying to stop me making an impulse purchase.

Share this post


Link to post
Share on other sites
Zmotive    48

If Apple disappoints with the next Mac Pro, my next computer will likely be X399, Threadripper, Vega (or more specifically the next iterations of those things  — not exactly sure what their schedule is to optimize them). Never thought I'd go all-AMD but I just might given the direction C4D is headed. An equally important factor will be what the next version of AE looks like. It should be the big return to multi-threading they've been working on... but who knows it could be more GPU-based than CPU-based. In which case if they go strong towards Nvidia then I'd get an Nvidia card with the above setup, not Vega. If Adobe's solutions get AMD-friendly this fall then it's a no-brainer.

Share this post


Link to post
Share on other sites
Cutman    123
On 13/08/2017 at 0:08 AM, Zmotive said:

If Apple disappoints with the next Mac Pro, my next computer will likely be X399, Threadripper, Vega (or more specifically the next iterations of those things  — not exactly sure what their schedule is to optimize them). Never thought I'd go all-AMD but I just might given the direction C4D is headed. An equally important factor will be what the next version of AE looks like. It should be the big return to multi-threading they've been working on... but who knows it could be more GPU-based than CPU-based. In which case if they go strong towards Nvidia then I'd get an Nvidia card with the above setup, not Vega. If Adobe's solutions get AMD-friendly this fall then it's a no-brainer.

As a Mac based studio we are not expecting much from the latest incarnation of the Mac Pro or the iMac Pro. I edit with FCPX so we'll never not have Macs in the studio but for the limited 3D work I do I cannot see beyond a fully loaded Threadripper system. Apple will never make a system as powerful or as affordable as I can custom build and looking at the base iMac Pro pricing as a guide I think the iMac + Threadripper PC would end up a cheaper and better suited to my work than 1 maxed out Mac Pro. I expect the latest Mac Pro to look stunning but also have a stunning price tag too.

 

I never thought I'd be considering buying an AMD PC either, all credit to AMD but they have really changed the PC landscape in the space of a couple of months. I'm not entirely sold on Vega, the cards look OK performing but with a very heavy power draw. Let's see how they're priced though in the next few days.

Share this post


Link to post
Share on other sites
Greatszalam    68
On ‎8‎/‎12‎/‎2017 at 6:08 PM, Zmotive said:

An equally important factor will be what the next version of AE looks like. It should be the big return to multi-threading they've been working on...

What makes you think that? I haven't seen or heard anything that would indicate it. In fact, stuff the AE team has said publicly would make me think otherwise.

 

Now, I should point out that, since CC 2015, AE is now actually multithreaded when it wasn't before (the UI and the renderer are running on separate threads). The initial release of CC 2015 was rather buggy with the new architecture and they kind of had to pause development to bug smash for a while. It's gotten pretty good in CC 2017, but I haven't heard any indication that multithreaded rendering is on its way in the next release though.

 

(Side note: the old versions of AE didn't actually have multithreaded rendering either. It was a kind of hacky cheat. AE just spun up several instances of itself in the background to render things. It was a bit buggy, but for a lot of stuff it worked reasonably well.)

 

On ‎8‎/‎12‎/‎2017 at 6:08 PM, Zmotive said:

If Adobe's solutions get AMD-friendly this fall then it's a no-brainer.

The GPU stuff that the AE team has been working on for the past several releases does make use of AMD cards; you don't have to wait for the autumn!  It's not like the old ray-traced renderer (which has been considered dead by the AE team since around CC 2014). The last few releases have brought GPU acceleration to several native effects - and there have been few more each release. Current release of AE includes GPU acceleration of Fractal Noise, Gaussian Blur, Fast Box Blur, Lumetri Color, Sharpen, Brightness and Contrast, Find Edges, Glow, Hue/Saturation, Invert, and Tint.

 

On Windows, AE. can use either CUDA or OpenCL for accelerating these. On MacOS, AE can use either OpenCL or Metal.

 

(Note: occasionally there are some noticeable differences in how the GPU-accelerated effects render in 8-bit projects compared to CPU rendering. Switching the project to 16-bit or 32-bit solves it.)

 

Anyway, not much on the GPU, but what is there can sometimes make a huge difference. Fractal Noise acceleration is a big deal - it's waaay faster now.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

×