Jump to content

Archived

This topic is now archived and is closed to further replies.

Guest kittonian

Creating a Render Farm (what hardware?)

Recommended Posts

Guest kittonian
  • Topic Author
  • Right now we've got a few Mac Pro machines (1,1 and 3,1) and a Macbook Pro (6,2) all running as full C4D R16 Studio workstations as well as Team Render clients. We've got a statically addressed GB ethernet LAN over CAT 6 and are interested in vastly speeding up our rendering times.

     

    For example, right now it's been over 15 hours and we still aren't even 1/4 of the way through a single still image render that uses the physical rendering engine, global illumination, ambient occlusion, and Mitchell anti-aliasing. This simply doesn't work.

     

    We rarely if ever need to render animations. Most of the time we are doing interior design/architecture still images, but always design in the most photo-realistic way possible (lots of careful lighting, shadows, heavy geometry, etc.).

     

    What I would like to do, and please correct me if I am thinking about this incorrectly, is spec out a 1U rack machine that will be very fast on rendering times to be used as a Team Render client. Then, we would purchase 3-4 of them to start and add more as necessary. These would be nothing more than Team Render clients and I assume they would be PCs with whatever Windows version is recommended for the best performance. These would be placed in the server room with the network hardware and be headless. The design workstations would remain Mac based.

     

    My concern is that I don't want to invest a bunch of money only to find we aren't getting much of a bump in rendering times. From what I understand, when using the physical rendering engine (which we do most often), only two buckets are created. As still image team renders distribute buckets to each machine, are only two machines ever going to be used for still image renders?

     

    I've been spending quite a bit of time trying to learn all the specifics and want to ensure I'm on the right path before spending any money.

     

    Any help is greatly appreciated!

     

     

    Share this post


    Link to post
    Share on other sites

    Guest kittonian
  • Topic Author
  • So I've been on the phone with MAXON for a while now and have been getting some interesting information. I'd love some real world confirmation on what I've been told.

     

    1. Having multiple machines that together add up to having 16 Hyperthreaded cores OR having a single machine with 16 Hyperthreaded cores will render at the exact same speed with Team Render

    2. You should have 2GB of ECC memory per thread (so 16 Hyperthreaded cores = 32 threads and thus you should have 32GB of RAM)

    3. A "bucket" equals one processor thread

    4. Using the physical render engine does not max out at two buckets, even with a still image render (I read this somewhere and apparently it is not the case as MAXON tech tried a Team Render on their end using 3 machines and all machines turned orange and distributed the processing)

    5. Having faster processor speeds (i.e. 2.7Ghz vs 2Ghz) or more cache on the processor isn't going to make that big a difference. It's really about having as many threads as possible, thus you can save money by using i7 procs over XEON procs.

     

    If all of this is indeed correct, it seems that the best thing to do is build one beefy machine with lots of Hyperthreaded cores and a lot of RAM instead of a bunch of machines where you have to worry about multiple GB ethernet connections, multiple hard drives, chassis, etc.

    Share this post


    Link to post
    Share on other sites

    It's all about the number of processor cores. The more the better. If all are in the same machine all well and good but if you're using a network then that works as well. One thing to be aware of is when GI is being calculated you pretty much can't use your computer for something else. Once actual rendering starts you can then do other things. There are tricks that you can do to specify the number of cores that C4D uses to get around this problem. I use 2 computers with Team Render and my set up works really well.

     

    Could be that you should buy 3DFluff's R16 High speed tutorial reviewed here by me and learn how to optimize your scenes and rendering times. There's quite a few tricks that can give significant improvements in render times.

    Share this post


    Link to post
    Share on other sites
    Guest kittonian
  • Topic Author
  • I've looked into that tutorial before and am considering it, though a lot of the things he goes over don't fit my workflow. Regardless, the guy I spoke with at MAXON showed me how to bake in the GI stuff before doing a team render so that everything looks perfect across whatever machines are included in the team render (apparently different platforms render lighting, color, and shadows a bit differently).

     

    Here's what I've got spec'd for the render machine thus far:

     

    Processor: Intel XEON E5 2670 v3 (x2)
    Motherboard: SUPERMICRO MBD-X10DRI
    Hard Drive: Samsung 250GB 850 Evo 2.5" SATA III Solid State Drive
    Memory: Crucial 64GB kit (16GBx4) DDR4 PC4-17000 Registered ECC 1.2V
     

    This will give me 48 threads across 24 cores over GB ethernet with more than 2GB of ECC memory per thread.

     

    I think the only thing I'm missing is the chassis/power supply but I wanted to ask the community's thoughts on a rack mount (1U or 2U preferred) chassis with superior cooling and plenty of power for this setup.

    Share this post


    Link to post
    Share on other sites
    Guest Chriscorr
  • Topic Author
  • I'm not very experienced with racks but I think you would want to go with at least 2U cases. 1U rack cases might be a really tight fit for something that produces as much heat as a rendering CPU, you're limited to about 45mm of cooler space, I'm not even sure there are coolers that small, any stock cooler certainly wouldn't be and I would never put a system for rendering in such a tight space unless there is a lot of smart cooling + ambient cooling.

     

    For the price of what you just wrote down, you could fill a server cabinet to the brim with smaller systems and end up with many more cores than 48. Of course you would have a much harder time with installation, electricity bills, network logistics and maintenance. But still you would probably get around 100 threads for the same price.

    Share this post


    Link to post
    Share on other sites
    Guest kittonian
  • Topic Author
  • Yes Chris, I agree with everything you said. 2U is just fine and I happen to have two 4U chassis on hand if need be instead of going with a new chassis, however the power supply and cooling would need to be updated and that's what I'm researching right now.

     

    Looks like the Arctic i30 CO is a perfect solution for cooling each CPU. Partner that with their MX-4 thermal compound and it should remain nice and chilly.

     

    With regards to price, I have been able to get it down to $4650 so far, not including the chassis itself. Not sure how I could get myself 100 threads with anywhere near this type of performance for the same price. Plus, as you mentioned, dealing with multiple systems, larger footprints, more fan noise, higher electricity bills, etc. seems to defeat the purpose.

     

    Unfortunately I have to find a new chassis (boo) because the Antec 3426B that I've got is ATX only, not E-ATX compatible and that's what I need for this motherboard. Also looking at quiet but powerful power supplies.

    Share this post


    Link to post
    Share on other sites
    Guest Chriscorr
  • Topic Author
  • In this case it depends solely on the motherboard. The one you chose has the socket 2011 loading mechanism narrow type (as opposed to square type). Which means that almost no consumer grade cooler will fit the mounting configuration. That doesn't mean it's hard to find a cooler:

     

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100008000%20600337814

     

    All of these should be compatible.

    Share this post


    Link to post
    Share on other sites
    Guest Chriscorr
  • Topic Author
  • Oh I didn't see your edit. Please note what I said about the socket ILM for that motherboard. The Arctic i30 CO is a nice consumer cooler, but it has a different ILM, the square kind which will NOT fit your motherboard.

     

    A 500$ system based on an AMD FX-8320E would get you 8 decent threads, probably even faster per-thread since the 2670 is only 2.3ghz. So that would be 72 threads for 4500$.

    Share this post


    Link to post
    Share on other sites
    Guest kittonian
  • Topic Author
  • Good looking out on the socket. I didn't notice that, nor did I see any specs that mentioned it. Which motherboard would you choose for this setup? I would prefer ATX over E-ATX because then I could use my existing chassis.

     

    How about this one from Asus? Z10PA-D8

    Share this post


    Link to post
    Share on other sites
    Guest kittonian
  • Topic Author
  • OK, I think I finally have it (been working on this all day). Please check my specs and if you see anything that could be done better to increase rendering performance let me know. I ended up forgetting about the rack mount chassis and found room for a mid-tower instead. The noise concerns we have just weren't being addressed by rack based components. This not only needs to be as powerful as possible but also as silent as we can get it.

     

    Processor: Intel XEON E5 2670 v3 (x2)
    Motherboard: ASUS Z10PA-D8 ATX Server Motherboard Dual LGA 2011-3 DDR4  
    Hard Drive: Samsung 250GB 850 Evo 2.5" SATA III Solid State Drive
    Memory: 64GB kit (16GBx4) DDR4 PC4-17000 Registered ECC 1.2V
    CPU Cooler: Arctic i30 CO (x2)
    CPU Thermal Compound: Arctic MX-4
    Power Supply: Corsair AX1500i
    Chassis: Corsair Carbide Series® 330R Titanium Edition Silent Mid-Tower Case

    Total Cost: $5142.90

    Share this post


    Link to post
    Share on other sites
    Guest Chriscorr
  • Topic Author
  • Unfortunately those coolers will not work. This time the mounting mechanism would be compatible, but these are really big coolers designed to be sitting alone on a regular motherboard. Take a look at the CPU sockets, they are very close together, I highly doubt two Arctic i30 coolers would fit on there side by side. These type of coolers have all sorts of clearance issues on a single CPU motherboard, let alone on a dual-CPU with RAM slots all around.

     

    To avoid clearance issues, I don't see a way around using one of these small specialized coolers:

     

     http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100008000%20600337815

     

    There are a few "big name" manufacturers that make Xeon specific coolers as well, such as this one:

     

    http://www.newegg.com/Product/Product.aspx?Item=9SIA0AJ2MW9893

     

    But even though you should be able to fit two of them on the motherboard without them intersecting each other, it looks like the fans would block the RAM modules. Perhaps not since you will only have 4 installed, depends on the particular Quad Channel configuration of this board.

     

    I've recently built a system myself and cooling can be surprisingly tricky because of space limitations, it gets frustrating. Doubly so on a dual-Xeon board.

     

    These small coolers will invariably be much louder during load. You can mitigate this by choosing a better silent case, such as the Fractal Design Define R5.

     

    One last note about the power supply. I would say it's complete overkill. It's top notch, but not even dual Xeon's need that much power because you don't have anything else in the system, no HDDs, no graphics cards, no expansions. 1500W is simply unnecessary. I would say the AX760 is more than enough.

    Share this post


    Link to post
    Share on other sites

    I would go for Supermicro server barebones or complete machines. Supermicro delivers the base technology for many makers of servers that don't have own development and they are very reliable and usually very cost efficient. 1HU is no problem you will have more trouble keeping the room temperature in check when rendering with a couple of dual Xeon machines, make sure to plan ahead for this, as well as for the noise these things make when under load.

    Share this post


    Link to post
    Share on other sites

    • Recently Browsing   0 members

      No registered users viewing this page.

    YOUTUBE CHANNEL:

    ABOUT US:

    C4D Cafe is the largest CINEMA 4D community. We provide facilities for discussion, showcasing and learning our favorite software :) Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, get your own private messenger, post status updates, manage your profile and much more. If you need to find solution to your problem or otherwise ask for help, Cafe is the right place.
    ×
    ×
    • Create New...