Jump to content
Cinema 4D Plugins


Regular Member
  • Content count

  • Joined

  • Last visited

  • Days Won


lopakam last won the day on December 20 2016

lopakam had the most liked content!

Community Reputation

44 Noble Beginner

About lopakam

  • Birthday 06/14/1961

Contact Methods

  • Website URL

Profile Information

  • First Name
  • Last Name
  • C4D Ver
    19.024 Studio
  • Location
    Palos Park (Chicago), USA
  • Interests
    Photography, Brewing Beer

Recent Profile Visitors

1,774 profile views
  1. I have been with Macs a long time too and was about to jump ship because I have been frustrated with Apple's apparent lack of interest in the pro users. Then they announced the future Mac Pro so I decided to try to find a way to stay with Macs, but take advantage of modern software (such as Octane and Redshift.) My first step was upgrading my 2009 Mac Pro by upgrading the firmware to a 2010 (5.1), install High Sierra, upgrading the CPUs to 3.46 12-cores (with washers, electrical tape, and other mods), and installing a 1080ti. Works great. But I needed a second computer. So I tried building a hackintosh. This has worked out very well for me. Upgrades are odd because I have to pull out my NVidia card, then put it back in after the upgrade. But it works. Having two Macs with two NVidia cards is perfect for my workflow. Having said that, this has all been done to keep me productive on Macs while I wait for the future Mac Pro. I just hope that I can run NVidia on it without creating computer sprawl on my desk with a bunch of boxes and cables like what happened with my trashcan Mac Pro. If you do choose to go the hackintosh path, you need to be aware that your first build may be frustrating. It took me one evening to build the machine, but four evenings to get OS X properly installed. I also installed a second drive that I use to keep a working image of the boot disk to use just incase an upgrade destroys the computer. The bottom line on this is that I am not recommending this path, but it works for me. Mark
  2. GPU advice

    While I like the AMD cards, they are not CUDA. If you ever plan on using a render such as Redshift or Octane, then you will need an NVidia card. Mark
  3. GPU advice

    If you do not need a GPU right now, I would hold off. The prices are inflated to crazy levels right now, most likely from cryptomining. For example, I purchased a 1080ti in July for $799 from Amazon and it is now listed at $1500. I purchased a 6GB 1060 in early December for $249 and it is now $550. I have to imagine that the prices will come down once the bubble pops on cryptomining, and used ones may be very affordable. I know that this is not answering your original post, but something to keep in mind. Mark
  4. This is probably the wisest thing I have read so far. I agree and would love one, but not in the budget. But the part that makes the most sense in your post is the dangers of being an early adopter. Let's face it, many of us love technology and have always been early adopters of technology. But when it comes to being such an important (and expensive) tool for 3D, this is an issue. I purchased the 2009 MacPro when it came out. I would have been much better off to wait for the 2010 since some strange issues were corrected the next year. Since then, I have made some tweaks to my 2009 and it is fantastic. But still would not have to tweak it if I purchased the 2010. I see the same thing with the upcoming MacPro. I will wait a year. The iMac Pro is very tempting, but I will wait for the MacPro. I just did the something with my new phone. I wanted to replace my iPhone and almost purchased the iPhone 8. But like many, I waited for the X. This may happen with the iMacPro, many of us want it, but will wait for the MacPro. I just hope that Apple does not misinterpret the (possible) slow sales of the iMac Pro as a lack of interest from the pro users. The pricing of the new MacPro is confusing to me. Would it be more than the iMacPro? If it is truly modular, it could be purchased with the minimum memory, the smallest ssd, and the base graphics card. Then install memory, storage, and graphics from third parties overtime (at a lower cost.) Finally, it most likely will not have a monitor (lowering the price.) Anyway, my wife would explode if I add another monitor to my desk! Mark
  5. R19 ProRender

    violst brings up a very good point. The 2009 through 2012 Mac Pros are amazing machines and can run modern GPUs, with a few small things to be aware of. I have a 2009 Mac Pro and found a way to upgrade the BIOS to a 2010 (5.1.). This allowed me to install High Sierra. With High Sierra, I now have an Nvidia 1080 ti. (I was also able to upgrade my processors to dual 3.46 hex, giving me 12 cores.) The main issue with using these cards is upgrades to the OS. Every time Apple updates the OS, the card does not work until I download the new drivers from Nvidia. So I have an old Apple 120 card installed, driving my monitors. This leaves the 1080 ti for rendering as well as using the computer to download the Nvidia update. Even with one card it is still possible to update if you remote into it, but leaving the old 120 in seems more practical. Having said that, I have not tried ProRender with this setup. I mainly use Redshift, Octane, and Cycles4D. But I cannot see any reason why this would not work. Don't get rid of that old Mac Pro just yet. There is still a lot of life in those old computers! Mark
  6. The Orville ECV-197

    Wow, I thought I was the only one that liked this show! The SciFi writing is very good, and the humor fits in nicely. It's old school Star Trek with modern/edgy humor. Your modeling is great, I would have a hard time doing that without any good reference. Amazing, really. I read that they film most scenes of the Orville from a model, and not CG, more old school. Your work is always very good and I'm excited to follow this! Mark
  7. The noice was OK. However, NOT acceptable for animation and probably not good for a finale render. Honestly, I just gave up. I thought about what you said. Why ProRender, such a crappy renderer? The only thing that I could think of was MAXON's drive (and I am VERY grateful for this) to make C4D work exactly the same on Windows and OS X. What other renderer works with AMD? Cycles 4D does, but has some issues using the AMD cards. For example. some shaders just come out black with the AMD GPUs selected, but fine in CPU and using Nvidia. In all fairness to MAXON, they provided ProRender as an Alpha/Beta type release. I have a lot of confidence in MAXON's ability to provide a quality product and believe that this may one day be a viable option. But I question their business decision to devote resources to this. Like I said earlier, we now have a lot of options for renders today, even for Macs (using a little creativity to install an Nvidia GPU.) Mark
  8. Nothing much. I used an interior scene that I created years ago and, as you said, it was painfully slow. I stoped it and created the typical "sphere on a plane", with a few cubes, scene with one light to get it to work. That was it, nothing elaborate. It took 24 seconds with both cards, 38 seconds with one card, and 1:48 with the six cores in CPU. It was with 100 passes. My old 2009 Mac Pro with 3.46 12 cores took 1:05 in CPU (again, no option for GPU with the Nvidia card.) In my VERY humble opinion (with very limited experience with ProRender) I find it totally unusable. With all the great renders out there, I see no reason for this and I now use Octane, Redshift, Cycles4d, and Physical (depending on what I am doing.) I'm not sure I understand why MAXON would even bother with this. I would hope that they direct their resources to other things such as character animation. Mark
  9. I just tried it on a 2009 Mac Pro with a 1080Ti (no AMD cards installed.) ProRender does not list any GPUs. I rendered a test scene that took 48 seconds to render. I then launched R19 on my 2013 Mac Pro and it listed both AMD D700s. With one of the cards selected it rendered in 33 seconds. With both selected it rendered in 45 seconds. With CPU selected, it rendered in 1:27. This seems odd. But anyway, it does not appear to support Nvidia on the Mac Pro. Update: I did a more complex scene with several shaders on the 2013 Mac Pro and selecting both D700s was faster than selecting one D700. Mark
  10. Ah, I see the card now. Sure, I'll let you know when I get home. Mark
  11. Looking at your screen shots, I do not see an NVidia card. All I see is an AMD listed, did I miss something? I have an NVidia 1080 Ti in my old Mac Pro at home, but never use ProRenderer. I'm in the office downtown today, but I will try ProRenderer on the NVidia when I get home. Mark
  12. You may want to check your fans on your GPU. It is possible that the card, or computer, is over heating. Unless you are using a Quadro, you are using a consumer grade device. I have seen the fans fail, and even the blades break off. If your card overheats, it may cause all kinds of issues, including overheating your CPU(s). The next thing to check is your power supply. GPUs draw a lot of power and if your power supply is failing, this could show up when starting to render because the GPU is drawing more power than can be supplied. Finally, it may be software related. Have you upgraded the OS? Have you installed some other software since the last time it worked? Can you run another GPU based program like Geekbench to verify that it is not Octane or C4D? I feel for you, these kind of issues are frustrating, and always seem to come up at the wrong time. Good luck, Mark
  13. That is true, for now, that the memory cannot be combined with multiple GPUs. The reason for this is the way they work. The renderer needs to load the entire scene into the card (including the materials) so the card can process the scene. If you have multiple GPUs, then each GPU needs to have the complete scene loaded. Having said that, I believe that Redshift uses the memory a little differently, but I'm not sure. Your second question is a good question that a lot of people ask. Basically, you will be limited by 8Gb in your example. Like I said, the entire scene needs to be loaded in memory for the GPU to process the scene. So in your example, if you load 10Gb, you will need to exclude the 8Gb card in your render settings for that scene. The amount of memory may not be an issue. 8Gb is a lot, and that may be more than what is needed. I have 11Gb in my GPU and have not even come close to using half of that. Most scenes that I have rendered stay around 3Gb to about 4Gb. I'm not sure what kind of work you do, but 8Gb is a massive amount of data. Finally, I would check the renderer that you choose and make sure that it will support multiple GPUs. For example, I do not believe that Maxwell supports more than one GPU at this time. Mark
  14. I was able to get my old 970 GTX to work with the 1080 Ti by using an external power supply. I could feel the heat as soon as I started rendering a test scene. I had to take out the two hard drives just above the second card because there was only a very slight gap (about 2mm) between the top of the card and the drives. Add the fact that I lost the convenience of upgrading the NVidia drivers by removing my old GT 120 (and loosing two hard drives), made this too inconvenient with the speed that I gained. I would hate to be Icarus (from the Greek tragedy) by flying too close to the sun and melting the wax, or in this case, melting my computer. Having the one 1080 Ti is incredibly fast and I will stay with the one card. As a matter of fact, Geekbench scores I ran on my computers showed the 1080 TI 2.5 time faster than my D700 graphics in OpenCL scores (additionally, I now have CUDA.) Hopefully this will help others. Bottom line: I feel comfortable with what I have until Apple releases a real Mac Pro. Mark
  15. I may have time this weekend to put in my old NVidia 970 since we have a three day weekend I'm off work tomorrow, ya! Going to mass in the morning to celebrate our veterans and then go with my dad to a gun range (he's an old Marine.) After that, the rest of the day is free. No side project, no work, and the house is clean. I am so excited about having nothing scheduled!!! I looked at the iMac Pro, and no NVidia support is a showstopper for me. And I cannot imagine how much an 18-core version will cost. I also understand that Geekbench scores are popping up of the new iMac Pro. While fast, the numbers didn't make sense to a lot of people. It turns out that Apple may have dialed back the speed on it. The only reason for this is limiting the amount of heat. Sure, it looks nice, but that is not what we want. We need speed. I'm still holding out for hope that the new Mac Pro will be a serious computer. But Apple cant go nuts pricing it. I have NO problem spending more for an Apple, but I will not spend $10,000 on a computer. Well, that's if I want to stay married! I'll be interested to hear what you finally end up doing. Mark