Jump to content

Sammy

Turbulence FD won't run sim on GPU

Recommended Posts

Hello,

I've been wondering why Turbulence won't run any simulations i calculate on my GPU.
When i run simulations it calculates via my CPU, it offers the option to use my GPU but the second i switch to my GPU during a sim it switches back to my CPU automaticly immedately.
I have no idea how this is possible, hope someone could help me out on this one.

Thank you in advance.

27" 4K LG Monitor 
32GB RAM

GTX 1080
Intel i7

Share this post


Link to post
Share on other sites

9 hours ago, Sammy said:

Hello,

I've been wondering why Turbulence won't run any simulations i calculate on my GPU.
When i run simulations it calculates via my CPU, it offers the option to use my GPU but the second i switch to my GPU during a sim it switches back to my CPU automaticly immedately.
I have no idea how this is possible, hope someone could help me out on this one.

Thank you in advance.

27" 4K LG Monitor 
32GB RAM

GTX 1080
Intel i7

Maybe your graphic card doesn't have enough memory if you set the resolution of your sim to high... Try to lower your resolution settings.

 

 

Cheers

Regis

Share this post


Link to post
Share on other sites
  • Topic Author
  • Thanx for the reply guys, unfortunately though I have considered these options.

    Switching before the sim also doesn't work.

    Tested it out on different simulation types some custom made heavier complicated ones aswell as your basic temperature and density channel emission from a sphere.
    Tried running the simulation cached aswell as interactive
    Tried increasing voxel size aswell as decreasing it.
    I've been hearing some stuff about Vram (Video-Ram) of my GPU.

    I can't imagine a GPU of this caliber wouldn't be able to handle either of the simulations, GTX1080 is an above average GPU it shouldn't have issues with a simulation of this level.

    I just really can't put a finger on it and apparently i'm the only one experiencing this issue, or atleast nowone knows how to fix it or seems to be troubled by it.

    Thanx for the suggestions tho guys

    Share this post


    Link to post
    Share on other sites

    I have a GTX Titan X (12GB memory) and I can tell you that based on the resolution, I already overlapped the capacity of cards for some testing purposes.

    If you try one the sample provided by the maker of the plug-in, do you encounter some issues?

     

     

    Cheers

    Regis

    Share this post


    Link to post
    Share on other sites
  • Topic Author
  • If that's the case, then my 4K screen is probably taking that Vram not allowing me to run the sim on GPU, but still it won't allow any type of simulation to run on GPU not even a simulation which barely takes any calculation it just switches to CPU in every type of sim i tried

    Share this post


    Link to post
    Share on other sites

    Hi, Sammy, did you fix the issue?

     

    I'm having the same problem...

     

    i7 7700k

    GTX 1060

    Full Hd Res.

    Share this post


    Link to post
    Share on other sites
  • Topic Author
  • Hi @Alez Unfortunately the same problem is still occuring when using TFD always, i'm upgrading my rig with a few extra GPU's wonder if the problem would still persist, if so it's a software or hardware or idk issue, if not it was my 4K monitor apparently taking that much vram for tfd to not be able to simulate on gpu anymore

    Share this post


    Link to post
    Share on other sites

    It's doubtful that a 4K display alone would consume enough of a 1080's VRAM.

     

    It sounds like there may be a problem with Pascal-family GPUs, perhaps raise the issue over on the TFD forums (forum.jawset.com)?


    John W. -- MacPro(12C/24T/10.10.5),32GB,ATI7970

    Share this post


    Link to post
    Share on other sites
  • Topic Author
  • @jwiede Didn't try that yet, I will thanks!
    And it's then probably a problem that not everyone has, cause i've searched allover the web and there's plenty of people with all types of rig setups, gtx 1080, gtx1080ti, gtx 1070, gtx1070ti etc.
    it appears only a handfull of people are facing this issue.

    Share this post


    Link to post
    Share on other sites

    Might be irreleveant but I had this issue when running the wrong TFD version with the wrong C4d version.

    I was running a R20 TFD on R19, it didn't see the GPU's - but still worked, just on the CPU

    Check the version number in the TFD Plug In folder is not higher than the C4d version

    Share this post


    Link to post
    Share on other sites

     

    On 10/26/2018 at 11:41 PM, Burgess said:

    Might be irreleveant but I had this issue when running the wrong TFD version with the wrong C4d version.

    I was running a R20 TFD on R19, it didn't see the GPU's - but still worked, just on the CPU

    Check the version number in the TFD Plug In folder is not higher than the C4d version

    Hi Burgess! I have the same issue with my GPU GTX 1070 ti. ¿How did you solve the problem? I can not find another version to download on the Jawset page. Thanks!

    Share this post


    Link to post
    Share on other sites

    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Reply to this topic...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


    • Recently Browsing   0 members

      No registered users viewing this page.

    YOUTUBE CHANNEL:

    ABOUT US:

    C4D Cafe is the largest CINEMA 4D community. We provide facilities for discussion, showcasing and learning our favorite software :)
    ×
    ×
    • Create New...