Insydium Cycles
Insydium Cycles

joolsd

Cafe Oldtimer
  • Content count

    208
  • Joined

  • Last visited

Community Reputation

0 Noble Beginner

About joolsd

Profile Information

  • First Name
    J
  • Last Name
    DIAMANDIS
  • C4D Ver
    15 Studio

Recent Profile Visitors

752 profile views
  1. I was hoping to bring the model back into Faceshift so we could use the mapping system in faceshift. This is just a still with no key frames. I have managed to use binary data from faceshift and use the faceshift xpresso to link motion to the rig and that works. But are very weak captures. Very poor definition of speech. I have yet to see anybody online use this technique that well, apart from FS themselves. And they are no longer supporting the product. There are about 3 workflows: 1. faceshift > C4D 2. C4D with faceshift networking between the two programs. 3. C4D > to faceshift, (assume you just need a rigged model with blend shapes working in a pose morph with correct, naming of objects?) I assume this is correct? The main concern is that the head is not working at all. Even if I was to use FS to C4D work flow. I wouldn't be able to export the model to Unity where it is needed. I have tried different FBX formats. I have successfully exported other models, with a similar if not exact setup.
  2. Hi all. I am looking for ideas or suggestions as to why the pose morph is not rendering? I have 48 blend shapes. I am starting to wonder if I am asking too much of the pose morph? After export, the head model completely disappears? Is this normal? I have tried many different FBX versions, different pose morph types. I'm starting to think this isn't possible? See the 'failed' and 'before' export. This is frustrating. :( Best regards Julian PS now use R17, I need to update my profile.
  3. Hi . I have been doing some facial rigging and this is involves some eye ball binding. In R17. The idea is that the eye will track motion data from a n external mo-cap source. It is one of those projects, where I am going to have to re-do stuff for development reasons. So I don't want any success to be a fluke. So with the eye ball i have a head rig. And I can get it to bind 50% of the time. I am ethier doing something very slightly different each time or there is a bug. I have tried to observe myself making it work, but I can't see what I am doing work when it doesn't work. Any ideas on the inconsistency? Best regards
  4. Thanks for the reply. I have looked at Daz as well, which is like Mixamo. Similar issues with slightly a slightly different set of blends.
  5. Hi all. I wondered if anybody had some advice on where to buy 3D character models of ordinary people. Except they must have facial blend shapes as well as rigged. We have a copy of Faceshift, which has been discontinued, but it can still be used. The capture is great on default faceshift models, but it is difficult to find custom models that conform or are close to the blends from faceshift. I tried re-linking a Maximo model to faceshift data (useing the faceshift C4D plugin). It works, but is a lot weaker because the right shapes are missing to produce a better quality animation. I don't really have much time to do custom morph/blends to make the speech look better, without the process becoming a labour of love. Any advice would be appeciated (where to purchase models). Best regards Jools MouthSmile_L MouthSmile_R EyeBlink_L EyeBlink_R JawOpen JawFwd JawL JawR Mouth_L Mouth_R LipsPucker LipsFunnel BrowsDown_L BrowsDown_R BrowsUp_L BrowsUp_R Puff LipsLowerClose LipsUpperClose EyeSquint_L EyeSquint_R
  6. Didn't quite understand the question. You could use a compositing tag and exclude the GI on this objects you wanted to preserve.
  7. In hindsight, it is probably better to do most of the rigging in ipi itself , using a mixamo model. Then bring an exported model back into C4D. This seems to work the best. Thanks for your help. Jools
  8. I think you'd have to bake the vibrate tag. Else the camera will ignore it.
  9. Thank you for this. I will watch tonight and let you know what I think. EDIT: I did have a look, just now. I think the trouble with this example is that it is useful for transferring motion data from one identical model to another, with the same hierarchy and the same geometry. I think my problem is getting motion data from an alien (less native) source onto a Mixamo model. The source being Ipi mocap data. It doesn't look impossible to retarget the ipi data, but it looks quite tricky. You can import a Mixamoo into ipi and do the Mocap in there then bring that into C4D, it is long winded though. Your technique still look useful. So I am grateful for the demo. I think this is something worth knowing for sure. Best regards Jools
  10. Are you using the cineware? Are you using an external compositing tag. There is an option to centre the anchor point if I remember on objects, or text?
  11. Thanks for the reply Fair enough. I suspected that was the case. Like I say, I have seen the 'retargeting tag' technique, but this does look a little too much like a hack than a solution. I would be interested to see your point cache technique. My mo-cap project will end up in Unity so will have to use MAX or do it in Unity itself. And Unity does use PLA. I preferer working in C4D but may have to use MAX, or get a max user to help me..
  12. I was wondering if this method, in this guys tutorial was possible to do with a mixamo rigs or any other rig for that matter? I tried to copy this tutorial by attaching bvh data, to a miximo rig, but it did not work. I would expect a few problems, such as a differing hierarchy system. (not being exactly the same). What is this format he is using 'c4dscr'?? I was looking for a nice quick method (if there is one)? I have seen the 'retarget tag' and even just 'binding'. But they seem like a lot of work and a bit hit and miss. Maybe it is just a lot of work. I saw the grayscale gorilla tutorial on binding the default mannequin model onto fbx mo-cap, but what if you need something more complex? What is the best way to use motion data on my desired model? Thank you.
  13. Ahhhhh, you know what I think that is probably it. I think I must of zero'ed it out by mistake. When I was making the black border opacity darker. It looks exactly like how it was. Ok, I will have to remember this, as I think it has happened before. Thanks.
  14. To be honest I tried all of these modes and still the same problem. Or at lease the 2 options I have. I see you have the extra option which doesn't seem to be in R17. Where is this icon in R17, as it may have been on somewhere else?