Jump to content

rasputin

Regular Member
  • Content Count

    997
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by rasputin

  1. Thanks for this, Beefdoctor... great tutorial. Yes, I'll be curious to see how Fields react when placed in the Volume Builder... especially the Random Field.
  2. Hey guys, I'm trying to find the place in the R20 Helpfiles where it discusses the features of the Hair Light Tag.... I'm not finding it anywhere. Thoughts? ras
  3. Wow... very cool, Bezo. Thank you for taking the time to set this up and upload it for me. Very helpful.
  4. I think I have solved my own problem: You have to drag the Vertex Map down into the deformer's Falloff region. Thus it falls under the new concept of Fields... which I am only beginning to understand...
  5. Hi Gang, I am experimenting today with Vertex Weighting. Attached here is the C4D Helpfiles entry which discusses Vertex Weights. It says that Vertex Maps can be used to affect Deformers. ie., you can use a Vertex Map to control degree of influence of a deformer (like BEND, TAPER, TWIST, etc.) Yet, how would you make that happen? Where would you place that Vertex Map to make a Deformer affect only part of your mesh? Thanks, ras
  6. Mais non. Hair looked like holy hell when Physical first came out.
  7. I just noticed that, in R20, the Physical Renderer can now render Hair effectively. When did this happen, I wonder...? (ie., what release?) For the longest time, the Standard Renderer was the only one that could make Hair look good... I think because it applied a different (or better) antialiasing solution? Just curious. ras
  8. Thanks, Kiwi. I did what you said, and have opened a New Panel View, as you see, on the right. The problem I run into is that both the main view... and the new View... both show the model with its polys showing... I guess this is because they are showing the same Perspective view. Can I have a new Perspective View window that is not linked to the main view... which isn't showing the polys? What I mean is, one view showing UV EDIT mode, the other window showing Texture mode. Thanks, ras
  9. Wow... I never thought of that. Thanks, Igor! How do I open a new panel within the UV EDIT space?
  10. Hey guys, A question: I'm in UV EDIT mode (see screencap). Usually in this view, when you're using the Magnet tool to slide UV's around your bitmap texture, you always see the mesh covering your model. And yet the mesh gets in the way sometimes. Is there any way I can use the Magnet tool... while temporarily hiding the visibility of my mesh? I would like to see only the underlying bitmap texture on my model as I make extremely fine tweaks... mainly to prevent areas of stretching. Thoughts? Thanks, ras
  11. For you HAIR wizards: Here's what I want to do (see my screencap): I am creating eyelashes on a human figure. My Hair Guides are all in place. I want to have the (rendered) hairs be gradually thicker (Thickness) as you move out to the outer eye. Thus, thinner eyelashes towards the inner eye, increasingly thicker towards the outer eye. I'm not quite sure how to achieve that. I tried painting a Hair Vertex Map on the guides, with "cold" values on the inner guides, increasingly "hot" values on the outer guides. A Hair Vertex Map Tag was automatically added in the OM. In my Hair Material, under Thickness--->Texture I entered an ordinary Vertex Map using EFFECTS---->Vertex Map. (This is what you'd do in any sort of Mesh situation, right?). But I see my Hair's painted Vertex Map is not welcome in that slot. So THAT didn't work. (Which makes me wonder: If I can't do that, then what's the purpose of a Hair Vertex Map Tag?) So... is there any other way to get the graduated Thickness look I'm after? Thanks, ras
  12. Agreed. It's GET CONTEXT and SET CONTEXT that mystify me the most. They are to work in tandem? Especially I don't know how to connect them with wires to other Nodes, don't see where they belong in a "chain". The helpfiles are indeed cryptic here.
  13. Hey guys, I don't know if you've encountered this problem when seeking to share your C4D renders on Facebook: Facebook's image publishing algorithm features a rather harsh compression standard. (I guess they have to use it, given the bazillions of images they are forced to store on their servers everyday.) In most images you publish, Facebook is going to make them come out with nasty artifacting, especially all sorts of ugliness around your sharp edges. It also tends to make your fields of solid color full of bad artifacting, and will make your color gradients come out with ugly banding. The more "toon-y" and less photorealistic is your image, the worse will be the artifacting. I've had a couple of my image posts almost completely ruined by Facebook's compression, which bummed me out plenty. Even if your render is already compressed and low-filesize--- as with a JPG--- Facebook will even compress it further. Well, I think I've discovered a bit of a workaround to making your image posts to Facebook look better, a bit closer to how they originally looked when you first rendered them in C4D. Export your C4D render as a 16-bit PNG. Import your C4D render into Photoshop, then choose the ADD NOISE filter. Make sure the filter is set to Gaussian and Monochrome. Then you want to dial in a Noise setting that is extremely subtle, like 0.5% or 0.6% The resulting noise should be extremely subtle, and not change your C4D picture unduly. You can always FADE the effect under EDIT after application, until the noise is almost imperceptible. I don't know why--- someone clever here will perhaps know--- but the slight noise added to the image completely "short-circuits" Facebook's compression algorithm, and your image will be posted with very little... almost no... banding and artifacting (which were far worse, visually, than the slight Noise you're adding) I also notice that Facebook tends to post your image slightly cooler and slightly less saturated than it appeared when first rendered. You can prep for this by making your image 2-3 degrees warmer in Photoshop under HUE/SATURATION, and you can use 2-3 degrees of Vibrance added, before you post to Facebook. With these steps-- admittedly, they are a workaround, not a cure--- your image in Facebook should look closer to your original artistic vision. ras
  14. Very cool, Srek! Yes, I do see the difference. I shall have to use the Triplanar Node to avoid stretches in texures.
  15. Okay, I'll submit one! It's the first good "trick" I've learned on my own using Nodes. The Triplanar Node is allowing you to assign 6 different colors to an object--- a simple Sphere--- based on the Sphere's 6 local axes orientations. Four pure white Lights are shining on it. If you can do this with colors, imagine what you can do with bitmaps or generated procedurals... Things can get real fancy, real quick! No big deal, really, but it's my first little "breakthrough" with Nodes. (The circular plane beneath is textured with an ordinary Material) ras triplanar colors.c4d
  16. It's the manipulating UV's that most interests me, I think, with the Nodes. I need to learn the features of GET CONTEXT, which are eluding me at present...
  17. Hey gang, I see that, with the introduction of Nodes in R20, a material's IOR is now written out as three floats. I think this means: the IOR as it appears in the Red, Blue and Green channels, correct? Where did C4D get these RGB preset values, anyway? Is this a further, scientific way of refining out IOR ? Just curious. And Absorption refers to which rays of pure white light are absorbed, and which, hence, are reflected. Correct? Thanks, ras
  18. Bingo. That was it. Now it's working. Thanks, fastbee. I will be more attentive to the Mode next time...
  19. Hey gang, I just tried something that didn't work (see screencap) I first made an complex object out of Mesh Volumes then I made it Editable (ie., turned the whole Volumes stack into a mesh via Current State To Object) then I applied a Polygon Reduction to it, reducing it by 75% then I tried to Sculpt it further, by subdividing and using Sculpt Tools I see that Sculpt is not available to me in this context? I guess the Polygon Reduction process creates a special kind of polymesh that cannot be further sculpted? Sculpt only works on regular, organized polygons? I was (seemingly) able to do a Sculpt Subdivision... but none of the Sculpt Tools had any effect at all. Thanks, ras
  20. Thanks, fastbee! I was just wondering if I needed to subsume all nodal elements inside the Reflectance channel, in the PBR fashion.
  21. Hey gang, Just a n00B question: Is it assumed that Nodes are meant to be used in a PBR context? ie., with PBR Lights, and in a Physical-type renderer? Or does it not matter what rendering method you use? Thanks, ras
  22. I see. Sometimes Light B will completely bleach out Light A... or seem to.
  23. Hey guys, A question regarding the use of the COLOR---> BLEND node in a complex node setup. Specifically the BLEND input of the BLEND node. The Blend input concerns itself with a dropdown list of all the (layer) blending modes we are familiar with: multiply, dodge, hard light, screen, and so forth. (see screencap). Here, I've got the blend mode arbitrarily and manually set to Color Dodge. Question: What kind of input is the Blend input seeking... in order to access that list of blend modes? In other words, what kind of node will allow you to "scroll" through that list of blending modes? Some kind of Math node? They are numbered as integers? Thanks, ras
×
×
  • Create New...