It’s been a bit of a busy weekend at my day job, but the skinning and weighting seems to be going well. Once again I’ll explain these processes briefly for those who don’t do this kind of work.
“Skinning” is when you apply a mesh to a rigged skeleton. “Rigging” is making the skeleton and telling it how the joints work, and that is something we currently cannot do for SecondLife; we have to use the skeletons they are already using. This makes skinning a bit tricky when you are using non-human shapes, because you don’t have a tail bone, or wing bones, or enough neck joints for a long neck. That means you have to do your best to apply the bones to the areas where they will make the closest approximation.
After the mesh is skinned, you could technically call it done and move on, but best practices would require ‘weighting’. This is a mostly manual operation of adjusting how the vertexes of the mesh move when their attached ‘bone’ is activated. For example, if you rotate the elbow, does the mesh get weird creases or bend like a noodle? Then you have to adjust the weights, which will allow some (but not total) control over how the elbow distorts. Again, it’s not perfect, and you have to go for best approximations.
So far, I BELIEVE everything is going well but you never know until you get to the stage of importing it into SL. One of the choices I have made is to try to use the eye ‘bones’ for the ears. If you’re familiar with horses, they have limited field of vision and their eyes do move but mostly you will see them move their heads and ears more expressively. The ear movements are FAR more important than the eye movements to a horse, so I’m going to give that a try and see if I like the results. The neat thing is, they can pretty much work like eyes do in SL and it will be close what I want (a horse’s ears tend to follow the direction of attention, even when their heads aren’t looking around.)
It’s an experiment though and I may hate it. Won’t know until I try.