Friday, February 18, 2011

Mambo #5 (let mambo = "post")

So I bet you're wondering what we did this week.

Nicole took a closer look at the rig in Maya, with some help from Pengfei. The skel_driven skeleton is controlled (driven) by the other skeleton. Imagine that! As for why there are two skeletons: perhaps different programs require different joint conventions. This allows for more flexibility when exporting the data. Nicole decided to animate from the non-skel_driven skeleton since it seems to control both skeletons.

Nicole blocked in the animations that do not involve fingers, since the skeleton does not have individual fingers rigged. (And a few of the gestures that do involve fingers, minus the finger movements.) The gestures all start from the same rest pose, with hands in front of the agent, per Dr. Badler's suggestion. At the moment they do not all end at the same rest post, since she's not sure how we plan to blend between successive animations and a few (like give up) probably wouldn't end in that rest pose... but she can add the rest pose at the end of each animation if she needs to.

The animations can be viewed here. Right now they are just playblasts. She can edit this with renders if needed.

Omar extended the 2D GUI to allow for interactive manipulation of alpha-values on each segment of the beta-spline. The he also created a new GUI to allow him to visualize beta-splines in 3D (mostly a test to make sure that the code did not need to be changed too much to allow for 3D visualization). Here's a screenshot:

This GUI currently takes in a .txt file that gives the number of control points and then a list of the x,y,z coordinates of each. He changed this from the initial interactive control point placement because, eventually, he will be receiving control point data from Nicole and using that to create splines. He wanted to make sure that he adjusted his code accordingly in order to allow for a similar input of control point data. He has also started to attempt to transfer his 2D interactive shape controls to the 3D version.

For next week...
-Nicole isn't quite sure what to do, other than refining some of the animations. Should she add finger joints to the skeleton to animate the other gestures? Should she start working on a way to connect her part with Omar's? (She's a bit stuck on where to start on that.)

-Omar is going to start to try to figure out how to use his splines to control a human model in Maya. Each joint should have a spline to itself, representing its motion. I'm not sure if that means that a control point must store rotation and translation data? Also, any tips on how to start attaching my splines to a maya model? I'm not quite sure where to start on that either.

So, a little stuck is Nelskati. But with some help, I'm sure we'll be right back on track.

With lots and lots of Lotsa Love,

Nelskati

2 comments:

  1. I saw the character animations and the beta-splines. Everything is
    looking very nice.

    Omar I think one thing to do would be to try to create a simple Maya
    plugin of what you have.
    I initially thought that is what you did since the screen shot looked
    nice, but realized I was just getting way ahread in my mind...
    Nicole has the example hello world, and I can send it to you again
    Monday, but away right now from work...
    So even before you get to control the model just get the code to work
    in Maya's framework now... the same way you have it now
    This process will answer a lot of the questions you posed about , how
    are curves stored in Maya, etc.
    I'll send a link to the documentation.
    but great job so far!

    It will be super cool to have that exact same editor for the
    beta-spines how you have inside Maya and will be needed for the next
    stage of the project. Which is what you are getting at with your
    question, but I would not jump right into that quite yet.
    Then you can start to override curves for the animations Nicole is
    making, but we'll dell with that later.


    Nicole I think you can do a couple things on your end. The animation
    looked nice. (Feel free to embed the movie in the blog I almost
    missed it at first)
    As the animation is moving, perhaps visualize the curve path for say
    the hands (wrist for example)....
    So its the same animation with the curve drawn on top from the link
    you had lets say
    That will take some work, and mucking around in Maya MEL and maybe the API

    The other is of course, the gestures need fingersso you might have to
    have a skeleton or modify one of the skeletons to use the hands for
    the animations
    then animated with hands moving. Figures I imagine would be important
    for control and gestures.


    Hopefully you see how these two parts are starting to intersect....
    Omar is getting the beta-splines working in Maya -> <- Nicole is
    visualizing paths from her animations.
    Obviously the Beta-spline would control this wrist as the next step....

    (Once that is done, the step after that would be to interface the
    EMOTE parameters with the biasing and weighting of the beta-splines...
    but that is 1 and 2 weeks away... and will require tweaking like
    spring constants in 563 when you guys took it.)

    As for the dyadic conversation paper , my interpretation of this would
    be ..., once you have this system working above
    The FSM/Behavior tree will fire EMOTE parameters which will control
    the agents in the conversations. The way I look at it, is a great
    application for your work once it works... Libo is mocking up the
    Behavior Trees... and they will fire the control EMOTE parameters as
    agents converse.
    Then imagine your work using these EMOTE parameters and canned
    animations and adjusting the motion based on what is going on in the
    scene.
    So in a month that should intersect very nicely....

    ReplyDelete
  2. Nicole,
    Your animations look good. As for the finger gestures, I think you can solve it either by adding more rig control on the mesh, or some inverse kinematics solution which alter the mesh of fingers given target end points. But I think the rig model is more direct and viable. You may have to modify the current rig model, because making a new one will cost much more time. But, just focus on the other main postures and keep productive.

    Omar & Nicole,
    I think you can try to build some interfaces on the human model, which help the splines model make use of the data in human model. Leave some spaces for extension. For example, you might want to support both the model finger postures and the one without.

    The 3d splines model looks good too.

    I think Omar raised a good questions regarding the relationship between two models - human motion model and splines model. As for the human model, as you may know, the position of root joint plus the rotations on root joint as well as other joints can determine a motion. So, the motion positions can be calculated via rotations data.

    However, for splines model, the information is mainly positional. I don't think you should plug in the rotation info in it. But, we can calculate all the rotations based on relevant splines.

    For example, let say A is a point at a time on the spline for the hand joint, B is the one on elbow joint, and C on the shoulder joint. What we know at first is the positional values for A, B and C. But we can calculate the rotations based on their relevant displacements. For example, the angle(A,B,C) could determine the rotation value on elbow. I think the other joints' values can also be calculated in this method.

    Thanks for your contributions. Look forward to your next week's progress, :)

    ReplyDelete