Thursday, February 24, 2011

Post Posterson VI, Esq.

Hello, friends. Read. Ist a long post. Ist very long.

So, this week was all Alpha Review prep. We made a pretty good video, if we do say so ourselves. Which we do.

But before we made a great video, we also did some work. Thanks to some guidance from Pengfei and Joe, we made some good progress this week.

Omar used MEL for the first time this past week! After reading a few online tutorials and trying to figure out how to use C++ code in Maya, he found that it would just be easier to adapt some of his C++ beta-spline interpolation code into MEL script. So that's what he did. And it worked pretty well. See the image below.



So, the one problem is that this particular curve actually has 60 edit points, when it should really only have 4. That is because Omar has not figured out how to have Maya draw a curve with a custom interpolation scheme like the one he has coded. He can specify points all he want, but it will only allow him to do so with interpolation of degree between 1 and 7. He may just have to get C++ and Maya working together. But for now, in order to at least visualize it in Maya, he currently has the script drawing the curve with 20 points on each segment, where each of these points is interpolated linearly. If anyone knows how to define an interpolation scheme using MEL, let Omar know. Thanks.


Nicole added finger joints to the rig. (A few of the hand joints had their names changed, and joints rhand_end and lhand_end were deleted because they were no longer needed.) Yay, fingers!



The plan was: add finger joints to the rig, save current skin weights, unbind the skeleton and the model, rebind the skeleton to the model (so you can animate the fingers), load the old skin weights (so you only have to paint the finger weights), and finally modify the skin weights so the finger joints move the fingers.

However, Nicole ran into a problem. Nicole unbound the joints; the model's arms and legs, previously in the bind pose, changed position to this:

 The rig does control the model, but it causes awkward scaling when a joint is moved since the skeleton and the model don't line up properly. Nicole has tried both smooth bind and rigid bind. Neither solves the problem.

Pengfei: Nicole has uploaded the Maya file to the sig computers if you'd like to take a look at it.
The file is called A_001_v8. It’s in sig-share\Projects\2011-RCTA-MARKETPLACE, and it has the model before the skeleton has been un-bound. (If you detach the skin and then bind it again, you get the image above.)

Question: Also, Nicole would like to know what to do about the skel_driven skeleton (the one she is not animating with)? Does she need to add finger joints to that one as well and link it to the main skeleton? She didn't play around with that, since trying to re-rig two skeletons could create twice as many problems, especially when one is linked to the other. (And she's not even sure if we'll need to use that skeleton.)

For next week:
  • Omar will hopefully be able to solve his interpolation problem in Maya. He'll try to make interactive manipulation of the curves work in maya like he has in his 2D GUI, but he's not sure ho far he'll get on that this week. There's a lot of things to do before spring break.
  • Nicole will hopefully be able to solve the problem with binding the rig, and will be able to work on the animations that involve fingers.
  • Nicole will also start reading up on MEL and Maya plugins. The next step is to draw out the curves in Maya for one joint (specifically, the translation of the joint over time), which sounds like a job for MEL. Once Nicole gets this done, she can give the control points of the curve to Omar, who can modify them with his curve editor. Drawing one curve is just a test. Eventually, we want to have a curve for every joint.
That's it for this post. See you all tomorrow at the alpha review.

Tender-Lovingly yours,

Nelskati

Friday, February 18, 2011

Mambo #5 (let mambo = "post")

So I bet you're wondering what we did this week.

Nicole took a closer look at the rig in Maya, with some help from Pengfei. The skel_driven skeleton is controlled (driven) by the other skeleton. Imagine that! As for why there are two skeletons: perhaps different programs require different joint conventions. This allows for more flexibility when exporting the data. Nicole decided to animate from the non-skel_driven skeleton since it seems to control both skeletons.

Nicole blocked in the animations that do not involve fingers, since the skeleton does not have individual fingers rigged. (And a few of the gestures that do involve fingers, minus the finger movements.) The gestures all start from the same rest pose, with hands in front of the agent, per Dr. Badler's suggestion. At the moment they do not all end at the same rest post, since she's not sure how we plan to blend between successive animations and a few (like give up) probably wouldn't end in that rest pose... but she can add the rest pose at the end of each animation if she needs to.

The animations can be viewed here. Right now they are just playblasts. She can edit this with renders if needed.

Omar extended the 2D GUI to allow for interactive manipulation of alpha-values on each segment of the beta-spline. The he also created a new GUI to allow him to visualize beta-splines in 3D (mostly a test to make sure that the code did not need to be changed too much to allow for 3D visualization). Here's a screenshot:

This GUI currently takes in a .txt file that gives the number of control points and then a list of the x,y,z coordinates of each. He changed this from the initial interactive control point placement because, eventually, he will be receiving control point data from Nicole and using that to create splines. He wanted to make sure that he adjusted his code accordingly in order to allow for a similar input of control point data. He has also started to attempt to transfer his 2D interactive shape controls to the 3D version.

For next week...
-Nicole isn't quite sure what to do, other than refining some of the animations. Should she add finger joints to the skeleton to animate the other gestures? Should she start working on a way to connect her part with Omar's? (She's a bit stuck on where to start on that.)

-Omar is going to start to try to figure out how to use his splines to control a human model in Maya. Each joint should have a spline to itself, representing its motion. I'm not sure if that means that a control point must store rotation and translation data? Also, any tips on how to start attaching my splines to a maya model? I'm not quite sure where to start on that either.

So, a little stuck is Nelskati. But with some help, I'm sure we'll be right back on track.

With lots and lots of Lotsa Love,

Nelskati

Thursday, February 17, 2011

Clarification about dyadic conversation paper (not the weekly post)


I read the dyadic conversations paper. Here’s how I think it ties into the work that Omar and I are doing.

The goal is to make a conversation engine that is efficient while still being realistic. These conversations focus on gestures rather than words. They are to be used with background characters.

The conversation is controlled by a finite state machine. At each tick, the FSM will determine whether the current action is done. If it is, it will determine the next action through a mathematical calculation that depends on the current stage of the conversation and the agent’s relationship with the person he’s talking to. The FSM will then grab the correct action from a set of “base” actions (Nicole’s animations). Depending on the agent’s emotional state, this base action may be modified (through control curves, for example; Omar’s work). A conversation ends whenever a variable, which can be thought of as an “end conversation” variable for simplicity, is over a certain range.

Is that correct?

-Nicole

Thursday, February 10, 2011

post post post POST

So I bet you're wondering what we did this week. No? Well, you can read anyway. Or leave. The choice is yours. Make the right one.

So:
Nicole started playing around with the model. She had to install Maya 2010 since the files couldn't be opened on Maya 2008. Here's a short video of our test gesture: the hand wave. She ran into some problems with the rig.

Problem One: The rig has two skeletons. Which one should we use? Here are screenshots of the two: the model with both skeletons showing and the skeletons viewed in the hypergraph.
 
They appear to have the same naming convention except one has skel_driven appended to the front. We asked Pengfei about it but he wasn’t sure which one to use.

Problem Two:  A few of the joints in the rigged model are constrained in their rotation. For example, the elbow can only rotate about the x axis, not the y axis. I realize that the elbow should only have one degree of rotation, but as it is, I think the axis might be wrong. When I unlock the y axis and rotate about it, things like this happen: screenshot here.

It appears that one skeleton is getting keyframes set and the other isn’t. How can we fix this? Other than keying both skeletons, which is what we're doing now… but it seems a bit pointless and means we're setting twice as many keyframes as we need to. (And if we forget to key one of the skeletons, weird things start happening.) The wrist also has the same problem.

Problem Three: the fingers aren’t individually rigged. A few of our gestures involve individual fingers. Does this mean I need to use a different rig, or do you want us to ignore these gestures for now?
Omar got B-splines working for real now. Then, he modified the GUI to allow for the interactive input of control points as opposed to interpolation points. And then he got beta-splines working. All in one week! Here's a visual comparison of the paper and my results:
(Beta-parameter tests)

And here's another picture of the local parameters at work (note the "kink" in the curve segment):

Currently, all parameters are hard-coded. I've begun to expand the GUI to allow the user to modify the parameters easily. I will also start to extend the splines to 3D. It seems that the code already accounts for a Z-coordinate, but that they are all 0. So I'll see if I have to actually change the code at all or if I just have to change the GUI to make it display in 3 dimensions.

A question: Have I implemented everything in the paper that I need to? I implemented Section 3 (not 3.1), and Sections 4.0, 4.1, and 4.2. There is an equation in section 4.3 (tension) that in reading seemed necessary, but in practice seemed unnecessary (the tension already seemed to be accounted for by the equation in section 4.0). Am I missing something there?

And another question that should have an obvious answer: is there a way to respond to specific blog comments? We couldn't figure it out.

Anyway, NEXT WEEK:
  •  Nicole is going to animate the rest of the gestures (possibly minus the ones that are finger-dependent)
  • Omar will continue to make improvements to the GUI
Once we complete these goals, we're both actually a bit unsure of what the next step is. We'll email Joe when we get there. Or you can email us whenever you want, Joe.

Another week down. Till the next one.

Love is a many-splendored thing,
Nelskati

Thursday, February 3, 2011

First post after the first post after the first post!

So, wanna know what we did thes week? Yeah, you do.

Nicole met a stupid girl who gave her the flu. Boo to you, stupid flu girl.
Needless to say, Nicole was not very productive.

Omar was not sick. And he finally got his B-Spline code working! (we think--end control points look a little odd, but I'll keep looking at that)

He also began to read the paper on beta-splines to see how he can extend his current code. Beta-splines are generalizations of B-splines that allow for local control over each segment of the spline without affecting the adjacent segments. (That sounds difficult to implement...)

Together, we looked through the 675 hand gestures in the gesture database. We chose gestures that were specific to the Middle East and Saudi Arabia, as well as some that are considered universal. We narrowed them down to 10 gestures, which you can see right here. Here's an example image, for those too lazy to click the link:


(We hope we didn't offend anyone with that gesture)

We looked for two kinds of gestures: 1)gestures that are foreign and novel for Americans and 2) gestures that we use in the United States that have different (and often more negative) connotations in the Middle East. We figured this would be useful since part of the goal is to use the Marketplace as a training simulation for the U.S. military.

So, for next week:
  • Nicole will be healthy.
  • Omar will not have caught the flu from Nicole.
  • Nicole will animate the test gesture (hand wave) and document the steps she takes to animate it per Joe's suggestion. If she gets the OK from Joe, she'll start to animate the gestures we picked out from the database.
  • Omar will continue to read about beta-splines and hopefully understand the paper. He'll start to try to implement them in 2D.
  • Omar and Nicole will not go to Feb Club events.
'Tis all for tis week.

Peace and Love and Double Rainbows and Double-Precision Floating Points,

Nelskati