Monday, May 9, 2011

Last post?? :(

Nelskati hasn't seen you in a few weeks. Nelskati has missed you. :(

So I bet you're wondering what Nelskati has done since you last talked to Nelskati.

Nicole was able to piece together the conversation given input from Libo (which contained the time of the keyframe, emotion, agent, and gesture name). Here's a video of the script running:


And here is the resulting conversation:


At the moment this conversation has no emotions.

We have started finding the beta spline parameters for each emotion. We also added textures to the models.
This video shows the result of the angry_fail conversation, with emotions. The video is lagging a bit, but if you go to 0:45, for instance, you'll see that some of the gestures have been tweaked/sped up.

Omar created a GUI that streamlines the interaction between our two parts and makes for easy emotion-infusing over gestures. By clicking a 'Load GKP' button, the scene file containing the textured agent model is opened. Then, the user is prompted to select the .gkp file corresponding with the gesture that he/she wants to modify. This creates the set of beta-splines for that gesture. There is a GUI with 3 panes:
1)"all-joint controls" that can control the bias, tension, speed, and number of divisions of all the curves
2)the individual joint controls (bias, tension, speed)
3)a pane containing 3 buttons--'Play New Gesture', 'Write to File', and 'Load New GKP'

(*Omar added curve divisions as an interactive option because he noticed that some gestures, when the number of divisions was too high, would lose the intention of the original gesture. Some gestures became too fluid and lost key parts of the gesture that made the gesture recognizable. So a slider for divisions has been really helpful)

The user can tweak the beta-splines and then hit the 'Play new Gesture' button to immediately key the new gesture and play it. This allows the user to easily see how his/her curve manipulations are affecting the animation. Once the user finds a combination of parameters that fits the desired emotion, the user can hit 'Write to File', select which emotion the new data represents, and then the program writes that out to a new .gkp file. Here is a short video that illustrates how it works:


Omar was also able to determine the beta-spline and speed parameters that gave certain gestures the look of being infused with emotions (happy, angry, sad). However, it is clear that there is no perfect set of parameters that can produce "angry" for every gesture, for example. In fact, we can barely use the bias parameter because it actually introduces a lot of error that is not easy to deal with within our current framework. Emotional gestures do not start and end in the same position as their neutral counterpart when using bias, which creates problems when stringing them together to make conversations.

Omar has found that speed is a very large factor in illustrating the emotion: faster looks angrier, slower looks sadder. Happy is a difficult emotion to achieve using just the beta-spline parameters. An increase in speed seems to denote happiness, but without the facial animation, happy and angry become difficult to tell apart. Also, some gestures connote a certain emotion to begin with, so to infuse it with an emotion of opposite connotation is very difficult. But still, for many of the gestures in the database currently, halving the speed is enough to make the agent look sad, and doubling the speed is enough to make the agent look angry. Repeating the gesture several times in a row would also increase the appearance of anger.

Well, we've submitted our final code, and now we're done! (although it doesn't feel like it) We can always keep working to improve what we've built, but unfortunately we are out of time. Still, we have accomplished a great deal. And Nelskati has had a lot of fun. But, of course, we couldn't have done this without the help of Joe, Norm, Pengfei, Libo, and Amy.  They have all been incredibly helpful and supportive.

Also, congrats to all our fellow classmates on a job well done! You all did amazing work.

Till we meet again.

With our undying love,

Nelskati

Thursday, April 21, 2011

(Un)lucky Post #13

So I bet you’re wondering what we did this week.

Nicole changed the way she was dealing with the keyframed data. Instead of storing the path that each joint follows, she’s now storing each joint’s rotation values. This makes it much easier to re-construct the animation curves, since she doesn't have to deal with converting between beta splines and Maya's NURBS implementation. Nicole can output a file with all of the keyframed rotation values, as well as read a file with rotation values and create a keyframed animation from it. Omar is still working on modifying the curves, so Nicole tested her code with an unmodified file. As you can see, the model had no keyframes initially. When the code was run, it outputs the animated gesture:



She also animated two new gestures upon Libo’s request: an emphasis gesture (called beat) and a goodbye gesture.





QUESTION: Does anyone know what the goodbye gesture is in middle eastern countries? Nicole could not find any information about it. She looked online and in the gesture database. She animated a “western” goodbye gesture as a placeholder, but she isn’t sure if this gesture is universal.

Tomorrow, Nicole is going to work on connecting her code to Libo’s output.

Omar had some code that was working really nicely when Nicole was passing him joint translation points. Now that key frame points are being passed to him, he has had to change up his code a lot. He has been operating under the assumption that Nicole would always be passing in control points to his system. Now, the key frames are actual points on the curve that he has to generate. Consequently, he has had to add the extra step of solving for the control points from the given key frame points, and then he can interpolate given those control points. This required him to now use C++ again, because the algorithm for solving for control points is one that involves solving a matrix equation (i.e. solving for c in Ac = d given A and d). He did not want to have to adapt the math::matrix C++ code to MEL.

Though this has been a tedious process, it is working relatively well. He has been able to figure out how to get his already existing MEL code to interact with the new C++ code that he has been writing. Currently, he only has the curves being drawn. He has not gotten to reimplementing the Beta Parameters yet because he has run into some issues in drawing the curves. When a joint is stationary (i.e. not rotating) for a period of time within the motion, the points on the curve start to drift. He doesn't know if this is a Maya issue or if it comes from passing duplicate points into his own code. He is continuing to investigate.

Once the investigation is complete, he will be able to add the interactive Beta-parameter control once again and be able to move forward with figuring out how the Beta-parameters affect performance.

That's what we got. Soon we'll make a poster and prepare for the presentations!

Love,

Nelskati

Tuesday, April 19, 2011

Nel has questions (not the weekly post)

Nicole had some questions. She was planning to send an email to Pengfei, but the email was getting kind of long, so she's posting here instead:

"I can now read Omar’s input files for one joint. (I haven't generalized it to all joints yet.) I'm trying to construct the new animation for this joint, but I’m a bit confused on how to do this.

Edit: The points that Omar passes to me are the control points, not the points on the curve.

Idea 1: keyframe each joint using the points that Omar passed in
Problem: how do we know what to keyframe (translation, rotation, etc) on each joint? The points passed in are the joint’s position in world space. Keying the translation won’t work, since only the root can translate. Keying the rotation might work, but how would we determine the rotation value for each joint? Also, I don’t think we know the timing of the animation after I get the points from Omar. (I passed Omar 200 control points; he passed me 187.)

Idea 2: create a new curve, then do the equivalent of the “Animate – motion path – attach to motion path” command in Maya
Problem: I don’t know if you can apply this to a joint with locked translation, or if it automatically applies to the root joint. When I tried it on a test skeleton in Maya, it defaulted to the root joint, and there were unwanted rotations/twists of the figure around the motion path.

Idea #2 is what I’m trying now, but I'm getting a strange discontinuity again when I draw the curve.

Screenshot of the drawn curve:



(Note: I moved the figure so you could see the curve.)

It appears to have the correct curve for the head joint, but there is some kind of discontinuity(?) that’s causing the vertical line to be drawn. I’ve double-checked the list I’m drawing from, and I believe that there are no duplicate points adjacent to each other. (That’s what caused the discontinuity in my previous code.)

I'm also beginning to think that the first idea would be much simpler, if I can figure out how to get each joint's x/y/z axis rotation values from their global position. Any suggestions?"

Thanks,
Nicole

Thursday, April 14, 2011

On the 12th day of Postmas

Wanna know what we did this week? Ok.

Nicole animated some additional gestures that Libo requested: yes/no head gestures, fidgeting, giving an item to someone, and receiving an item from someone.

They can be found here: http://www.youtube.com/watch?v=MQUcJTJwqu4
(She would embed the video, except Youtube changed their layout slightly and Nicole can't find the new embed link.)

She also tweaked the animations so they all begin and end in the same pose (which should simplify the transitions between them) and ran her script on each of them so she could give the control point files to Omar. This is part of the pre-computation, and only needs to be done once for each animation, unless the animation is changed.

Omar was able to get his scripts to draw out the joint translation curves of every joint given the .gcp file passed to him by Nicole. He also modified the GUI to just include Global Bias and Tension for each joint all in one scrollable window. Here is a screenshot of what it looks like with all the joint-motions drawn out:


The reason this wasn't working before was because of the number of UI's the script was originally generating. Currently, it still "draws" joints that do not move. This has not caused any issues, though.

Omar is currently in the process of writing a script that will take the curves he's drawn and output a new file for Nicole to read. It will probably be another .gcp file because he is passing back the same information. However, it seems that there may be less control points being passed back to Nicole than were passed to Omar in the first place. He's not sure if this will affect the length of the animation. He will find out soon.

For next week...
- Nicole plans to read up on Python I/O so she can connect her part to Omar's, and finish the last few gestures that Libo requested. It would be ideal if she could also find a way to re-generate the animation curves after Omar modifies the beta splines. She has a few ideas on how she could do this in Maya.
-Omar plans to finish the script he is currently working on so that his and Nicole's parts can continue to connect.He may need to find a way to ensure that the same number of cvs are being passed back to Nicole.
-If we can get these parts connected soon, then Nelskati will also start looking at the files that Libo has given us to generate conversations. We will figure out the best way to parse through this and get all our necessary information, and then pass this into the system that we have created.

Nelskati is too tired to think of something witty tonight.
Beam me up, Nelskati.

Thursday, April 7, 2011

Post Number Onety-one

So I bet you’re wondering what we did this week.

Nicole fixed the discontinuity on the line. Then she extended her code so it draws the joint paths for all joints. Here’s a video of the results:




As you can see, it generates curves that follow each joint’s position over time. (It did generate curves for the head as well, though they’re currently overlapped by the model.) The script itself is a bit slow since it hasn’t been completely optimized, so the part of the video where the script was running was sped up.

This script also outputs a .gcp file with the joint names, number of control points for each joint, and a list of control points for each joint. (.gcp stands for gesture control points, if you were curious.) This file is passed to Omar, who reads it in, chooses a set number of control points, and modifies the curve. But I’ll let Omar give you more details on that.

Nicole is currently working on a script that reads the .gcp file that Omar’s code generates, but she’s having trouble finding good IO documentation for Python. She can read lines or a specific number of characters, but what if she wants to read until she hits whitespace. How would she do that?

testline 1.0 -2.0 3
For example, if she wants to grab the following strings from the line above: ‘testline’ ‘1.0’ ‘-2.0’ ‘3’
-----
Omar wrote a MEL script that reads the .gcp file that Nicole generates, takes roughly 20 of the control points passed in, and outputs the curve as a beta-spline. It can do this for a .gcp file with data for multiple joints. Here is a screenshot of 3 joint translation curves as beta-splines from one .gcp file that Nicole generated (if they look familiar, it's because they look a lot like the curves that Nicole drew, which is a good thing):
One of the problems that Omar has encountered is that when he tried to run this on the file of all the joints, the script stopped running after about half of them. It may have been due to the fact that too many GUIs were being generated, which leads Omar to his next problem/decision. Currently, for each joint, a GUI appears (as seen above) with sliders for Global Bias and Global Tesion, as well as sliders for Local Bias and Tension for each segment of the spline (with 20 control pts-->17 segments). So, each GUI is huge and holds a lot of information. Although the local parameters provide for a lot more detailed manipulation, Omar is not sure if it is worth it to have to store all that information. And with almost 20 segments, the change from local manipulation is not as easily noticeable as it is on a curve with 6-12 control points. If anyone has an opinion on this matter, please let him know.

For now, Omar will probably comment out that code to have just one UI with a list of the global parameters for each input joint appear after the .gcp file has been parsed. That may be more pleasing to the eye and to Maya.

Another small issue is that not all of the joints move, yet they are passed to Omar in the .gcp file. So, there are a lot of extraneous, "invisible" beta-splines drawn whose control points are being stored even though they are all the same. This just seems like a waste, yet Omar has not found an efficient way to have the script know to skip those joints. It may be able to come from Nicole's end.

For next week…
- Nelskati will discuss whether non-moving joints will not be written to the .gcp or if they will not be read from the .gcp.
- Nicole will continue work on reading input files. Then she’ll start creating a new animation from the points on the modified curve that Omar passes to her.
- Omar will write a script to write a new .gcp file with the new data generated by manipulating the beta parameters. He will also change the GUI design (for now) to just include global parameters.
- If Nicole and Omar can both read in and write files to each other by mid-week, then Omar will be able to tweak the beta-spline curves, pass them to Nicole, and she can hopefully have it playback in the animation so we can see if the adjustment of the curve creates an appearance of a certain emotion! (This may not be possible in the scope of this next week, but we are very close)

So, we're excited. We're getting much closer to emotion aspects of this project. We hope it works!

Until time next brings us together.

The grass is (sometimes) greener on the other blogs (so we might have to counter by making our blog background green),

Nelskati

Monday, April 4, 2011

Beta Post

So, we just had our beta-review. It was very informative.

To start, here are links to two videos which we showed to Norm and Joe at the review:

Beta-Spline Creation/Manipulation in Maya:
http://www.youtube.com/watch?v=cqRP3GKNFoA
 This video shows a UI that Omar created in Maya with MEL. It takes in control points and outputs a beta-spline. The spline can be interactively manipulated with the bias and tension sliders.

Drawing Joint Translation Curve in Maya:
http://www.youtube.com/watch?v=2aq11vV5tqc
This video shows the result of a Python script that draws a single joint's translation over time in Maya. There is a discontinuity at the end of the curve, and we think we figured out why. At the moment, the curve is drawing the position of the joint at each frame. After frame 130 or so, the joint quits moving. This causes the same point to be passed in many times, which creates a discontinuity in the curve. Nicole thinks an easy fix would be to check when the first and last keyframes are set for each joint, then "cut" that section from the position list. Which leads to a question: Does anyone know the Python command to query if there is a keyframe at a certain frame?

What we learned:

One of our main issues that we came across is that we are using two different types of curves. Maya uses NURBS curves, whereas Omar is using beta-splines. Since Maya provides no way for us to define curve interpolation any differently, we need to export NURBS curve data to beta-spline control point data.

Since our animation curves generate such a large number of control points, Joe and Norm suggested that we export the NURBS control points in Maya at a uniform timestep of our choosing, producing somewhere between 10 and 25 control points (this range may change if we find we need more of them). These control points will then be passed to Omar's beta-spline creator, which will generate a beta-spline to be manipulated. After manipulation, we will determine the new set of points on the curve and pass them back to Maya to then be used as the new animation curve.

The first part--taking the 200-something control points from the NURBS curve and outputting a list of control points for a beta-spline--will be a pre-processing step. We are treating the base animations that Nicole created as an animated gesture database. Since they should not change (theoretically), then we can also store lists of control points that correspond to each gesture in a similar database.

Now, we're feeling pretty confident that we can get our parts working together and working well. Hopefully, we'll have some good progress to show by Thursday's post!

Until then.

From Russia with Love,

Nelskati

Thursday, March 31, 2011

Top 10 reasons why Nel should not make post titles. #10...

So I bet you're wondering what we did this week.

Nicole continued to familiarize herself with Python in Maya. The Maya documentation on Python commands was very useful. She can now write basic python scripts that do things like drawing curves, creating objects, and getting/setting attributes of objects.

She did run into one problem. She was able to get joint position at a certain time with cmds.getAttr(‘joint_name.translate’, time = t). However, this returns the joint position in local coordinates, which never changes. (This makes sense, because bones can’t stretch.) Is there any way to get the position in global coordinates, aside from matrix multiplication, which would likely be too expensive to compute at every frame? Some commands have a 'global' parameter, but she's not sure how to add this to the getAttr call, since getAttr has no global parameter.

Once she figures out the world/local coordinate problem, she should be able to draw the translation curve along one joint, then extend it to the other joints.

Moving forward, any tips on how to compute the control points once we have the exact points along the curve?

Omar took time to try to figure out the best way to manipulate beta-splines in Maya. After working with the Maya C++ API, he found that MEL scripting may still be the better option. He made progress on the MEL UI, thanks to the help of the tutorials that he posted last week. Here's a screenshot of part of the UI:

Currently, a first window pops up, asking the user to specify the number of control points. This then brings up the Control Points window above, with the number of float field groups that the user previously specified. The user can input the x,y,z values for each control point. Hitting the Draw Spline button does exactly that. The above screenshot shows what it looks like after hitting Draw Spline. Also, if the user were to input new values into the window, it would just update the curve, not create a new one.

Next, Omar needs to add a way to manipulate the beta and/or alpha values of the curve (probably float sliders). Then, he would be about at the stage where he could use Nicole's control points. Right now, the control point input is interactive. Omar is still not sure exactly how the control points will need to be input/retrieved from Nicole's part, since Nicole is not exactly sure how to get them yet. They will figure that out together.

Question for Joe: For the beta review, what type of presentation do you want? Will we be talking over a video like the alpha reviews?

That's it.

Much lava,

Nelskati

Thursday, March 24, 2011

The nine lives of Post Posterson

So I bet you're wondering what we did this week.

Omar has been really sick this past week (and still is). So, unfortunately, he was unable to get much done. He still needs to build a GUI in Maya that will allow him to manipulate the beta-spline curves. Assuming he starts to recover relatively soon, he should be able to get a GUI up and running by next week. He has done some reading on Maya GUI building and has found that MEL is the best for that. He's found some links online to try to help him out:

http://trishabutkowski.blogspot.com/2008/09/mel-scripting-intro-to-gui.html

http://www.polyextrude.com/tutorials/MelScripting/chapter9.html

As per his prior problem with drawing curves in Maya without 20*(number of control points-3), he's going to forget about that for now. It might not be as big a problem as he thought it was.

Nicole has been reading up on Python scripts in Maya. The end goal is to grab the control points from the movement curves in Maya. The first step involves getting Python script to run in Maya and drawing the movement path of a single joint.

Here are some resources she's looking at:

http://cgkit.sourceforge.net/maya_tutorials/intro/
http://www.rtrowbridge.com/blog/2008/11/maya-python-import-scripts/
http://cgkit.sourceforge.net/mayadoc/install.html

And one that looks like it will be particularly helpful
http://www.chadvernon.com/blog/resources/python-scripting-for-maya-artists/python-in-maya/

She'd be happy to hear any other resource suggestions.

This is the first time she's done any scripting, so she had some questions:

1. Right now Maya has a "Python" tab in the script editor. This site mentions a Python plug-in manager. Do we need this? I was talking to Jon McCaffrey, and he made it sound like we could copy the Python code straight into the script editor.

2. Similarly, the tutorials keep talking about the "Python plugin" and "Maya Python package". Where can we download these? Do we even need to download these?

3. If we want to load a Python file into Maya instead of copying and pasting it all into the script editor, where should we save the file? We have tried saving it under "C:\Program Files\Autodesk\Maya2010\scripts", but it doesn't find the file. (We're working on the SIG lab computers.)

4. One of the tutorials has a single line that's supposed to print "Hello World". (py "print 'Hello World'"). When typed into the MEL tab in the script editor, I get the error "cannot find procedure py". If I erase the py part and just type "print 'Hello World'" in the Python tab, it works. However, if I try to type the next example from that page into the Python tab, it doesn't work. Any idea on what's going wrong?

SELF EVALUATION

What we have completed
- picked out 10 gestures to animate
- animated those gestures, including finger movements
- implemented beta splines

Partially completed
- get beta splines working in Maya

By the beta review
- have Maya GUI ready
- draw the path of a single joint in Maya as it animates
- determine the control points of this path; not quite as trivial as it sounds

By the final review
- generalize the previous two steps to all joints
- pass these control points to Omar's beta spline editor
- modify the control points to achieve various emotional states; this could be done by permuting the joint positions across a certain range of y values, for instance
- poll a set of people to determine what types of modifications produce what types of emotions

By CG@Penn Day
- we might be able to generalize these findings. For example, if an angry gesture is faster and has wider motions, increasing a gesture's speed and "stretching" the position of the control points will make a gesture look angry. Prevents having to do trial and error for every gesture/emotion combination.
- create a GUI that allows the user to change parameters (example: emotion), which will in turn change the control points/curves

We've got a lot left, but we think once we can get our two parts working together, things should move quickly.

Love and unicorns,

Nelskati

Thursday, March 17, 2011

Post #(Infinity-Symbol rotated 90°)

Hey all! We're back from Spring Break and slowly trying to get into the swing of things again.

What we week this did so to you know want? Okay.

After returning from break, Omar has been working on being able to correctly display beta-splines in Maya. It took him a while to get the MayaPluginWizard working, but in the meantime, he began research on how to define/draw curves in Maya without being restricted to the given interpolation scheme. Unfortunately, it has been very difficult to find exactly what he's looking for. The Maya C++ API documentation is not so great. Omar wants to be able to draw the beta-splines without having every point drawn be considered an edit (or interpolation) point. Basically, he is looking for the equivalent of a glVertex call for Maya. In that way, he can draw lines without defined, manipulable points. This week has been spent looking for answers and trying out different methods. He has been able to play around a bit with making simple maya plugins, which has helped. But the progress he wanted to make was not made.

In order to not continue wasting time, Omar will start trying to at least build a GUI in Maya (using MEL or C++--not sure which will prove to be more efficient) that will allow the user to take the beta-splines he is currently drawing (defined with roughly 20 times the number of edit points with which they should be defined) and manipulate the bias and tension (alpha1 and alpha2) parameters for the different segments of the curve, using simple sliders. He has started trying to learn how to do this.

Nicole has been working on the rig with fingers in Maya. She tried binding the different parts of the mesh separately to fix the 'mesh and skeleton not matching up' problem, but it didn't help. She had to reposition some of the joints before re-binding the skin, which led to a few problems...

Problem 1: Some joints no longer start in the same position. This means that Nicole could not just copy keyframes from the no-finger model to the finger model. She had to re-animate the already-animated gestures.

Problem 2: She had to paint skin weights, since joints were added. (She tried importing the previous weights, but it didn't help.) She couldn't get the weights 100% accurate on all joints; the wrist, elbow, and shoulder in particular look awkward when rotated at certain angles. This happened a bit with the previous rig and skin weights, but it is more noticeable with the new rig. If your wrist twists like this, you might want to see a doctor...


Problem 3: She belatedly realized that a few of the finger joints were not in the optimal places for certain gestures... specifically, any gestures that involved the thumb and fingers touching. She plans to re-visit this at a later time, if the opportunity arises, but figured that there were better uses of time than trying to tweak the rig again.

Nicole did manage to get the 10 animations (mostly) done. A few still need fine tuning. 
 



The rest of the rendered videos can be found here. Is there a way to organize Youtube uploads into "folders" or something similar, so the older and newer videos will be separate?

So, for next week:
  • Omar will try to have a simple beta-spline manipulation GUI made in Maya (probably in MEL script)
  • Nicole will start writing a Maya application (in Python?) to get the control points from an animation curve in Maya.
Until our Eulerian paths cross again.

Full of love we are,

Nelskati

Thursday, March 3, 2011

Post White and the Seven Posts Post.....post...postpostpost

O.M.GUI. It's almost spring break. Wow.

We don't feel like writing a blog post, but if we were to write one, it might go like this:

"O.M.GUI. It's almost spring break. Wow.

We don't feel like writing a blog post, but if we were to write one, it might go like this: 

'Nicole has been murdered by studying for midterms. And by mixed-media animation. But she's been brought back to life by the power of doughnut holes.

She did pick up a good idea during the weekly meeting on Thursday. So how do we check if an "angry" emotion actually looks angry? One way is to permute the control points from their "base" values, then upload these gestures to Mechanicalturk and have people vote on which emotions the gestures convey. By surveying a large group of people, we can see how strongly a gesture conveys a certain emotion. Hopefully we will be able to find a trend, for the "angry" values, for example, to save time when modifying future gestures.

Omar had an equally stressful week. There were also doughnut holes involved.

He has begun to read/consult the C++ Maya API to figure out how to correctly draw out his beta-splines in Maya. He has also been giving some thought to what the GUI in which both Nicole's and Omar's parts of the project will interface. He's not sure if it should be directly in Maya (i.e. a GUI in MEL script) or if it should be a separate thing that takes information from a Maya file. It seems like the first option may be the one to go with, at least at first.

A question: When will Unity be involved in the scope of our project, if ever? We haven't had to incorporate it thus far, and we're not sure how/when it will come into play.


For next week:
  • Nicole will be in Oklahoma (where the wind comes sweeping down the plain, and the waving wheat can sure smell sweet when the wind comes right behind the rain) enjoying her break.
  • Omar will be performing with Mask & Wig in New York City, Pittsburgh, Chicago, and Detroit. They may even make a stop in Canada. How aboot that? That'd be pretty cool, eh?
So, no blog post for Nelskati for a week. Hopefully you can manage on your own. We surely can.

Well, until we blog again.

Love is also like a box of chocolates,

Nelskati

'
Something like that.
"
Something like that like that. Good thing we didn't actually write one.

Bye.

Thursday, February 24, 2011

Post Posterson VI, Esq.

Hello, friends. Read. Ist a long post. Ist very long.

So, this week was all Alpha Review prep. We made a pretty good video, if we do say so ourselves. Which we do.

But before we made a great video, we also did some work. Thanks to some guidance from Pengfei and Joe, we made some good progress this week.

Omar used MEL for the first time this past week! After reading a few online tutorials and trying to figure out how to use C++ code in Maya, he found that it would just be easier to adapt some of his C++ beta-spline interpolation code into MEL script. So that's what he did. And it worked pretty well. See the image below.



So, the one problem is that this particular curve actually has 60 edit points, when it should really only have 4. That is because Omar has not figured out how to have Maya draw a curve with a custom interpolation scheme like the one he has coded. He can specify points all he want, but it will only allow him to do so with interpolation of degree between 1 and 7. He may just have to get C++ and Maya working together. But for now, in order to at least visualize it in Maya, he currently has the script drawing the curve with 20 points on each segment, where each of these points is interpolated linearly. If anyone knows how to define an interpolation scheme using MEL, let Omar know. Thanks.


Nicole added finger joints to the rig. (A few of the hand joints had their names changed, and joints rhand_end and lhand_end were deleted because they were no longer needed.) Yay, fingers!



The plan was: add finger joints to the rig, save current skin weights, unbind the skeleton and the model, rebind the skeleton to the model (so you can animate the fingers), load the old skin weights (so you only have to paint the finger weights), and finally modify the skin weights so the finger joints move the fingers.

However, Nicole ran into a problem. Nicole unbound the joints; the model's arms and legs, previously in the bind pose, changed position to this:

 The rig does control the model, but it causes awkward scaling when a joint is moved since the skeleton and the model don't line up properly. Nicole has tried both smooth bind and rigid bind. Neither solves the problem.

Pengfei: Nicole has uploaded the Maya file to the sig computers if you'd like to take a look at it.
The file is called A_001_v8. It’s in sig-share\Projects\2011-RCTA-MARKETPLACE, and it has the model before the skeleton has been un-bound. (If you detach the skin and then bind it again, you get the image above.)

Question: Also, Nicole would like to know what to do about the skel_driven skeleton (the one she is not animating with)? Does she need to add finger joints to that one as well and link it to the main skeleton? She didn't play around with that, since trying to re-rig two skeletons could create twice as many problems, especially when one is linked to the other. (And she's not even sure if we'll need to use that skeleton.)

For next week:
  • Omar will hopefully be able to solve his interpolation problem in Maya. He'll try to make interactive manipulation of the curves work in maya like he has in his 2D GUI, but he's not sure ho far he'll get on that this week. There's a lot of things to do before spring break.
  • Nicole will hopefully be able to solve the problem with binding the rig, and will be able to work on the animations that involve fingers.
  • Nicole will also start reading up on MEL and Maya plugins. The next step is to draw out the curves in Maya for one joint (specifically, the translation of the joint over time), which sounds like a job for MEL. Once Nicole gets this done, she can give the control points of the curve to Omar, who can modify them with his curve editor. Drawing one curve is just a test. Eventually, we want to have a curve for every joint.
That's it for this post. See you all tomorrow at the alpha review.

Tender-Lovingly yours,

Nelskati

Friday, February 18, 2011

Mambo #5 (let mambo = "post")

So I bet you're wondering what we did this week.

Nicole took a closer look at the rig in Maya, with some help from Pengfei. The skel_driven skeleton is controlled (driven) by the other skeleton. Imagine that! As for why there are two skeletons: perhaps different programs require different joint conventions. This allows for more flexibility when exporting the data. Nicole decided to animate from the non-skel_driven skeleton since it seems to control both skeletons.

Nicole blocked in the animations that do not involve fingers, since the skeleton does not have individual fingers rigged. (And a few of the gestures that do involve fingers, minus the finger movements.) The gestures all start from the same rest pose, with hands in front of the agent, per Dr. Badler's suggestion. At the moment they do not all end at the same rest post, since she's not sure how we plan to blend between successive animations and a few (like give up) probably wouldn't end in that rest pose... but she can add the rest pose at the end of each animation if she needs to.

The animations can be viewed here. Right now they are just playblasts. She can edit this with renders if needed.

Omar extended the 2D GUI to allow for interactive manipulation of alpha-values on each segment of the beta-spline. The he also created a new GUI to allow him to visualize beta-splines in 3D (mostly a test to make sure that the code did not need to be changed too much to allow for 3D visualization). Here's a screenshot:

This GUI currently takes in a .txt file that gives the number of control points and then a list of the x,y,z coordinates of each. He changed this from the initial interactive control point placement because, eventually, he will be receiving control point data from Nicole and using that to create splines. He wanted to make sure that he adjusted his code accordingly in order to allow for a similar input of control point data. He has also started to attempt to transfer his 2D interactive shape controls to the 3D version.

For next week...
-Nicole isn't quite sure what to do, other than refining some of the animations. Should she add finger joints to the skeleton to animate the other gestures? Should she start working on a way to connect her part with Omar's? (She's a bit stuck on where to start on that.)

-Omar is going to start to try to figure out how to use his splines to control a human model in Maya. Each joint should have a spline to itself, representing its motion. I'm not sure if that means that a control point must store rotation and translation data? Also, any tips on how to start attaching my splines to a maya model? I'm not quite sure where to start on that either.

So, a little stuck is Nelskati. But with some help, I'm sure we'll be right back on track.

With lots and lots of Lotsa Love,

Nelskati

Thursday, February 17, 2011

Clarification about dyadic conversation paper (not the weekly post)


I read the dyadic conversations paper. Here’s how I think it ties into the work that Omar and I are doing.

The goal is to make a conversation engine that is efficient while still being realistic. These conversations focus on gestures rather than words. They are to be used with background characters.

The conversation is controlled by a finite state machine. At each tick, the FSM will determine whether the current action is done. If it is, it will determine the next action through a mathematical calculation that depends on the current stage of the conversation and the agent’s relationship with the person he’s talking to. The FSM will then grab the correct action from a set of “base” actions (Nicole’s animations). Depending on the agent’s emotional state, this base action may be modified (through control curves, for example; Omar’s work). A conversation ends whenever a variable, which can be thought of as an “end conversation” variable for simplicity, is over a certain range.

Is that correct?

-Nicole

Thursday, February 10, 2011

post post post POST

So I bet you're wondering what we did this week. No? Well, you can read anyway. Or leave. The choice is yours. Make the right one.

So:
Nicole started playing around with the model. She had to install Maya 2010 since the files couldn't be opened on Maya 2008. Here's a short video of our test gesture: the hand wave. She ran into some problems with the rig.

Problem One: The rig has two skeletons. Which one should we use? Here are screenshots of the two: the model with both skeletons showing and the skeletons viewed in the hypergraph.
 
They appear to have the same naming convention except one has skel_driven appended to the front. We asked Pengfei about it but he wasn’t sure which one to use.

Problem Two:  A few of the joints in the rigged model are constrained in their rotation. For example, the elbow can only rotate about the x axis, not the y axis. I realize that the elbow should only have one degree of rotation, but as it is, I think the axis might be wrong. When I unlock the y axis and rotate about it, things like this happen: screenshot here.

It appears that one skeleton is getting keyframes set and the other isn’t. How can we fix this? Other than keying both skeletons, which is what we're doing now… but it seems a bit pointless and means we're setting twice as many keyframes as we need to. (And if we forget to key one of the skeletons, weird things start happening.) The wrist also has the same problem.

Problem Three: the fingers aren’t individually rigged. A few of our gestures involve individual fingers. Does this mean I need to use a different rig, or do you want us to ignore these gestures for now?
Omar got B-splines working for real now. Then, he modified the GUI to allow for the interactive input of control points as opposed to interpolation points. And then he got beta-splines working. All in one week! Here's a visual comparison of the paper and my results:
(Beta-parameter tests)

And here's another picture of the local parameters at work (note the "kink" in the curve segment):

Currently, all parameters are hard-coded. I've begun to expand the GUI to allow the user to modify the parameters easily. I will also start to extend the splines to 3D. It seems that the code already accounts for a Z-coordinate, but that they are all 0. So I'll see if I have to actually change the code at all or if I just have to change the GUI to make it display in 3 dimensions.

A question: Have I implemented everything in the paper that I need to? I implemented Section 3 (not 3.1), and Sections 4.0, 4.1, and 4.2. There is an equation in section 4.3 (tension) that in reading seemed necessary, but in practice seemed unnecessary (the tension already seemed to be accounted for by the equation in section 4.0). Am I missing something there?

And another question that should have an obvious answer: is there a way to respond to specific blog comments? We couldn't figure it out.

Anyway, NEXT WEEK:
  •  Nicole is going to animate the rest of the gestures (possibly minus the ones that are finger-dependent)
  • Omar will continue to make improvements to the GUI
Once we complete these goals, we're both actually a bit unsure of what the next step is. We'll email Joe when we get there. Or you can email us whenever you want, Joe.

Another week down. Till the next one.

Love is a many-splendored thing,
Nelskati

Thursday, February 3, 2011

First post after the first post after the first post!

So, wanna know what we did thes week? Yeah, you do.

Nicole met a stupid girl who gave her the flu. Boo to you, stupid flu girl.
Needless to say, Nicole was not very productive.

Omar was not sick. And he finally got his B-Spline code working! (we think--end control points look a little odd, but I'll keep looking at that)

He also began to read the paper on beta-splines to see how he can extend his current code. Beta-splines are generalizations of B-splines that allow for local control over each segment of the spline without affecting the adjacent segments. (That sounds difficult to implement...)

Together, we looked through the 675 hand gestures in the gesture database. We chose gestures that were specific to the Middle East and Saudi Arabia, as well as some that are considered universal. We narrowed them down to 10 gestures, which you can see right here. Here's an example image, for those too lazy to click the link:


(We hope we didn't offend anyone with that gesture)

We looked for two kinds of gestures: 1)gestures that are foreign and novel for Americans and 2) gestures that we use in the United States that have different (and often more negative) connotations in the Middle East. We figured this would be useful since part of the goal is to use the Marketplace as a training simulation for the U.S. military.

So, for next week:
  • Nicole will be healthy.
  • Omar will not have caught the flu from Nicole.
  • Nicole will animate the test gesture (hand wave) and document the steps she takes to animate it per Joe's suggestion. If she gets the OK from Joe, she'll start to animate the gestures we picked out from the database.
  • Omar will continue to read about beta-splines and hopefully understand the paper. He'll start to try to implement them in 2D.
  • Omar and Nicole will not go to Feb Club events.
'Tis all for tis week.

Peace and Love and Double Rainbows and Double-Precision Floating Points,

Nelskati

Thursday, January 27, 2011

First post after the first post!

So, you're probably wondering what we did this past week.

Nicole finally got SVN working on her laptop!!

Omar worked on his B-Spline code from the curve editor assignment of 562. He wasn't able to fix it yet, but Joe has provided some resources that should help. We should hopefully get it working and move on to beta-splines within the next week.

Nicole tried to get Maya working with Visual Studio in order to write melscript. She ran into some external linking errors. (21 of them to be exact; we were able to fix 4.) Do the SIGlab computers already have this working? If so, we might just work on those.

We weren't able to choose 8 to 12 gestures to animate, since the gesture database went down. (Thanks, Joe.) Joe did suggest picking one random "test" gesture to start from, so we've chosen a hand wave.

Let's talk about SmartBody--

We reread the SmartBody paper that Norm gave us and downloaded the code. So far, all we've been able to do is make them shoot lasers out of their eyes.


Here's a summary of what we understood from the paper. SmartBody is a system that takes multiple animations as inputs, and uses controllers to blend between them. It strives to create a realistic in-between motion. If this is all it does, it doesn't seem like it would be worth the hassle to include it in our project. It's possible that we overlooked a feature. The documentation is not great, so we are still trying to explore all of SmartBody's functionality.

If we decide to go ahead and incorporate this into our project, we found a webpage that claims SmartBody will work with Unity:
http://www.mail-archive.com/smartbody-developer@lists.sourceforge.net/msg00101.html

So for this coming week, we hope to:


-Get b-splines working
-Pick out our 8-12 gestures (once Joe stops trying to ruin our lives)
-Animate the hand-wave test gesture we selected (is there a rigged model we can use?)

Any questions, comments, just let us know!

With all our love,

Nelskati