Monday, May 9, 2011

Last post?? :(

Nelskati hasn't seen you in a few weeks. Nelskati has missed you. :(

So I bet you're wondering what Nelskati has done since you last talked to Nelskati.

Nicole was able to piece together the conversation given input from Libo (which contained the time of the keyframe, emotion, agent, and gesture name). Here's a video of the script running:


And here is the resulting conversation:


At the moment this conversation has no emotions.

We have started finding the beta spline parameters for each emotion. We also added textures to the models.
This video shows the result of the angry_fail conversation, with emotions. The video is lagging a bit, but if you go to 0:45, for instance, you'll see that some of the gestures have been tweaked/sped up.

Omar created a GUI that streamlines the interaction between our two parts and makes for easy emotion-infusing over gestures. By clicking a 'Load GKP' button, the scene file containing the textured agent model is opened. Then, the user is prompted to select the .gkp file corresponding with the gesture that he/she wants to modify. This creates the set of beta-splines for that gesture. There is a GUI with 3 panes:
1)"all-joint controls" that can control the bias, tension, speed, and number of divisions of all the curves
2)the individual joint controls (bias, tension, speed)
3)a pane containing 3 buttons--'Play New Gesture', 'Write to File', and 'Load New GKP'

(*Omar added curve divisions as an interactive option because he noticed that some gestures, when the number of divisions was too high, would lose the intention of the original gesture. Some gestures became too fluid and lost key parts of the gesture that made the gesture recognizable. So a slider for divisions has been really helpful)

The user can tweak the beta-splines and then hit the 'Play new Gesture' button to immediately key the new gesture and play it. This allows the user to easily see how his/her curve manipulations are affecting the animation. Once the user finds a combination of parameters that fits the desired emotion, the user can hit 'Write to File', select which emotion the new data represents, and then the program writes that out to a new .gkp file. Here is a short video that illustrates how it works:


Omar was also able to determine the beta-spline and speed parameters that gave certain gestures the look of being infused with emotions (happy, angry, sad). However, it is clear that there is no perfect set of parameters that can produce "angry" for every gesture, for example. In fact, we can barely use the bias parameter because it actually introduces a lot of error that is not easy to deal with within our current framework. Emotional gestures do not start and end in the same position as their neutral counterpart when using bias, which creates problems when stringing them together to make conversations.

Omar has found that speed is a very large factor in illustrating the emotion: faster looks angrier, slower looks sadder. Happy is a difficult emotion to achieve using just the beta-spline parameters. An increase in speed seems to denote happiness, but without the facial animation, happy and angry become difficult to tell apart. Also, some gestures connote a certain emotion to begin with, so to infuse it with an emotion of opposite connotation is very difficult. But still, for many of the gestures in the database currently, halving the speed is enough to make the agent look sad, and doubling the speed is enough to make the agent look angry. Repeating the gesture several times in a row would also increase the appearance of anger.

Well, we've submitted our final code, and now we're done! (although it doesn't feel like it) We can always keep working to improve what we've built, but unfortunately we are out of time. Still, we have accomplished a great deal. And Nelskati has had a lot of fun. But, of course, we couldn't have done this without the help of Joe, Norm, Pengfei, Libo, and Amy.  They have all been incredibly helpful and supportive.

Also, congrats to all our fellow classmates on a job well done! You all did amazing work.

Till we meet again.

With our undying love,

Nelskati

Thursday, April 21, 2011

(Un)lucky Post #13

So I bet you’re wondering what we did this week.

Nicole changed the way she was dealing with the keyframed data. Instead of storing the path that each joint follows, she’s now storing each joint’s rotation values. This makes it much easier to re-construct the animation curves, since she doesn't have to deal with converting between beta splines and Maya's NURBS implementation. Nicole can output a file with all of the keyframed rotation values, as well as read a file with rotation values and create a keyframed animation from it. Omar is still working on modifying the curves, so Nicole tested her code with an unmodified file. As you can see, the model had no keyframes initially. When the code was run, it outputs the animated gesture:



She also animated two new gestures upon Libo’s request: an emphasis gesture (called beat) and a goodbye gesture.





QUESTION: Does anyone know what the goodbye gesture is in middle eastern countries? Nicole could not find any information about it. She looked online and in the gesture database. She animated a “western” goodbye gesture as a placeholder, but she isn’t sure if this gesture is universal.

Tomorrow, Nicole is going to work on connecting her code to Libo’s output.

Omar had some code that was working really nicely when Nicole was passing him joint translation points. Now that key frame points are being passed to him, he has had to change up his code a lot. He has been operating under the assumption that Nicole would always be passing in control points to his system. Now, the key frames are actual points on the curve that he has to generate. Consequently, he has had to add the extra step of solving for the control points from the given key frame points, and then he can interpolate given those control points. This required him to now use C++ again, because the algorithm for solving for control points is one that involves solving a matrix equation (i.e. solving for c in Ac = d given A and d). He did not want to have to adapt the math::matrix C++ code to MEL.

Though this has been a tedious process, it is working relatively well. He has been able to figure out how to get his already existing MEL code to interact with the new C++ code that he has been writing. Currently, he only has the curves being drawn. He has not gotten to reimplementing the Beta Parameters yet because he has run into some issues in drawing the curves. When a joint is stationary (i.e. not rotating) for a period of time within the motion, the points on the curve start to drift. He doesn't know if this is a Maya issue or if it comes from passing duplicate points into his own code. He is continuing to investigate.

Once the investigation is complete, he will be able to add the interactive Beta-parameter control once again and be able to move forward with figuring out how the Beta-parameters affect performance.

That's what we got. Soon we'll make a poster and prepare for the presentations!

Love,

Nelskati

Tuesday, April 19, 2011

Nel has questions (not the weekly post)

Nicole had some questions. She was planning to send an email to Pengfei, but the email was getting kind of long, so she's posting here instead:

"I can now read Omar’s input files for one joint. (I haven't generalized it to all joints yet.) I'm trying to construct the new animation for this joint, but I’m a bit confused on how to do this.

Edit: The points that Omar passes to me are the control points, not the points on the curve.

Idea 1: keyframe each joint using the points that Omar passed in
Problem: how do we know what to keyframe (translation, rotation, etc) on each joint? The points passed in are the joint’s position in world space. Keying the translation won’t work, since only the root can translate. Keying the rotation might work, but how would we determine the rotation value for each joint? Also, I don’t think we know the timing of the animation after I get the points from Omar. (I passed Omar 200 control points; he passed me 187.)

Idea 2: create a new curve, then do the equivalent of the “Animate – motion path – attach to motion path” command in Maya
Problem: I don’t know if you can apply this to a joint with locked translation, or if it automatically applies to the root joint. When I tried it on a test skeleton in Maya, it defaulted to the root joint, and there were unwanted rotations/twists of the figure around the motion path.

Idea #2 is what I’m trying now, but I'm getting a strange discontinuity again when I draw the curve.

Screenshot of the drawn curve:



(Note: I moved the figure so you could see the curve.)

It appears to have the correct curve for the head joint, but there is some kind of discontinuity(?) that’s causing the vertical line to be drawn. I’ve double-checked the list I’m drawing from, and I believe that there are no duplicate points adjacent to each other. (That’s what caused the discontinuity in my previous code.)

I'm also beginning to think that the first idea would be much simpler, if I can figure out how to get each joint's x/y/z axis rotation values from their global position. Any suggestions?"

Thanks,
Nicole

Thursday, April 14, 2011

On the 12th day of Postmas

Wanna know what we did this week? Ok.

Nicole animated some additional gestures that Libo requested: yes/no head gestures, fidgeting, giving an item to someone, and receiving an item from someone.

They can be found here: http://www.youtube.com/watch?v=MQUcJTJwqu4
(She would embed the video, except Youtube changed their layout slightly and Nicole can't find the new embed link.)

She also tweaked the animations so they all begin and end in the same pose (which should simplify the transitions between them) and ran her script on each of them so she could give the control point files to Omar. This is part of the pre-computation, and only needs to be done once for each animation, unless the animation is changed.

Omar was able to get his scripts to draw out the joint translation curves of every joint given the .gcp file passed to him by Nicole. He also modified the GUI to just include Global Bias and Tension for each joint all in one scrollable window. Here is a screenshot of what it looks like with all the joint-motions drawn out:


The reason this wasn't working before was because of the number of UI's the script was originally generating. Currently, it still "draws" joints that do not move. This has not caused any issues, though.

Omar is currently in the process of writing a script that will take the curves he's drawn and output a new file for Nicole to read. It will probably be another .gcp file because he is passing back the same information. However, it seems that there may be less control points being passed back to Nicole than were passed to Omar in the first place. He's not sure if this will affect the length of the animation. He will find out soon.

For next week...
- Nicole plans to read up on Python I/O so she can connect her part to Omar's, and finish the last few gestures that Libo requested. It would be ideal if she could also find a way to re-generate the animation curves after Omar modifies the beta splines. She has a few ideas on how she could do this in Maya.
-Omar plans to finish the script he is currently working on so that his and Nicole's parts can continue to connect.He may need to find a way to ensure that the same number of cvs are being passed back to Nicole.
-If we can get these parts connected soon, then Nelskati will also start looking at the files that Libo has given us to generate conversations. We will figure out the best way to parse through this and get all our necessary information, and then pass this into the system that we have created.

Nelskati is too tired to think of something witty tonight.
Beam me up, Nelskati.

Thursday, April 7, 2011

Post Number Onety-one

So I bet you’re wondering what we did this week.

Nicole fixed the discontinuity on the line. Then she extended her code so it draws the joint paths for all joints. Here’s a video of the results:




As you can see, it generates curves that follow each joint’s position over time. (It did generate curves for the head as well, though they’re currently overlapped by the model.) The script itself is a bit slow since it hasn’t been completely optimized, so the part of the video where the script was running was sped up.

This script also outputs a .gcp file with the joint names, number of control points for each joint, and a list of control points for each joint. (.gcp stands for gesture control points, if you were curious.) This file is passed to Omar, who reads it in, chooses a set number of control points, and modifies the curve. But I’ll let Omar give you more details on that.

Nicole is currently working on a script that reads the .gcp file that Omar’s code generates, but she’s having trouble finding good IO documentation for Python. She can read lines or a specific number of characters, but what if she wants to read until she hits whitespace. How would she do that?

testline 1.0 -2.0 3
For example, if she wants to grab the following strings from the line above: ‘testline’ ‘1.0’ ‘-2.0’ ‘3’
-----
Omar wrote a MEL script that reads the .gcp file that Nicole generates, takes roughly 20 of the control points passed in, and outputs the curve as a beta-spline. It can do this for a .gcp file with data for multiple joints. Here is a screenshot of 3 joint translation curves as beta-splines from one .gcp file that Nicole generated (if they look familiar, it's because they look a lot like the curves that Nicole drew, which is a good thing):
One of the problems that Omar has encountered is that when he tried to run this on the file of all the joints, the script stopped running after about half of them. It may have been due to the fact that too many GUIs were being generated, which leads Omar to his next problem/decision. Currently, for each joint, a GUI appears (as seen above) with sliders for Global Bias and Global Tesion, as well as sliders for Local Bias and Tension for each segment of the spline (with 20 control pts-->17 segments). So, each GUI is huge and holds a lot of information. Although the local parameters provide for a lot more detailed manipulation, Omar is not sure if it is worth it to have to store all that information. And with almost 20 segments, the change from local manipulation is not as easily noticeable as it is on a curve with 6-12 control points. If anyone has an opinion on this matter, please let him know.

For now, Omar will probably comment out that code to have just one UI with a list of the global parameters for each input joint appear after the .gcp file has been parsed. That may be more pleasing to the eye and to Maya.

Another small issue is that not all of the joints move, yet they are passed to Omar in the .gcp file. So, there are a lot of extraneous, "invisible" beta-splines drawn whose control points are being stored even though they are all the same. This just seems like a waste, yet Omar has not found an efficient way to have the script know to skip those joints. It may be able to come from Nicole's end.

For next week…
- Nelskati will discuss whether non-moving joints will not be written to the .gcp or if they will not be read from the .gcp.
- Nicole will continue work on reading input files. Then she’ll start creating a new animation from the points on the modified curve that Omar passes to her.
- Omar will write a script to write a new .gcp file with the new data generated by manipulating the beta parameters. He will also change the GUI design (for now) to just include global parameters.
- If Nicole and Omar can both read in and write files to each other by mid-week, then Omar will be able to tweak the beta-spline curves, pass them to Nicole, and she can hopefully have it playback in the animation so we can see if the adjustment of the curve creates an appearance of a certain emotion! (This may not be possible in the scope of this next week, but we are very close)

So, we're excited. We're getting much closer to emotion aspects of this project. We hope it works!

Until time next brings us together.

The grass is (sometimes) greener on the other blogs (so we might have to counter by making our blog background green),

Nelskati

Monday, April 4, 2011

Beta Post

So, we just had our beta-review. It was very informative.

To start, here are links to two videos which we showed to Norm and Joe at the review:

Beta-Spline Creation/Manipulation in Maya:
http://www.youtube.com/watch?v=cqRP3GKNFoA
 This video shows a UI that Omar created in Maya with MEL. It takes in control points and outputs a beta-spline. The spline can be interactively manipulated with the bias and tension sliders.

Drawing Joint Translation Curve in Maya:
http://www.youtube.com/watch?v=2aq11vV5tqc
This video shows the result of a Python script that draws a single joint's translation over time in Maya. There is a discontinuity at the end of the curve, and we think we figured out why. At the moment, the curve is drawing the position of the joint at each frame. After frame 130 or so, the joint quits moving. This causes the same point to be passed in many times, which creates a discontinuity in the curve. Nicole thinks an easy fix would be to check when the first and last keyframes are set for each joint, then "cut" that section from the position list. Which leads to a question: Does anyone know the Python command to query if there is a keyframe at a certain frame?

What we learned:

One of our main issues that we came across is that we are using two different types of curves. Maya uses NURBS curves, whereas Omar is using beta-splines. Since Maya provides no way for us to define curve interpolation any differently, we need to export NURBS curve data to beta-spline control point data.

Since our animation curves generate such a large number of control points, Joe and Norm suggested that we export the NURBS control points in Maya at a uniform timestep of our choosing, producing somewhere between 10 and 25 control points (this range may change if we find we need more of them). These control points will then be passed to Omar's beta-spline creator, which will generate a beta-spline to be manipulated. After manipulation, we will determine the new set of points on the curve and pass them back to Maya to then be used as the new animation curve.

The first part--taking the 200-something control points from the NURBS curve and outputting a list of control points for a beta-spline--will be a pre-processing step. We are treating the base animations that Nicole created as an animated gesture database. Since they should not change (theoretically), then we can also store lists of control points that correspond to each gesture in a similar database.

Now, we're feeling pretty confident that we can get our parts working together and working well. Hopefully, we'll have some good progress to show by Thursday's post!

Until then.

From Russia with Love,

Nelskati

Thursday, March 31, 2011

Top 10 reasons why Nel should not make post titles. #10...

So I bet you're wondering what we did this week.

Nicole continued to familiarize herself with Python in Maya. The Maya documentation on Python commands was very useful. She can now write basic python scripts that do things like drawing curves, creating objects, and getting/setting attributes of objects.

She did run into one problem. She was able to get joint position at a certain time with cmds.getAttr(‘joint_name.translate’, time = t). However, this returns the joint position in local coordinates, which never changes. (This makes sense, because bones can’t stretch.) Is there any way to get the position in global coordinates, aside from matrix multiplication, which would likely be too expensive to compute at every frame? Some commands have a 'global' parameter, but she's not sure how to add this to the getAttr call, since getAttr has no global parameter.

Once she figures out the world/local coordinate problem, she should be able to draw the translation curve along one joint, then extend it to the other joints.

Moving forward, any tips on how to compute the control points once we have the exact points along the curve?

Omar took time to try to figure out the best way to manipulate beta-splines in Maya. After working with the Maya C++ API, he found that MEL scripting may still be the better option. He made progress on the MEL UI, thanks to the help of the tutorials that he posted last week. Here's a screenshot of part of the UI:

Currently, a first window pops up, asking the user to specify the number of control points. This then brings up the Control Points window above, with the number of float field groups that the user previously specified. The user can input the x,y,z values for each control point. Hitting the Draw Spline button does exactly that. The above screenshot shows what it looks like after hitting Draw Spline. Also, if the user were to input new values into the window, it would just update the curve, not create a new one.

Next, Omar needs to add a way to manipulate the beta and/or alpha values of the curve (probably float sliders). Then, he would be about at the stage where he could use Nicole's control points. Right now, the control point input is interactive. Omar is still not sure exactly how the control points will need to be input/retrieved from Nicole's part, since Nicole is not exactly sure how to get them yet. They will figure that out together.

Question for Joe: For the beta review, what type of presentation do you want? Will we be talking over a video like the alpha reviews?

That's it.

Much lava,

Nelskati