So I bet you’re wondering what we did this week.
Nicole changed the way she was dealing with the keyframed data. Instead of storing the path that each joint follows, she’s now storing each joint’s rotation values. This makes it much easier to re-construct the animation curves, since she doesn't have to deal with converting between beta splines and Maya's NURBS implementation. Nicole can output a file with all of the keyframed rotation values, as well as read a file with rotation values and create a keyframed animation from it. Omar is still working on modifying the curves, so Nicole tested her code with an unmodified file. As you can see, the model had no keyframes initially. When the code was run, it outputs the animated gesture:
She also animated two new gestures upon Libo’s request: an emphasis gesture (called beat) and a goodbye gesture.
QUESTION: Does anyone know what the goodbye gesture is in middle eastern countries? Nicole could not find any information about it. She looked online and in the gesture database. She animated a “western” goodbye gesture as a placeholder, but she isn’t sure if this gesture is universal.
Tomorrow, Nicole is going to work on connecting her code to Libo’s output.
Omar had some code that was working really nicely when Nicole was passing him joint translation points. Now that key frame points are being passed to him, he has had to change up his code a lot. He has been operating under the assumption that Nicole would always be passing in control points to his system. Now, the key frames are actual points on the curve that he has to generate. Consequently, he has had to add the extra step of solving for the control points from the given key frame points, and then he can interpolate given those control points. This required him to now use C++ again, because the algorithm for solving for control points is one that involves solving a matrix equation (i.e. solving for c in Ac = d given A and d). He did not want to have to adapt the math::matrix C++ code to MEL.
Though this has been a tedious process, it is working relatively well. He has been able to figure out how to get his already existing MEL code to interact with the new C++ code that he has been writing. Currently, he only has the curves being drawn. He has not gotten to reimplementing the Beta Parameters yet because he has run into some issues in drawing the curves. When a joint is stationary (i.e. not rotating) for a period of time within the motion, the points on the curve start to drift. He doesn't know if this is a Maya issue or if it comes from passing duplicate points into his own code. He is continuing to investigate.
Once the investigation is complete, he will be able to add the interactive Beta-parameter control once again and be able to move forward with figuring out how the Beta-parameters affect performance.
That's what we got. Soon we'll make a poster and prepare for the presentations!
Love,
Nelskati
Thursday, April 21, 2011
Tuesday, April 19, 2011
Nel has questions (not the weekly post)
Nicole had some questions. She was planning to send an email to Pengfei, but the email was getting kind of long, so she's posting here instead:
"I can now read Omar’s input files for one joint. (I haven't generalized it to all joints yet.) I'm trying to construct the new animation for this joint, but I’m a bit confused on how to do this.
Edit: The points that Omar passes to me are the control points, not the points on the curve.
Idea 1: keyframe each joint using the points that Omar passed in
Problem: how do we know what to keyframe (translation, rotation, etc) on each joint? The points passed in are the joint’s position in world space. Keying the translation won’t work, since only the root can translate. Keying the rotation might work, but how would we determine the rotation value for each joint? Also, I don’t think we know the timing of the animation after I get the points from Omar. (I passed Omar 200 control points; he passed me 187.)
Idea 2: create a new curve, then do the equivalent of the “Animate – motion path – attach to motion path” command in Maya
Problem: I don’t know if you can apply this to a joint with locked translation, or if it automatically applies to the root joint. When I tried it on a test skeleton in Maya, it defaulted to the root joint, and there were unwanted rotations/twists of the figure around the motion path.
Idea #2 is what I’m trying now, but I'm getting a strange discontinuity again when I draw the curve.
Screenshot of the drawn curve:
(Note: I moved the figure so you could see the curve.)
It appears to have the correct curve for the head joint, but there is some kind of discontinuity(?) that’s causing the vertical line to be drawn. I’ve double-checked the list I’m drawing from, and I believe that there are no duplicate points adjacent to each other. (That’s what caused the discontinuity in my previous code.)
I'm also beginning to think that the first idea would be much simpler, if I can figure out how to get each joint's x/y/z axis rotation values from their global position. Any suggestions?"
Thanks,
Nicole
"I can now read Omar’s input files for one joint. (I haven't generalized it to all joints yet.) I'm trying to construct the new animation for this joint, but I’m a bit confused on how to do this.
Edit: The points that Omar passes to me are the control points, not the points on the curve.
Idea 1: keyframe each joint using the points that Omar passed in
Problem: how do we know what to keyframe (translation, rotation, etc) on each joint? The points passed in are the joint’s position in world space. Keying the translation won’t work, since only the root can translate. Keying the rotation might work, but how would we determine the rotation value for each joint? Also, I don’t think we know the timing of the animation after I get the points from Omar. (I passed Omar 200 control points; he passed me 187.)
Idea 2: create a new curve, then do the equivalent of the “Animate – motion path – attach to motion path” command in Maya
Problem: I don’t know if you can apply this to a joint with locked translation, or if it automatically applies to the root joint. When I tried it on a test skeleton in Maya, it defaulted to the root joint, and there were unwanted rotations/twists of the figure around the motion path.
Idea #2 is what I’m trying now, but I'm getting a strange discontinuity again when I draw the curve.
Screenshot of the drawn curve:
(Note: I moved the figure so you could see the curve.)
It appears to have the correct curve for the head joint, but there is some kind of discontinuity(?) that’s causing the vertical line to be drawn. I’ve double-checked the list I’m drawing from, and I believe that there are no duplicate points adjacent to each other. (That’s what caused the discontinuity in my previous code.)
I'm also beginning to think that the first idea would be much simpler, if I can figure out how to get each joint's x/y/z axis rotation values from their global position. Any suggestions?"
Thanks,
Nicole
Thursday, April 14, 2011
On the 12th day of Postmas
Wanna know what we did this week? Ok.
Nicole animated some additional gestures that Libo requested: yes/no head gestures, fidgeting, giving an item to someone, and receiving an item from someone.
They can be found here: http://www.youtube.com/watch?v=MQUcJTJwqu4
(She would embed the video, except Youtube changed their layout slightly and Nicole can't find the new embed link.)
She also tweaked the animations so they all begin and end in the same pose (which should simplify the transitions between them) and ran her script on each of them so she could give the control point files to Omar. This is part of the pre-computation, and only needs to be done once for each animation, unless the animation is changed.
Omar was able to get his scripts to draw out the joint translation curves of every joint given the .gcp file passed to him by Nicole. He also modified the GUI to just include Global Bias and Tension for each joint all in one scrollable window. Here is a screenshot of what it looks like with all the joint-motions drawn out:
The reason this wasn't working before was because of the number of UI's the script was originally generating. Currently, it still "draws" joints that do not move. This has not caused any issues, though.
Omar is currently in the process of writing a script that will take the curves he's drawn and output a new file for Nicole to read. It will probably be another .gcp file because he is passing back the same information. However, it seems that there may be less control points being passed back to Nicole than were passed to Omar in the first place. He's not sure if this will affect the length of the animation. He will find out soon.
For next week...
- Nicole plans to read up on Python I/O so she can connect her part to Omar's, and finish the last few gestures that Libo requested. It would be ideal if she could also find a way to re-generate the animation curves after Omar modifies the beta splines. She has a few ideas on how she could do this in Maya.
-Omar plans to finish the script he is currently working on so that his and Nicole's parts can continue to connect.He may need to find a way to ensure that the same number of cvs are being passed back to Nicole.
-If we can get these parts connected soon, then Nelskati will also start looking at the files that Libo has given us to generate conversations. We will figure out the best way to parse through this and get all our necessary information, and then pass this into the system that we have created.
Nelskati is too tired to think of something witty tonight.
Beam me up, Nelskati.
Nicole animated some additional gestures that Libo requested: yes/no head gestures, fidgeting, giving an item to someone, and receiving an item from someone.
They can be found here: http://www.youtube.com/watch?v=MQUcJTJwqu4
(She would embed the video, except Youtube changed their layout slightly and Nicole can't find the new embed link.)
She also tweaked the animations so they all begin and end in the same pose (which should simplify the transitions between them) and ran her script on each of them so she could give the control point files to Omar. This is part of the pre-computation, and only needs to be done once for each animation, unless the animation is changed.
Omar was able to get his scripts to draw out the joint translation curves of every joint given the .gcp file passed to him by Nicole. He also modified the GUI to just include Global Bias and Tension for each joint all in one scrollable window. Here is a screenshot of what it looks like with all the joint-motions drawn out:
The reason this wasn't working before was because of the number of UI's the script was originally generating. Currently, it still "draws" joints that do not move. This has not caused any issues, though.
Omar is currently in the process of writing a script that will take the curves he's drawn and output a new file for Nicole to read. It will probably be another .gcp file because he is passing back the same information. However, it seems that there may be less control points being passed back to Nicole than were passed to Omar in the first place. He's not sure if this will affect the length of the animation. He will find out soon.
For next week...
- Nicole plans to read up on Python I/O so she can connect her part to Omar's, and finish the last few gestures that Libo requested. It would be ideal if she could also find a way to re-generate the animation curves after Omar modifies the beta splines. She has a few ideas on how she could do this in Maya.
-Omar plans to finish the script he is currently working on so that his and Nicole's parts can continue to connect.He may need to find a way to ensure that the same number of cvs are being passed back to Nicole.
-If we can get these parts connected soon, then Nelskati will also start looking at the files that Libo has given us to generate conversations. We will figure out the best way to parse through this and get all our necessary information, and then pass this into the system that we have created.
Nelskati is too tired to think of something witty tonight.
Beam me up, Nelskati.
Thursday, April 7, 2011
Post Number Onety-one
So I bet you’re wondering what we did this week.
Nicole fixed the discontinuity on the line. Then she extended her code so it draws the joint paths for all joints. Here’s a video of the results:
As you can see, it generates curves that follow each joint’s position over time. (It did generate curves for the head as well, though they’re currently overlapped by the model.) The script itself is a bit slow since it hasn’t been completely optimized, so the part of the video where the script was running was sped up.
This script also outputs a .gcp file with the joint names, number of control points for each joint, and a list of control points for each joint. (.gcp stands for gesture control points, if you were curious.) This file is passed to Omar, who reads it in, chooses a set number of control points, and modifies the curve. But I’ll let Omar give you more details on that.
Nicole is currently working on a script that reads the .gcp file that Omar’s code generates, but she’s having trouble finding good IO documentation for Python. She can read lines or a specific number of characters, but what if she wants to read until she hits whitespace. How would she do that?
testline 1.0 -2.0 3
For example, if she wants to grab the following strings from the line above: ‘testline’ ‘1.0’ ‘-2.0’ ‘3’
-----
Omar wrote a MEL script that reads the .gcp file that Nicole generates, takes roughly 20 of the control points passed in, and outputs the curve as a beta-spline. It can do this for a .gcp file with data for multiple joints. Here is a screenshot of 3 joint translation curves as beta-splines from one .gcp file that Nicole generated (if they look familiar, it's because they look a lot like the curves that Nicole drew, which is a good thing):
One of the problems that Omar has encountered is that when he tried to run this on the file of all the joints, the script stopped running after about half of them. It may have been due to the fact that too many GUIs were being generated, which leads Omar to his next problem/decision. Currently, for each joint, a GUI appears (as seen above) with sliders for Global Bias and Global Tesion, as well as sliders for Local Bias and Tension for each segment of the spline (with 20 control pts-->17 segments). So, each GUI is huge and holds a lot of information. Although the local parameters provide for a lot more detailed manipulation, Omar is not sure if it is worth it to have to store all that information. And with almost 20 segments, the change from local manipulation is not as easily noticeable as it is on a curve with 6-12 control points. If anyone has an opinion on this matter, please let him know.
For now, Omar will probably comment out that code to have just one UI with a list of the global parameters for each input joint appear after the .gcp file has been parsed. That may be more pleasing to the eye and to Maya.
Another small issue is that not all of the joints move, yet they are passed to Omar in the .gcp file. So, there are a lot of extraneous, "invisible" beta-splines drawn whose control points are being stored even though they are all the same. This just seems like a waste, yet Omar has not found an efficient way to have the script know to skip those joints. It may be able to come from Nicole's end.
For next week…
- Nelskati will discuss whether non-moving joints will not be written to the .gcp or if they will not be read from the .gcp.
- Nicole will continue work on reading input files. Then she’ll start creating a new animation from the points on the modified curve that Omar passes to her.
- Omar will write a script to write a new .gcp file with the new data generated by manipulating the beta parameters. He will also change the GUI design (for now) to just include global parameters.
- If Nicole and Omar can both read in and write files to each other by mid-week, then Omar will be able to tweak the beta-spline curves, pass them to Nicole, and she can hopefully have it playback in the animation so we can see if the adjustment of the curve creates an appearance of a certain emotion! (This may not be possible in the scope of this next week, but we are very close)
So, we're excited. We're getting much closer to emotion aspects of this project. We hope it works!
Until time next brings us together.
The grass is (sometimes) greener on the other blogs (so we might have to counter by making our blog background green),
Nelskati
Nicole fixed the discontinuity on the line. Then she extended her code so it draws the joint paths for all joints. Here’s a video of the results:
As you can see, it generates curves that follow each joint’s position over time. (It did generate curves for the head as well, though they’re currently overlapped by the model.) The script itself is a bit slow since it hasn’t been completely optimized, so the part of the video where the script was running was sped up.
This script also outputs a .gcp file with the joint names, number of control points for each joint, and a list of control points for each joint. (.gcp stands for gesture control points, if you were curious.) This file is passed to Omar, who reads it in, chooses a set number of control points, and modifies the curve. But I’ll let Omar give you more details on that.
Nicole is currently working on a script that reads the .gcp file that Omar’s code generates, but she’s having trouble finding good IO documentation for Python. She can read lines or a specific number of characters, but what if she wants to read until she hits whitespace. How would she do that?
testline 1.0 -2.0 3
For example, if she wants to grab the following strings from the line above: ‘testline’ ‘1.0’ ‘-2.0’ ‘3’
-----
Omar wrote a MEL script that reads the .gcp file that Nicole generates, takes roughly 20 of the control points passed in, and outputs the curve as a beta-spline. It can do this for a .gcp file with data for multiple joints. Here is a screenshot of 3 joint translation curves as beta-splines from one .gcp file that Nicole generated (if they look familiar, it's because they look a lot like the curves that Nicole drew, which is a good thing):
One of the problems that Omar has encountered is that when he tried to run this on the file of all the joints, the script stopped running after about half of them. It may have been due to the fact that too many GUIs were being generated, which leads Omar to his next problem/decision. Currently, for each joint, a GUI appears (as seen above) with sliders for Global Bias and Global Tesion, as well as sliders for Local Bias and Tension for each segment of the spline (with 20 control pts-->17 segments). So, each GUI is huge and holds a lot of information. Although the local parameters provide for a lot more detailed manipulation, Omar is not sure if it is worth it to have to store all that information. And with almost 20 segments, the change from local manipulation is not as easily noticeable as it is on a curve with 6-12 control points. If anyone has an opinion on this matter, please let him know.
For now, Omar will probably comment out that code to have just one UI with a list of the global parameters for each input joint appear after the .gcp file has been parsed. That may be more pleasing to the eye and to Maya.
Another small issue is that not all of the joints move, yet they are passed to Omar in the .gcp file. So, there are a lot of extraneous, "invisible" beta-splines drawn whose control points are being stored even though they are all the same. This just seems like a waste, yet Omar has not found an efficient way to have the script know to skip those joints. It may be able to come from Nicole's end.
For next week…
- Nelskati will discuss whether non-moving joints will not be written to the .gcp or if they will not be read from the .gcp.
- Nicole will continue work on reading input files. Then she’ll start creating a new animation from the points on the modified curve that Omar passes to her.
- Omar will write a script to write a new .gcp file with the new data generated by manipulating the beta parameters. He will also change the GUI design (for now) to just include global parameters.
- If Nicole and Omar can both read in and write files to each other by mid-week, then Omar will be able to tweak the beta-spline curves, pass them to Nicole, and she can hopefully have it playback in the animation so we can see if the adjustment of the curve creates an appearance of a certain emotion! (This may not be possible in the scope of this next week, but we are very close)
So, we're excited. We're getting much closer to emotion aspects of this project. We hope it works!
Until time next brings us together.
The grass is (sometimes) greener on the other blogs (so we might have to counter by making our blog background green),
Nelskati
Monday, April 4, 2011
Beta Post
So, we just had our beta-review. It was very informative.
To start, here are links to two videos which we showed to Norm and Joe at the review:
Beta-Spline Creation/Manipulation in Maya:
http://www.youtube.com/watch?v=cqRP3GKNFoA
This video shows a UI that Omar created in Maya with MEL. It takes in control points and outputs a beta-spline. The spline can be interactively manipulated with the bias and tension sliders.
Drawing Joint Translation Curve in Maya:
http://www.youtube.com/watch?v=2aq11vV5tqc
This video shows the result of a Python script that draws a single joint's translation over time in Maya. There is a discontinuity at the end of the curve, and we think we figured out why. At the moment, the curve is drawing the position of the joint at each frame. After frame 130 or so, the joint quits moving. This causes the same point to be passed in many times, which creates a discontinuity in the curve. Nicole thinks an easy fix would be to check when the first and last keyframes are set for each joint, then "cut" that section from the position list. Which leads to a question: Does anyone know the Python command to query if there is a keyframe at a certain frame?
What we learned:
One of our main issues that we came across is that we are using two different types of curves. Maya uses NURBS curves, whereas Omar is using beta-splines. Since Maya provides no way for us to define curve interpolation any differently, we need to export NURBS curve data to beta-spline control point data.
Since our animation curves generate such a large number of control points, Joe and Norm suggested that we export the NURBS control points in Maya at a uniform timestep of our choosing, producing somewhere between 10 and 25 control points (this range may change if we find we need more of them). These control points will then be passed to Omar's beta-spline creator, which will generate a beta-spline to be manipulated. After manipulation, we will determine the new set of points on the curve and pass them back to Maya to then be used as the new animation curve.
The first part--taking the 200-something control points from the NURBS curve and outputting a list of control points for a beta-spline--will be a pre-processing step. We are treating the base animations that Nicole created as an animated gesture database. Since they should not change (theoretically), then we can also store lists of control points that correspond to each gesture in a similar database.
Now, we're feeling pretty confident that we can get our parts working together and working well. Hopefully, we'll have some good progress to show by Thursday's post!
Until then.
From Russia with Love,
Nelskati
To start, here are links to two videos which we showed to Norm and Joe at the review:
Beta-Spline Creation/Manipulation in Maya:
http://www.youtube.com/watch?v=cqRP3GKNFoA
This video shows a UI that Omar created in Maya with MEL. It takes in control points and outputs a beta-spline. The spline can be interactively manipulated with the bias and tension sliders.
Drawing Joint Translation Curve in Maya:
http://www.youtube.com/watch?v=2aq11vV5tqc
This video shows the result of a Python script that draws a single joint's translation over time in Maya. There is a discontinuity at the end of the curve, and we think we figured out why. At the moment, the curve is drawing the position of the joint at each frame. After frame 130 or so, the joint quits moving. This causes the same point to be passed in many times, which creates a discontinuity in the curve. Nicole thinks an easy fix would be to check when the first and last keyframes are set for each joint, then "cut" that section from the position list. Which leads to a question: Does anyone know the Python command to query if there is a keyframe at a certain frame?
What we learned:
One of our main issues that we came across is that we are using two different types of curves. Maya uses NURBS curves, whereas Omar is using beta-splines. Since Maya provides no way for us to define curve interpolation any differently, we need to export NURBS curve data to beta-spline control point data.
Since our animation curves generate such a large number of control points, Joe and Norm suggested that we export the NURBS control points in Maya at a uniform timestep of our choosing, producing somewhere between 10 and 25 control points (this range may change if we find we need more of them). These control points will then be passed to Omar's beta-spline creator, which will generate a beta-spline to be manipulated. After manipulation, we will determine the new set of points on the curve and pass them back to Maya to then be used as the new animation curve.
The first part--taking the 200-something control points from the NURBS curve and outputting a list of control points for a beta-spline--will be a pre-processing step. We are treating the base animations that Nicole created as an animated gesture database. Since they should not change (theoretically), then we can also store lists of control points that correspond to each gesture in a similar database.
Now, we're feeling pretty confident that we can get our parts working together and working well. Hopefully, we'll have some good progress to show by Thursday's post!
Until then.
From Russia with Love,
Nelskati
Subscribe to:
Posts (Atom)