Thursday, January 27, 2011

First post after the first post!

So, you're probably wondering what we did this past week.

Nicole finally got SVN working on her laptop!!

Omar worked on his B-Spline code from the curve editor assignment of 562. He wasn't able to fix it yet, but Joe has provided some resources that should help. We should hopefully get it working and move on to beta-splines within the next week.

Nicole tried to get Maya working with Visual Studio in order to write melscript. She ran into some external linking errors. (21 of them to be exact; we were able to fix 4.) Do the SIGlab computers already have this working? If so, we might just work on those.

We weren't able to choose 8 to 12 gestures to animate, since the gesture database went down. (Thanks, Joe.) Joe did suggest picking one random "test" gesture to start from, so we've chosen a hand wave.

Let's talk about SmartBody--

We reread the SmartBody paper that Norm gave us and downloaded the code. So far, all we've been able to do is make them shoot lasers out of their eyes.


Here's a summary of what we understood from the paper. SmartBody is a system that takes multiple animations as inputs, and uses controllers to blend between them. It strives to create a realistic in-between motion. If this is all it does, it doesn't seem like it would be worth the hassle to include it in our project. It's possible that we overlooked a feature. The documentation is not great, so we are still trying to explore all of SmartBody's functionality.

If we decide to go ahead and incorporate this into our project, we found a webpage that claims SmartBody will work with Unity:
http://www.mail-archive.com/smartbody-developer@lists.sourceforge.net/msg00101.html

So for this coming week, we hope to:


-Get b-splines working
-Pick out our 8-12 gestures (once Joe stops trying to ruin our lives)
-Animate the hand-wave test gesture we selected (is there a rigged model we can use?)

Any questions, comments, just let us know!

With all our love,

Nelskati

Friday, January 21, 2011

First Post!

Here's our abstract for this project:

The goal of this project is to create a real-time simulation of a virtual marketplace. The environment, events, and interactions will all be plausible, life-like, and generable in real-time. To a subject placed into the virtual world, the marketplace will seem to be a living entity, with autonomous people who each have their own individualized agendas and motivations depending on their culture and occupation. Subjects will be able to interact and communicate with the agents in the marketplace. Thus, it is extremely important that the agents respond and react appropriately based on the situations at-hand and any information that they can explicitly or implicitly infer from the subject. These interactions must be as realistic as possible so that this project can be used as a training scenario

To see the rest of our design document, click on this.

For this upcoming week, we plan to read up on beta-splines and start looking at the SmartBody framework. We'll also begin sifting through the gesture database for our 8-12 basic gestures. And of course, we'll start to get our B-spline code working so that we can extend it to beta-splines.

Until next week.

Love,

Nelskati