Wednesday 22 May 2013

Week 11

This was the week in which we combined the seperate work that our group members had been doing into one cohesive project. The team working on the modelling and visualisation side of everything gave Matt and myself a copy of the level, which we began to add in our Kinect interactivity. The initial step for this was choosing the right bathroom to use, as they had modelled three sizes, small medium and large. I decided that the best one to use would be the largest, as it would be a lot easier to use this bathroom as an example for our real world demonstration for our final presentation, and the group agreed.

Myself and Matt spent most of the weeks tutorial, as well as most of the next day working to impliment the interactive elements from the Kinect.

Temperature:
At the moment, the temperature control is the only one we have setup with an actual gesture. Like in our earlier demonstration using a ball as a substitute, the user holds their right arm horizontal to activate the control, then by holding their left hand above or below the waist, they can control the variable. I setup a red rectangular prism in the corner of the shower to represent this variable. I thought this would be the best way to show the changes. Our other option was to increase a steam particle effect, but I thought this would give a much clearer interpretation of the effect of the gesture control. This was the easiest of the interactive elements to connect up because we already had it running earlier, and only needed to change what the variable controlled.



Height control:
The height controls for the sink and toilet turned out to be one of the hardest things we worked on in the entire project. Our earlier creation along these lines was the "bench" (sideways door), that would adjust it's height according to the users knee level was a simplified version of what we set out to create here. The early test had no constraints, but we wanted to put a lot of constraints on these objects. One major concern was a maximum height, which was easy enough to limit in the flowgraph. The other main concern was only activating the controls when the user wanted them to. For this, we decided to use a proximity test. If the user is within a certain distance to the sink or toilet, the flowgraph will activate the part of the graph that controls them, otherwise it remains at it's last set variable. We set this up after Russell had a look at the work and suggested that their constant moving would wear out the parts much quicker in a real life version, so stopping it when it's not needed would save this wear and tear. Additionally, it stops them from moving when for example you bend down to grab something, or the sinks moving when you sit on the toilet.



Light controls:
At the moment we have the lights setup so that they will activate when the kinect recognises someone in the room. This seemed like it would be easy, but turned out to be difficult because of the way the kinect works. Because of it's limited range, it is difficult for it to see someone on the outskirts of the example room from where we have it situated. Also, when the kinect loses sight of someone, who for example has left the room, it leaves the last set of data it registered in the system, so the orbs we use to represent the player stay where they last were. This means that the lights will mostly stay on, even if you leave the room and the Kinect's range.
We also setup controls for the mirror lights. They are setup similarly to the toilet, and will turn on when the player moves near them.

Emergency fall detection:
One of the earliest ideas we had was a test to determine if the user had fallen over, and if so, call an emergency or family line. We found this to be a really important test due to the entire project being aimed at the elderly. One problem we kept running into was that when the user lays down horizontally, the Kinect loses it's ability to properly track them. The coordinates sent to the orbs jump very randomly, and this can sometimes turn off the trigger. We went through about three different tests to get to one that works reliably. It tests to see if the waist height is within a certain limit of the head, over a period of time. This counteracts things like bending down to grab something, which would not be within both the time and position limits.



This week I also created a number of videos. I very much dislike being filmed so I had Matt demonstrate our work, while I filmed it. I then edited it together, using both a video I took of him demonstrating and a screen capture I took of the interactivity working in CryEngine. I synched these and edited them, then uploaded them to youtube.

No comments:

Post a Comment