• Isadora
  • Get it
  • Forum
  • Help
  • ADD-ONS
  • Newsletter
  • Impressum
  • Dsgvo
  • Impressum
Forum
    • Categories
    • Recent
    • Popular
    • Tags
    • Register
    • Login

    Seeking teaching advice (1 kinect + 10-15 high school students)

    How To... ?
    4
    6
    142
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • K
      Kirsten
      last edited by

      Hi everyone

      My question might be a weird one. I have used Isadora with high school students several times for projects (poetryfilms) in my creative writing classes. We've built interactive films using an XBox kinect and it has worked OK. But up until now, the collaboration has been with me on my laptop, which is loaded with Isadora and the OpenNi tracker plugin. My students' role was to write their poems and create the media that they wanted to use with the adobe suite and various other software tools. Based on their vision, I have been the one creating the patches with their media and poetry. So when we test and record the films, we are using my laptop and they are the ones moving in front of the kinect camera. It has been kind of clunky, but the kids have been happy with the collaboration for those particular classes (which are writing classes not computer classes).

      This spring, I'd like to actually do a similar unit with my students but rather than building the patches for them, I'd like to teach them some basics on how to use Isadora themselves. So I thought that we'd rent Isadora licenses for a week for each kid. The school is willing to pay for the rentals so that is good, but I am having a hard time imagining how we will all share a my personal kinect camera in an efficient way, and it is definitely out of budget to purchase more kinects. On the other hand, if this one unit goes well, I may be able to convince the school to buy a bunch of perpetual licenses and more equipment so that I can do a much bigger project next year.

      Anyway, I have this vague idea that I will create a standard patch to get the students started on their own laptops and then teach them how modify it with other actors. But how do I emulate the kinect input so that they can all work on their individaul projects simultaneously? I was thinking that they could use a mousewatcher actor as input to give them a rough idea of the interactivity while they wait for their individual turns at the camera. Or perhaps could I record a person's movement and have the kids use that recording with their patches while they wait their turns, but it seems like neither of these solutions would emulate the depth that we get with the kinect. 

      So I thought that I would ask you all. It doesn't have to be perfect. I just want a way to emulate or fake the kinect input so that all of the students can work simultaneously on their own laptops rather than us all sharing one.  How do I do this given we only have one camera? 

      Thanks!

      DusXD JuriaanJ 2 Replies Last reply Reply Quote 0
      • DusXD
        DusX Tech Staff @Kirsten
        last edited by

        @kirsten

        The idea of making a recording with OpenNI, is a good one, you will get the exact same skeleton data from the recording, and can share the recording to each student so they have control of it. That provides a lot of flexibility, but not the live element, or the control of being able to try specific moves on the fly.

        Another option, which might suit you well is to use your laptop with the Kinect, and share all the Skeleton data to your students as OSC channels. The easiest way to do this would likely be to have an additional Wifi router, that you and your students connect to (not connected to the Internet or the school network). This wireless network would just share OSC skeleton data.
        This would allow all students to use the same skeleton data at the same time. They may need to take turns interacting with the Kinect to work out specific cases, but should provide a good experience.

        Troikatronix Technical Support

        • New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
        • Isadora Add-ons: https://troikatronix.com/add-ons/
        • My Add-ons: https://troikatronix.com/add-ons/?u=dusx

        Running: Win 11 64bit, i7, M.2 PCIe SSD's, 32gb DDR4, nVidia GTX 4070 | located in Ontario Canada.

        K WolandW 2 Replies Last reply Reply Quote 0
        • K
          Kirsten @DusX
          last edited by

          @dusx

          Thank you! Those ideas are very helpful. I am going to test them out this week and I may be back with a few more questions, but I'm excited to realize that I can find a way to make individual projects work. I think these projects are going to be fun, and the students will learn more than they did in previous years.

          Kirsten

          WolandW 1 Reply Last reply Reply Quote 3
          • WolandW
            Woland Tech Staff @DusX
            last edited by Woland

            @dusx said:

            The idea of making a recording with OpenNI, is a good one, you will get the exact same skeleton data from the recording, and can share the recording to each student so they have control of it.

            Additionally, if you have the default place that this saves to be a folder on your computer that is always synched to Dropbox or Google Drive, you can make recordings and the students can access and download them almost immediately.

            You could also broadcast the live kinect depth video to them over NDI

            TroikaTronix Technical Support
            New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
            TroikaTronix Support Policy: https://support.troikatronix.com/support/solutions/articles/13000064762
            TroikaTronix Add-Ons Page: https://troikatronix.com/add-ons/

            | Isadora 3.2.6 | Mac Pro (Late 2013), macOS 10.14.6, 3.5GHz 6-core, 1TB SSD, 64GB RAM, Dual AMD FirePro D700s |

            1 Reply Last reply Reply Quote 0
            • WolandW
              Woland Tech Staff @Kirsten
              last edited by

              @kirsten

              These Kinect files of mine might be helpful as a starting point: https://www.dropbox.com/scl/fo/5vx98o6dne2e5g2hz0pyn/h?dl=0&rlkey=dcauvbj2893b674e3i8iix7wo

              TroikaTronix Technical Support
              New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
              TroikaTronix Support Policy: https://support.troikatronix.com/support/solutions/articles/13000064762
              TroikaTronix Add-Ons Page: https://troikatronix.com/add-ons/

              | Isadora 3.2.6 | Mac Pro (Late 2013), macOS 10.14.6, 3.5GHz 6-core, 1TB SSD, 64GB RAM, Dual AMD FirePro D700s |

              1 Reply Last reply Reply Quote 0
              • JuriaanJ
                Juriaan Tech Staff @Kirsten
                last edited by

                @kirsten

                I would personally do a combination of the two. So recordings that you already did / + maybe the video footage of the RGB Camera that the Kinect has so they can see the image / see the puppet or whatever you wish to with the Kinect data.


                And then a few scenes with live data! Either thru OSC / NDI / whatever works in your situation 🙂

                Isadora 3.1.1, Dell XPS 17 9710, Windows 10
                Interactive Performance Designer, Freelance Artist, Scenographer, Lighting Designer, TroikaTronix Community moderator
                Always in for chatting about interaction in space / performance design. Drop me an email at hello@juriaan.me

                1 Reply Last reply Reply Quote 1
                • First post
                  Last post