Hi
This is the building I want to map and as you can see the pillars and arches are further forward than the rest of the building.
I will be using either 3 x 8,500 lumen or 4 x 7500 lumen. (either 2 x 2 or alternatively 3x1 portrait)
If I blend to the front surface the "cross-over point" will be out at the back.
Im not sure which direction to take.
I note that in Isadora a projector can't be assigned to multiple stages.
Is there a solution within izzy?
Should I consider syphon through to either Qlab or Madmapper
The event is in 2 weeks so I am a little daunted by what Ive taken on
Regards
Jim.
Hi have anyone managed to send audio from isadora to multi outputs on multi devices using Dante via ?
dear All
I have an issue with one of my stages going to my beamers being dull
Is there a way I can brighten up stage so all images are a bit brighter.?
Tr742
Arthur
Hello all,
I was curious if anyone has an approach they use for midi cc latching, i.e. preventing a midi cc value from changing until the incoming CC value matches where it was previously left off. Picture a midi controller where the same 8 knobs control different parameters in different isadora scenes. If I leave scene A, use the midi controller in scene B, and then come back, the values of the knobs could be much different than when I left scene A. Using any of these controls then will create a large jump in value that is often undesirable for me. I would like to be able to implement a behavior such that moving those controls when returning to scene A would have no effect until it reaches the value previously set upon leaving that scene, then pick right up from there and function normally. I imagine this would be some combination of gate and comparator actors, but I'm stuck at how I can store and poll the data about the last held CC value when entering a scene. Has anyone else implemented a similar function in one of their patches? Any help or guidance would be much appreciated, as this is something that would make operating my patches live much smoother.
Best,
JP
Hi everyone,
I have been using EpocCam as a live webcam (over wifi and with USB) for many years successfully with Isadora. It seems I can't get this to work with the Elgato Camera Hub software anymore... While Isadora is recognizing Elgato Camera Hub I can't get an image but am left with a green scree (see screenshot). I have tested on Intel and M1 Macbooks and get the same problem (of course I have given permission in the system preferences :-). Anyone has experienced the same problem?
What would you use as an alternative? I am ok with Iriun Pro, any other ideas?
Thanks a lot!
Hi again,So, I am trying to work with a virtual stage controlling some videos in the background (for making a 3d Sphere that I can make fades on through stages) it so far seems to kinda work. Now, is there a way to controll when the videos begin from stage to stage? I mean, I want to start a new stage but I would like the virtual stage to be the video - and I would like it to start from the beginning. Should I make 2 virtual stages and then reactivate the stage I need to use when it is mixing the videos or how to do that? I dont know if that makes sense? With other words is there a way to controll things happening in the virtual stage like e.g. when to start the video if I want to use it for a smooth video fade?
Ideas are welcome.
All the best
Eva
Hello there,
I have a terrible quality of live video input in isadora. As it is drawing, it's really obvious and unusable. It took me a while but i discovered it's really in isadora the input is this bad.
I join an image of the video in in isadora, and the same (same material) in madmapper. Look at the difference of quality.
I tested it on a pc (windows11, rtx3080) and on a macbook pro m3 max late 2023, with two different capture card (decklink duo and decklink 8k pro) and the result is the same.
Am i missing something hidden in the setup or is there an issue here.
It's a big problem for me as i can't use this quality of image in the show.
Hello everyone ! I am working with particle systems and I cannot generate a descending cascade effect, since I do not see the possibility of generating particles on the ENTIRE X axis simultaneously. Any suggestions?
My idea after achieving this is (through motion tracking) that the audience can interact in some way with that waterfall (particle-waterfall).
Something like this
Thanks a lot!
Best,
Maxi-RIL
Hello everyone ! I wonder if it is possible to achieve motion tracking the way Touch Designers uses OPTICAL FLOW? Of course leaving aside the Kinect or other depth cameras.
It seems to be something very straight foward this optical flow operator in Touch Designers but I always prefer Isadora!
Eyes Actor +? (possible scenario: an installation where the audience manipulates the projection of a system of particles)
Thanks friends !
RIL
Hi there,
I am working with a projection on a circular screen and I have made my projection so that it should project as if on a 3d object (a sphere) - so there is a video on a 3D sphere. Now, this all works in each scene, but when I try to fade from one scene to the next in stead of making a fade, Isadora creates a pattern like a disco-mirror ball or something. Any suggestions on how to solve this? Thanks,
Cheers Eva