Hello!
It's been a few months since I needed to use Syphon Virtual Camera, so I dusted off the Application and fired it up today to discover that it's no longer connecting with OBS Virtual Camera in applications like Zoom. I'll see the program feed inside Syphon Virtual Camera from the origination point (Qlab, for example) but that picture is not being captured by OBS virtual camera.
Tried uninstalling and reinstalling OBS, Syphon Virtual Camera, Zoom, etc. and now I can't even get the Syphon Virtual Camera application to do anything when it opens. (The app appears in the dock but nothing appears on the screen.)
I am running Ventura 13.5.1. The same signal flow works on my MacBook Pro M1 which I just updated to Sonoma 14.0.
Please help! I have a virtual show next week and need to capture syphon into Zoom!
Hi all,
I'm working on a data-intensive project that involves about 200 OSC channels that each transmit a time-synced data packet of 18 values that drive visual generation elements. The packets are generated on the local machine via a python script that provides the interface with an AWS database, and then are sent locally to Isadora.
In prototyping the project for the first 50 packets I've run into a three issues with the OSC multiple listener.
1) data received in a listener is sometimes offset by 1 channel number from the channel number specified in the Stream setup. For example a channel called /sensor20 is set in stream setup to start at OSC channel 20. In the OSC multiple listener the first data value for the packet is only received if the starting channel is placed at 19, while the channel address shows on the listener at channel 20.
2) Special value types not transmitted for unixtime and MAC addresses: The multiple listener will only display the first few digits (and irregularily) unix second or millisecond values when included in an OSC packet, and when a MAC address is encoded will only display the first two characters regardless of how encoded.
3) The most significant issue is having just discovered the 800 channel limit - Is there a way to get around that? I have approximately 200 "addresses" of data (which in other OSC implementations would be a channel), but as Isadora splits the into a channel per value that means the project will consume ~ 3600 isadora OSC channels.
If that is a hard limit work around suggestions are most welcome. While it is desirable to work in OSC (data is also being sent to max-msp) the data source is a dynamoDB json stream, so perhaps we could be sending a json to izzy instead.
thanks for thoughts,
Working with izzy 3.2.6 on OSX 13.6.
happy to share more details
Best
Ian
Re: [[SOLVED] Active video accross scenes](fade in videos in other scenes on top?)
I know this thread says solved, but I'm finding myself in the same problem (I think) and I can't seem to figure out the work around. I've also found many different threads on the same topic — and yet none of those solutions seem to work either.
I'm activating video-in-watchers in scene one and wish to have black shadow pngs (various shapes) track in front of the live feed — Works great when it's all in one scene, but now that I'm trying to separate things out, I'm finding I cannot get the black shadows to come to the top layer in a transparent blend mode. I have the projectors (all in a user actor) in the second scene set at 10 for safety's sake. I've tried ungrouping layers, but then I can't get the live video in the previous scene to activate whatsoever. I've tried moving this base layer scene (with video-in-watchers) to the end, but that's also not working.
Here's the general idea. Any thoughts.
nosferatu-10-11-troubleshoot.izz
And a general thank you to everyone responding on this forum! I'm so grateful for the responsiveness here.
Hello
Contextual menu on the actors via right-click is not happening for me.
I though my cadMouse was angry because of Sonoma and drivers, but I tried another mouse just now and its not working.
Also both mice will give me the contextual menu in all other applications.
Any ideas?
Isadora 3.2.6 - Studio M2 Ultra - MacOs Sonoma
Hello, happy to report:
4x Blackmagic Ultrastudio Monitor 3G (via thunderbolt)
Mac M2 Ultra Studio
MacOs Sonoma
Isadora 3.2.6
Runs like a charm (or like a little beast actually, have not seen it it over 10% yet)
________________
However, I am no sure how to set up my stage. I can talk to the Blackmagic outputs fine in Izzy, but I struggle with the theory.
The setup:
2x1080 projections stacked left and 2x1080 stacked right, to make a 3840 panorama
I have two Blackmagic outputs for the left half of the image, and two for the right. Doubled up yes, but I need individual projectors, so I get discreet mapping on each ouput.
My compositing is 3840px and I would love for the output to split it and send 2x left half and 2x right half to the individual Blackmagic outputs
I did watch the guru session on this and it makes me think that this is easy, yet I have not found a good solution.
Thanks for any advice
I'm trying to run a video capture from my webcam and a Kinect feed using the OpenNI actor at the same time. Unfortunately, it seems an either-or situation. As soon as I Start Capture for the webcam, the Kinect image freezes, and visa versa. Any idea how to solve this? (I kind of need this quite urgently for an installation).
thanks all.
Hi folks, looking for an odd function that doesn't seem to have been addressed previously in either software communities.
My aim is to extract the audio file's metadata (artist & track title) Native Instrument's Traktor, and project these onto a screen. Traktor isn't able to do it directly, so one workaround I found was using an application called Now Playing (NP), which extracts the metadata while Traktor's broadcast is engaged.
From what I understand, NP sends the text out as an image at a specific address local host address, and one of the examples on their website is to connect it with OBS' browser function. While I managed to successfully get OBS to display that image, trying the same address on the "Get URL Text" actor did not work, as it simply outputs an HTTP error message when in Post mode, and a very long string of code in Get mode, which frankly I can't say I understand very well..
Hope the above explanation made sense, happy to provide more details and/or screenshots if it helps.
Any pointers on this specific issue, or any other ways of getting track metadata out of Traktor would be much appreciated.
hello folks! this is pretty basic Izzy stuff, but I've never needed to do it before:
i have a preset scene with an OpenNI Tracker running into a Broadcaster.
the next scene has a listener that injects the vid-gpu from the Kinect through a complex animation, so I trigger this animation using an Activate Scene actor. it works nicely.
problem is - if i click into the second scene, it deactivates the first one.
and if i use an Activate Scene actor in the second scene, pointing to the Kinect scene, it re-activates the first scene, therefore resetting the OpenNI Tracker.
so, what i would like to do, is keep both scenes active whilst i open the second scene editor, just in case i need to edit anything on the fly. i don't want to have to go into Blind Mode if i can help it.
any ideas?
I was just trying to get some clarification on whether it's possible to reverse engineer control of the control panel buttons/sliders/etc for an actor in the scene. For instance, can an envelope generator actually impact a control panel slider beyond "showing value of linked property." And if not, I remain unclear from the thread below on how to use Midi or OSC to control the control panel buttons.
Thanks!
Josh
Re: [[ANSWERED] osc to trig button on "show control" page](/topic/8403/answered-osc-to-trig-button-on-show-control-page)
dear all
Programming my project with V3 software
Some thing strange happened.
All of a sudden I lost my edit page actors.
when I selected scenes no prob saw on projector
And monitor section. But could not see actors so I cant edit actors
Help
Tr742
Arthur