I should also mention that most of the hardware was found in a cupboard somewhere, covered in dust. Venues hold many secret things that you don't know about. A great example is when attendees kept bringing Andy Parsons a chair, even though part of the entire sketch was him not having anything to sit on. We didn't store any chairs there. 4 years later, I still don't know where the chair came from.
Initially I had tried to use VLC (using the HTTP protocol) to push live video feeds from behind the audience to the backstage wings (Big, expensive cable we had broke, had to improvise), but this introduced terrible latency, some clients even progressively got worse (From 10-15 seconds). Many optimisations were had, but to no avail. The lowest that I could achieve with VLC was around 3 seconds, which isn't really great if actors need to be in time with musicians, as you can probably guess.
What's the solution?
Well, FFMPEG and RTP using the MPEG2 codec was the answer. This introduced anywhere from 150-300ms of latency, which was almost perfect, however, using VLC media player as a client seemed impossible, and since I was using Unicast RTP, multiple clients weren't a thing that could be done. About 30 minutes before the show started, I had come up with a script that introduced the lowest latency that I could find (Trial and Improvement over the course of around 2 hours).
Using FFPlay as a client introduced no issues, and solved all of the issues that VLC was causing. The only downside is that there is no user interface with FFPlay (And FFMPEG), so if you are a less technical user, I would advise looking for alternatives, however, FFPlay did not require any maintainance on the nights that I was using it, other than the odd Ctrl-C to stop the script, start it again after server reconfigurations, etc.
How do I do this?
Grab FFMPEG if you haven't already. I don't think what we're doing works as well as it should on the later builds of FFMPEG so you may want to grab a version from mid-late December 2015.
Please be aware that RTP in this context is a point-to-point protocol (NOT Multicast), so you will need to stream to the IP Address of the CLIENT machine running FFPlay. Two clients cannot simultaneously recieve a Unicast RTP stream. The packets are accepted by the client machine whether it is viewing the stream or not, so bear that in mind if network bandwidth is limited if you're using old wireless cards.
The commands are as follow:
ffmpeg -f dshow -i video="[Video Capture Device]" -s 768x576 -r 50 -vcodec mpeg2video -b:v 10000 -f mpegts rtp://[Client IP]:1234
ffplay rtp://[Client IP]:1234
Use this to find your video device:
ffmpeg -list_devices true -f dshow -i null
NOTE: You'll have to do some slight modifications if you're using anything other than Windows, primarily the "-f dshow" part, that'll be whatever your OS uses instead.