Description

This was built as the cheapest possible machine that could handle multicamera live streaming switching with replay and animated graphics and Audio. This system travels a lot so it needs to get rebuilt into a rackmount based system soon

sidenote: I set up the corsair ram rgb to show me temp because I was worried about overheating with such a small cooler on a pretty heavy cpu wattage wise. The lights kind of resemble an incandescent bulb/ or tube from a tube amp, where when the cpu is at low temp the lights are dim orange yellow, and the hotter the cpu gets the closer they get to full bright white. This made it easier to monitor during initial testing getting the fan curves at sweet spot.

Log in to rate comments or to post a comment.

Comments

  • 13 months ago
  • 1 point

+1 for the multiple Blackmagic cards.

What are you using? Wirecast? vMix? or CasparCG?

  • 13 months ago
  • 1 point

vmix. wirecast is not so great for multicam in my opinion. Its fine for basic stuff being an encoder with a few sources. I talked to their engineer at NAB a few weeks ago and literally watched his hardware based wirecast go crash and have to reboot. CasparCG is awesome, but VMIX new GT title animator has been amazing, and then being able to pair that with data sources and scorebot, I've really not had a reason to use caspar

  • 13 months ago
  • 1 point

It is a shame the only solution on Linux is OBS since it recently added Decklink Output. (with alpha channel for keying) You're at the mercy of Windows Updates on Windows 10.

  • 13 months ago
  • 1 point

I've honstly been ok with and gotten used to the windows mess. VMIX has a function in their settings that blocks windows updates, so I've never had an issue with that when live except for the very first time I ever tried vmix. We were literally 5 minutes out from being live and windows just completely went into update mode! luckily it was a test show but was still embarrassing because it was suppose to prove to my bosses the future of an NDI and software based workflow lol! needless to say I had a lot of work to do to come back from that. But it is aweseome Linux has the decklink support now, building some NDI over public IP servers on a linux platform would be pretty awesome

  • 13 months ago
  • 1 point

It's up to the decode power of the machine at the point where it's fully NDI, and Awesome Games Done Quick is a major contributor to OBS now. They do a weeklong charity livestream using strictly OBS and off the shelf capture cards.

  • 13 months ago
  • 2 points

I've been following AGDQ and SGDQ for a little while now and have always been impressed that they are doing everything through OBS. I'm a huge fan of what they do and have been trying to make it out to one sometime as both a fan and also to take a peek at their technical setup.

  • 13 months ago
  • 1 point

How is the 550W Corsair holding up in this rig? No issues, or anything?

  • 13 months ago
  • 1 point

we'll see, that corsair is definitely the weak link in this system, I've only had it since February so its hard to tell but has been good so far and has been running night and day for almost a month now that it is working part time in the studio instead of just being mobile. So no issues yet but I may swap it out eventually for safety, I was just a little tight on budget when I first got the parts for this build.

  • 13 months ago
  • 2 points

I thought about doing the sage board for my non traveling studio server build, but for this mobile live production build I wanted to go with mini ITX. the plan is to eventually house this in a 2U server case with the dynatron L3 server aio cooler. Then I also thought about doing the 1660 card, but in my testing, I could never get the horsepower i needed from a 1060, 1070 and 2060 were about the sweet spot to be able to handle camera shots/ replay/ and data sources so I was afraid to really spend the time testing on the 1660 (have you tried it? would love to know your thoughts and results). Plus your build doesn't account for physical SDI capture in the price unless you plan is to go exclusively with NDI which I love, but unfortunately I need to still be somewhat backward compatible. Otherwise that system build looks good, though you don't really have a reason to go with a 44lane cpu since the only pcie you're adding is the 4pcie 10gb nic. Might as well stick with the cheaper 28lane cpu right?

  • 13 months ago
  • 1 point

You Are Right. I Have Just Realized My Mistake

[comment deleted by staff]
  • 13 months ago
  • 2 points

I probably shouldn't have called it a "streaming rig". Its more of a competitor to a 20,000$ tricaster that can take up to 12 sdi camera sources and as many as 9 NDI sources with the single gigabit nic. the idea is to also add in a 10gb nic in the future build to allow as many as 100 NDI video sources though I think you'd hit a GPU bottleneck before that happend. Anyway, yes if you were doing a basic single cam gaming stream, I could just do that with about 400$, this is more like a virtual 1m/e live production switcher/replay machine, and animated graphics system.

If you feel like all that could be done cheaper I am by all means all ears, but the capture cards alone were 1500 and those are about the cheapest I could find while still maintaining reliability, and I'm maxing out my 28 cpu lanes with those 2 cards plus the graphics card. I wish I could have done this cheaper so please send suggestions

  • 13 months ago
  • 1 point

Do remember the 7800X is still using that thermal paste interface between the die and the IHS. The 9800X would be better since it's soldered and would give you that expandability to 100 NDI streams because it now also has 44 PCI-E lanes.

Also consider that M.2 to PCI-E x4 risers exist so you can run more cards off of the M.2 ports with a 44 lane CPU.

  • 13 months ago
  • 1 point

So I did the 9800x in my server build that lives in our studio, but for the road build I was fine with only having the 28 lanes to keep it cheaper and smaller. I havn't had any overheating issues, though it does run a little hotter than I'd like, but for price to performance ratio I'm super happy with the 7800x in an ITX form factor though. I have the 10gb card in the studio build and then if I'm doing an NDI heavy show on the road, I just swap out my decklink duo for the 10gb card. So I'm totally with you!

I've been nervous to try and m.2 solution because do the M.2 slots not share the 4 pcie lanes to the motherboard? I'd be nervous to choke up those lanes, I know there are some MOBOs where the M.2 shares the PCIE lane, but at that point why not just use the PCIE lane?

Studio Server build is more like what you're describing but I havn't built these quite small enough for travel yet: https://pcpartpicker.com/b/zvGG3C

thank you for the feedback! I'd be curious to see your live setups!

  • 13 months ago
  • 1 point

Most X299 mobos do have the lanes going to the chipset, but you would use those M.2 lanes for lower priority devices like USB controllers for the non-critical UVC HDMI capture cards for additional cameras.

Thunderbolt controllers would also work for the x4 going to the chipset too, as long as that's the only downstream device on the chipset. You could then hook up some Hyperdecks or Ultrastudios directly to the Thunderbolt controller.

I agree that the 10GBe controller should have direct lanes, but Thunderbolt and USB 3.0 controllers don't necessarily need direct lanes.

  • 13 months ago
  • 2 points

Please keep feedback polite and constructive.