• breezykoi 6 hours ago

    You might also want to mention AMWA NMOS, which is increasingly used alongside SMPTE 2110 in setups like this. NMOS (Networked Media Open Specifications) defines open, vendor-neutral APIs for device discovery, registration, connection management, and control of IP media systems. In practice, it's what lets 2110 devices automatically find each other, advertise their streams, and be connected or reconfigured via software.

    The specs are fully open source and developed in the open, with reference implementations available on GitHub (https://github.com/AMWA-TV)

    The specs define REST API's, JSON schemas, certificate provisioning, and service discovery mechanisms (DNS-SD / mDNS), providing an open control framework for IP-based media systems.

    • lukeh 4 hours ago

      There’s also AES70, or OCA (https://news.ycombinator.com/item?id=46934318). More popular in audio than video, something of a competitor to NMOS (although there are parts of NMOS that were very much inspired by OCA). There are open source C++, Python, JavaScript and Swift implementations as well as some commercial ones.

    • lifis an hour ago

      Seems the classic legacy overengineered thing that costs 100x production costs because it's a niche system, is 10x more complex than needed for to unnecessary perfectionism and uses 10-100x more people than needed due to employment inerta.

      A more reasonable thing is to just use high quality cameras, connect to the venue fiber Internet connection, use normal networked transport like H.265 with MPEG-TS over RTP (sports fans certainly don't care about recompression quality loss...), do time sync by having A/V sync and good clocks on each device and aligning based on audio loud enough to be recorded by all devices, then mix, reencode and distribute on normal GPU-equipped datacenter servers using GPU acceleration

      • pjc50 28 minutes ago

        The sort of systems which demand 100% reliability tend to be like that. "Disruption" in the middle of live sports broadcast is unpopular with customers.

        • kmbfjr 3 minutes ago

          As a yute, a good portion of my early working years was spent in the transition from analog to digital TV, and that included a ton of work doing sports TV. Aside from one or two replay people and stats, they're not using 10-100x more people. It remains a technical director (switcher), audio A1/A2, video tech, a body on each camera and E1/E2 (truck chief engineers). If they could do it cheaper, they would.

          These trucks are utility devices, and while the spend much of their time doing sports TV, they also are used for any live TV event require high production capability. The same crews that do sports also do these shows.

          While stream splicing and time base correction have likely made incredible improvements since I left the business, I see a number of flaws in your alterative production plan starting with ineffective timing that will render switching the video and producing a coherent show difficult.

          The TV industry would listen to your alternative ideas, but you probably need to understand their problem and how digital video works a little better. I still have contacts in the industry, how'd ya like to be an A2?

          • TD-Linux 41 minutes ago

            While I think you are oversimplifying the timing issue, you are not the first to think that about 2110.

            https://stop2110.org/

            • jacquesm an hour ago

              Sounds like you've got it made then: produce the equivalent that fits in a minivan and laugh all the way to the bank.

              • amluto 27 minutes ago

                > do time sync by having A/V sync and good clocks on each device and aligning based on audio loud enough to be recorded by all devices

                Why do you need good clocks? For audio, even with simultaneously playing speakers, you only need to synchronize within a couple of ms unless you need coherence or are a serious audiophile. If if want to maintain sync for an hour I suppose you need decently good clock.

                But as long as you have any sort of wire, basically any protocol can synchronize well enough. Although synchronizing based on visual and audible sources is certainly an interesting idea. (Audio only is a completely nonstarter for a sporting event: the speed of sound is low and the venues are large. You could easily miss by hundreds of ms.)

                > then mix, reencode and distribute on normal GPU-equipped datacenter servers using GPU acceleration

                Really? Even ignoring latency, we’re talking quite a few Gbps sustained. A hiccup would suck, and if you’re not careful, you could easily spend multiple millions of dollars per day in egress and data handling fees if you use a big cloud. Just use a handful of on-site commodity machines.

              • acolumb 11 hours ago

                nice to see an article from my industry! st2110 is such a complex standard which a lot of the hardware mentioned has been molded to deal with.

                • hdgvhicv 5 hours ago

                  Most 2110 kit relies on narrow timing. That means packets arriving and leaving in a window on the order of 10 microseconds. Doing that in software reliably for your typical 100gbit interface is challenging.

                • RupertSalt 3 hours ago

                  Unfortunate typo in the headline, reproduced here and once in the article. This is not about email or spam.

                  • master_crab 2 hours ago

                    It’s SMPTE. Not SMTPE

                    • jacquesm 3 hours ago

                      Neat post. I wonder what the drift is on those clocks.

                      • thanksgiving 6 hours ago

                        > like why they use bundles of analog copper wire for audio instead of digital fiber

                        Good article. Got me to read the article because I was curious why...

                        • jauntywundrkind 10 hours ago

                          Fun to see.

                          PipeWire had some decent AES67 support for network audio. Some really fun interesting hardware already tested. Afaik no SMPTE 2110 (which is video) but I don't really know.

                          I know it's not the use case but I do wish compressed formats were more supported. Not really necessary for production, but these are sort of the only defacto broadly capable network protocols we have for AV, so it would expand the potential uses a lot IMO. There may be some very proprietary JPEG-XS compression, but generally the target seems to be uncompressed.

                          https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/AES...

                          • rezonant 10 hours ago

                            Actually SMPTE 2110 can host video (2110-20), audio (2110-30) and ancillary data (2110-40) essences, and each essence can be delivered independently of the others.

                            ST 2110-22 standardizes compressed video using JPEG XS. While there is a patent pool for XS, otherwise the format is standardized and open.

                            It would be nice to see an essence type defined for AVC, but the quality tradeoffs of AVC/HEVC are really not appropriate for the domain that ST 2110 is aiming for: which is the contribution side video network of a broadcast operation.

                            There are alternative "consumer grade" and "prosumer grade" IP video solutions out there.

                            There is Teleport which is growing up in the OBS space, but is quite capable (we've used it in production for quite awhile)

                            https://github.com/fzwoch/obs-teleport

                            And of course the underpinning of 2110 itself is RTP, which is a standard network protocol, which does have AVC defined as a payload type in RFC 6184

                            https://www.rfc-editor.org/rfc/rfc6184

                            Really it's a bit odd to wish that ST 2110 had compressed video when it's really just a specific profile of RTP with some broadcast specific bits on top while RTP itself does support lots of payloads.

                            ST 2110-10 which provides the timing just standardizes PTP and the meaning of the RTP timestamps (notably a specific epoch), but there's nothing stopping you from using PTP based timestamps for your RTP payloads otherwise.

                            ST 2110 is not a "plug and play" system by itself. There is a whole standard that adds in such capabilities which is the NMOS IS standards, but none of that is attempting to make "peer to peer" (so to speak) ST 2110 a thing, so actually using it for anything other than a broadcast system is far from trivial, and you'd be better off using something else. NMOS goal is to make auto configuration of ST2110 flows a thing, which they have broadly succeeded in doing.

                            • _kb 9 hours ago

                              ST 2110-22 is codec agnostic. It just standardises CBR compression, for which JPEG-XS is a good fit today.

                              For plug-and-play IPMX (https://ipmx.io/about/) is looking to be a pretty promising approach that combines ST 2110 with NMOS, auth, encryption and other useful features. It's targetted at the ProAV market but IMO should be mostly suitable for consumer use.

                              • rezonant 9 hours ago

                                Ooh, you're right, and it just adopts the IETF RTP payload types for that. Cool.

                                Also forgot about IPMX.

                            • breezykoi 6 hours ago

                              It's worth mentioning NDI (Network Device Interface) as well, which is widely used in the Pro-AV for transporting compressed video and audio over IP.

                            • scoot 2 hours ago

                              I'm amused but not entirely surprised to see that live video production hasn't meaningfully progressed since I was involved 30+ years ago.

                              Yes, the technology has evolved – digital vs analog (partly – for example analog comms here because digital (optical) "isn't redundant" (lol, what?)); higher resolution; digital overlays and effects, etc. But the basic process of a bunch of humans winging it and yelling to each other hasn't changed at all.

                              This is an industry ripe for massive disruption, and the first to do it will win big.