On 5/12/20 7:51 PM, John Pierce wrote: > just looked in my video library, largest file I see is for a multi-language > 1080p MP4/x.264 version of Parasite, 2h 11m long, 10GB. > > thats 1.3 MB/sec, or about 10 Mbit/sec. *easily* done on 100baseT. Sony agrees with you. > is there any wireless between your server and the TV client ? No. Everything is 1 Gbps wired, except for the interface in the TV itself. Theoretically, the wireless in the TV is actually faster ... until it isn't. > next largest file in 'recent' is The Irishman, also MP4 x.265 1080p, no > subtitles, english-only 5.1 AAC audio, 4.1GB for 3:29 long, thats only > 335kByte/sec, or about 3Mbit/sec, heck you could play that on 10baseT As an example, I have a 59GiB 4K video, which includes multiple audio and subtitle streams, which is approximately 115 minutes long. That results in a *mean* bitrate of about 73.5 Mbps.ls If I remove the audio and subtitle streams for languages that I don't care about, I can reduce the file to 54GiB, a mean bitrate of 66.4 Mbps. Don't try playing those on your 10baseT network! So the question is whether the *peak* bitrate required to play that file, including all the protocol and application overhead, exceeds the capabilities of the TV's interface. Just watching iftop, I've seen the reported bitrate from the "router" to the TV exceed 75 Mbps when playing the smaller file. Is that the highest it gets? I don't know, because I don't have the patience to watch iftop for almost 2 hours. Thus my original question. I'm looking for something that can tell me the *peak* bitrate to a particular host over the time that the video is playing. Something that could graph it would also be nice; I would expect to see the bitrate plateau at a high level if/when it saturates the TV interface, likely correlated with the pauses that I'm seeing. -- ======================================================================== In Soviet Russia, Google searches you! ========================================================================