Richard,
You're bringing assumptions about resolutions and frame rate that imply broadcast experience. WebRTC was not designed to adhere to broadcast standards.
vMix-to-vMix calls are kind of a special case. It's also worth considering the general case of browser-to-vMix, which is a more typical WebRTC application.
I believe that almost all WebRTC services try to deliver 720p30 and, perhaps, 1080p30. That's all that webcams could deliver until quite recently.
The
Logitech C922, launched last fall, is the first webcam to support 720p60.
Razer's Stargazer does as well. So does the newer
Logitech Brio, which is seriously disappointing.
I don't know of any WebRTC services that try to deliver p60. They could try, but why bother if it's unlikely that the camera can deliver the stream?
I gather that vMix Call is based upon reference WebRTC application. Perhaps martin could comment as to whether it was adapted to deliver specific frame rate. That seems unlikely.
In reality, the bit rate and frame rate will vary as the link adapts to network conditions. That supposed to be part of the magic of WebRTC.
It's interesting to try
https://test.webrtc.org and then download the result. Doing this just now I see that the current release of Chrome does indeed have support for 4K links via WebRTC.
{"ts":1501351182063,"name":"test-run","id":10,"args":{"name":"Check supported resolutions","status":"running"}},
{"ts":1501351182345,"name":"test-run","id":10,"args":{"success":"Supported: 160x120"}},
{"ts":1501351182726,"name":"test-run","id":10,"args":{"success":"Supported: 160x120"}},
{"ts":1501351183101,"name":"test-run","id":10,"args":{"success":"Supported: 320x180"}},
{"ts":1501351183477,"name":"test-run","id":10,"args":{"success":"Supported: 320x240"}},
{"ts":1501351183852,"name":"test-run","id":10,"args":{"success":"Supported: 640x360"}},
{"ts":1501351184226,"name":"test-run","id":10,"args":{"success":"Supported: 640x480"}},
{"ts":1501351184602,"name":"test-run","id":10,"args":{"success":"Supported: 768x576"}},
{"ts":1501351184977,"name":"test-run","id":10,"args":{"success":"Supported: 1024x576"}},
{"ts":1501351185352,"name":"test-run","id":10,"args":{"success":"Supported: 1280x720"}},
{"ts":1501351185639,"name":"test-run","id":10,"args":{"info":"1280x768 not supported"}},
{"ts":1501351185648,"name":"test-run","id":10,"args":{"info":"1280x800 not supported"}},
{"ts":1501351185656,"name":"test-run","id":10,"args":{"info":"1920x1080 not supported"}},
{"ts":1501351185664,"name":"test-run","id":10,"args":{"info":"1920x1200 not supported"}},
{"ts":1501351185672,"name":"test-run","id":10,"args":{"info":"3840x2160 not supported"}},
{"ts":1501351185681,"name":"test-run","id":10,"args":{"info":"4096x2160 not supported"}},
It tests for it, and finds it not supported by the built-in camera of my laptop.
I can think of no reason why the bit rate in each direction would need to match. In fact, given asymmetrical internet connections it seems logical the they would not match. Presuming a typical ISP, you can send them a better quality stream than they can send you. Their inbound being more generous than their outbound.