vMix Forums
»
General
»
NDI
»
vMix and NDI: CPU vs GPU?
Rank: Newbie
Groups: Registered
Joined: 10/3/2017(UTC) Posts: 8
Thanks: 5 times
|
Hi all,
I'm looking into building a new system specifically for running vMix. We use 4 or more NDI inputs per shoot, and was wondering which is used more for NDI; CPU or GPU? I plan on putting a 1080 TI in there no matter what, but I'm contemplating an i7 vs an i9 at the moment.
Thanks.
|
|
|
|
Rank: Advanced Member
Groups: Registered
Joined: 5/13/2014(UTC) Posts: 518 Location: Manchester, UK Thanks: 2 times Was thanked: 183 time(s) in 130 post(s)
|
atmonauti wrote:... and was wondering which is used more for NDI; CPU or GPU? I plan on putting a 1080 TI in there no matter what, but I'm contemplating an i7 vs an i9 at the moment. The NDI SDK uses the CPU for its codec, so i9 will probably be best.
|
|
|
|
Rank: Advanced Member
Groups: Registered
Joined: 3/31/2016(UTC) Posts: 126 Location: london
Thanks: 2 times Was thanked: 46 time(s) in 32 post(s)
|
zenvideo wrote:atmonauti wrote:... and was wondering which is used more for NDI; CPU or GPU? I plan on putting a 1080 TI in there no matter what, but I'm contemplating an i7 vs an i9 at the moment. The NDI SDK uses the CPU for its codec, so i9 will probably be best. Just for completeness, NDI *HX* decoding can use GPU if that is enabled by the host application. So if you are using a lot of NDI HX sources and hardware decode is enabled, then GPU will be useful. Everything else about NDI uses CPU only, as Martin explained.
|
1 user thanked livepad for this useful post.
|
|
|
Rank: Newbie
Groups: Registered
Joined: 10/3/2017(UTC) Posts: 8
Thanks: 5 times
|
livepad wrote:zenvideo wrote:atmonauti wrote:... and was wondering which is used more for NDI; CPU or GPU? I plan on putting a 1080 TI in there no matter what, but I'm contemplating an i7 vs an i9 at the moment. The NDI SDK uses the CPU for its codec, so i9 will probably be best. Just for completeness, NDI *HX* decoding can use GPU if that is enabled by the host application. So if you are using a lot of NDI HX sources and hardware decode is enabled, then GPU will be useful. Everything else about NDI uses CPU only, as Martin explained. Thank you for the information. Does either method introduce latency over the other? That is, hardware decoding via GPU vs CPU? I do happen to be using NDI-HX sources for the most part. I ask because I plan on building the Obsidian reference system either way, but which of the above is the better practice?
|
|
|
|
Rank: Advanced Member
Groups: Registered
Joined: 3/24/2016(UTC) Posts: 331 Location: Chicago, IL
Was thanked: 143 time(s) in 94 post(s)
|
atmonauti wrote: Does either method introduce latency over the other? That is, hardware decoding via GPU vs CPU? I do happen to be using NDI-HX sources for the most part.
I ask because I plan on building the Obsidian reference system either way, but which of the above is the better practice?
Latency has nothing to do with CPU or GPU in this case. NDI|HX is a LongGOP style compression and all LongGOP style compression codecs will have some latency. That being said, NDI|HX is probably the lowest latency you will find for this kind of compression, but it will be a few frames. NDI will be 1 frame of latency in software, in a hardware version it can be down to few video lines of latency. Kane Peterson NewTek
|
|
|
|
vMix Forums
»
General
»
NDI
»
vMix and NDI: CPU vs GPU?
Forum Jump
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.
Important Information:
The vMix Forums uses cookies. By continuing to browse this site, you are agreeing to our use of cookies.
More Details
Close