Just FYI, as someone who works in Wi-Fi, I found your post difficult to parse initially. In the context of Ethernet and Wi-Fi and other networking technologies (and this thread is about Wi-Fi after all), MAC (in all caps) refers to Layer 2 of the OSI model, the
Medium Access Control layer. Mac (capital first letter, lowercase "a and c") is a well accepted abbreviation for Macintosh. I don't believe Apple has ever referred to their Macintosh line of products as "MAC" and it took me a while to figure out that you meant the latter but you wrote it like the former.
Well if one user were to boot everyone else off of the network, he/she could get the full 70 Mbps. That's my point. Max throughput for any individual user is not the same as what all users get in a shared setting.
5-8 4K simultaneous streams will take down that connection, so my conclusion must be that you either aren't on a shared 100 Mbps connection, OR you never have more than ~4 people streaming at the same time. How did you determine that the connection to your building is 100 Mbps? You can't do it by running a simple speed test, because there could be traffic shaping equipment behind the entry point that throttles any individual user or unit down to 100 Mbps, regardless of the capabilities of the physical connection to the backbone.
802.11n is so 8 years ago. Where have you been? We're now using 802.11ac and draft 802.11ax devices are already available.