The Balanced Link Fallacy

Having a set of walkie talkie radios as a kid was the ultimate in cool toys. Well, except for my TI-99/4A. Anyway, just like every new plaything, we found the limit of our new form of communication very quickly. Once we reached a three or four hundred yards away from each other we weren’t able to understand each other any longer.  Bummer.

Those walkie-talkies are a great example of a nicely balanced link. They were identical; both had the same transmit power, antenna and receive sensitivity.

Transmit Power

Transmit power is the amount of power that the radio chip actually produces. Power output from a Wi-Fi device is typically measured in milliwatts (mW) or dBm. A 200 mW (23dBm) transmitter is pretty decent in a Wi-Fi access point.

Antenna

The antenna is responsible for taking the power from the chip and sending it out on to the air. Antennas typically provide gain is signal strength although they don’t have to. In the world of Wi-Fi, antennas are typically measured in dBi. Those nice rubber duck antennas that you see give about 3dBi of signal gain.

Receive Sensitivity

This is lesser known of the three factors in our link. Receive sensitivity is how well a device can hear. My wife repeatedly tells me that I’m a good listener. (“Sorry babe, what did you say?”) That means I must have good receive sensitivity.  Receiver sensitivity is typically shown in an RSSI (Received Signal Strength Indicator) chart. This chart states at what signal level a certain data rate can be achieved. Here is a made up truncated chart:

Device #1

-87dBm = 6Mbps

-84dBm = 18Mbps

-73dBm = 300Mbps

Device #2

-85dBm = 6Mbps

-83dBm = 18Mbps

-70dBm – 300Mbps

Device #1 has better receive sensitivity because it requires less signal to achieve the same data rates as device #2. (keep in mind that these numbers are negative so the closer to zero is higher.)

Defining Balance

An imbalanced link is when one device can hear better or transmit higher signal than the other. For example, let’s say that my set of walkie talkies had a “signal gain” dial on it. My brother sets his to 5 and I turn mine up to 11. Now I’m transmitting more power than him and we have an imbalanced link.

Is this link imbalance a problem? Before we answer that, let’s look at the link itself. What defines a good link with our walkie talkies? That’s pretty simple. If each party can understand the other, we have a good link. Oddly enough, this analog link is quite binary in nature. Either you can understand each other or you can’t.

When do you consider that the link failed? The link has failed when there is no longer bi-directional communication. The purpose of walkie talkies is to communicate with each other (two way) not like a radio station (one way). So, for the link to be considered broken, only ONE side of the link has to fail.

Back to the question. Is the link imbalance between our walkie talkies a problem? No, it actually isn’t. All it means is that at a certain distance, bi-directional communication will fail. The fact that one transmitter is higher than the other doesn’t actually matter. This is exactly what happens in a Wi-Fi network.

Have you ever stopped to think about what actually transpires in a Wi-Fi network when a client device can no longer communicate with an AP? I guarantee you, 99.999% of the time it’s ONE side of the link that fails first. If a device transmits and never receives an ACK, the link has failed.

Many a tech document has stated that if you want Wi-Fi to work right, you should lower the transmit power of the AP to the match that of the client. That is a terrible idea. Stream of consciousness as to why:

– The AP has much better receive sensitivity than your client. If you set the Tx power on the AP to 30mW equaling that of the client BUT the AP can hear 6dB better, you still have a seriously imbalanced link.

– Ah… now you are thinking this: If I set my Tx power of my AP to compensate for the client hearing better, now I have a balanced link. What that means is, you set the AP Tx power to 6dB higher than the Tx power of the client. Now, your AP would be transmitting at 120mW (6dB higher) and your client at 30mW. Now you have a balanced link! Perfect! Well, not really.

Bring on the meat!

Some vendors like to use dynamic (non static) antenna gain and / or transmit beam forming (TxBF) to get more signal to the client. But, if the client can’t talk any louder, does that actually improve anything? Read on!

Walkie talkie communication was quite simple. Again, we could either understand each other or we couldn’t. But, Wi-Fi brings in another factor that doesn’t exist in analog communication. Data rates.

Wi-Fi automatically adjusts data rate to accommodate the communication channel. If signals are high and everything is good, it transmits at high data rate. If there is noise and signals are weak, it will transmit at a low data rate. (There are a bunch or rates in between too) With Wi-Fi, signal equals speed and reliability. Unless you are at your maximum MCS (data rate) extra signal will improve your link speed and that equates to improved capacity and throughput.

So what happens if you have that evil unbalanced link and your AP sends higher signals than your client?  Great! I’m digging that because it gives me increased data rates on ALL downlink data. Given that many networks have 80%+ of their data going to the clients from the AP, who wouldn’t want more signal, speed, capacity and throughput?

16 thoughts on “The Balanced Link Fallacy

  1. gthill says:

    I’m going to leave a comment on my own blog. Nice.

    If you read my earlier blog (Yes Ben, I’m referring to my own published works) called “Talk Faster and Not So Loud” you’ll see that achieving a balanced link is further rendered impossible because of the factors mentioned in the blog but ALSO because transmit power shifts by data rate AND that shift is different from device to device. So even if max Tx power was identical, every rate shift would create an imbalance.

  2. A man in search of perfection, heh?

    I believe you’ve missed the purpose of matching the power to the weakest client Tx capability (read, matching the EIRP). All that you’ve put in the blog is true, that is if we isolate the problem to ONE AP serving clients. It truly does not matter what power the AP is set at to define the working range of the communication.

    Saying that….

    We are discussing enterprise grade WiFi networks where high client density or demanding applications like VoIP or location tracking are frequently used and even represent a mission critical component of the network. So what is the biggest problem in those environments? Stable RF environment and the overall capacity.

    So what happens when one designs the network for RTLS or Voice and keep the power levels up? Diminished capacity and unstable RF environments (keeping our jitter and latency levels where we do not want them to be). Furthermore we are creating unpredictable environment for roaming behaviors (btw. hounds have a problem with that one if the network is not designed very carefully). And to top it of, if not careful we are quick to create problems like “near far” or “hidden node”. To be honest, the last two examples are not often found these days due to default protection mechanisms in the mixed mode setup but still… it proves a point.

    Its not all black or white though. For HD deployments I would say that the power level of the APs do not have to be lowered IF the collision domains are separated and if load balancing and band steering mechanisms work optimally (both are still more of an art than a sure thing in todays vendor implementations). If this is not possible than lowering the APs Tx power will help (but not cure) come of the capacity issues.

    So its all shades of gray. Matching power is not a cure for all WiFi illness but it helps.

    My 2 cents.

    • gthill says:

      Gregor,

      Thanks for the comments.

      You made this statement: “So what happens when one designs the network for RTLS or Voice and keep the power levels up? Diminished capacity and unstable RF environments (keeping our jitter and latency levels where we do not want them to be).”

      How does an AP with greater Tx power than the client cause this issue?

      GT

      • Let me reply with a question.

        Why is putting “1 AP per two classrooms better than 1 AP per classroom”?

        As written above, by limiting the cell size, better channelization can be applied. If a we have multiple strong signals on the same channels we are creating a mess. The more APs we have the bigger the problem.

        Cheers,

        Gregor

  3. Excellent again GT, thanks. I was explaining something along these lines only yesterday when helping a reseller survey a port environment with the 7762 as the AP. We could demonstrate the different reliable distances achievable with devices of different capabilities; a Symbol handheld scanner, iPhone, Galaxy Tab and 3SS laptop (which was the best at nearly 200m).

    Hope that doesn’t count as marketing, wouldn’t want to upset the locals!

  4. Hi GT, interesting post.

    I get what you’re saying, and I even agree that perhaps link budget isn’t as important depending on where the client is within the coverage cell. If the client is close enough that it can be heard, then that’s swell and everybody is happy. However, wouldn’t link budget become very important as a client moves closer to the edge of a coverage cell? That’s when it eventually reaches a point of uni-directional comms.

    In a single AP scenario, I’m not sure the downlink benefit pans out as you get further away since a client still has to make requests and acknowledgements. Up close, I get it. Further away, it falls apart in my mind since the client would just fall off the network.

    In a multiple AP scenario, I guess the client will have to roam at that point and hope the entire WLAN wasn’t designed with no regard for link balance since the client would conceivably be jumping from one unbalanced link to another. This really goes to what Gregor was saying above. The potential for an unstable RF environment really shoots through the roof if you don’t at least try to aim for balanced links in your design.

    Is link balance a fallacy? Sure, I would agree with you that given all the variables presented by different clients/orientations, that you’ll never be able to say that you’ve got a balanced link budget. But I think we should still be trying to get as closed to balanced as we can, no?

    Very nice post. I like reading stuff that makes you think and challenge what is considered ‘best practice’. Best way to learn in my opinion!

    Thanks!

  5. These comments are all too long for me to read, so my apologies if this has been said.

    My iPad takes a shit when associated to an AP with too high a transmit power level. It gets associated, shows the little WiFi bars, and then at the edge of coverage it transmits like crap (high Retrys, low rates, etc.). I’d rather hope that the iPad sees the lowered signal and connects to a nearby AP instead.

  6. gthill says:

    Let me throw something else out there:

    Let’s say you design a network with balanced links. Now, imagine those coverage circles on your floor plan. Next, imagine that the AP is transmitting higher power than it did before. Did you change the coverage area of the AP? NO!!!!

    I know that doesn’t sound right but hear me out. If you design with a balanced link then the client and AP lose signal with each other at the same time.

    Now, you increase the Tx power of the AP. Did the coverage circle change? Nope. Because that circle is based on the client being able to talk to AP as well.

    However, what you did accomplish is higher downlink data rates which equals higher capacity, less airtime utilization, reduced co-channel interference and overall awesomeness.

    GT

  7. Matthew Gast says:

    The comments do a great job discussing a beamforming fallacy: the idea that downstream beamforming is all that matters. Beamforming is awesome when you have a single AP and you care about downlink capacity — for example, in a deployment where you want to push video streams in a single-AP household from a set-top box to a TV. It gives you better range, better throughput, and, in a small enough dwelling, doesn’t create much in the way of problems.

    In a network with more than one AP, beamforming makes the sticky client problem worse, and introduces hidden nodes. To use a rough analogy, if we’re talking and I shout at you (beamforming) and you whisper back (hoarseness, to represent the terrible transmitter in the iPad), we’re not going to have much of a conversation. I can’t hear you, so there’s not going to be “flow control,” and other people around me are going to talk over you all the time.

    For downstream beamforming to help, you need two things: improved uplink (to the AP) reception, either as client-side beamforming or improved receiver sensitivity. Link symmetry doesn’t always mean the same Tx power; it refers to having equivalent bidirectional performance. Second, you have to have dramatically better coordination between APs so that you can prevent the long-range beamforming transmissions from stepping on neighbors. Hidden nodes are not your friend.

    • gthill says:

      Matthew, thanks for taking the time to stop by.

      You stated: “For downstream beamforming to help, you need two things: improved uplink (to the AP) reception, either as client-side beamforming or improved receiver sensitivity.” I don’t agree with this as a blanket statement.

      I agree that to have a complete solution you should also affect the uplink (receive sensitivity) but, for the sake of argument, IF you only positively affect downstream with an increase in signal that increases both downstream and upstream throughput. When you decrease airtime utilization, even in one direction, it positively affects the entire network including upstream throughput and reduction of co-channel interference.

      In most networks the majority of the traffic is downstream. In a pure VoIP network the traffic is presumably 50/50. And even in that situation increasing the data rate on the downstream has the benefits stated above.

      Even though I work for Ruckus this blog was meant for all Wi-Fi. I think in most situations an AP should have a higher link budget than the client because of the stated benefits.

      Because of the difference in connected devices there is NEVER a way to have a balanced link. That’s the point of this blog. The idea of a balanced link looks great in a white paper or book but it cannot be achieved in the real world.

      GT

  8. Devin Akin says:

    “In most networks the majority of the traffic is downstream.” — I somewhat disagree.
    Depends on the vertical market, the client type, the application, the time of day, and lots of other factors. Even if it were entirely true, I wouldn’t use it as an blanket excuse when my network was in the middle of choking on uplink mobile device backups. 😉

    Devinator

    • gthill says:

      When did I ever say that I wanted to reduce uplink data rate? Let’s see if I can say this in two bullets.
      – System A – uplink throughput = x and downlink throughput = y
      – System B – uplink throughput = x and downlink throughput = 2y

      This can be achieved with an increase in downstream signal strength. As long as it doesn’t increase co-channel interference, what is the problem?

      GT

    • gthill says:

      Ben and I have been back and forth on this. I don’t think that ONE test with 11g (only 8 OFDM rates) is adequate. I completely believe that retry rates increased in Ben’s test. However, a lot of that has to do with the vendor’s rate control algorithm. If the vendor can’t rate control properly then retry rates will go up.

      It’s still a fact that more signal and less noise equals higher data rates. As long as the proper rate is selected, it’s a sure fire way to increase throughput and more important, capacity.

Leave a comment