WiFi Strength Signal

  • 1
  • Question
  • Updated 2 years ago
  • Answered
Can you explain why minimum recommended signal test for best use is -65 dbm? what if it goes to -72 dbm?
Photo of Borey Chum

Borey Chum

  • 4 Posts
  • 0 Reply Likes

Posted 2 years ago

  • 1
Photo of Manoj Gowrie

Manoj Gowrie

  • 1 Post
  • 1 Reply Like
Hi, WiFi signal degrades with distance from the AP, -65dbm has become the industry standard as the recommended limit of signal strength/coverage  to ensure that the connection from the device to the AP is still stable, once the signal drops below -65dbm the connection starts becoming unstable, i.e. packet drops, intermittent loss of connection, etc. You will still be able to get a connection at -72dbm but you would see increased latency and potentially packet loss dependent of how much the signal strength fluctuates
Photo of John D

John D, AlphaDog

  • 497 Posts
  • 136 Reply Likes
Right, I guess the more cynical answer is that 65dBm has become the industry unspoken standard for the minimum RSSI that a half decent wifi device should be able to hold a connection at. If you fall below there, then you're likely to experience more and more clients that can't hold a signal.

Wifi client antenna / RF design is pretty inconsistent, just like how you have good AP models and crap AP models, the same holds true on the client side. I've had some awesome laptops that can hold a connection down close to -80dBm and pick up wifi where none of my other devices can see wifi, but I've also had the opposite built into gaming consoles where -65dBm was already not enough for Netflix streaming.
Photo of Robert Lowe

Robert Lowe

  • 147 Posts
  • 22 Reply Likes
I also consider SNR in this scenario. at -65dBm "in a normal operating environment" it is safe to assume that the signal strength is strong enough to not be effected by WIFI & none-WIFI RF noise. Obviously there are instances when the noise floor is much higher and you can still experience issues at -65dBm
Photo of Borey Chum

Borey Chum

  • 4 Posts
  • 0 Reply Likes
We had test two different device phone of Android. Exact same location, but we got different signal. Can you advice?
Photo of Robert Lowe

Robert Lowe

  • 147 Posts
  • 22 Reply Likes
what was the difference in signal? Devices can have different antenna inside therefore it is quite normal to get different signal reading from different devices in the same place. For example a laptop will generally report a signal up to 10dB stronger than a smartphone in the same location. Different antenna = different RSSI.
Photo of Todd

Todd

  • 53 Posts
  • 7 Reply Likes
Robert nailed it, but to explain differently.  Think of your Android phone like you do your car radios.  Some car radios receive radio stations better than others in the same exact location.  That because of the car radio (receiver) quality & antenna quality as well as placement.  As an example, one car will drop a radio station while driving through a tunnel while another will continue to play the radio station just fine driving through that same tunnel.
Photo of Sean

Sean

  • 342 Posts
  • 87 Reply Likes
When designing a network you always have to design to the client, but you should not however be desiging to the worst client, especially if your trying to get that client to achieve -65dBm.

Note: Some clients are a lot worse than others and in certain cases can be in excess of 10dB when it comes to a signal delta.

One thing you should note is that Ruckus has Interference filters which will reduce your noise floor by about 12dB - think typical noise environment in a shopping mall where NF is around -85dBm, with Ruckus this could be lowered to around -97dBm, giving you more SNR room to play with.

One thing that has not been mentioned yet is how channel size increases noise floor and how that may affect your design.

Without Interference Filter and only 3dB noise figure at reciever:

20MHz = -97.99dBm
40MHz = -94.98dBm
80MHz = -91.97dBm
160MHz = -88.96dBm

With Interference Filter and only 3dB noise figure at reciever:

20MHz = -109.99dBm
40MHz = -106.98dBm
80MHz = -103.97dBm
160MHz = -100.96dBm

As you can see from the above if you dont have an intereference filter you may need a higher RX power value to obtain decent SNR /bitrate / throughput.

Note: The above only represents a clean environment though so lets look at a noisy one.

Without Interference Filter and only 15dB noise figure at reciever:

20MHz = -85.99dBm
40MHz = -82.98dBm
80MHz = -79.97dBm
160MHz = -76.96dBm

With Interference Filter and only 15dB noise figure at reciever:

20MHz = --97.99dBm
40MHz = -94.98dBm
80MHz = -91.97dBm
160MHz = -88.96dBm

Obvioulsy the 40, 80 and 160MHz channels are aimed at 5GHz, but as you can see it may impact your design if you are aiming at obiquitous coverage across both bands and then you still have the log value delta between 2.4GHz and 5GHz.

Personally I think this is how -65dBm came about, as 20dB SNR is considered to be a good value and if you were to look at the first of the 2 values above showing the 15dB noise figure, the NF at 20MHz is -85dBm... I'll let you do the remaining math :)
(Edited)