• Skip to main content
  • Skip to primary sidebar
  • Skip to footer

How Wireless Works

Connecting Our Wireless World

  • Courses
    • CWTS – Devin Akin Author
    • Basic Routing and Switching
    • WiFi Foundations 101 Rick Murphy
    • 802.11ax – Devin Akin
    • WiFi Best Practices For EDU and Enterprise
    • Aircheck G2
  • Forums
  • Blog
  • Resources
  • Contact Us
  • Login

CCA Preamble Detect – Fact vs. Fiction – Full email thread initiated by C. Lukaszewski – 9/27/2015

Home › Forums › WiFi Deep Dive › CCA Preamble Detect – Fact vs. Fiction – Full email thread initiated by C. Lukaszewski – 9/27/2015

  • This topic has 32 replies, 1 voice, and was last updated 7 years, 2 months ago by Rick Murphy.
Viewing 15 posts - 16 through 30 (of 33 total)
← 1 2 3 →
  • Author
    Posts
  • April 4, 2016 at 4:13 pm #2985
    Rick Murphy
    Forum Admin

    On Oct 23, 2015, at 11:23 AM, Devin Akin [mailto:Devin.Akin@DivDyn.net] wrote:

    Rick,

    You need to practice being more thorough. 😀 hehe.

    Nice work on this! I agree that this test supports Chuck’s assertion.

    Thanks a ton for the effort!

    Devin

    April 4, 2016 at 4:14 pm #2986
    Rick Murphy
    Forum Admin

    On Oct 24, 2015, at 8:12 AM, Chuck Lukaszewski [mailto:clukaszewski@arubanetworks.com] wrote:

    Rick,

    Had a really long week here, will reply more tomorrow.

    Very nice test design and results. Appreciate you checking out my claim here.

    -cl

    April 4, 2016 at 4:15 pm #2987
    Rick Murphy
    Forum Admin

    On Nov 16, 2015, at 4:08 AM, Peter Mackenzie wrote:

    Hi All,

    Sorry for how long it has taken me to share my test results with you, I have been crazy busy here.

    Please find attached a very quick write up of my first controlled test. Although I have written a short conclusion section. I’m trying to not draw to many conclusion at this stage and would like to do some more tests.

    Thanks
    Peter

    April 4, 2016 at 4:15 pm #2988
    Rick Murphy
    Forum Admin

    On Nov 16, 2015, at 1:29 PM, Devin Akin [mailto:Devin.Akin@DivDyn.net] wrote:

    Peter,

    I think the procedure was sound, but it looks like to me that your APs were on different channels per your screenshot. If so, that would explain the results. Please take a look and see if that was the case.

    Thanks!

    Devin

    April 4, 2016 at 4:16 pm #2989
    Rick Murphy
    Forum Admin

    On Nov 19, 2015, at 6:25 AM, Peter Mackenzie [mailto:pmackenzie@marquest.com] wrote:

    Hi All,

    So it would appear that one of AP’s did changed channel at some point during my testing. I apologized for the “school boy” error, I really should have noticed this. I’m normally so obsessed with the detail, so I’m not sure how I missed it. As I have no proof when the channel change occurred, I’m disregarding all my previous results and I have re-run the tests, using the same procedure. This time making sure that the AP stays on the same channel.

    So please find attached my latest results.

    Thanks
    Peter

    April 4, 2016 at 4:16 pm #2990
    Rick Murphy
    Forum Admin

    On Nov 21, 2015, at 12:06 PM, Rick Murphy [mailto:rmurphy@wirelesstrainingsolutions.com] wrote:

    Hi Peter,

    Thank you for sharing your research. Very interesting results. I wonder if the fact that you were using 40 MHz wide channels during the test would have had any effect on these results?

    Thanks,
    Rick

    April 4, 2016 at 4:18 pm #2991
    Rick Murphy
    Forum Admin

    On Nov 23, 2015, at 8:52 AM, Peter Mackenzie [mailto:pmackenzie@marquest.com] wrote:

    Hi Rick.

    I agree, It would be worth trying a 20MHz test as a comparison.

    The next set of testing should include:
    20MHz downlink test

    20MHz uplink test

    40MHz downlink test

    40MHz uplink test

    Protocol captures to be taken at both locations with client-matched adaptors for all tests.

    Please let me know if anybody thinks anything else should be tested

    Thanks
    Peter

    April 4, 2016 at 4:19 pm #2992
    Rick Murphy
    Forum Admin

    On Nov 23, 2015, at 6:19 PM, Chuck Lukaszewski [mailto:clukaszewski@arubanetworks.com] wrote:

    Great to see all the work on this. Peter, love your attention to detail. Keep it coming!

    I have run many many of these spatial reuse type tests. Both of Peter’s test runs clearly suggest that his cells are too isolated from one another. The combined goodput should be marginally higher than a single cell. Instead, he’s almost getting full reuse in both V1 and V2 test reports.

    Rick’s results are more consistent with what we see here when the cells are properly separated (e.g. just inside one another’s preamble ranges). BTW – a quick way to test this is to see whether STA2 can pass traffic to AP1, and vice versa. If the cells are truly overlapping then this will be possible at MCS0. If you can’t pass traffic then the cells are independent collision domains.

    Please note that there are some subtle but important differences between Peter and Rick’s test designs at the radio level.

    – Peter’s choice of channel 44 limits significantly limits EIRP, might be better to test on Ch 108 or 108+. Suggest not to use a channel on the band edge as for most vendors these are slightly reduced EIRP from inside channels due to spurious emissions issues.

    – By comparison, Rick’s test uses 2.4GHz and HT20. since his company is USA based I assume this is with full 36dBm allowed EIRP (or whatever the tested devices are capable of). As opposed to Peter using HT40 with ETSI limit of 23dBm.

    – Now you could say this doesn’t matter since its RX power on far side that matters for this test. I nominally agree. I think the key difference is the HT20 vs. HT40 in these two tests. But it is possible that reduced launch power is resulting in different channel fading in the two tests.

    Also, I note that there is some difference between Peter’s two test runs.

    – V2 shows -50 and -61 respectively inside each cell, whereas V1 shows -56 and -50. So V2 has a 10dB AP-STA SNR delta as compared with V1 which has a 6dB delta. This likely explains the reduced goodput in the AP2 cell.

    Some general observations to questions raised over the last few days.

    • 40 vs 20.
    o Yes this will make a difference for a test of this type due to the discussion we are having on the other thread about reduced SNRs.
    o Peter’s configuration is right at the limit of coverage for HT40 – here is the table from 7131 data sheet (attached). It’s very possible that the total goodput being measured is additive because the cells are outside one another’s collision radius.

    AP 7131 Received Sensitivity Thresholds

    o HT20 improves sensitivity to -93. Suggest if anyone is going to put more time into this to rerun with 20, or alternatively close up the physical distance to improve SINRs by 3-4dB for good measure.
    o Alternatively, move to channel 100 and go to max EIRP. Might buy you another 6dB.
    • Up vs. down
    o Definitely a different test.
    o This one will be tricky because the 4965 probably has lower EIRP than the AP.
    o Best way to test this is fire up a soft AP on the two laptops and check RSSI to see if there are any big RX power deltas.
    o 4965 RX sensitivity may also be worse than the 7131. I’m not aware that the Intel specs are public, so not sure what these values are.
    • Separate iperf servers
    o I think this is a NOOP at the throughput levels we are talking about here (11n 2SS)
    o However, given that Peter is using distance and structural loss to achieve the signal levels, it’s probably simpler from a cabling perspective to have 2 servers.
    o We generally used to use a single IxChariot server for this type of test, but in my lab building I have a common cabling so it’s easy.
    o I will say that for 11ac 3SS VHT80 testing we have gone to separate IxChariot wired endpoints since a single cell can generate 850Mbps+ TCP/UDP goodput.

    It is expected that data frames will not be decodable on the far side, as their payloads require high SNR. Only the preambles will make it across.

    Something else I didn’t talk about in person but is relevant. With 11n HT, a lot of vendors do not do RTS/CTS. Not sure about these specific products. So depending on exact timing of when BSS1 sends preamble for a data frame, BSS2 could miss it if it is already TXing. It’s quite possible to see a slight increase in 2 BSS test as compared with 1 BSS test for this reason, especially when operating at low SINR levels which then allow each BSS to get through.

    If you enable RTS/CTS with 11n, or if you test with 11ac, you will get a much clearer result. The RTS at 6Mbps rate will fully clear both BSS at both L1 (L-SIG) and L2 (NAV) levels.

    -cl

    April 4, 2016 at 4:48 pm #3000
    Rick Murphy
    Forum Admin

    On Nov 24, 2015, at 5:44 AM, Peter Mackenzie [mailto:pmackenzie@marquest.com] wrote:

    Hi Chuck,

    Thank you for your email and the level of detail you have put into it, really helpful.

    I just want to clarify one of your points below:

    BTW – a quick way to test this is to see whether STA2 can pass traffic to AP1, and vice versa. If the cells are truly overlapping then this will be possible at MCS0. If you can’t pass traffic then the cells are independent collision domains

    I agree if you can pass data then the cells are truly overlapping and the combined goodput should be marginally higher than a single cell. This would be constant with other testing\lab exercises I have performed. But is this what we are testing here?

    Maybe I miss understood what you were saying at the Wi-Fi Trek conference. But if I understood what you said correctly, the theory goes that if a STA can hear a neighbouring transmission enough to decode a valid PLCP header it will set CCA busy and it is irrelevant whether or not the STA can successfully transmit a frame back to the neighbouring device.

    Thanks
    Peter

    April 4, 2016 at 4:49 pm #3001
    Rick Murphy
    Forum Admin

    On Nov 24, 2015, at 12:59 PM, Chuck Lukaszewski wrote:

    Hi Peter,

    We are testing whether 2 BSS that are < -82dBm relative to each other block one another’s transmissions. E.g. with mutual RSSI in the -82 to -93 range and SNR > 4dB.

    So blockage for any RSSI value below -82 proves my assertion. You may be trying to push it too low. (so -89dBm RX sensitivity is not helpful with a -93dBm noise floor, in that case we’d need an extra 2-3dB of link margin to ensure the preambles and control frames are being decoded on other side)

    To verify the cells are within earshot of one another, verify that STA2 at -85 relative to AP1 should be able to associate and pass traffic at MCS0. Same with STA1 at -85dBm relative to AP2.

    If you cannot do this, then by definition the two BSS are completely independent and when you run both you will get 2X goodput.

    -cl

    April 4, 2016 at 4:50 pm #3002
    Rick Murphy
    Forum Admin

    On Nov 25, 2015, at 5:31 AM, Devin Akin wrote:

    Thanks to all who are actively digging into this thread.

    Two questions:

    1. Is there a reference doc for the 4dB SNR?

    2. Was there a reasonable rationale behind 11n clients/APs not using RTS/CTS?…especially since 11ac clients/APs supposedly use it?

    The rest was very clear.

    Thanks!

    Devin

    April 4, 2016 at 4:51 pm #3003
    Rick Murphy
    Forum Admin

    On Nov 25, 2015, at 6:47 AM, Rick Murphy wrote:

    I was also wondering if there was anything documented about the 4 dB SNR level…

    Thanks,
    Rick

    April 4, 2016 at 4:58 pm #3004
    Rick Murphy
    Forum Admin

    On Nov 29, 2015, at 5:23 PM, Chuck Lukaszewski wrote:

    Been trying to locate something, finally found it staring me in the face.

    Figure 5.8 from the Perahia book that Devin sent around captures this nicely. Look at the line with the “*” for 20MHz MCS0.

    At 4 dB SNR, MCS0 will produce a packet error 15% of the time. To drop the PER to ~1% which is the usual target for a given modulation you have to be at 7dB.

    Reference: Next Generation Wireless LANs - E. Perahia & R. Stacey

    April 4, 2016 at 5:00 pm #3005
    Rick Murphy
    Forum Admin

    Sent from my iPad
    
On Nov 30, 2015, at 10:20 PM, Devin Akin wrote:
    Chuck,

    Were those 1% and 15% numbers estimations based on that chart? I looked at the line you mentioned, but didn’t see that as 1% and 15%…. help?

    Devin

    April 4, 2016 at 5:00 pm #3006
    Rick Murphy
    Forum Admin

    On Dec 1, 2015, at 12:02 AM, Chuck Lukaszewski wrote:

    Those are the conversions of the PER. 10^-2 = .01 = 1%

  • Author
    Posts
Viewing 15 posts - 16 through 30 (of 33 total)
← 1 2 3 →
  • You must be logged in to reply to this topic.
Log In

Primary Sidebar

Recently Active Topics:

  • Latency improvements in comparison to 802.11ac
    reply by Anonymous
    4 years ago
  • CCA Preamble Detect – Fact vs. Fiction – Full email thread initiated by C. Lukaszewski – 9/27/2015
    reply by Rick Murphy
    7 years ago
  • Cisco RRM
    reply by Scott Williams
    7 years ago
  • Introduction to Analysis Worksheet
    reply by Rick Murphy
    7 years ago
  • 802.11 Standard Quick Reference
    reply by Shane
    7 years ago

Forums:

  • AirMagnet User Only Community Forum
  • Course Forums
  • WiFi Deep Dive
  • WLAN Networks
  • Out in the Field – “How do I …?”
  • Industry and Product Comments
  • Industry Certifications and Training

Footer CTA

Follow Us

  • Facebook
  • Twitter

howwirelessworks.com Copyright © 2023

· Privacy Policy · Terms Of Use · Login