WiFi sensitivity question for Jeff Liebermann & anyone well versed in antennas

and for that subsidy, google gets to track and data mine you, even though you think you're avoiding it. you're not.

Reply to
nospam
Loading thread data ...

You will want to test different device orientations as well. Reception might (from my experience: does) differ depending on device orientation (e.g.: with the back facing in the AP direction, with the top/bottom/left/right sides facing the AP).

Additionally, reception will differ as well depending on how/where you put your hands (for an extreme example see the old iPhone4 Antennagate, where bridging the different antenna segments of the frame could lead to a dramatical decrease of signal strength), which hand you're using (a ring on one hand might influence the readings), whether there's a protective cover on the phone (and of which type, ...), ...

Obviously, you will need to test several devices for each device model (in order to rule out issues with a specific device), and different models altogether.

Lot's of influencing factors, that you want to take into consideration.

Best of luck,

Michael

Reply to
Michael Eyd

find a 5k all in one for comparison.

the imac 5k first came out 2 years ago and there *still* isn't anything to match, so components is all that's possible.

there are some pc all in ones but they're not 5k displays which means there's even more of a price advantage to the mac.

that's not normal use.

normal use is connecting to wifi networks (public or private) and doing normal everyday tasks, such as surfing the net, skype, checking email, downloading new apps, etc. it's not running benchmarks and geeking out over numbers.

i used to use my iphone 4 in a *wide* variety of places, from at home (fairly strong signal) to airports & hotels (often weak and overcrowded signals) and never had any problem with wifi or cellular.

discussion is useful, not trolling.

Reply to
nospam

You don't use any Google apps on that iOS device?

Reply to
Aardvarks

Without a Google ID, all it has is an advertising ID, which I switch randomly, and which Google *says* they don't maintain the connection.

I don't log into *any* Google apps, as you know.

Since my system is well organized, I keep a duplicate folder of *just* Google Apps, where every one is logged out of (and almost none are used anyway, except maybe Google Maps). I have the history turned off if I'm forced to log into an app, but I can't think of any app that you have to log into other than Gmail, which is a different beast altogether, and, which has the same issues on iOS anyway.

As you know, I also have my SSID tracking turned off, so that I'm not spying on my self and my neighbors.

Likewise, I have all app connection to my location turned off, and no app is allowed to use my location unless I expressly turn the app location ability back on with App Ops Starter. And you have that same issue too on iOS anyway, so, nothing is different there.

So, where, may I ask, is Google spying on me on Android that they're not also spying on you in iOS?

Reply to
Aardvarks

Jeff. That is the understatement of the year!

Reply to
Aardvarks

it has more than that.

Reply to
nospam

don't change the topic.

Reply to
nospam

The thing about Usenet, one can follow a narrative and sometimes one can actually learn something. I've learned a bit so far, but a point was made that I need help with understanding. Other info snipped....

Is that because Apple engineering/manufacturing is so advanced that no one can match it, or is it because there's no market for similar products?

These comments got me thinking (always a good thing), so I googled to find out how many 5k all in ones had been sold since they were released. All I could find was everyone raving about how great they were and what a fantastic price.... but nothing on how many were sold. I tried to find out because I couldn't understand why anyone would want to pay $2500 for ANY all-in-one computer. So I'm wondering, is the 5k simply an expensive advertisement for brand Apple? Something that looks great but few people want?

Reply to
Charlie Hoffpauir

Like what?

Reply to
Aardvarks

Right. Instead of numbers that are useful for comparing performance, we might have "it feels fast" or perhaps "it does what I need, which is good enough". I used to do battle with such nonsense when dealing with wireless product design. My standard answer was to suggest that perhaps we need more realistic metrics, test conditions, procedures, and environments, not vague impressions or "mean opinion scores".

The nice thing about performance tests and benchmarks is that under real world conditions, you're NOT going to get any better performance than what is achieved by the performance testing. In other words, it puts a ceiling on what to expect with your real world tests. Testing at maximum speeds also tends to expose anomalies that would not necessarily appear using a real world test. Of course, in the real world, products that advertise big numbers tend to sell better than products advertising the lesser real world numbers, which makes companies prefer benchmarking. It's also rather difficult to compare products tested under different real world conditions. Many real world tests are difficult or impossible to reproduce and often produce different numbers.

Duly noted. Have you tried loading iperf or jperf and run a not so real world test yet?

Reply to
Jeff Liebermann

you've been told several times before.

Reply to
nospam

ok

the former. there's a lot of custom parts in it.

apple doesn't release sales numbers for specific models, but the imac is by far their most popular desktop computer.

but the real questions is why does that matter? apple sells enough for it to be a very profitable product.

the price starts around $1800, not $2500, and while *you* might not want to spend that much, many others do, mainly because the display is very, very good.

since the dell display alone costs about the same price, you're basically getting a free computer with the imac.

Reply to
nospam

That's quite true. The RF pattern produced by a cell phone is tailored primarily to meet SAR (specific absorption rate) specifications. There's very little RF emitted in the direction of the head, while much more out the back. Oddly, the peak for smartphones is often straight down, where there are fewer obstructions and the users hand is not likely to be holding the phone. Try pointing the bottom of the phone at the nearest cell site and see if the signal improves. It does on my Moto G phone.

Measuring the antenna patterns is not easy, but possible. All you need is a $100 million anechoic RF chamber: (1:41) and a huge pile of RF test equipment. I do my best using junk, but it doesn't compare to having the real goodies.

Second best is to model the phone with an NEC4 modeling program. Those are the colorful 3D patterns. I do my best with 4NEC2 free software.

What you'll probably find is that the local RF environment (reflectors and absorbers) has a much bigger effect on RF performance than the cell phone antenna pattern. Both will cause variations in signal strength, often in odd ways. The best I can do is wave the phone around and record the highest reading or the average reading. Neither is perfect, but the effort necessary to obtain a good 3D picture of the phone is just too much work.

In this case, the issue is whether there is a difference in range and performance (speed) between Apple wi-fi devices, and Android wi-fi devices. This can be tested with both types of devices side-by-side and connecting to the same wireless router. I previously posted 2 good ways to perform the test, which so far nobody seems to have performed. Also, nobody has asked me to perform any tests in order to settle the issue, so I'm doing what I do best, which is nothing.

Reply to
Jeff Liebermann

Wow. I like that. Not only are you more cynical than me, you also have an excellent grasp of how to trash a good idea. The problem with your statements, all of which are correct, is that there is no closure. There are an infinite number of tests which "should" be performed in order to produce accurate results. Besides various orientations and locations, such tests should be performed at different temperatures, altitudes, positions of the moon, and using all left handed people. It doesn't matter how carefully one sets up an RF test, there are always missing factors that someone suggests might have an effect. Admittedly, sometimes it does. For example, I participated in one parking lot test where all the 2.4Ghz wi-fi stuff was placed about 3ft above ground level. When I pointed out that at

3ft, the path was well within the Fresnel zone: and should be elevated in order to improve accuracy, the engineer in charge was initially reluctant to change, even though he knew I was right. To his credit, he raised the antennas as high as possible, and the result began to look more consistent and realistic.

As you note, multiple studies tend to offer new insights into a test, which often disprove previous claims. The trick is to have a large number of people performing a test, not necessarily in exactly the same manner. It doesn't take long for a pattern to appear from which additional conclusions can be drawn. That's exactly what happened with cold fusion, where a few claimed to have duplicated the original results, but far more failed.

So, we now have a comparison between Apple and Android. Both devices and wireless routers are sufficiently common that anyone can perform a suitable performance and range test. Claims have been made and all that's needed are a few brave people to extract their faces from their retina monitor and convince first themselves, and possibly others of the relative merits of Apple versus Android.

Reply to
Jeff Liebermann

I've had four iPhones the original, 3G, 4S and 6. Never had a dropped connection. But then I rarely use it while in my microwave.

Reply to
BobbyK

The primary directions for mobile network antennas and WiFi antennas may be different, so one would have to test them independently...

Side-by-side (taking this literally) might be yet another influencing factor, where one device (might) severely interfere with the other. Additionally (forgot to mention that in my previous post) there shouldn't be anybody running around inside the test area (which is larger than just the direct line of sight between the device(s) and the AP), no cars should be passing in the vicinity, there should be no neighboring WiFi networks even at the horizon, ...

I won't do the tests, for several reasons:

- I don't have any Android device available, least several different ones.

- Where I live I can easily and at any time find several other WiFi networks.

- I wouldn't have enough open range (without reflections from other houses, passings cars, heck there are even electrified railroad tracks at about 500m distance).

- ...

Way too bad conditions for performing such a test.

Best regards,

Michael

Reply to
Michael Eyd

Heh heh.

Without a google play account, there's nothing for Google to latch on to.

Reply to
Aardvarks

wrong

Reply to
nospam

True. However, unless you use an RF anechoic chamber, the influences of the room environment will have a bigger influence than the antenna patterns. Reflectors and absorbers will ruin any test, unless you're interested in performing a "real world" type of test, which is what this range test might be considered. For example: does their benchmarks indoors, with plenty of walls and furniture to get in the way. I think it's part of Tim Higgins house, but I'm not sure: The RF environment is far from perfect, but it's identical for each router being tested, which the point of the test:

More on how they run their tests: and even more:

The overall results are rather interesting (to me). Different routers, which use the same chipset and roughly the same antennas, produce substantially different performance results. I don't have time to speculate on why, but let's just say that there are is a large list of uncontrolled factors that have an effect on the measurements. One can eliminate a fair number with a $100 million RF anechoic chamber, but that's a bit beyond my present means.

True. However, if a wireless client is associated with an access point, but not passing any traffic other than the usual beacons and broadcasts, there is very little traffic that might constitute interference. Offhand, my guess(tm) is about a 1/100 duty cycle. Were any of these packets to collide with traffic from a nearby wireless device, the error would be about 1% from the collision.

However, for the range test, this will have no effect because we're not trying to squeeze as many packets as possible through a pipe. We're trying to determine the range at which it is no longer possible to pass packets or where the connection becomes unstable. At worst, packet collisions will "blurr" the results somewhat. I don't consider proximity to be a problem.

Part of the range test is take the tablet or iphone and walk away from the wireless router, noting the range at which traffic ceases. Presumably, one would need to hold the device to do that. At the frequencies involved, placing the device on top of a cardboard box when carrying it will minimize proximity effects and antenna detuning, while still allowing one monitor the device. It's far from perfect, but methinks good enough.

Borrow one or invite your friends to the test. Or, are all your friends Apple users? What a horrible thought.

Not a problem. You're not trying to maximize throughput, just determine how far you can operate before it quits. You can do that with ping, which hogs very little bandwidth, and will not interfere with the neighbors streaming wireless connection.

I think you'll find that at 802.11g speeds, with the wireless fixed at

54Mbits/sec, you'll get about 30 meters range. The transition between working and dead will be quite abrupt, usually within a meter or two. If you find a straight line path that's about 50 meters long, you should be ok.

It doesn't matter. We're comparing two devices, not producing an absolute measurement. Absolute measurements would be nice, so we could compare your results with mine and others, but that's not going to happen without an extremely well controlled environment. However, when comparing two devices, the conditions are identical, and therefore the comparison is quite valid.

You probably spent more time finding excuses to not run the test than it would take to actually perform it. Thanks for at least thinking about the problems involved.

Reply to
Jeff Liebermann

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.