I’ll apologize for starting it. Hadn’t heard of CHIP or this fella when I posted it, and hadn’t really seen it mentioned on here. I thought it was slightly interesting. Still don’t know who the hell he is but apparently he pissed y’all off at some point!
I expect that the existence of CHoIP and Thread are basically going to work out as the cartoon nickrout posted suggests.
(They’re not going to be the “standards” everybody changes to so much as they’ll be yet another “competing standard”.
However, this is the beauty of something like Home Assistant. Rather than worrying about which of all those competing standards I’ll embrace, I’ll just use HA and integrations to put them all together into one UI that controls them all.
My apologies, I thought the purpose of buying prosumer level or better WiFi hardware was that it could actually survive with other signal sources within the nearest half a mile.
Imagine my surprise when rolling out Unifi APs to high-density office blocks in the middle of London to discover that they’re rendered completely useless by the nearest neighboring office with a microwave!
It’s a good thing you live on a farm! Best keep those button cell powered zigbees away from your radios!
See you in 4 months =D
P.S. The non-overlapping channels between ZigBee and Wi-Fi are 15, 20, 25, and 26
Of course it will be, but imagine again the effort what has to be spent to create integration. If companies pick up CHIP, then a single integration will cover hundreds of brands.
It is a better question, will it allow local control or only through a cloud service. As partially based on ZigBee hopefully that’s not the case.
Zigbee will not harm your wifi. It is the opposite way around, and having multiple wifi IoT devices on your network, I would worry more about keeping legacy standards active to make them work. But that’s my 2 cents.
And just to be honest, I would really worry more about bad quality USB3 cables and ports than the low power Zigbee…
Oops… I posted that before…
Otherwise Aqara/Xiaomi has just recently came out with a set of Zigbee 3.0 certified products, might not sold worldwide yet.
Look at the 433Mhz band again, that is fairly full of all sort of devices (remotes, weather stations).
It is simple, do not buy products from outside of your region. It is the issue with the sellers who are selling incorrectly advertised products outside their region.
Otherwise Zigbee has some specific bands as well for Americas and Australia, and for Europe as well, but they are not commonly used by any manufacturers.
That’s not how radio works. It’s a two way street. Sure Wifi may drown out zigbee completely as it is the stronger signal but the zigbee devices are also increasing the noise floor that wifi has to deal with, leading to reduced range/performance.
So mixing them is not a good idea.
As I said originally way back in the spaz. The best outcome would be for CHIP to pick an alternate unoccupied band.
Those are not problematic. Look into the definition of duty cycle. Interference with devices transmitting short bursts of data can be easily mitigated by error correcting protocols. The problem are devices that transmit data continuously. And you won’t find those on 433MHz, because this band is duty-cycle regulated.
Well no, it’s not that simple. Ask the Australians about the ‘huge’ choice of ZWave devices they have… This is about the pov of the manufacturer. Supporting and manufacturing your device in n different permutations versus only one. Guess what manufacturers are going to pick.
Actually look at who manufacture Zwave chips vs who manufacture ZigBee ones. You will be surprised. (Zwave wasn’t an open platform until recently…)
Australia is an odd example… Look at car manufacturing as well. Do you think that’s the same as IoT?
The local frequencies has been approved according local regulations. First responders use the same frequencies in some regions what matches with other regions Zwave frequencies.
Zigbee on the channels I provided above has been experimentally demonstrated to cause between 0-2% packet loss at worst.
Picking an “alternate unoccupied band” is not a thing (especially if you actually care about range/performance) - even if we ignore licensing
Zigbee is a 2MHz wide channel, which fits in the 2MHz gaps between the 20MHz wide 2.4GHz wifi channels. It has been independently experimentally demonstrated to have negligible interference at best when positioned between the primary WiFi bands. Zigbee devices are low-power, and tend to only broadcast in small, intermittent bursts. So now we’re looking at a small percentage of time during which that 0-2% interference will even have an impact.
If you truly cared about 1% performance, you wouldn’t be running legacy 2.4GHz devices that are already limited by the WiFi spec.
If you truly cared about performance, you wouldn’t be running 3 separate channels, you’d be using wider channels for higher throughput.
If you truly cared about performance, you wouldn’t be loading more devices onto the same wifi network, which actually has a measurable impact on performance.
If you truly care about range, then I’d expect your APs to be far enough apart that you can run them across 2 alternating channels, or you’d set up repeaters, or you’d be running LR APs, or you’d be using one of the array of other range-extending options available to you.
If the local grammar where I was born can deploy over 130 Ubiquiti APs in a single school - I’m sure the tech can manage 3 APs and a few rogue radios.
Overall, this fear of zigbee reminds me a lot of anti-vaxxers who go on about the “mercury” in vaccines causing autism, because they heard vaccines have mercury, and they heard mercury is bad, but missed out on the 4 years of medical school in between.
Mate, if you want to mix the two go for it.
Sure, that’s the theory and the ideal. Look, however, at how Philips and others have implemented it and you’ll likely (and sadly) see the future of CHoIP.
I’ve always thought that standards should be standards, but apparently that doesn’t play well in some forms of profit motive.
Wifi and zigbee do interfere. period. Zigbee uses a pretty clever error correction scheme to reduce it but it is not none. I tested it on my own installation. Your post is dogmatic and presumptuous.
You don’t always get a choice about the wifi channels you run, especially when you have neighbors and so you may not be able to run with channels 1, 6, 11. I know I cannot. The interference is a lot more than 1%. I had a case where my wifi completely overwhelmed my zigbee signals rendering them unresponsive. There is no zigbee channel not interfering with wifi 2.4. Yes many other devices interfere as well but not as much as these 2 (+ microwaves which blanket the whole spectrum). I have walked around my house with a spectrum analyzer to verify all these.
Running 3 separete wide channels is what brings the most performance: Avoid channel overlapping and maximize width. If You reduce width you would have less performance but less interference.
It’s not so much about “caring about performance” as it is about efficiency of an installation: minimize power consumption, reducing retries, packet collisions, useless RF emissions and optimize and improve scalability and reliability. You sure can do what you want with 130 ubiquity units, it certainly makes them very happy selling these underperforming and overpriced junk as “prosumer” but is inefficient and wasteful.
As for anti vax… until you experience it yourself or it strikes close to you, your analogy is horrible and it is pure dogma. Try feeding yourself some asbestos, cigarettes and lead which were all taught to be good for your health in med schools not so long ago.
CHIP is yet another standard trying to be the standard and I don’t really see what is new about it. Certainly, if running on wifi, I won’t be touching it. My combination of z-wave and zigbee works perfectly fine as I went on a rampage a couple of years ago getting rid of all my wifi HA devices which were wasting wifi airtime and bandwidth, keeping wifi for higher throughput devices. Zigbee in this respect is much better than wifi but zwave is better…
Actually Philips bulbs are following ZLL and the Philips environment brings out the max of the standard.
Look at the Hue Sync box and Ambilight or any dynamic scenes with multiple bulbs.
I would rather call out Xiaomi, Tuya or Wiser (Schneider Electric), but the last has never stated that their devices Zigbee certified.
Anyhow, the standard gives a lot of options for manufacturers to implement proprietary commands on the manufacturer specific clusters, and that maybe abused a bit too much. And it wouldn’t be a problem if the manufacturers would share how these specific clusters are used, but they don’t.
Found the anti-vaxxer
As someone on the spectrum - I’d rather have Autism, than be dead.
Just because you poorly configured your network, doesn’t mean it’s not possible to do it right.
The salient question for me is "Can I use this with [generic ZigBee hub] or do I need the [brand-name hub] to control it. My understanding is that Hue bulbs fall in the later category. That’s what I was referring to.
Agreed on Xiami, Tuya, Wiser, et. al. Which actually brings us to my main point. When a standard isn’t really the standard, you end up needing multiple hubs and integrations to make the disparate products work together. HA does a great job of doing that, but it’s still a bit annoying that manufacturers do the only partially compatible thing. A lot of manufacturers do it. It’s still annoying.
My house is 100% Hue Bulbs and 0% Hue Hubs - I’m using a simple slae.sh stick, but I suspect other hubs would also work.
Actually Hue bulbs works with other platforms.
Just an example:
The only issue is that they do use ZLL, and that has limitation on Zigbee channels, if your current Zigbee channel is none of those then you need to change the channel. The other “issue” is resetting bulbs, that is another story, but nothing to do with Zigbee really, that was only a choice of the original Hue build.
Otherwise for the really fancy light control you need a Hue bridge, but that is only because the heavy lifting is done by the programming of the app or sync box and the bridge. It could be ported to other controllers as well, but most of them doesn’t like the high frequency changes of colors and brightness.
The annoying is really, when the standard is not defined in a way that everyone would understand the same, but actually different manufacturers (programmers) interpret it differently.
It has happened with Z-wave and multi endpoint devices. Fibaro and Qubino historically interpreted differently, and a compliance with the current Z-wave standard rendered their devices non-working due to noncompliance. That is annoying…
SiLabs have nothing to do with local frequency management. All their ZWave chips work on the entire supported spectrum covering all zones. It’s up to the integrator / manufacturer of the actual IoT device to localize a chip to a certain geographical zone by using appropriate crystals and efuse settings. It’s these manufacturers that have to deal with all the overhead that comes with having localized products: design and manufacturing multiple versions of a single product, supply chain, support, etc. If they can avoid all that by using a globally available frequency band, they will. Even if that means adding yet more interference to an already crowded band.
I actually like Paul Hibbert.
He gets a little bit nutty but I think he makes some good points and he has decent reviews.
But maybe I’m just a bit simple… (nah, that’s not it…)
I have a mix of a Unifi LR AP and a couple of ASUS routers running on divided non-overlapping channels so I have the whole spectrum taken up by 2.4GHz wifi. And I have a strong microwave that garbles my Bluetooth headset if I get too close.
I also have many zigbee devices running on the default channel (15 I think?). I have 2 zigbee radios that are only 6 inches apart. The radios are 3 ft from my ASUS router and 12 ft from my microwave.
I can’t say that I’ve had any issues with my zigbee stuff in well over a year. It literally just works.
I don’t think I’ve done anything special at all to minimize interference except split my AP’s to different channels.
Maybe I’m just lucky.
Strong? or Leaking?
yes…(10 char min…)