I'm guessing maybe the "European Commission" threw them off, because it's an EU entity (basically the executive branch), not "Europe wide" one, which the name kind of implies. But then "EU" also implies "Europe wide" in its name, and people seem to kind of get the difference most of the times.
The rules the EU establishes will also apply to the EEA, and in practice will almost certainly also be adopted by the UK, which has tended to take its lead from the EU on such matters since Brexit. So, while pedantically these are not rules for Europe, _for practical purposes_ they likely will be.
And even the EU itself is pretty fragmented with various overlapping areas with different rules.
As someone who's studied European relations, I can tell you that it's a real mess, and the fact that journalists don't accurately reporting the facts definitely isn't helpful.
It really should have been EUR for Europe, and EU as in European Union.
I do think the media should aim to do better so agree that the Register should have used the correct term.
Which is, of course, true; however, in English conversation, it's often nothing more than pedantry. In Spanish it makes more sense, since there is a separate demonym for a US person that doesn't co-opt the term "American."
Outside of Romance language speakers born on the American continents, I agree that everyone seems fine calling US-born persons "Americans" without much confusion nor gnashing of teeth.
Which is all kinds of weird because - what about Mexico and Canada? And what about the ‘United states’ part?
It’s just to disambiguate from ‘Americano’ as in what others in South America sometimes use to refer to latin Americans and as a little bit of a FU to the USA, hahah.
In French, people from the Americas are américains. This includes, say québecois and Brazilians. When context matters, people from the US are états-uniens.
The point is that nobody would object if you refer to someone from anywhere else in the Americas as américain. Like my lab mate from Buenos Aires or friends from Montréal. And we’re definitely not in Haiti.
One of the most famous soccer teams in Mexico is even called “Club América”, obviously this doesn’t refer to the US.
That ambiguity disappears if you call it "the Americas", but many places see America as one continent (including Latin America, parts of Europe and the Olympic flag)
And in French the inhabitants of "les Etats-Unis" are "Etats-uniens". I've taken the habit of referring to them as USAians, which often gets negative reactionsand remains rare - but I find it is the most accurate demonym and I'll keep pushing it.
I look forward to the world inventing demonyms for the citizens of the European Union, because at least it will mean that our emerging national body is getting mindshare !
USA is a country and EU is not
And, being on an island, British people are probably never going to stop thinking of “the continent” as at least a little bit of a different thing from themselves.
Radio, by virtue of physically not caring about borders, is a really really hot mess, with lots of very powerful and very monied interests floating around.
It's kinda like Canada and the US.
They align with the EU a lot for commercial reasons but not completely. Same with Norway. And really, if they did align completely there'd be no reason to not join.
Everyone understood that it was the relevant nearly pan-European political entity which was actually designed by the geographical designation.
I see that a lot. It just needs people from not the US, but from Americas in comments. Suddenly there are people that are complaining about it.
The distinction between EU and Europe is very important. They're "word stealing" something as neutral as a geographical concept, to make it political.
But in this case here, probably if EU legislate on this, others non-EU european countries will follow
To be even more pedantic you have to throw in Africa as well, as that is connected by land to Asia just like Europe is! Now we have the supercontinent Afroeurasia which contains like 85% of the worlds population.
Relevant watching: Map Men: How many continents are there? (https://www.youtube.com/watch?v=hrsxRJdwfM0)
If we were to rearrange the continents if anything we should split up Asia further, and split Africa into Northern Africa and Sub-Saharan Africa (maybe giving Sub-Saharan Africa to the same new continent as the Arabian peninsula, but that's debatable)
Are there? Most Latin American countries, who see all of America as a single continent, would disagree.
It really makes no sense to argue about this. As I already mentioned, there is no universally agreed model of which continents exist and where the boundaries lie. In the end people feel like they belong to one or the other and that's as far as it will probably ever get.
Also: VoLTE is not a thing since a lot of years, and probably there are even a ton of smartphones out there that does not support it (and thus switch back to 2G/3G to place voice calls).
[0]: https://newsroom.vodafone.de/2g-abschaltung-macht-lte-und-5g...
[1]: https://www.telekom.de/hilfe/2g-abschaltung
[2]: https://www.rosenberger-telematics.com/en/news/switching-off...
And yes many things broke, train ticket vending machines stopped working, smart meters stopped working, etc. But then the got replaced.
2G and 3G is a horrible waste of bandwidth compared to 5G. Keeping them on and wasting all that bandwidth is borderline negligence.
I think the reasoning was that the heavy data 3G users had already upgraded to 4G and beyond, and low data 3G users could fall back to 2G, so retiring 3G would have negligible impact - while opening up a lot of bandwidth for 4G and 5G.
On the other hand, there were plenty of 2G-only low data users around, so retiring that early would break stuff for a lot of people. Keeping it around longer gave people more time to upgrade.
For example, lower-frequency bands have longer reach, but lower bandwidth. Because everyone has 2G support, it makes sense to put 2G on the lower frequencies as fallback, with 3G/4G/5G on higher frequencies as optional bandwidth booster. But this also means 5G reliability is being limited by its frequency! You could have better 5G - if it could use the frequency currently occupied by 2G...
It also doesn't help that 2G and 3G aren't forward-compatible. They require their own dedicated frequency, so you need to sacrifice a lot of potential bandwidth for a small number of low-data devices. 4G and beyond can play nice with future tech: a single base station using a single frequency can handle both 4G and 5G connections at once - it'll dynamically allocate the air time as needed.
About the elderly: my 95-year-old grandma uses a tablet for video calls and a big-button 4G-capable feature phone. My 85-year-old other grandma has fully embraced her smartphone. Turns out they really like seeing pictures of their great-grandkids! Give them a reason to switch and they will adopt it - they both ditched their land lines.
Same with elevators and stuff: schedule a kill date 5 years into the future and they'll be replaced by 4G-capable units instead of ancient slightly-cheaper 2G-only ones when their warranty inevitably expires.
This is the same reasoning why they keep active the "old" analog telephone network, why not everyone is switched to VOIP, because there are situations where it's still used by stuff that is critical or too expensive to replace.
> with 3G/4G/5G on higher frequencies as optional bandwidth booster.
There are 5G bands in the ~700MHz bandwidth (that was recovered by switching to more efficient encoding for DTV) that could be used that are even lower than 2G that is around 900MHz.
They could (and probably will) dismiss 2G for consumer use, but keep some frequencies that are used by operators that provide MTM SIM.
> Give them a reason to switch and they will adopt it - they both ditched their land lines.
I've tried to make my grandma learn how to use a phone to send SMS multiple times and failed. If she uses a mobile phone (rare situation) she uses it as a landline phone, that is type the number that she wants to call, not even using the contacts in the phone. To be fair I had difficulties explaining how to use a cordless landline phone.
Speaking of elderly, there are a lot of them that have dedicated devices that they can use to make emergency calls to registered numbers, that probably use 2G network (some other use even landline). Since these devices are even provided for free by the national healthcare system, I see that there is not much money to spend to upgrade them.
BTW, are we sure that all the smartphone out there support VoLTE? If not, to make phone calls they need to fallback to 3G/2G, it was a common problem not many years ago, with some providers (Iliad) that even started supporting VoLTE like less than 2 years ago...
And for 2G especially GSM has a physical cell size limit due to TDMA that LTE does not have so in sparse areas the same transmitter location can reach further.
If in your country 2G still has better coverage that's not due to technical superiority of the standard but due to decisions made by the operator.
> mainly because a 2G connection is more tolerant to low signal and noise
Huh? Everything I've heard about 2G indicates that it is an incredibly noisy protocol with horrible congestion characteristics, and that it craps out even when there's only a few devices. Maybe it's only winning because it has nearly completely disappeared?
Its like robocalling, fraud texts and calls you received. Carriers can't sell services to filter them if they fix the issues that allow these to happen in the first place.
Way to fix it is to make the carrier liable. It will be fixed instantly.
https://www.fcc.gov/document/fcc-opens-entire-6-ghz-band-ver...
https://publicknowledge.org/ted-cruz-wants-to-sell-your-wi-f...
https://www.ofcom.org.uk/spectrum/innovative-use-of-spectrum...
Whereas for mobile operators it would be very useful in outdoor/indoor (airports etc) urban areas that are very busy.
Can’t they be just another user of a well established standard or do they want to abuse the crap out of it?
If other types of devices also use your channel, you'll have to shut up and wait for airtime even longer. Having WiFi and cellular co-exist mean that they are both fighting eachother over airtime, and both spending a lot of time silent.
It's preferable to avoid channel overlap when the services need to co-exist.
Tell me what stops them from using the exact same technology they use for WiFi calling? They just want to own the means the people connect to the internet and be a tax on everyone.
It's important to note that "they" are not trying to fix calls, they are trying to improve cellular connectivity. Getting calls to work is easy, and traditional calls have become a niche use of smartphones. Many already have excellent internet connectivity on their devices and would like that to just remain seamlessly available in all situations rather than having to switch technologies and maintain multiple subscriptions.
While I'm quite happy with my fiber at home, I only really use WiFi on my phone at home is to access local devices. In other situations, especially in corporate or public settings, WiFi is not only inconvenient but often a way worse and slower experience than just staying on 5G - after all, my phone gets gigabit on 5G with a public IPv6 address, and the latency is pretty good too, which can't be said for overcrowded WiFi and crap enterprise network solutions. If it wasn't for casting, 5G + tailscale would alleviate most needs to WiFi.
Heck, for the smartphone-native generation it might even seem weird that they need another internet subscription for their home when they pay for on for their phone, "just because" 5G or whatever wasn't allowed to step on a spectrum - a quite literal tax.
(Don't get me wrong, I like my WiFi, but cellular is not the enemy. We just need to hand out more spectrum to both.)
We do need a lot more frequencies to be opened up both for personal and professional use. New technologies, dynamic long and short range connections etc.
We also need to enforce frequency usage as well so that a neighbor of ours doesn’t block the entire 2.4GHz for the entire block with his access points blasting at full power.
Here’s the problem, 6GHz already became a WiFi standard and these cellular companies are lobbying to retroactively change the frequency allocation for themself because they think they can use it better and more importantly all the research and development is already done and they don’t want to waste money developing new technologies there.
But why? Why would we do all the research and development with public fund and then allocate the frequency bands to cellular companies and let them charge people $100+ per month and have 40%+ profit margins while increasing their prices over 60% since 2020.
Hypothetically speaking just a small fraction of that money can be used to put fiber internet all over the place with tons of 6GHz access points and let everyone have free internet.
The cellular companies are late to the game here so they can have a small section of the 6GHz or some of the 7GHz can be opened up but there’s no reason for 6GHz to be retroactively given to them because they lobbied for their own benefits.
> Hypothetically speaking just a small fraction of that money can be used to put fiber internet all over the place with tons of 6GHz access points and let everyone have free internet.
That's what 5G is: fiber running to a bunch of APs running a suitable technology for covering an entire area with a lot of devices in high-speed internet.
WiFi is not that technology (it doesn't target that kind of device density or coexistence, and only really works well with very low device counts in RF quiet buildings), nor would anything about that be cheaper - sounds like the issue you voice is mainly greedy ISPs, while using WiFi deployments would not give them any reason to be any less greedy. A free internet service is a choice that can be done with both WiFi and 5G.
(Yes it would be nice if they didn't both trample on upper 6GHz, but improving cellular, but I'm not sure if WiFi is the greater value prop for those channels.)
2.4GHz is completely unusable in urban environments, because you're getting interference from two dozen neighbours. And everyone has a poor connection, so their "handy" nephew will turn up the transmission power to the maximum - which of course makes it even worse.
6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.
On the other hand, cellular networks are well-regulated: if an airport's entire network is managed by a single party they can just install extra antennas and turn down the power.
And it's not like cellular operators will be able to use it often: outdoor use falls apart the moment there are a bunch of trees or buildings in the way, so it only makes sense in buildings like airports and stadiums. Why would the rest of society have to be banned from using 6GHz Wifi for that?
Besides, didn't 5G include support for 30GHz frequencies for exactly this application? What happened to that?
I agree with this and the fact that 6GHz should still be available for wifi, but this whole bandwidth frenzy over wifi has always seemed like a meme for anyone except power users. A 4K netflix stream caps out around 15mbps, so >95% of typical home users will be just fine using 2.4/5GHz inside their own homes.
In practice it is all about degraded performance. If you're sitting in another room than the AP, close to your neighbour, do you want to be left with 50Mbps remaining out of the original 5000Mbps, or 2Mbps remaining out of the original 200Mpbs?
Yeah, but that's just because Netflix streams are ridiculiously over compressed -- they use extremely low quality encodes. It's technically a "4K" stream, sure, but at a bitrate only realistically capable of 1080p.
An actual 4K stream (one capable of expected resolution at 4K) is around 30 to 40mbps.
I mean sure, its usable, but its not good. You can notice the differences in buffering / scrubbing speed well into the 100+ mbps range.
Plus being able to download and upload files quickly. Particularly from something like a home NAS, is important. 15 mbps is like using a shitty USB 2 stick for everything!
The point here is that only devices like a TV, mobile, tablet or laptop should be on WiFi and it's pretty hard to notice the difference between say 50Mbps and 500Mbps on any of those except maybe if you are moving files around on your laptop.
Your smartphone is not talking to your NAS over Ethernet.
Traffic is bursty. Higher bandwidth connections make the whole internet more efficient - if you can race to idle then servers have fewer concurrent connections to keep track of, routers can more quickly clean up their NAT tables etc etc
I'm no expert and only speak from personal experience. When the signal is weak, you don't have the whole bandwith, you only get low throughput. Ideally you would want a strong, high penetration signal (low frequency) and all users on separate channels. It's of course impossible in densely populated areas.
Whenever I have to deal with setting up WLAN in the office or at home, I hate the experience and I try to use wired connections wherever possible.
It gets really bad when signal is difficult to distinguish from noise because (for example!) everyone is talking at roughly the same power level. Think crowded bar with everyone yelling at each other.
When one is significantly louder than others, even if the others are not that quiet, it’s not a big deal unless at your ear/antenna they have the same loudness. Think concert with big speakers for the main act.
6ghz is better for many isolated networks right next to each other precisely because the others ‘voices’ lose power so quickly. You don’t have the competition for attention. Think ‘every couple in the bar gets their own booth’.
Wired connections are even better, because the amount of noise required to be unable to tell apart signal from noise is orders of magnitude higher - like ‘noisy welder right on top/EMP’ levels. Because the wires can actually be shielded. It’s like having your own hotel room.
Isn't this mostly arbitrary? Eg what frequency range one defines channels over and thus how many channels? Eg in the wikipedia link that "6GHz" goes up to ~7.1GHz. Because otherwise channels seem to be more or less spread centered 20MHz apart in each case.
The intrinsic benefit for the frequencies around 6ghz is the reduced penetration through walls which will also reduce the congestion.
The really big problem here is that 6Ghz also comes with the ability to have 320Mhz towards one channel so its got double the bandwidth of 5Ghz as well as being lower penetration. Its really good for things like VR headsets due to the lower interference and higher bandwidth.
The main benefit is going to be the additional frequency space. 5GHz effectively has 3ish channels, and 6GHz adds another 3-7 to that. Combine it with band steering and dynamic channel allocation, and you and all of your close neighbours can probably all get your own dedicated frequency.
It would be useful if vendors shipped with 40Mhz channels by default.
A 1x1 40Mhz using 802.11ax will give you a max PHY of 287Mbps:
* https://www.intel.com/content/www/us/en/support/articles/000...
* https://superuser.com/questions/1619079/what-is-the-maximum-...
Even if you half that, that's (IMHO) probably sufficient for the vast majority of online activities. And if you have a 2x2 client you double it anyway.
It's not saying 6GHz shouldn't be used for WiFi. It's saying that 6-6.4GHz (approx) is reserved for WiFi and 6.4-7GHz should be used for cellular networks.
My point isn't that we shouldn't have no WiFi on 6GHz, but 1GHz extra for WiFi is limited utility compared to cellular networks.
You can still fit an entire 320MHz channel width in the lower 6GHz and if it doesn't overlap like you say why bother with 3x that?
6GHz barely makes it thought a piece of paper. I live in dense downtown area of Los Angeles and I see zero 6Ghz networks except mine, sometimes three 5Ghz networks (usually just two). No issues using 160Mhz wide channel on 5Ghz, at least for me.
My balcony separated from AP with a 2 panel window, other than that it's in line of sight: 6Ghz not visible at all, 5Ghz poor signal, but better than 2.4Ghz. 2.4 Ghz is unusable in my area at all.
Just today, there’s a news report in India where the major telecom companies have lobbied that the entire 6 GHz band be reserved for mobile services and that even part of it shouldn’t be left for unlicensed WiFi. [1]
The problem in India is that the penetration of wired broadband is very low, and the telcos don’t seem to be interested in expanding it as much as they are in grabbing more of wireless spectrum.
I don’t believe it’s a good move to reserve these exclusively for mobile services. We (in general) need more unlicensed spectrum for innovation. Let the companies figure out another way out.
I also know that these bands are already allowed for unlicensed WiFi use in the US.
[1]: https://telecom.economictimes.indiatimes.com/news/industry/j...
I don't know anything really about India's telecoms market, but I know in other 'similar' countries you can buy a mobile phone data plan for like a couple/few dollars a month, but a fixed line is 10X that. You could argue it's not very progressive to reserve the spectrum for the 'rich' who can afford fixed lines.
Mobile data is cheap, but broadband is much cheaper.
My family lives outside of a tier 2 city border, in what used to be farmland in the 90s.
They have Asianet FTTH at 1Gbps, but most of the video/streaming traffic ends at the CDN hosts in the same city.
That CDN push to the edge is why Hotstar is faster to load there - the latency on seeks isn't going around the planet.
You could go active but then your SFP/SFP+ per port cost eats you up.
For less than 1mil fixed wireless is going to cover 2,800km/sq. You are not going to to get anywhere near that cost trying to do the same thing to 2048(or more) subs in that footprint with fiber. That wouldn't even cover your fiber material cost!
One could see India deploying the same density compatible infrastructure in the usual "leapfrog" model of skipping lesser technology implementations in this space.
LTE is what somebody would do without much telecom experience and more money than sense.
I've built fiber networks and fixed wireless networks. Almost ended up becoming an LTE network as well. It didn't make any sense in any sort of financial modeling, even with spectrum availability.
LTE helps solve "general connectivity". What it does not do is build scalable, reliable, high speed, economical sensitive broadband infrastructure.
Anyway, LTE should be the literal last option. It requires more than 2x as many towers as fixed wireless, with gear more than 20x more expensive. That's also more than 2x-3x the required amount of of battery backup systems, networking equipment, and land / tower leases.
If you have extreme density, you NEED fiber and you need WiFi. You extend from the fiber network with extremely high quality ngFW. To fill gaps, use satellite.
Fiber requires a certain density of subscriber/mile(km), the same as any technology.
Even with 0 labor cost, you still need to get conduit in the ground (materials), fiber, terminations, switching, routing, OLT/ONT cost, handholes, any permitting or utilities location, horizontal boring equipment , jackhammers, splicers, etc. The upfront cost is many, many, many times higher for fiber and if you're okay with your cost-per passing being more than you would ever make on customer ARPU, then sure do that. Even if labor cost was 0. And it will take YEARS longer to deploy and see a return on investment from, of ever.
It doesn't matter if there's broadband to the location if nobody at the location can afford it.
If you want broadband, LTE is the worst option.
Unless local conditions make you want to use aerial cable, you'll just cable plow a speed pipe and put in a small access riser every 2~3 miles. You blow the cable in segment-by-segment, either splicing at these locations or spooling the ongoing length up before moving the blower and doing the next segment.
If the cable is damaged you measure with OTDR where the break is, walk there with a shovel, some spare speedpipe, and two speedpipe connectors. You dig out the damage, cut it out, put good pipe in, join it to the open ends where you cut the damaged section out, and bury it while taking more care to make it last better this time. You pull/blow out the section of cable and blow in a fresh one, splice it to the existing cable and both ends of the segment, and the connection is fixed.
AFAIK cable plow for fiber in not-very-hard ground is cheaper than planting "telegraph" poles like they did in the old days.
The only expensive parts about fiber optic Internet are the machine that allows you to splice (about 1k$, unlike the 5$ LSA tool for attaching RJ-45 sockets to Cat.5/6/7 cable; this only blocks DIYers from easily doing it) and digging up developed area with more finely controlled tools than a literal plow if you forgot to put in speed pipe the last time the ground was dug up for any infrastructure at all (say, piped water).
Oh, and arguably the optics if you expect to be cheaper than copper on distances within a building at speeds under 10 Gbit/s.
It might have issues with the cost of digging, though.
It's the "homestead in Maine" type of situation where the fiber plow enables economic viability of burial.
Uh... wat? Something like 70%+ of all internet data anywhere goes over a 2.4GHz wifi for its terminal client, squashed into a paltry 100 MHz of spectrum.
There are surely engineering minutiae arguments to be made for why radios for dedicated bands can be better in some way, or public safety arguments for why unlicensed users need to be segregated from the system that provides emergency service.
But "more efficient use of the space" seems ridiculous on its face.
A) give the richest 15K of them absolutely no faster WiFi whatsoever because 5GHZ will not be congested at all for them (so there is no problem to solve really)
OR
B) you can have the mobile carrier install a 6Ghz base station on every other telecoms/power pole in town and offer up terabits of mobile data capacity available to everyone throughout the town.
What's the most efficient use?
As for most efficient use of the resource, well, consulting my spectrum analyzer, ISM bands are winning by a mile and we should want more of them.
ISM is tragedy of the commons; make it free, let anyone do anything and it becomes junk. Carriers need something they have exclusive use of.
Carriers don't need 6GHz for backhaul. They have fiber and cable and (other) microwave. Not to mention the ability to shape their own links with antennas and beam forming and do a good job of it rather than a "default job." What they don't have -- and shouldn't be given under any circumstances -- is the excuse to build a moat in the bustling public park.
On top of that, mobile data is quite expensive in the US, so the only time I have data when out and about is... when I'm on free public wifi networks (which is most of the time). So I don't see much reason to give more of a monopoly to mobile providers. I honestly don't even see a use-case for cell service outside of super rural areas; the only reason I even have it is because it's necessary for MFA. Cell providers are legacy tech as far as cities are concerned IMO.
To me it'd make way more sense to me to let wifi have more bands with stricter limits on power levels, and any exclusivity should be to municipalities who can contract with companies to build and manage their infrastructure.
It's not 2015 - that narrative is long dead. There are countless options for unlimited mobile data (5G, with hotspot) for $15-$20/mo.
It's true that there's no single service one can sign up for and you have to bounce around cafe and Xfinity and whatever "Free WiFi!" networks are being offered. Which is definitely annoying and it's nice to have a single company sell you service in a neatly packaged "phone" product.
But again, trying to phrase that as a technical point is ridiculous. Free bands are just plain better, technically. You get more data to more people for less money using open spread spectrum protocols than you do with dedicated bands. Period.
Fiber won't go everywhere, fixed wireless extends the reach much, much cheaper than LTE, satellite fills in the gaps.
If you don't know what you're talking about, why even bother to post? Maybe wait for a topic that you know something about before responding.
For broadband I pay 10$ a month for 100 Mbps.
Mobile is terrible at times, Broadband service is amazing, even though it is slow.
Broadband is not that common
I live in a EU country in an apartment and 5GHz is completely crowded and pretty much unusable because of DFS (making your WiFi AP unexpectedly stop to do a complete scan and choose a new channel), so 6GHz is the only stable, high bandwidth option here, and we need more channels so that most peopole chan switch to it.
The cellular networks operators can have that shitty 5GHz part of the spectrum if they want it!
It will be as crowded as 5
Yes. Probably because they have some basic grasp of electromagnetic reality, which perhaps you might consider studying a bit before forming strong opinions?
>It will be as crowded as 5
Physically impossible. 6 GHz simply does not have the material penetration, that's the point. Having way more raw bandwidth on tap, all available all the time without DFS plopped in the middle too, is also extremely helpful of course too. But the signal just not traveling as far and not going through walls well is the core thing. You don't need special effort EM shielding for it so much, bulk material will do it. And WAPs are cheap now. Having a higher number of smaller cells has been best practice for awhile already, and 6 GHz takes that much further.
APs using 160MHZ channel widths with 1 or 2 spatial streams because it's cheaper than 80MHZ channels and 3 or 4 spatial streams. Absolutely crap 'auto' channel selection, too high a power (because cheaper than a second AP), poor AP placement and inappropriate channel width (in an apartment block 40Mhz per AP might be optimal).
Now each AP has to have 3 radios, 2.4ghz for compatibility, 5ghz for compatibility while still maintaining some performance and 6ghz for performance.
What about when 6ghz is full of the same crap, do we add 7ghz?
To the extent "we" said this, we were absolutely, 100% correct. 5 GHz was and remains a massive improvement over 2.4 GHz, exactly as hoped. But in the decade and half since demands have gone up a lot. 6 GHz will be even better as it propagates even worse and has even more bandwidth available, while human population density won't change.
>I'm sorry to say it's not true, there's more than enough spectrum in 5Ghz if properly managed and co-ordinated
I'm sorry to say you're wrong, there is not remotely enough usable spectrum, and that's regardless of "proper management" which in reality is completely contrary to the practical reality local networks in a setting with a high density of independent people/organizations.
>I would rather fix that first.
That's nice. Most fortunately you are not in charge.
>Why is it we can run WiFi for thousands of developers in one room/venue just fine
That's a low demand situation under the control of a single entity where people are going to be understanding of compromise given the special circumstances, unlike in home or business.
>but people living in apartment blocks are apparently struggling with a dozen devices per 60sqm apartment?
You're wondering why might want their own independent LANs in their own homes? Well, I'm sure you can think of one or two reasons if you put your mind to it.
Really? Is there something special about 6 GHz absorption through common construction materials? Otherwise, why would a 20% higher frequency be that much worse?
5Ghz has 500Mhz worth of total bandwidth, while 6Ghz has (in US/CA, and hopefully in EU eventually) 1200Mhz. That's over double.
6Ghz has more 160Mhz channels (7) than 5Ghz has 80Mhz channels (6).
About 40% of the US population lives in a coastal county:
* https://coast.noaa.gov/states/fast-facts/economics-and-demog...
About two-thirds (66%) of the US population lives with-in 100 miles (150km) of the border:
* https://www.aclu.org/news/immigrants-rights/your-rights-bord...
The US population is more concentrated than many people realize.
In reality, a 1x1 80Mhz connection gives you a 600 Mbps PHY rate:
* https://www.intel.com/content/www/us/en/support/articles/000...
* https://superuser.com/questions/1619079/what-is-the-maximum-...
Even if you halve that, how many online activities are going to make use of that bandwidth? And if you have a 2x2 client, you double it anyways. A 1x1 40Mhz using 802.11ax will give you a max PHY of 287Mbps. How many activities use >100 Mbps, especially continuously?
Off the top of my head: certainly downloading a new game or software updates can eat up those bits, and photo/video editing or creation (local NAS or uploading) it might be useful; are there any other activities that use that?
As I commented elsewhere: it would be great if residential Wifi devices defaulted to 40 MHz.
Wi-Fi 6E and later standards that unlock 6 GHz are designed to mitigate contention through several dynamic power management and multiplexing capabilities: TWT, MLO, OFDMA, improved TPC, etc. While these things aren't somehow inherent to 6 GHz, the 6 GHz band isn't crowded with legacy devices mindlessly blasting the spectrum at max power, so it is plausible that 6 GHz Wi-Fi will perform better in dense urban environments. The higher frequency also contributes because attenuation is substantially greater, although in really dense, thin-walled warrens that attenuation won't solve every problem.
I know if I had noisy Wi-Fi neighbors interfering with me, the few important Wi-Fi only devices I have would all be on at least 6E 6 GHz by now, not only because 6 GHz has fewer users, but also in the hope that ultimately, when the users do appear, their devices will be better neighbors by design. I don't actually have that problem, however. The nearest 5 GHz AP I can actually see (that isn't mine,) in Kismet (using rather high gain antennas) is -96 dB, and my actual APs hardly ever see those at all. I've yet to actually detect a 6 GHz device that isn't mine. I known there are a few because the manufacturers and model numbers of many APs are visible, but between the inherent attenuation and the power level controls, I don't see them.
Yes.
5 Ghz has 12x 40MHz channels, 6 Ghz has (in the US/CA where it is basically 'fully unlocked' for Wifi) 29x 40Mhz channels. It's the difference between 500Mhz worth of total bandwidth and 1200Mhz: over double.
* https://www.everythingrf.com/community/what-frequency-band-d...
* https://spectrum.potatofi.com
* http://www.potatofi.com/posts/spectrum-viewer/
And given attenuation increases as frequencies goes up, your neighbours' signals won't travel as far as the lower frequency bands, which helps with localization.
We just have to hope that vendors don't ship 80 or 160Mhz channels by default for residential devices, which will potentially eat up bandwidth (though Wifi 7 makes Punctured Transmission / Preamble Puncturing mandatory, where previously it was optional). Though even if they do, 6Ghz has more 160Mhz channels than 5Ghz has 80Mhz ones (7 vs. 6).
A 1x1 40Mhz using 802.11ax will give you a max PHY of 287Mbps:
* https://www.intel.com/content/www/us/en/support/articles/000...
* https://superuser.com/questions/1619079/what-is-the-maximum-...
Even if you half that, it's (IMHO) probably sufficient for the vast majority of online activities. And if you have a 2x2 client you double it anyway.
Define "many". The US average, as of 2023, seems to be ≤150Mbps:
* https://www.opensignal.com/2023/05/23/usa-fixed-broadband-ex...
Cloudflare has data of more sustained-use bandwidth that shows lower numbers:
* https://radar.cloudflare.com/quality
1x1 40Mhz = 287Mbps PHY ~ 143Mbps realistic ~ 100Mbps probable. Double that for 2x2 40Mhz: 200 Mbps.
Certainly some connections may need more, and is the reason for >40 Mhz options, but I'm not entirely convinced one of those should be the out-of-box default.
This less of a concern in 6 Ghz because there are many more channels, but this is what the story is all about: how is that frequency band going to be allocated? In US/CA all of it basically went to Wifi, and that gives folks more options, even in more densely populated areas.
The Wifi Alliance may wish to provide guidance on this, at least for the residential space. Some ideas:
* have the router/AP do a speed test (at boot; regularly), and if the connect is ≤X Mbps then the wider channels won't help anything;
* do a sweep of the band (boot up; intervals), and if ≥Y% (50?) of it already in use, default to narrower channel.
Either/both of these would be done on a default "Auto-choose" setting, with allowable manual override.
(Student wifi hotspots in a large lecture theatre, that's another problem entirely!)
Hard, hard no.
Or, if you are not being sarcastic, the solution to wireless networking issues is to… rebuild cities in which billions of people live and spread those people over… where? Never mind the fact that cities are the best way of arranging lots of people, where would you build those "houses not so close to each other" that is not a desert, a cliff, or an ocean?
“They’ll have congested internet!” would go well alongside “It will block the sun on my daily walk on that part of the street!”
Are they legitimate worries? I don't know.
But certainly more legitimate than congested internet and shade.
I have lived in the centre of a city in a victorian apartment block and looking back it was a dream. About a foot of brick wall between me and the next flat either side. Never heard a peep, excellent WiFi.
Having said that I'm still mad they removed it from the MacBook pro and that was like 14 years ago now so I feel you.
It's enough to stream 4k video (though barely, and I'd be better off moving to a TV), has better wall penetration and is fast enough for browsing and updating software.
I don't have to deal with congestion though. I think I've only seen a neighbour's AP once and I doubt they started hiding their SSID. My guess is that congestion is an issue because transmission power isn't low enough and there's little you can do to fix someone else's AP other than be increasingly louder than them.
5Ghz is stuttery and laggy and makes it pretty much impossible to have a clean video call. I don't game, but gaming on it would be miserable. I've measured latency, and it regularly spikes above 1s.
6E is far, far better. Rock solid video calls. Latency testing sites show low, steady latency. The only real problem is signal attenuation seems to be far worse with 6E. Getting a signal 2 walls away from the router is nearly impossible. Though this is also a strength, as it limits the number of devices competing for spectrum.
What channel width (20/40/80/other) are you typically seeing?
Our ISP provided router does seem to default to 20MHz (I think; I cannot recall if I changed it). It offers the choice of 20 or 40Mhz.
Even a few sheets of drywall greatly attenuates 5 GHz. Your scenario simply seems impossible unless you have a weird router that can only utilize a tiny portion of the channels.
* The reach for 6GHz by mobile service providers is straight-up greed, as Wi-Fi is a threat to their business expansions towards monopolization
* Wi-Fi is incredibly overcrowded, and a shift to 6GHz will not solve the underlying issues causing the crowding in the first place (mobile device density, over-reliance on Wi-Fi instead of running ethernet to capable devices and drop points)
* ISPs would prefer mobile service providers get 6GHz so they can get higher speeds to fixed receivers without the requisite network buildout
My personal position? Give 6GHz to Wi-Fi, but also make it clear that this is the last spectrum the standard will get. Simultaneously, promote (through regulations, subsidies, or municipal buildouts) wired networking wherever practicable. The fact new construction in 2025 doesn’t mandate ethernet drops in every non-bathroom is what’s contributing to Wi-Fi crowding, and prevalent last-mile wired access ensures that mobile operators have to compete on cost rather than data caps - and thus hinders their monopolization efforts.
See also: cell phones having to boost power for more distant towers.
As things are, every device has to scream to be heard.
Can tablets and phones use ethernet? Yes! Should they? Perhaps for a fixed installation, but otherwise no, because that’s not their primary role? Same goes for laptops: if it’s stationary, plug it in; if not, WiFi is fine.
The goal is to shift traffic where reasonable and practical onto wired networks. Desktops, laptops, set top boxes, streaming rigs, control panels, SBCs, game consoles, the list goes on. The only “Wi-Fi required” devices are really just laptops, phones, tablets, watches, and similarly high-mobility devices.
Otherwise, ethernet should be the norm.
I remember a lot of people at the time getting really upset about how wasteful that was. Just saying.
This would raise the cost of construction, in the middle of widespread housing shortages.
You’re talking a $10k expense on a new home, tops. That’s chump change and easily shoved into regulations regarding new builds without significantly harming progress. The real regulations impacting housing are zoning anyway.
Just waiting for the Wi-Fi cops to show up
mvandermeulen•2mo ago
buddhistdude•2mo ago