OK so someone has noticed that you "only" need 15MBs-1 for a 4K stream. Jolly good.
Now let's address latency - that entire article only mentioned the word once and it looks like it was by accident.
Latency isn't cool but it is why your Teams/Zoom/whatevs call sounds a bit odd. You are probably used to the weird over-talking episodes you get when a sound stream goes somewhat async. You put up with it but you really should not, with modern gear and connectivity.
A decent quality sound stream consumes roughly 256KBs-1 (yes: 1/4 megabyte per second - not much) but if latency strays away from around 30ms, you'll notice it and when it becomes about a second it will really get on your nerves. To be honest, half a second is quite annoying.
I can easily measure path latency to a random external system with ping to get a base measurement for my internet connection and here it is about 10ms to Quad9. I am on wifi and my connection goes through two switches and a router and a DSL FTTC modem. That leaves at least 20ms (which is luxury) for processing.
reaperman 13 hours ago [-]
There is no way that "15MBs-1" is more clear than "15MB/s", especially in an environment which lacks the ability to give text superscript styling.
Dylan16807 12 hours ago [-]
Even worse when it's actually 15Mb/s.
alanfranz 8 hours ago [-]
Which could be written as 15Mbps with no doubt whatsoever.
notpushkin 8 hours ago [-]
15 Mb·s⁻¹
gerdesj 13 hours ago [-]
Sorry mate, an old habit.
Perhaps we should insist on more from formatting from above. That's probably a post processing thing for ... gAI 8)
throwaway314155 10 hours ago [-]
Old habit from where? Is this a personal preference or a lesser known convention?
_Algernon_ 29 minutes ago [-]
Used that a bunch in physics class in school. Makes it much easier to calculate with units since you simply add up the exponents of the units instead of having to consider the notation around the units.
Doesn't work well in pure text mediums.
reaperman 8 hours ago [-]
Someone said physics, it's also very common in engineering. But academically and professionally, I've only seen it with superscript.
batiudrami 9 hours ago [-]
Pretty common physics convention, uncommon when referring to data speeds but common for other things measured over time.
9 hours ago [-]
silisili 8 hours ago [-]
Oh you'd love Project Genesis.
It has all the latency associated with cell networks combined with all the latency of routing all traffic through AWS.
As an added bonus, about 10% of sites block you outright because they assume a request coming from an AWS origin IP is a bot.
rsynnott 4 hours ago [-]
5G is (at least on paper) _very_ low latency; couple of ms.
ksec 27 minutes ago [-]
It is not on paper. Couple of ms ( 4ms ) is real, except it is not normal 5G but specific superset / version of 5G.
Your network needs to be End to End running on 5G-A ( 3GPP Rel 18 ) with full SA ( Stand Alone ) using NR ( New Radio ) only. Right now AFAIK only Mobile networks in China has managed to run that. ( Along with turning on VoNR ) Most other network are still behind in deployment and switching.
lambdaone 1 hours ago [-]
It is until you congest it. Then, like all other network technologies, performance collapses. You can still force traffic through with EDC, buffering, and re-tries, but at massive cost to overall network performance.
Karrot_Kream 8 hours ago [-]
Latency is an issue but it's not the biggest one. Most real-time protocols can adjust for latency (often using some sort of OOB performance measurement port/protocol.) The issue is jitter. When your latency varies a lot, it's hard for an algorithm to compensate.
fulafel 7 hours ago [-]
You can't fix problems caused by network latency by adjustments in real time apps like video/audio calls or gaming.
Maybe you are thinking of working around bufferbloat?
timewizard 12 hours ago [-]
That's a 1/4 megabit per second. The codec should be around 32,00 bytes and thus 256,000 bits. The problem is the Internet is not multicast so if I'm on with 8 participants that's 8 * 32,000 bytes and our distributed clocks are nowhere near sample accurate.
If you really want to have some fun come out to the country side with me where 4G is the only option and 120ms is the best end to end latency you're going to get. Plus your geolocation puts you half the nation away from where you actually are which only expounds the problem.
On the other hand I now have an acquired expertise in making applications that are extremely tolerant of high latency environments.
nomel 12 hours ago [-]
The first time I learned about UDP multicasting, I thought it was incredible, wondering how many streams were out there that I could tap into, if only I knew the multicast group, and all with such little overhead! Then I tried to send some multicast packets to my friend's house. I kept increasing the hop limit, thinking I was surely sending my little UDP packet around the world at that point. It never made it. :(
Hikikomori 1 hours ago [-]
Mbone.
gscott 4 hours ago [-]
Secret is to try off alg.
gspr 3 hours ago [-]
And then comes all the latency _jitter_ inherent in a shared resource like 5G, and on top of that shenanigans like (transparently) MITM-ing TCP streams on a base station by base station base.
JasserInicide 11 hours ago [-]
Barring some revolutionary new material where we can get EM waves to travel orders of magnitude faster through it over current stuff, I don't think we're ever getting around the latency problem.
ahartmetz 7 hours ago [-]
It's usually not a speed of light problem. It's a problem of everyone optimizing for bandwidth, not latency, because that is the number that products are advertised with. Speed of light is 200'000-300'000 km/s in the media used, that should not be very noticeable when talking to someone in the same country.
lambdaone 1 hours ago [-]
See my comments about L4S.
viraptor 17 hours ago [-]
> Could maximum data speeds—on mobile devices, at home, at work—be approaching “fast enough” for most people for most purposes?
That seems to be the theme across all consumer electronics as well. For an average person mid phones are good enough, bargain bin laptops are good enough, almost any TV you can buy today is good enough. People may of course desire higher quality and specific segments will have higher needs, but things being enough may be a problem for tech and infra companies in the next decade.
aceazzameen 16 hours ago [-]
The answer is yes. But it's not just about speed. The higher speeds drain the battery faster.
I say this because we currently use an old 2014 phone as a house phone for the family. It's set to 2G to take calls, and switches to 4g for the actual voice call. We only have to charge it once every 2-3 weeks, if not longer. (Old Samsung phones also had Ultra Power Saving mode which helps with that)
2G is being shutdown though. Once that happens and it's forced into 4G all the time, we'll have to charge it more often. And that sucks. There isn't a single new phone on the market that lasts as long as this old phone with an old battery.
The same principle is why I have my modern personal phone set to 4G instead of 5G. The energy savings are very noticeable.
bityard 16 hours ago [-]
I actually miss the concept of house phones. Instead of exclusively person-to-person communication, families would call other families to catch up and sometimes even pass the phone around to talk to the grandparents, aunts and uncles, etc.
ssl-3 13 hours ago [-]
It may have been decades ago, but there was a time when it did not seem awkward or even unusual to just...call someone's house, or to simply stop by.
There was usually a purpose in this that was more profound than just being bored or lonely or something, but none was really required.
And maybe the person they were trying to find wasn't home right now, but that was OK. It was not weird to talk to whoever for a minute, or to hang out for awhile.
Nowadays, the ubiquity of personal pocket supercomputers means that the closest most people ever get to that kind of generally-welcome (if unexpected) interaction is a text message.
"Hey, I'm in front of your house. Are you home?"
And maybe that works more efficiently between two individuals than knocking on the door did, but otherwise this present way of doing things mostly just serves to further increase our social isolation.
Sometimes it seems that the closer it is that our technology allows us to be, the further it is that we actually become.
bobthepanda 11 hours ago [-]
unfortunately, these days the inundation of spam from hucksters trying to sell you something, or worse, has now made everyone extremely distrustful of unprovoked interaction.
graemep 4 hours ago [-]
There is a lot less of that in IRL interaction than online interaction. I find months can go buy without anyone knocking on my door.
IT does not help with the just phoning bit but does with stopping by.
I do not get that many spam calls but I think that varies a lot, especially between countries with different laws about it.
11 hours ago [-]
pjmlp 4 hours ago [-]
I can vouch that pass the phone around is still a thing in southern Europe, or being sundenly dropped into a group video call where everyone gathers around the phone to be in view.
little_bear 13 hours ago [-]
I set up an old rotary phone via twilio as a house phone for our kids. They just dial 1-9, each number is programmed to a different family member. Their cousins also have one, so they can call each other whenever they want. It’s great to also have a phone that just rings in the house, for when you don’t care about getting a specific person.
aceazzameen 12 hours ago [-]
That sounds awesome. Do you have any more info on your setup anywhere?
xattt 13 hours ago [-]
I’m looking to set up an Asterisk server that takes calls over Bluetooth on any paired cell phones inside the house, or falls back to a VoIP line if no phones are at home.
Similarly, I find that it’s hard to catch family on their cell and much easier when I call their home.
secstate 16 hours ago [-]
That's a facinating obsevation! I hadn't consider the side effects of calling a common phone and interacting with other people rather than exclusively the one person you wanted to talk/text with. Probably distances you from (or never allows you to know) those adjacent to the person you already know.
anal_reactor 15 hours ago [-]
My siblings are much older than I am and when I was 10 my sister was starting her relationship with now-husband. One time he called our house phone, I picked up, and said "yeah sister is home but she's busy at the moment so you can spend five minutes talking to me instead"
8note 12 hours ago [-]
does that not still happen for you?
passing cell phones around still happens for my family
vel0city 16 hours ago [-]
There are some more traditional home phones which work on 4/5G networks with a DECT handset which talks to a cellular base station. You might look into switching to that model to replace your "cell phone as a home phone" concept. It makes it a bit easier to add another handset to the DECT network and often means convenient cradles to charge the handsets while the base station stays in a good signal spot with plenty of power.
Just a thought when it comes time to change out that device.
aceazzameen 15 hours ago [-]
That's not a bad idea, thank you! I always hate having to retire perfectly good working hardware just because a spec requirement change.
derefr 13 hours ago [-]
Given that it's a house phone, have you tried enabling Wi-Fi Calling (VoWiFi) in the carrier settings (if you have that option), and then putting the phone in Airplane Mode with wi-fi enabled? AFAIK that should be fairly less impactful on battery.
(Alternately, if you don't have the option to use VoWiFi, you could take literally any phone/tablet/etc; install a softphone app on it; get a cheap number from a VoIP number and connect to it; and leave the app running, the wi-fi on, and the cellular radio off. At that point, the device doesn't even need a[n e]SIM card!)
chefandy 13 hours ago [-]
Or just port the number to Magic Jack for $20 and get a cordless phone $20 at Target that you have to charge once per week if you don’t just keep it on the dock, and pay like $5/mo or less for service. And you can make/receive calls from that number on a mobile phone using their app.
Gigachad 14 hours ago [-]
I think it's more that mobile transfer speeds is no longer the bottleneck. It's the platforms and services people are using. If you record a minute video on your phone it ends up as like 1GB. But when you send it on a messaging app it gets compressed down to 30MB. People would notice and appreciate higher quality video. But it's too expensive to host and transfer for the service.
hexator 17 hours ago [-]
A problem for tech companies but not the world.
gambiting 17 hours ago [-]
Get ready to be sold the exact same thing you already own, just "with AI" now.
callc 17 hours ago [-]
And obsolescence via “TPM2”!
yieldcrv 16 hours ago [-]
I’m all for advancements in the size of various models able to be loaded onto devices, and the amount of fast ram available for them
gambiting 7 hours ago [-]
Sure, but why is my Samsung tumble dryer "enhanced with AI" now? You know what it does? You turn it on, it shows "optimising with AI" on the screen....to show you the most used programme. That's not AI, that's a counter in a text file somewhere. But of course everything has to have AI now so my tumble dryer now has AI. The market has spoken.
teamonkey 15 hours ago [-]
A problem when the world’s pension funds contain significant holdings of US tech stocks
sureIy 9 hours ago [-]
I don't understand this "good enough" argument. We never really needed anything we use daily today. Life was "good enough" 100 years ago (if you could afford it), should we have stopped?
4K video reaches you only because it's compressed to crap. It's "good enough" until it's not. 360p TV was good enough at some point too.
rglullis 5 hours ago [-]
> 4K video reaches you only because it's compressed to crap.
Streaming video gets compressed to crap. People are being forced to believe that it is better to have 100% of crap provided in real time instead of waiting just a few extra moments to have the best possible experience.
Here is a trick for movie nights with the family: choose the movie you want to watch, start your torrent and tell everyone "let's go make popcorn!" The 5-10 minutes will get enough of the movie downloaded so you will be able to enjoy a high quality video.
Dalewyn 4 hours ago [-]
>Streaming video gets compressed to crap.
That's because your source video is crap.
I'm not sure if you realize it, but all forms of digital media playback are streaming. Yes, even that MP4 stored locally on your SSD. There is literally no difference between "playing" that MP4 and "streaming" a Youtube or Netflix video.
Yes, even playing an 8K Blu-Ray video is still streaming.
rglullis 3 hours ago [-]
"streaming" was shorthand for "video-on-demand services like Netflix".
Does that help?
theshackleford 7 hours ago [-]
> 4K video reaches you only because it's compressed to crap.
Yes, but I assume when they say the "consumer" they mean everyone not us. Most people i've had in my home couldnt tell you the difference between a 4K bluray at 2 odd meters on a 65" panel vs 1080p.
I can be looking at a screen full of compression artificts that seem like the most obvious thing i've ever seen and ill be surrounded by people going "what are you talking about?"
Even if I can get them to notice, the response is almost always the same.
"Oh...yeah ok I guess I can see. I just assumed it supposed to look like it shrug its just not noticable to me"
sureIy 7 hours ago [-]
You can't ask someone today to see what they're not used to see.
I expect a future of screens that are natively subtly 3D and where you could see someone's nose hair without focusing. Only then they will notice "why do they look blurry and flat" when comparing it to an old TV.
Today if you get closer to a TV you will see blur. Tomorrow you will see the bird's individual strands of feathers.
"Good enough" is temporary.
theshackleford 1 hours ago [-]
> Only then they will notice "why do they look blurry and flat" when comparing it to an old TV.
They either won’t notice or won’t care and even if they do, it takes far longer than enthusiasts expect for the line to move.
Large numbers of people still say in 2025 that ‘4K is a gimmick,’ so I’m not holding my breath. ‘Good enough’ lasts much longer for the majority than most realise.
Look at displays today: I can’t even buy a modern one with motion quality that matches what I had 20 years ago. Why? Because for the average consumer, what we have is ‘good enough’ and has been for a long time.
> Today if you get closer to a TV you will see blur. Tomorrow you will see the birds individual strands of feathers
No, I’ll see blur. Unless you’re suggesting we’ve magically solved sample and hold induced motion blur in the consumer display space?
Of course, I know you meant in a still frame however if I wanted to stare at a high quality still image, I’d save myself the money and just go with some nice framed artwork instead.
> “Good enough” is temporary.
I’ll grant you this on a long enough timeframe. But it’s got a long tail and it’s gonna be a slow ride.
Back in the day, if you knew the secret menu, you could change the default vocoder to use on the network. The cheap cell company I used defaulted to half-rate. I would set my phone to the highest bitrate, with a huge improvement in quality, but at the expense of the towers rejecting my calls around 25% of the time. When I would call landline phones, people would mention how good it sounded.
hypercube33 10 hours ago [-]
For a short while Verizon enabled this on our super grandfathered plan. Well not exactly this, it was some more modern codec dubbed HD and it sounded so good it was freaky and unnerving
pjmlp 2 hours ago [-]
That is the thing with capitalist exponential growth, it doesn't last forever, eventually it flattens out, where new goods only get acquired due to existing ones being replaced, or newer generations getting to acquire their first products.
Business should learn to earn just good enough to make by.
no_wizard 11 minutes ago [-]
Business culture of the 50s and 60s was closer to this, largely due to how taxes on corporations worked. At least if I'm to believe some business historians I follow.
trevithick 14 hours ago [-]
It won't be a problem for them. They'll find a way to make it not enough - disable functionality that people want or need and then charge a subscription fee to enable it. And more ads. Easy peasy.
ekianjo 16 hours ago [-]
Good enough but most of these devices are not built to last.
15 hours ago [-]
anal_reactor 15 hours ago [-]
Due to a bunch of reasons combined, I'm stuck on a plan where I have 5GB of mobile data per month, and honestly, I never really use it up. I stopped browsing so much social media because it's slop. I don't watch YouTube on mobile because I prefer to do it at home on my TV. I don't stream music because I keep my collection on the SD card. Sometimes I listen to the online radio but it doesn't use much data anyway. Otherwise I browse the news, use navigation, and that's pretty much it.
Once every few months I'm in a situation where I want to watch YouTube on mobile or connect my laptop to mobile hotspot, but then I think "I don't need to be terminally online", or in worst-case scenario, I just pay a little bit extra for the data I use, but again, it happens extremely rarely. BTW quite often my phone randomly loses connection, but then I think "eh, god is telling me to interact with the physical world for five minutes".
At home though, it's a different situation, I need to have good internet connection. Currently I have 4Gbps both ways, and I'm thinking of changing to a cheaper plan, because I can't see myself benefitting from anything more than 1Gbps.
In any case though, my internet connection situation is definitely "good enough", and I do not need any upgrades.
pr337h4m 19 hours ago [-]
>Regulators may also have to consider whether fewer operators may be better for a country, with perhaps only a single underlying fixed and mobile network in many places—just as utilities for electricity, water, gas, and the like are often structured around single (or a limited set of) operators.
There are no words to describe how stupid this is.
hedora 18 hours ago [-]
It actually works well in most places. Look up the term “common carrier”.
The trick is that the entity that owns the wires has to provide/upgrade the network at cost, and anyone has the right to run a telco on top of the network.
This creates competition for things like pricing plans, and financial incentives for the companies operating in the space to compete on their ability to build out / upgrade the network (or to not do that, but provide cheaper service).
protocolture 15 hours ago [-]
Your second and third paragraph are contradictory.
Common carriers become the barrier to network upgrades. Always. Without fail. Monopolies are a bad idea, whether state or privately owned.
Let me give you 2 examples.
In australia we had Telstra (Formerly Telecom, Formerly Auspost). Testra would resell carriers ADSL services, and they stank. The carriers couldn't justify price increases to upgrade their networks and the whole thing stagnated.
We had a market review, and Telstra was legislatively forced to sell ULL instead. So the non monopolist is now placing their own hardware in Telstra exchanges, which they can upgrade. Which they did. Once they could sell an upgrade (ADSL2+) they could also price in the cost of upgrading peering and transit. We had a huge increase in network speeds. We later forgot this lesson and created the NBN. NBNCo does not sell ULL, and the pennies that ISPs can charge on top of it are causing stagnation again.
ULL works way better than common carrier. In singapore the government just runs glass. They have competition between carriers to provide faster GPON. 2gig 10gig 100gig whatever. Its just a hardware upgrade away.
10 years from now Australia will realise it screwed up with NBNCo. Again. But they wont as easily be able to go to ULL as they did in the past. NBN's fibre isn't built for it. We will have to tear out splitters and install glass.
The actual result is worse than you suggest. A carrier had to take the government/NBNCo to court to get permission to build residential fibre in apartment buildings over the monopoly. We have NBNCo strategically overbuilding other fibre providers and shutting them down (Its an offence to compete with the NBN on the order of importing a couple million bucks of cocaine). Its an absolute handbrake on competition and network upgrades. Innovation is only happening in the gaps left behind by the common carrier.
rstuart4133 13 hours ago [-]
Yeah, I've noticed the same things with roads. They are common carriers owned mostly by the government, so they never get upgraded. That freeway near my house is always clogged.
Oh wait ... the reason that freeway is always clogged is they are ripping it up, doubling it's width. And now I think about it, hasn't the NBN recently upgraded their max speeds from 100 Mb/s, to 250Mb/s, and now to 1Gb/s. And isn't the NBN currently ripping out the FttN, replacing it woth FttP, at no cost to the cusytomer? Sounds like a major upgrade to me. And wasn't the reason we got the NBN that Telstra point blank refused to replace the monopoly copper infrastructure with fibre?
If I didn't know better, I'd be think the major policy mistake Australia made in Telecom was the liberals to selling off Telstra. In a competitive market when a new technology came along a telecom is forced to upgrade because their a competitors would use the new technology to steal their customers. That works fabulously for 5G, where there is competition. But when the Libs sold Telstra it was a monopoly. Telstra just refused to upgrade the copper. The Libs thought they could fix that though legislation, but what happened instead is Telstra fought the legalisation tooth and nail and we ended up in the absurd situation of having buildings full of federal court judges and lawyers fighting to get reasonable ULL access. In the end Tesltra did give permission to change the equipment at the ends of the wires. But replacing the wires themselves - never. That was their golden goose. No one was permitted to replace them with a new technology.
Desperate to make the obvious move to fibre, the Libs then offered Telstra, the Optus, then anybody money to build a new fibre network - but they all refused to do so unless the government effectively guaranteed monopoly ownership over the new network.
Sorry, what was your point again? Oh, that's right, public ownership shared natural monopolies like wires, roads, water mains is bad. The thing I missed is why a private rent extracting monopoly beholden to no one except the profit seeking share holders owning those things is better.
highcountess 2 hours ago [-]
[dead]
celsoazevedo 16 hours ago [-]
Common carriers have some upsides, but one downside is that it sometimes removes the incentive for ISPs to deploy their own networks.
I was stuck with a common carrier for years. I could pick different ISPs, which offered different prices and types of support, but they all used the same connection... which was only stable at lower speeds.
Gigachad 14 hours ago [-]
Feels a lot like whitelabeling. Where you have 200 companies selling exactly the same product at slightly different price points but where there isn't really any difference in the product.
ClumsyPilot 17 hours ago [-]
> This creates competition for things like pricing plans
If the common carrier is doing all the work, what’s the point of the companies on top? What do they add to the system besides cost?
Might as well get rid of them and have a national carrier.
kemitche 17 hours ago [-]
The companies on top provide end user customer support, varied pricing models ("unlimited" data vs pay by the GB, etc), and so on. It allows the common carrier to focus solely on the network hardware.
hedora 16 hours ago [-]
They also sometimes own the machines in the field closets. So, anyone can rent 1U + a bunch of fiber endpoints for the same price. What you do with the slots is up to you. If there's a problem with the power or actual fiber optics, the common carrier fixes it. (Like a colo, sort of.)
L-four 16 hours ago [-]
They add value by producing complicated and convoluted contracts which cannot be compared easily full of gotchas.
odo1242 13 hours ago [-]
"Common carrier" tends to raise prices for minimum service, though. And once the network is built the carrier is just going to keep their monopoly. You bet they're never upgrading to any new piece of technology until they're legally required to.
timewizard 12 hours ago [-]
It actually has nefarious benefits. Look up the term "HTLINGUAL" or "ECHELON." It's certainly nice for the government to have fewer places to shop when destroying our privacy.
The trick is that this is essentially wireless spectrum. Which can be leased for limited periods of time and can easily allow for a more competitive environment than what natural monopolies allow for.
It's also possible to separate the infrastructure operators from the backhaul operators and thus entirely avoid the issues of capital investment costs by upstart incumbents. When done there's even less reason to tolerate monopolistic practices on either side.
therein 17 hours ago [-]
It also makes it more vulnerable to legal, bureaucratic and technical threats.
Doesn't make much sense to me to abstract away most of the parts where an entity could build up its competitive advantage and then to pretend like healthy competition could be build on top.
Imagine if one entity did all the t-shirt manufacturing globally but then you congratulated yourself for creating a market based on altered colors and what is printed on top of these t-shirts.
jethro_tell 16 hours ago [-]
This was a common way to do things before the telcos in the USA were deregulated in the 2000s and 2010s. At the time it was both internet and telephone but due to the timing of de regulation, it never really took off with real high speed internet, only dsl and dialup.
I used to work at a place that did both on top of the various telcos. We offered ‘premium service’ with 24 hour customer support and a low customer to modem and bandwidth ratio.
Most of our competitors beat us in price but would only offer customers support 9-5 and you may get a busy signal/ lower bandwidth in the back haul during peak hours.
There was a single company that owned the wires and poles, because it’s expensive and complex to build physical infrastructure and hard to compete, but they were bared from selling actual services or undercutting providers because of their position. (Which depended on jurisdiction).
It solved the problem we have now of everyone complaining about their ISP but only having one option in their area.
We have that problem now specifically because we deregulated common carriers for internet right as it took over the role of telephone service.
SSLy 17 hours ago [-]
the in world practice seems to have this worked out. I am working for such provider right now and it is neither cash starved not suffocating under undue bureaucracy
computerthings 16 hours ago [-]
And private companies don't even have to be vulnerable, they can just do nasty things nilly willy, because it might be profitable and they might get away with it. Yeah, there could be ones that don't suck, and then customers could pick those, but when there aren't, when they all collude to be equally shitty and raise prices whenever they can -- which they do -- people have no recourse. They do have recourse when it comes to the government.
And for some things it's just too much duplicated effort and wasted resources, T-shirts are one thing, because we don't really need those, but train lines and utilities etc. are another. I can't tell you where the "boundary" is, but if every electric company had to lay their own cables, there would only be one or two.
And in the opinion of many including mine, for example the Deutsche Bundesbahn got worse when it got privatized. They kinda exploited the fact that after reunification, there were two state railroad systems obviously, and instead of merging them into one state railroad system, it was privatized, but because it made more money for some, but not because it benefits the public, the customers. Of course the reasoning was the usual neoliberal spiel, "saving money" and "smaller government" but then that money just ends up not really making things better to the degree privatization made them worse.
Obviously not everything should be state run, far from it. But privatizing everything is a cure actually even worse than the disease, since state-run sinks and swims with how much say the people have, whereas a 100% privatized world just sinks into the abyss.
grahar64 18 hours ago [-]
In New Zealand we have a single company that owns all the telecommunications wires. It was broken up in the 90's from a service provider because they were a monopoly and abusing their position in the market. Now we have a ton of options of ISPs, but only one company to deal with if there are line faults. BTW the line company is the best to deal with, the ISPs are shit.
Same for mobile infrastructure would be great as well.
kiwijamo 17 hours ago [-]
In NZ we also have the Rural Connectivity Group (RCG) which operates over 400 cellular/mobile sites in rural areas for the three mobile carriers, capital funded jointly by the NZ Government and the three mobile carriers (with operational costs shared between the three carriers I believe). For context the individual carriers operate around 2,000 of their own sites in urban areas and most towns in direct competition with each other. It has worked really well for the more rural parts of the country, filling in gaps in state highway coverage as well as providing coverage to smaller towns that would be uneconomical for the individual carriers to cover otherwise. I'm talking towns of a handful of households getting high speed 4G coverage. Really proud of NZ as this sort of thing is unheard of in most other countries.
wingworks 8 hours ago [-]
Ironically, often you get way faster speeds out on a RCG tower too. (probably due to few users), vs when in the city, where I often get pretty average speeds be it 4g or 5g.
Marsymars 18 hours ago [-]
I dunno, it makes conceptual sense. Networks infrastructure is largely commodity utilities where duplication is effectively a waste of resources. e.g. you wouldn't expect your home to have multiple natural gas connections from competing companies.
Regulators have other ways to incentivize quality/pricing and can mandate competition at levels of the stack other than the underlying infrastructure.
I wouldn't expect that "only a single network" is the right model for all locations, but it will be for some locations, so you need a regulatory framework that ensures quality/cost in the case of a single network anyway.
newsreaderguy 18 hours ago [-]
IMO this can be neatly solved with a peer-to-peer market based system similar to Helium
https://www.helium.com/mobile.
(I know that helium's original IoT network mostly failed due to lack of pmf, but idk about their 5G stuff)
Network providers get paid for the bandwidth that flows over their nodes, but the protocol also allows for economically incentivizing network expansion and punishing congestion with subsidization / taxing.
You can unify everyone under the same "network", but the infrastructure providers running it are diverse and in competition.
suddenlybananas 19 hours ago [-]
I think that it should be run as a public service like utilities and should be as cheap as humanly possible. Why not?
cogman10 18 hours ago [-]
I personally like the notion of a common public infrastructure that subleases access. We already sort of do that with mobile carriers where the big 3 provide all access and all the other "carriers" (like google fi) are simply leasing access.
Make it easy for a new wireless company to spawn while maintaining the infrastructure everyone needs.
daedrdev 18 hours ago [-]
My public utility is bad at its job because it has literally zero incentive to be cheap, and thus my utilities are expensive
cogman10 18 hours ago [-]
> it has literally zero incentive to be cheap
Do private utilities have any incentive to be cheap?
The reason we have utility regulations in the first place is because utilities are natural monopolies with literally zero incentive to be cheap. On the contrary, they are highly incentivized to push up prices as much as possible because they have their customers over a barrel.
ssl-3 13 hours ago [-]
Utilities do have incentive to be cheap as long as there are is the presence of competing offerings and the lack of collusion.
...which is unusual with many utilities, but is also pretty common with wireless carriers in much of the world.
natebc 18 hours ago [-]
I believe the idea is that you shouldn't have a corporation provide the utility if there's only going to be one.
"public utility" implies it's owned by the public not a profit seeking group of shareholders.
sweeter 18 hours ago [-]
A private electric grid is a nightmare. Look at Texas. People pay more, and they get less coverage. It's worse by every metric. The conversation should revolve around, how can we fix the government so that it isn't 5 corporations in a trench coat who systematically defund public utilities and social safety nets in hopes of breaking it so they can privatize it and make billions sucking up tax payer money while doing no work. See the billions in tax funding to ATT, Google, etc... to put in fiber internet that they just pocketed the cash and did nothing.
daedrdev 16 hours ago [-]
In Texas electricity is literally less than half the price than the price in my state on average. (14c/kwh vs 34c/kwh) (I live in California)
If you want to say its worse, perhaps you should check if its actually worse first.
dragonwriter 16 hours ago [-]
A big part of the reason that California average electrical price per kWh is high is that a huge portion of the cost is fixed costs, and California's efficiency push has resulted in the lowest lowest per capita electricity usage (and fourth lowest per capita energy usage) in the USA, so the fixed costs are spread over fewer kWh.
Conversely, Texas has significantly above average use per capita, spreading the fixed costs across more kWh, but still results in higher annual costs per capita, despite lower per kWh rates.
cogman10 14 hours ago [-]
Let's also not forget the cost of the things like the campfire fire. That's a huge bill that needs to be paid and that cost is ultimately going to come out of the kwh rates.
Further, the LA fires might have also been caused by a downed line so that's going to be a fairly big cost to the power company.
dragonwriter 14 hours ago [-]
> Let's also not forget the cost of the things like the campfire fire.
That's, I assume, a reference to the 2018 Camp Fire.
> That's a huge bill that needs to be paid and that cost is ultimately going to come out of the kwh rates.
The Trust established to pay PG&E liabilities for the 2015 Butte, 2017 North Bay, and 2018 Camp Fires, which discharged PG&E's responsibility for them, receives no additional ratepayer funds after its initial funding and is in the wind-down process expecting a single final top-off payment to already approved claimants. So, no, its not a huge bill that will be paid out of future rates.
fny 18 hours ago [-]
Because competition drives innovation. 5G exists as widely as it does because carriers were driven to meet the standard and provide faster service to their customers.
This article is essentially arguing innovation is dead in this space and there is no need for bandwidth-related improvements. At the same time, there is no 5G provider without a high-speed cap or throttling for hot spots. What would happen if enough people switched to 5G boxes over cable? Maybe T-Mobile can compete with Comcast?
javier2 18 hours ago [-]
Well, 5G is unlikely to be built in my area for the next decade, meanwhile 3 operators are building networks in the slightly more populated areas.
kalleboo 14 hours ago [-]
Coverage requirements could be part of the spectrum auction, like when Google managed to get "no SIM locking" part of the spectrum requirements for the 700 MHz band, opening up Verizon phones https://en.wikipedia.org/wiki/2008_United_States_wireless_sp...
18 hours ago [-]
dgacmu 18 hours ago [-]
Competition drives innovation, but also, we've generally seen that things like municipal broadband are _more_ innovative than an incumbent monopoly carrier. Large chunks of the US don't have much competition at all in wired services, and if we approach that in wireless, we are likely to see the same effects starting where the local monopoly tries to extract maximum dollars out of an aging infrastructure. Lookin' at you, Comcast, lookin' at you.
fny 18 hours ago [-]
As you say, "incumbent monopoly carrier" is not competition, so a municipal provider which competes with broadband is a great idea. This article, however, is arguing we don't need more bandwidth, and we need more consolidation of major providers: I'm not convinced.
vel0city 17 hours ago [-]
The T-Mobile 5G Rely fixed-wireless home internet plan offers no caps and no throttling plans.
fny 16 hours ago [-]
It does past a terabyte.
vel0city 16 hours ago [-]
The fine print does say:
> During congestion, customers on this plan may notice speeds lower than other customers and further reduction if using >1.2TB/mo., due to data prioritization
So not really a cap, but a deprioritization. A few friends using it around me routinely use >2TB/mo and haven't experienced degradation, I guess there's not excessive congestion. YMMV.
immibis 18 hours ago [-]
Three things are necessary then:
1. It must be well-run.
2. It must be guaranteed to continue to be well-run.
3. If someone can do it better, they must be allowed to do so - and then their improvements have to be folded into the network somehow if there is to be only one network.
oytis 18 hours ago [-]
Internet is treated this way in Germany, and it's slow and expensive. Eastern European countries that put their bets on competition instead of regulation have more bang for the buck in their network infrastructure
Sabinus 15 hours ago [-]
How come it's failed to provide cost effective internet in the US then?
javier2 18 hours ago [-]
Maybe it is. Building multiple networks for smaller populations comes at enormous cost though. In my country there have been a tradition for this kind of network sharing, where operators are required to allow alternative operators on their physical network for a fee set by government.
HnUser12 18 hours ago [-]
They should study Canada. We’re already running that experiment.
recursive 16 hours ago [-]
How confusing. Now I can't tell whether it's very stupid, not stupid, or medium stupid. Too bad there were no words.
SSLy 18 hours ago [-]
It works very well in at least two very rich European countries, and one bit less affluent but still not exactly poor.
odo1242 13 hours ago [-]
If regulators do this, it would have to be municipal carriers, like the one in that city in Tennessee.
Spooky23 16 hours ago [-]
You’re thinking legacy. In our new Italian Fascist/Peronist governance model, maximizing return on assets for our cronies is the priority. The regulatory infrastructure that fostered both good and bad aspects of the last 75 years is being destroyed and will not return.
Nationalizing telecom is a great way to reward the tech oligarchs by making the capital investments in giant data centers more valuable. If 10 gig can be delivered cheaply over the air, those hyperscale data centers will end up obsolete if technology continues to advance at the current pace. Why would the companies that represent 30% of the stock markets value want that?
15 hours ago [-]
micromacrofoot 18 hours ago [-]
It's not that stupid IMO, they could handle it like some places handle electricity — there's a single distributor managing infra but you can select from a number of providers offering different generation rates
Having 5 competing infrastructures trying to blanket the country means that you end up with a ton of waste and the most populated places get priority as they constantly fight each other for the most valuable markets while neglecting the less profitable fringe
cft 18 hours ago [-]
Who are these "regulators"? Did we vote for them? Were they selected in the process of market competition and attrition?
jay_kyburz 18 hours ago [-]
Having a single provider of utilities is great when owned by the gov and run "at cost". Problem is, dickheads get voted in and they sell the utility to their mates who get an instant monopoly and start running the utility for profit.
fny 19 hours ago [-]
Clearly you did not like playing Monopoly as child.
lambdaone 1 hours ago [-]
Wireless networks' latency problems are almost entirely caused by contention, buffering, and vast over-booking of bandwidth, where raw bandwidth number competition has been over-valued relative to actual network application performance.
L4S is on its way, and may finally be the beginning of the end for bufferbloat and congestion, and vendors of mobile devices are in an almost unique position of being able to roll out network stack changes en masse. And just for once, consumer incentives, vendor incentives and network operator incentives all align - and it's incremental, and lacking in incentives for bad actors.
The development of L4S has been a pincer operation across all levels of the network stack, integrating everything previously understood about latency and congestion in real networks, and one of the most impressive bits of network engineering I've seen in the history of the Internet.
Animats 19 hours ago [-]
> Of course, sophisticated representations of entire 3D scenes for large groups of users interacting with one another in-world could conceivably push bandwidth requirements up. But at this point, we’re getting into Matrix-like imagined technologies without any solid evidence to suggest a good 4G or 5G connection wouldn’t meet the tech’s bandwidth demands.
Open-world games such as Cyberpunk 2077 already have hours-long downloads for some users.
That's when you load the whole world as one download. Doing it incrementally is worse. Microsoft Flight Simulator 2024 can pull 100 to 200 Mb/sec from the asset servers.
They're just flying over the world, without much ground level detail.
Metaverse clients go further. My Second Life client, Sharpview, will download 400Mb/s of content, sustained, if you get on a motorcycle and go zooming around Second Life. The content is coming from AWS via Akamai caches, which can deliver content at such rates.
If less bandwidth is available, things are blurry, but it still works. The level of asset detail is such that you can stop driving, go into a convenience store, and read the labels on the items.
GTA 6 multiplayer is coming. That's going to need bandwidth.
The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.
The CEO of Intel, during the metaverse boom, said that about 1000x more compute and bandwidth was needed to do a Ready Player One / Matrix quality metaverse. It's not that quite that bad.
SteveNuts 19 hours ago [-]
How many people consuming these services are doing so over a mobile network?
For my area all the mobile network home internet options offer plenty of speed, but the bandwidth limitations are a dealbreaker.
Everyone I know still uses their cable/FTTH as their main internet, and mobile network as a hotspot if their main ISP goes down.
Maakuth 8 hours ago [-]
Here in rural Finland, 4G/5G is the only option available to me. I'm getting 50-150Mbps download speed, but often just a dozen Mbps upload. On the night hours it's better and that's when I my game downloads and backup uploads. I think there's going to be another municipal FTTH program, let's see if I get a fixed line at that point.
john_minsk 7 hours ago [-]
Right now - not many. But at some point in the future, if metaverse is everywhere, you can pull out a phone and a combined data for the room you are in might be 100GB. Would we want to have 6G then?
markedathome 5 hours ago [-]
> The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.
The PS5 and Xbox Series S/X both had disks that were incapable of holding a terabyte at the launch of The Matrix Awakens. Not sure where you are getting that info from, but both the X S/X and PS5 were about 30GB in size on disk, and the later packaged PC release is less than 20GB.
The full PC development system might total a TB with all Unreal Engine, Metahumans, City Sample packs, Matrix Awakens code and assets (audio, mocap, etc) but even then the consumer download will be around the 20-30GB size as noted above.
drawfloat 18 hours ago [-]
Few people play games built for mobiles, let alone looking to play GTA6 on an iPhone
Animats 18 hours ago [-]
For the better games, you'll need goggles or a phone that unfolds to tablet size for mobile use. Both are available, although the folding-screen products still have problems at the fold point.
kmeisthax 17 hours ago [-]
On my current mobile plan (Google Fi[0]) the kind of streaming 3D world they think I would want to download on my phone would get me throttled in less than a minute. 200 MB is about a day's usage, if I'm out and about burning through my data plan.
The reason why there isn't as much demand for mobile data as they want is because the carriers have horrendously overpriced it, because they want a business model where they get paid more when you use your phone more. Most consumers work around this business model by just... not using mobile data. Either by downloading everything in advance or deliberately avoiding data-hungry things like video streaming. e.g. I have no interest in paying 10 cents to watch a YouTube video when I'm out of the house, so I'm not going to watch YouTube.
There's a very old article that I can't find anymore which predicted the death of satellite phones, airplane phones, and weirdly enough, 3G; because they were built on the idea of taking places that traditionally don't have network connectivity, and then selling connectivity at exorbitant prices, on the hopes that people desperate for connectivity will pay those prices[1]. This doesn't scale. Obviously 3G did not fail, but it didn't fail predominantly because networks got cheaper to access - not because there was a hidden, untapped market of people who were going to spend tens of dollars per megabyte just to not have to hunt for a phone jack to send an e-mail from their laptop[2].
I get the same vibes from 5G. Oh, yes, sure, we can treat 5G like a landline now and just stream massive amounts of data to it with low latency, but that's a scam. The kinds of scenarios they were pitching, like factories running a bunch of sensors off of 5G, were already possible with properly-spec'd Wi-Fi access points[3]. Everyone in 5G thought they could sell us the same network again but for more money.
[0] While I'm ranting about mobile data usage, I would like to point out that either Android's data usage accounting has gotten significantly worse, or Google Fi's carrier accounting is lying, because they're now consistently about 100-200MB out of sync by the end of the month. Didn't have this problem when I was using an LG G7 ThinQ, but my Pixel 8 Pro does this constantly.
[1] Which it called "permanet", in contrast to the "nearernet" strategy of just waiting until you have a cheap connection and sending everything then.
[2] I'm told similar economics are why you can't buy laptops with cellular modems in them. The licensing agreements that cover cellular SEP only require FRAND pricing on phones and tablets, so only phones and tablets can get affordable cell modems, and Qualcomm treats everything else as a permanet play.
[3] Hell, there's even a 5G spec for "license-assisted access", i.e. spilling 5G radio transmissions into the ISM bands that Wi-Fi normally occupies, so it's literally just weirdly shaped Wi-Fi at this point.
stkdump 11 hours ago [-]
> I'm told similar economics are why you can't buy laptops with cellular modems in them
I don't know what you mean. My current laptop (Lenovo L13) has a cellular modem that I don't need. And I am certainly a cost conscious buyer. It's also not the first time that this happened as well.
john_minsk 7 hours ago [-]
So true. I remember the first time I got access to 5G was on a short visit to Dubai. I got a sim card with ~20GB of traffic and was super excited to try speedtest. My brother told me not to do that because 5G is so fast that if speedtest doesn't limit the traffic size for the test it will consume all of it within 30 seconds the test runs. Guess what? I didn't run the test because I didn't want to pay another $50 for the data package.
If I have 1Gbit connection even with 100G. What's the point? I'm still kind of on 4G if I want 100G to last me a month...
Peanuts99 5 hours ago [-]
Seems damn expensive. In the UK you can get a SIM card for £20 with unlimited data & calls which is run on one of the larger 5G networks. I usually have the opposite problem though, I barely use 5GBs on a given month.
ai-christianson 19 hours ago [-]
We live in a rural location, so we have redundant 5G/Starlink.
It's getting pretty reasonable these days, with download speeds reaching 0.5 gbit/sec per link, and latency is acceptable at ~20ms.
The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload. If we could improve one thing with future wireless tech, I think upload would be a great candidate.
sidewndr46 19 hours ago [-]
I can pretty much guarantee you that your 5G connection has more bandwidth for upload than my residential ISP does
ai-christianson 19 hours ago [-]
Yeah?
We're getting 30-50 mbit/sec per connection on a good day.
repeekad 19 hours ago [-]
In downtown Columbus Ohio the only internet provider (Spectrum) maxes out at maybe 5 mbps up (down 50-100x that), it's not just a rural issue, non-competitive ISPs even in urban cities want you to pay for business accounts to get any kind of upload whatsoever
zamadatix 15 hours ago [-]
Check out Breezeline (previously WOW), they cover nearly the entirety of Columbus and many of the surrounding areas. The minimum plan has 10 up but you can get ~50 up with the standard consumer plans. Bit shitty to deal with though, even for an ISP.
fweimer 18 hours ago [-]
Is it really a technical capacity issue, or just market segmentation? Usually, it's possible to get so-called business service with higher upload bandwidth, even at residential addresses.
jacobgkau 18 hours ago [-]
That's exactly what he said, they want you to pay for business accounts.
repeekad 16 hours ago [-]
Yup, I’m saying it’s not a rural issue to have bad upload, often it’s just a way to charge more money even in a major US city (particularly one with limited ISP competition)
In San Francisco monkeybrains is the best ISP I’ve ever used to date; symmetric up/down, great ping and cheaper than any other provider
iamcalledrob 4 hours ago [-]
As a (former) long-time monkeybrains customer, it's a mixed bag. For a casual home internet connection they're great. For WFH where you need something reliable, perhaps not.
If you're lucky and get pointed at a good site, you can get great speeds. I was getting 500Mbps up/down. If you're unlucky, you might only get 50Mbps.
There's always some jitter and packet loss on the connection too -- much more than a wired connection. Online gaming was not great. Congestion control algos suffered because of this. I would see TCP throughput drop by 90% over long distances because congestion control isn't tuned for these network conditions.
And in the rain my connection would drop out every few minutes.
But for $35/month it was great value, and the whole company is friendly and easy to deal with.
sidewndr46 58 minutes ago [-]
were they using WiFi as the wireless layer?
simoncion 6 hours ago [-]
I am also a quite happy Monkeybrains customer, and have been for a couple of years now.
For any folks who are using cable internet, or shitty DSL, I strongly recommend checking to see if Monkeybrains serves your home. If you have symmetric fiber, or are happy with Webpass, then it might not make sense to switch.
nightpool 19 hours ago [-]
Yes, many residential broadband ISPs top out at 1/10th that.
squeaky-clean 14 hours ago [-]
I'm in NYC, Spectrum is my ISP. 500mb/s down, 10mb/s up. I used to live in a building with symmetric 1G fiber from Verizon, but they don't serve my building.
sidewndr46 51 minutes ago [-]
I'm at 300/10. Actual upload is a bit higher I think, it seems they try and account for IPv4 header overhead or something. If I pay some absolutely bananas monthly fee I can get 30mbps up advertised. That isn't really fast enough for me to notice the difference and the service is unreliable anyways.
ai-christianson 14 hours ago [-]
Maybe once download gets so fast people have a hard time figuring out what to do with it (2-10g+) they'll start finally increasing upload.
sidewndr46 50 minutes ago [-]
That seems unlikely, as it is just more bandwidth they'd have to pay for.
toast0 19 hours ago [-]
> The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload.
For 5G, a lot of the spectrum is statically split into downstream and upstream in equal bandwidth. But equal radio bandwidth doesn't mean equal data rates. Downstream speeds are typically higher because multiplexing happens at one fixed point, instead of over multiple, potentially moving transmitters.
orev 19 hours ago [-]
You identified the problem in your statement: “the ISPs allocate…”. The provider gets to choose this, and if more bandwidth is available from a newer technology, they’re incentive is to allocate it to downloads so they can advertise faster speeds. It’s not a technology issue.
throwaway2037 13 hours ago [-]
What do you need faster upload speeds for?
cogman10 18 hours ago [-]
This article misses the forest through the trees.
I can grant that a typical usage of wireless bandwidth doesn't require more than 10Mbps. So, what does "even faster buy you"?
The answer is actually pretty simple, at any given frequency you have a limited amount of data that can be transmitted. The more people you have chatting to a tower, the less available bandwidth there is. By having a transmission standard with theoretical capacities in the GB or 10GB, or more you make it so you can service 10, 100, 1000 more customers their 10Mbps content. It makes it cheaper for the carrier to roll out and gives a better experience for the end users.
chris_va 20 hours ago [-]
With 5G, I have to downgrade to LTE constantly to avoid packet loss in urban canyons. Given the even higher frequencies proposed for 6G, I suspect it will be mostly useless.
Now, it's possible that that raw GB/s with unobstructed LoS is the underlying optimization metric driving these standards, but I would assume it's something different (e.g. tower capex per connected user).
numpad0 19 hours ago [-]
There seem to be some integration issues in 5G Non-Standalone equipment and existing network. Standalone or not, 5G outside of millimeter wavelength bands("mmWave") should behave like an all-around pure upgrade compared to 4G with no downsides, in theory.
BenjiWiebe 20 hours ago [-]
5G can also use the same frequency bands as 4G, and when it does, apparently gets slightly increased range over 4G.
msh 20 hours ago [-]
In my part of the world 5g actually is rolled out on lower frequencies than 4g so I actually get better coverage.
thrownblown 19 hours ago [-]
I just leave mine in LTE and upgrade to 5G only when I know i'm gonna DL something big.
the_mitsuhiko 21 hours ago [-]
> Transmitting high-end 4K video today requires 15 Mb/s, according to Netflix. Home broadband upgrades from, say, hundreds of Mb/s to 1,000 Mb/s (or 1 Gb/s) typically make little to no noticeable difference for the average end user.
What I find fascinating is that in a lot of situations mobile phones are now way faster than wired internet for lots of people. My parents never upgraded their home internet despite there being fire available. They have 80MBit via DSL. Their phones however due to regular upgrades now have unlimited 5G and are almost 10 times as fast as their home internet.
maxsilver 20 hours ago [-]
> Transmitting high-end 4K video today requires 15 Mb/s, according to Netflix.
It doesn't really change their argument, but to be fair, Netflix has some of the lowest picture quality of any major streaming service on the market, their version of "high-end 4K" is so heavily compressed, it routinely looks worse than a 15 year old 1080p Blu-Ray.
"High-end" 4K video (assuming HEVC) should really be targeting 30 Mb/s average, with peaks up to 50 Mb/s. Not "15 Mb/s".
nsteel 17 hours ago [-]
It's frustrating the author took this falsehood and ran with it all throughout this article.
Dylan16807 12 hours ago [-]
Why? The conclusion that "somewhere between 100 Mb/s and 1 Gb/s marks the approximate saturation point" wouldn't be any different.
nsteel 4 hours ago [-]
Yes, you've nailed exactly why it's frustrating. They still could have written his piece almost as is, even including the napkin-math extrapolations for future tech, and it would have carried a little more weight.
pak9rabid 20 hours ago [-]
Not to mention I doubt they're even including the bandwidth necessary for 5.1 DD+ audio.
kevin_thibedeau 19 hours ago [-]
Audio doesn't require high data rates. 6 streams of uncompressed 16-bit 48 kHz PCM is 4.6 Mb/s. Compression knocks that down into insignificance.
ziml77 20 hours ago [-]
On one hand it's nice that the option for that fast wireless connection is available. But on the other hand it sucks that having it means the motivation for ISPs to run fiber to homes in sparse towns goes from low down to none, since they can just point people to the wireless options. Wireless doesn't beat the reliability, latency, and consistent speeds of a fiber connection.
dageshi 20 hours ago [-]
It doesn't beat it but honestly it's good enough based on my experience using a 4g mobile connection as my primarily home internet connection.
harrall 20 hours ago [-]
5G can be extremely fast. I get 600 MBit over cellular at home.
…and we only pay for 500 MBit for my home fiber. (Granted, also 500 Mbit upload.)
(T-Mobile, Southern California)
throitallaway 20 hours ago [-]
Sure but I'll take the latency, jitter, and reliability of that fiber over cellular any day.
vel0city 19 hours ago [-]
The reliability is definitely a bigger question, jitter a bit more questionable, but as far as latency goes 5G fixed wireless can be just fine. YMMV, but on a lot of spots around my town it's pretty comparable latency/jitter-wise as my home fiber connection to similar hosts. And connecting home is often <5ms throughout the city.
throitallaway 17 hours ago [-]
I was considering my cell phone and hotspot experiences (not 5G fixed wireless.) I suppose that has some amount of prioritization happening in order to provide a "stable" experience. My experiences with LTE/5G/5GUW have varied wildly based on location and time.
vel0city 17 hours ago [-]
Fixed wireless sometimes operates on dedicated channels and priorities.
My experiences on portable devices have also seen some mixture of performance, but I'm also on a super cheap MVNO plan. Friends on more premium plans often get far more consistent experiences.
reaperducer 20 hours ago [-]
5G can be extremely fast. I get 600 MBit over cellular at home.
Is your T-Mobile underprovisioned? Where I am, T-Mobile 5G is 400Mbps at 2am, but slows to 5-10Mbps on weekdays at lunchtime and during rush hours, and on weekends when the bars are full.
Not to mention that the T-Mobile Home Internet router either locks up, or reboots itself at least twice a day.
I put up with the inconvenience because it's either $55 to T-Mobile, $100 to Verizon for even less 5G bandwidth, or $140 the local cable company.
harrall 20 hours ago [-]
Probably. My area used to be a T-Mobile dead zone 5 years ago.
I also have Verizon.
Choice of service varies based on location heavily from my experience. I’m a long time big time camper and I’ve driven through most corners of most Western states:
- 1/3 will have NO cellular service
- 1/3 will have ONLY Verizon. If T-Mobile comes up, it’s unusable
- 1/3 remaining will have both T-Mobile and Verizon
My Verizon is speed capped so I can’t compare that. T-Mobile works better in more urban areas for me, but it’s unpredictable. In a medium sized costal town in Oregon, Verizon might be better but I will then get half gigabit T-Mobile in a different coastal town in California.
One thing I have learned is that those coverage maps are quite accurate.
bsimpson 19 hours ago [-]
Verizon Fios sells gigabit in NYC for $80/mo.
They're constantly running promotions "get free smartglases/video game systems/etc if you sign up for gigabit." Turns out that gigabit is still way more than most people need, even if it's 2025 and you spend hours per day online.
randcraw 18 hours ago [-]
That's what I pay for FIOS internet 20 miles north of Philly. I suspect that's their standard rate for 1 Gb/s service everywhere in the US.
immibis 18 hours ago [-]
It's not that mobile is fast, it's that home internet is slow. It's the same reason home internet in places like Africa, South Korea and Eastern Europe is faster than in the USA and Western Europe: home internet was built out on old technology (cable/DSL) and never upgraded because (cynically) incumbent monopolies won't allow it or (less cynically) governments don't want to pay to rip up all the roads again.
hocuspocus 15 hours ago [-]
Several Western European countries have deployed XGS-PON at scale, offering up to 10 Gbps, peaking at ~8 Gbps in practice. Hell I even have access to 25 Gbps P2P fiber here in Switzerland.
Also you can deliver well over 1 Gbps over coax or DSL with modern DOCSIS and G.fast respectively. But most countries have started dismantling copper wirelines.
rsynnott 4 hours ago [-]
Very few people have home equipment that can do anything close to 10Gbps, of course; this is all largely future proofing.
Years back, when FTTH started rolling out in Ireland, some of the CPE for the earliest rollouts only had 100Mbit/sec ethernet (on a 1Gbit/sec service)...
dale_glass 20 hours ago [-]
Higher bandwidths are good to have. They're great for rare, exceptional circumstances.
10G internet doesn't make your streaming better, but downloads the latest game much faster. It makes for much less painful transfer of a VM image from a remote datacenter to a local machine.
Which is good and bad. The good part is that it makes it easier for the ISPs to provide -- most people won't be filling that 10G pipe, so you can offer 10G without it raising bandwidth usage much at all. You're just making remote workers really happy when they have to download a terabyte of data on a single, very rare occasion instead of it taking all day.
The bad part is that this comfort is harder to justify. Providing 10G to make life more comfortable the 1% of the time it comes into play still costs money.
kelnos 20 hours ago [-]
I have 1Gbps down, and the only application I've found to saturate it is downloads from USENET (and with that I need quite a few connections downloading different chunks simultaneously to achieve it).
I have never come remotely close to downloading anything else -- including games -- at 1Gbps.
The source side certainly has the available pipe, but most (all?) providers see little upside to allowing one client/connection to use that much bandwidth.
bsimpson 19 hours ago [-]
Part of it is hardware too.
Only the newest routers do gigabit over wifi. If most of your devices are wireless, you'll need to make sure they all have wifi 6 or newer chips to use their full potential.
Even if upgrading your router is a one-time cost, it's still enough effort that most people won't bother.
mikepurvis 20 hours ago [-]
This tracks. I recently upgraded from 100mbps to 500mbps (cable), and barely anything is different— even torrents bumped from 5MB/s to barely 10MB/s. And there's no wifi involved there, just a regular desktop on gigabit ethernet.
stkdump 10 hours ago [-]
Same here. My ISP recently did a promo to try out 1G/1G for free for a few months. I decided not to buy it after the free trial and went back to my old 500/200 line instead of paying 40% more. Yeah, it takes a minute longer downloading the latest LLM from huggingface, so what.
kookamamie 19 hours ago [-]
Steam downloads easily saturate my 1 Gbs. Same for S3 transfers.
__alexs 19 hours ago [-]
Steam downloads can easily max 1Gbps for me.
msh 20 hours ago [-]
Steam and the ps5 store can fill out my 1 gigabits connection.
sheepdestroyer 20 hours ago [-]
Steam can fill up much more
I'm getting my Steam games at 2Gbps, and I am suspecting that my aging ISP's "box" is to blame for the cap (didn't want to pay my ISP for the new box that officially supports 8Gbps symmetrical, and just got a SFP+ adapter for the old one). I pay 39€/M for what is supposed to go "up to" 8Gbps/500Mbps on that old box.
Games from Google Drive mirrors are coming at full speed too.
Nice when dling that new Skyrim VR 90GB mod pack refresh
vel0city 17 hours ago [-]
Steam used to max out my internet, but now its smarter about it and starts to decrypt/expand the download as its going instead of doing it in phases. This quickly maxes out my IOPS on even NVMe drives at only several hundred megabits for most games I've tried recently.
munhitsu 4 hours ago [-]
My personal bugbear is the network coverage.
Context: London / UK (EE).
Yes, I have 5G at home, but it's just one bar and sometimes even this one bar will disappear. Yes, there is 5G/4G all around the city, but you can't hold an uninterrupted conversation over FaceTime Audio while on the overground train or driving. I'll not even discuss the underground.
However, uninterrupted, low-latency, average bandwidth is a hard market and even harder to design.
zipy124 4 hours ago [-]
Yup, I recently travelled to Hong-Kong from the the UK and noticed just how much better coverage was in the city. I expect poor coverage when I'm in Yorkshire, but there is no excuse for just how bad London's coverage is. A lot of it seems to come down to which carrier you are on, a quick look at coverage maps [1] and you can see what carrier you are on is really important in some parts of the country.
My impression (from pretty limited recent experience) is that mobile networks in UK cities (at least London and Manchester) are _notably_ worse than in other European cities. I initially thought it was just a London thing, and, okay, London's very, very big, but was recently in Manchester, which has no such excuse, and it was also pretty terrible.
Not sure what's going on there.
Garvi 4 hours ago [-]
Most people are shunning the covid topic for various reasons, so no one is bringing up the fact that the British torched a lot of their 5G infrastructure to "stop the spread of covid". There were over 80 such arson incidents, but I'm sure this number will go down with time.
rsynnott 4 hours ago [-]
I'm guessing that that was probably not in city centres, tho.
(Also, where're you getting 'over 80'? From a quick search there seem to have been maybe 5 masts damaged/destroyed.)
Garvi 3 hours ago [-]
In your search results. It was well over 80 at the time. 80 is the number mostly cited nowadays. I see you're down to 5 damaged. Good for you.
Vast majority of those were not destroyed or damaged. I think some of the tabloids may have conflated "any vandalism or attempted vandalism" with "actually damaged", but, well, that's tabloids for you.
> I see you're down to 5 damaged. Good for you.
Who's 'you' in this context?
Garvi 2 hours ago [-]
That's not really ambiguous.
So it didn't happen, because it was *obviously* made up? The problem with such reasoning is, it does not require substantiation. You should find 20 people that also believe it(shouldn't have a problem on this platform) and you can start a religion.
rsynnott 41 minutes ago [-]
I mean, I'm unsure why you're so excited about it. Whether it happened or not feels almost irrelevant to me (it'd be far from the worst behviour by loonies during the pandemic), but there seems to be very little evidence that it actually did happen in significant numbers.
Like, even if it had been 80 masts destroyed, the ongoing impact would be nil; they'd just have been replaced and 80 masts is in any case a trivial number (it's difficult to get numbers on how many exist because the ONS data counts nano cells, but there seem to be over 60,000 'real' masts in the UK, anyway.)
hn_throwaway_99 15 hours ago [-]
> Is that such a foregone conclusion, though? Many technologies have had phases where customers eagerly embrace every improvement in some parameter—until a saturation point is reached and improvements are ultimately met with a collective shrug.
> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades. Although supersonic passenger aircraft found a niche from the 1970s through the early 2000s with the Concorde, commercial supersonic transport is no longer available for the mainstream consumer marketplace today.
OK, "Bad Analogy Award of the Year" for that one. Traveling at supersonic speeds had some fundamental problems, primarily being that the energy required to travel at those speeds is so much more than for subsonic aircraft, and thus the price was much higher for supersonic travel, and the problem of sonic booms meant they were forbidden to travel over land. When the Concorde was in service, London to NYC flights were 10-20x more expensive on the Concorde compared to economy class on a conventional jet, meaning the ~4 hours saved flight time was only worth it for the richest (and folks just seeking the novelty of it). There are plenty of people that would still LOVE to fly the Concorde if the price were much cheaper.
That is, the fundamental variable cost of supersonic travel is much higher than for conventional jets (though that may be changing - I saw that pg posted recently that Boom has found a way to get rid of the sonic boom reaching the ground over land), while that's not true for next gen mobile tech, where it's primarily just the upfront investment cost that needs to be recouped.
James_K 20 hours ago [-]
I got a 5G capable phone a few months back, and I can't say I've noticed a difference from my old one. (Aside from the new phone being more expensive, worse UI, slower, heavier, unwieldy, filled with ads, and constantly prompting me to create a "Samsung account".)
fkyoureadthedoc 20 hours ago [-]
What's any of that have to do with 5G? On 2 bars of 5G right now and I get 650Mbit download speed, it's significantly faster than 4G.
James_K 20 hours ago [-]
The last bit is just stuff I wanted to whine about. I obviously know it is faster, you don't need to explain that concept. I have just never had need of any significant internet speed on my phone. I don't download things, and only sometimes stream video. Most of the time I am just checking emails, or calendars, or something trivial like that. Unless I do some kind of benchmark, I can't notice the difference between 4G and 5G.
throw0101c 18 hours ago [-]
> I have just never had need of any significant internet speed on my phone. I don't download things, and only sometimes stream video.
But other people do.
And the main resource that is limited with cell service is air time: there are only so many frequencies, and only so many people can send/receive at the same time.
So if someone wants to watch a video video, and a particular segment is (say) 100M, then if a device can do 100M/s, it will take 1s to do that operation: that's a time period when other people may not be able to do anything. But if the device can do 500M/s, then that segment can come down in 0.2s, which means there's now 0.8s worth of time for other people to do other things.
You're not going to see any difference if you're watching the video (or streaming music, or check mail), but collectively everyone can get their 'share' of the resource much more quickly.
Faster speeds allow better resource utilization because devices can get on and off the air in a shorter amount of time.
Marsymars 18 hours ago [-]
You'd figure that would incentivize cell operators not to market segment higher speeds behind higher prices.
It's like I'm paying them extra for the privilege of increasing their network efficiency.
throw0101c 16 hours ago [-]
4G (AIUI) uses different frequencies, is a sunk cost, and 5G needs new gear, so someone has to pay for upgrades and the 5G frequency auctions.
dylan604 19 hours ago [-]
If 5G lived up to everything it was touted to do, you could use a 5G hotspot for your home internet would could be a huge positive in areas that only have one ISP available. However, 5G does not live up to the promises, and your traffic is much more heavily shaped than non-wireless ISPs.
rsynnott 4 hours ago [-]
During the pendemic, I was stuck working from home, and for a couple of periods my home internet connection was quite unreliable (it's DOCSIS, and nearby construction kept damaging the cable; eventually the cable company actually strung the cable across the road, _attached it to a tree_, and ran it back, to bypass the construction site). So I regularly had to tether on my (then LTE) phone. And it was _fine_; it was actually surprisingly okay.
vel0city 18 hours ago [-]
I know many households which use 5G hotspots for home internet. They even do cloud-rendered gaming and remote telework on such setups. Consistently get several hundred megabits at pretty decent latency and jitter. I'd say that lives up to many promises.
dylan604 18 hours ago [-]
And I know just as many that have tried but get crap service, so we're even now?
I barely get regular cell service in my house from my provider. There's no way I'd get a hotspot for a service that is a must have. Provider's "coverage" maps are such a joke to make them useless.
vel0city 17 hours ago [-]
I'd say its then more YMMV rather than a blanket "does not live up to the promises".
I can routinely go around most of the metro area on any given day and get hundreds of megabits of throughput back home at <10ms latency on a plan that's costing me ~$30/mo on a device that cost less than $400. I can be in a crowded sports arena and on my regular cellular internet and still manage to pull >50Mbit down despite the crowd. Several years ago, I'd be lucky to even get SMS/MMS out quickly and reliably.
rsynnott 4 hours ago [-]
It's going to depend a _lot_ on your telco and region.
secondcoming 16 hours ago [-]
You're getting bad 5G. I use 5G for home internet. It's perfect except for pings.
dylan604 15 hours ago [-]
I'm getting bad 5G ~= 5G does not deliver as promised.
to the person that is affected by it, it's not a good thing. we can argue all day about it but you're wrong to whoever you're arguing against at that point.
jfengel 19 hours ago [-]
It would matter more if you were in a crowded place, with more users taking up the spectrum. But yeah, as with computer speed, ordinary applications maxed out a while back.
readthenotes1 19 hours ago [-]
That's a very interesting observation - - the bandwidth requirement is more for the transmitter than the receiver as we get more and more connected devices
agumonkey 17 hours ago [-]
In a similar manner, I got 5G recently and used it as my main link, and i'm still at 150GB downloaded (multiple persons, multiple laptops, regular OS updates, docker pull etc). I'm not even smart about this .. Without constant 4K streaming I realized that my needs will rarely exceed 200GB.
secondcoming 16 hours ago [-]
5G allowed me to avoid having to get fibre.
adrr 9 hours ago [-]
What provider? I have yet to get over 300Mb/sec on either TMobile or Verizon. Tmobile is suppose to have the fastest speeds according to reports.
throwaway2037 13 hours ago [-]
Let's say that 5G is 10x faster than 4G. Why do you need faster than 65MBit download speed on a mobile phone?
That's completely a false sense of security with 5G systems, because the way it achieves that high bandwidth is by literally “steering the beam” to follow you, i.e. precise location surveillance is an implicit part of using it: https://arxiv.org/pdf/2301.13390
“The initial access period includes the transmission of beamformed SSBs that provide the UEs with a basic synchronization signal and demodulation reference signals. This allows for UEs to save power by going inactive and rejoining the network at a later initial access period. At the physical layer, the UE will receive the SSBs from a single antenna or using one or more spatial filters, such as a multi-panel handset used to overcome hand blockage. The UE will use the received SSB for synchronization and determining the control information. The beam reporting stage includes one or more possible SSB CSI reports which are transmitted in the random access channel. The report includes information for the strongest serving cell and may include a set of the next strongest cells within the same band to assist with load balancing. The number of reported additional cells depends on the carrier frequency, the previous state of the UE in the net- work, and the bands being monitored. In a newly-active state, the UE reports the top 6–16 additional cells across each active frequency range. This reporting helps to manage handover and mitigate cell-edge interference. In the final steps, the UE has connected to a serving cell and is ready to start receiving data. Further beam refinement and channel estimation can occur by transmitting reference signals with more precise beams. Although not specified in the standard, a typical CSI-RS would cover smaller portions of the reported SSBs’ directions or combine coherently across a multipath channel. Using more directional or precise beams can increase the SNR–thereby improving the channel estimates and beam alignment. Beam refinement can also be used to adjust the beamforming slightly to track highly mobile UEs.”
Your linked article even agrees: “Carriers can still see which cell towers your device connects to, use the strength and angle of your device’s signal to the tower, and then look up your device’s unique cellular identifier to determine your general location. Your location may never be private when you’re connected to a cellular network”
Sure but you linked articles about the 5G positioning system which the UE could choose not to participate to.
atian 19 hours ago [-]
Yeah…
dclowd9901 20 hours ago [-]
I've found 1-bar 4G LTE to actually be enough to do work on at home, to my surprise (in the occasions that my in-the-ground cable connection up and dies on me). Only thing I don't get is Zoom with that, but it's nice to have a good excuse not to be in a meeting.
readthenotes1 19 hours ago [-]
Well, I believe you can try audio only to reduce the bandwidth requirements. That was my excuse for anything below five bars...
jowea 13 hours ago [-]
Does using the internet use less battery?
pjdesno 20 hours ago [-]
Note that existing bandwidth usage has been driven by digitization of existing media formats, for which there was already a technology and industry - first print, then print+images, then audio, then video. People have been producing HD-quality video since the beginning of Technicolor in the 1930s, and while digital technology has greatly affected the production and consumption of video, people still consume video at a rate of one second (and about 30 frames) per second.
There are plenty things that *could* require more bandwidth than video, but it's not clear that a large number of people want to use any of them.
kfarr 21 hours ago [-]
I was hoping to see some mention of latency. Agree with the premise that for most consumer applications we don’t need much more wireless throughput but latency still seems way worse than Ethernet heyday times in college
ianburrell 21 hours ago [-]
LTE latency is 20-50ms, 5G is 1ms, Gigabit Ethernet is less than 1ms, Wifi is 2-3ms. Overall latency is more about distance, 300km is 1ms, number of hops, and response times.
With mobile, I bet contention and poor signal are more of an issue. 5G is a noticeable improvement over LTE, and I am not sure they can do much better.
yaantc 19 hours ago [-]
LTE total latency is 20-50 ms, and you compare this to the marketing "air link only" 5G latency of 1 ms. It's apple and oranges ;)
FYI, the air link latency for LTE was given as 4-5 ms. FDD as it's the best here. The 5G improvement to 1ms would require features (URLLC) that nobody implemented and nobody will: too expensive for too niche markets.
The latency in a cellular network is mostly from the core network, not the radio link anymore. Event in 4G.
(telecom engineer, having worked on both 4G and 5G and recently out of the field)
porridgeraisin 17 hours ago [-]
Always been interested in this stuff. Where would you recommend a software/math guy learn all this stuff? My end goal is to understand the tech well enough to at least have opinions on it. How wifi works would be great as well if you're aware of any resources for that.
yaantc 16 hours ago [-]
It's a good but hard question... Because cellular is huge.
In a professional context, nobody knows it all in details. There are specializations: core network and RAN, and inside RAN protocol stack vs PHY, and in PHY algos vs implementation, etc.
It's about as readable as it can get. The PHY part is pretty awful by comparison. If you have a PHY interest, you'll need to look for technical books as the specs are quite hermetic (but it's not my field either).
the_mitsuhiko 21 hours ago [-]
> 5G is 1ms
I have never seen this. Where do I have to get 5G service to see these latencies?
supertrope 20 hours ago [-]
1ms to the cell tower. Even on fiber Internet there’s still single digit ms latency to servers in the same metro area. Only T-Mobile has deployed 5G SA (standalone). ATT and Verizon use 5G NSA (non standalone) which is a 4G control channel bonded with 5G channels so it has 4G latency.
tguvot 13 hours ago [-]
if we go by useless endpoints, let's compare apples to apples.
on fiber network equivalent to cell tower will be probably splitter. i guess it has sub 1ms latency
readthenotes1 19 hours ago [-]
Pretty sure splitting up latency by useless endpoints is not a relevant way to do it.
20 hours ago [-]
refulgentis 20 hours ago [-]
This is blatantly false.
I will bet many $$$ that no one in this thread has ever gotten 1ms.
If you search "5g latency", Google's AI answer says 1 ms, followed by another quote lift from Thales Group™ saying 4G was 20 ms and 5G is 1ms.
Once you scroll past the automated attempts, you start getting real info.
Spoiler: 23 ms median for fastest provider, T-mobile.
BenjiWiebe 20 hours ago [-]
Latency to where? Speedtest servers or cell towers?
refulgentis 18 hours ago [-]
I assume Speedtest servers, as they wouldn't have a way to get measurements for individual cell towers at scale.
(at least, I don't recall being able to get that sort of info from iOS APIs, nor have I ever seen data that would have required being derived that way)
mbesto 21 hours ago [-]
When 5G first rolled out this was absolutely not the case. Not only was it not 1ms, it was like full 1000's of ms to the point where I actively turned off 5G on my iPhone because it was so bad.
I can only speculate 5G was so saturated on the initial rollout so it led to congestion and now its stabilized. But latency isn't only affected by distance and hops - congestion matters.
rsynnott 20 hours ago [-]
Could be lots of things. I'd go with "your telco was doing something stupid" as a first guess, tbh.
20 hours ago [-]
eber 20 hours ago [-]
I thought they were going to mention L4S or low latency low loss, over 5G which seems to be in the latest 3GPP 5G-Advanced Release 18 (2024) but I have no idea what the rollout of that is.
One of the issues with this 5g vs 6g is the long-term-evolution of it all -- I have no idea when/where/if-at-all I will see improvements on my mobile devices for all the sub-improvements of 5g or if it's only reserved for certain use cases
jquery 20 hours ago [-]
I was hoping to see any mention of large file downloads and uploads. Nevermind the article’s ponderous “I can’t imagine any use case for more than 5GB/s”, that’s a use case today where higher speeds above 5GB/s would be helpful. For example, a lot of AAA games are above 100GB, with the largest game in my steam library being over half a terabyte (DCS World). Ideally I wouldn’t have to store these games locally, but I do if I want to have access to them in any reasonable amount of time.
It also takes ages to back up my computer. 18 terabytes of data in my case, and that’s after pruning another 30 terabytes of data as “unnecessary” to back up.
umanwizard 20 hours ago [-]
I don't think the article ever claimed that nobody would ever want speeds above 5G. But you have to admit that your use case is uncommon. Only a tiny fraction of people has anywhere near 18 TiB stored locally and an even smaller group regularly wants to do cloud backups of all of it. There are various solutions for only backing up the diff since the last backup, rather than uploading the full image.
crazygringo 20 hours ago [-]
The article is about mobile bandwidth only.
Are you downloading AAA games or backing up your computer over mobile?
Also, I hope you're doing differential backups, in which case it's only the initial backup should be slow. Which it's always going to be for something gargantuan like 18 TB!
msh 20 hours ago [-]
5g Home internet is getting common
satellite2 8 hours ago [-]
I find the premise of the article completely out of touch with the growth environment of the last 40 years. Of course the median consumer don't have an application for more than what the current infra can offer, it has consistently been the case since broadband inception. The reason is simple, no one will create applications designed for the median consumer with requirements higher than what the infrastructure can offer as this is guaranteed to fail. I thought it was clear that it was a typical offer driven market. Reading this article I'm really afraid that we will shoot ourselves in the foot if we forget this. A bit like what is happening with CPUs.
01HNNWZ0MV43FF 21 hours ago [-]
Oh boy oh boy I'm excited for the 5G shutdown when the phone I haven't even bought yet will quit working :)
SR2Z 20 hours ago [-]
I get the spirit of what you're trying to say (I think) but the truth is that wireless spectrum is an extremely scarce resource. It is bad policy to let inefficient protocols use it without good reason - 2G has the status of "lowest common denominator" and that's probably the only baseline that you should be able to rely on.
There are a ton of other inefficient allocations of spectrum^1, but not all spectrum is suitable for all purposes and the bands for cellular connectivity are highly sought after.
I know you are being sarcastic but 3G antennas in the US were only just recently shut down. 20~ years isn't bad for how rapidly tech advancement has been happening in the recent decades. Obviously AM and FM radio have been continuing for far longer than that but there are legal and logistical reasons for that, at least for now.
xattt 20 hours ago [-]
Similar to how Apple how moved forward with breaking compatibility for apps with older OS X versions.
AkshayGenius 19 hours ago [-]
Is there a reason we keep trying to use higher frequencies in every new wireless standard (Wi-Fi, 5G, now 6G) instead of trying to increase the maximum possible bitrate per second into lower frequencies? Have we already reached the physical limits of the amount of data the can be encoded at a particular frequency?
Lower frequencies have the advantage of longer distances and permeating through obstructions better. I suppose limited bandwidth and considerations of the number of devices coexisting is a limiting factor.
tliltocatl 19 hours ago [-]
> Have we already reached the physical limits of the amount of data the can be encoded at a particular frequency?
Basically, yes (if you take into account other consideration like radiated power, transmitter consumed power, multipath tolerance, Doppler shift tolerance and so on). Everything is a tradeoff. We could e. g. use higher-order modulation, but that would result in higher peak-to-average power ratio, meaning less efficient transmitter. We could reduce cyclic prefix length, but that would reduce multipath tolerance. And so on.
Another important reason why higher frequencies are preferred is frequency reuse. Longer distance and penetration is not always an advantage for a mobile network. A lot of radio space is wasted in areas where the signal is too weak to be usable but strong enough to interfere with useful signals at the same frequency. In denser areas you want to cram in more base stations, and if the radiation is attenuated quickly with distance, you would need less spectrum space overall.
linsomniac 18 hours ago [-]
>Longer distance and penetration is not always an advantage
Exactly. When I was running WiFi for PyCon, I kept the radios lower (on tables) and the power levels at the lower end (especially for 2.4GHz, which a lot of devices still were limited to at the time). Human bodies do a good job of limiting the cell size and interference between adjacent APs in that model. I could count on at least a couple people every conference to track me down and tell me I needed to increase the power on the APs. ;-)
consp 47 minutes ago [-]
That works if you control all the radios. If there is some other device screaming into the void you are screwed either way. (been there)
kmlx 19 hours ago [-]
We don’t move to higher frequencies just because we’ve run out of ways to pack more data into lower bands. The main reason is that higher frequencies offer much wider chunks of spectrum, which directly leads to higher potential data rates. Advanced modulation/coding techniques can squeeze more capacity out of lower bands, but there are fundamental physical and regulatory limits, like Shannon’s limit and the crowded/heavily licensed spectrum below 6 GHz that make it harder to keep increasing speeds at those lower frequencies.
spacemanspiff01 19 hours ago [-]
In addition to what others have said, Often from a network perspective you want smaller range.
At the end of the day, there is a total speed limit of Mb/s/Hz.
For example, in cities, with a high population density, you could theoretically have a single cell tower providing data for everyone.
However, the speed would be slow, as for a given bandwidth six the data is shared between everyone in the city.
Alternatively, one could have 100 towers, and then the data would only have to be shared by those within range. But for this to work, one of the design constraints is that a smaller range is beneficial, so that multiple towers do not interfere with each other.
protocolture 15 hours ago [-]
5G's trick is MIMO. Basically just using more channel space for more data. In some places that means 3G/4G spectrum + 24GHz + 60GHz. And responding when you close a door and the 60GHz goes away. In some parts of the world where licensing worked out differently, it might just be a couple chunks of old 4G spectrum. Its not a monolith.
hocuspocus 14 hours ago [-]
In most places it's 2G/3G/4G bands, either repurposed or through dynamic spectrum sharing, plus sub-6 bands.
mmWave is a flop.
protocolture 14 hours ago [-]
Its a flop in this circumstance for sure.
I used to have some early engineering material outlining what had been approved for use in each country and 24GHz was pretty damn common. Could be that changed I havent kept up.
I do know in Australia we have sweet FA and 5G isnt very interesting at all.
vel0city 19 hours ago [-]
5G can operate at the same low frequencies as 2G/3G/4G. It's not inherently a higher frequency standard.
It just also supports other bands as well.
idolofdust 16 hours ago [-]
We need the freedom to do more on our mainstream pocketable devices, my hotspot for my laptop will be always throttled down to 3G, as if to say
“Hey, this isn’t actually supposed to be used to get work done. Keep doing simple phone stuff!“
dbspin 16 hours ago [-]
That's bonkers. No such restriction here in Ireland. Frequently get work done away from the home office on 4G hotspotted to the laptop.
nick_name 5 hours ago [-]
For VoIP applications, one-way mouth-to-ear delay should ideally be less than 150 ms for natural conversations [1].
Other factors, such as jitter, transmission delay, queuing delay, etc., also impact quality. However, if the delay occurs mid-transmission (e.g., due to network congestion or routing inefficiencies), there’s little that can be done beyond optimizing at the endpoints.
What's the point of more bandwidth when virtually all carriers limit your data plans to 50 to 150 GB/month?
Calamityjanitor 17 hours ago [-]
Here in Australia they'll charge more for 5G but limit it to 150mbps. That's slower than LTE's max, no wonder 5G uptake is slow.
kjellsbells 12 hours ago [-]
I follow what the operators tell Wall St, because the executives have large personal financial stakes in getting it right (modulo fraud, but lets trust for now).
For example, at the last investor day that AT&T held, they indicated [0:pdf] that their growth plans are in broadband fiber, not building more 5G capacity to serve a surge in traffic. Reading their charts and knowing how AT&T traditionally worked, I believe that they are going to try to cut the expense of running the 5G network via various optimizations and redirect capital heavily to go after the fiber broadband market instead, using convergence (ahem: price bundling) to win subscribers.
(I bet it really stuck in their craw that Comcast mastered the bundle so well that they even built a successful MVNO on the back of their xfinity customer base and parked their tanks on AT&Ts lawn while the latter was futzing about with HBO.)
However, bundling is a price compression game, not an ARPU growth game. If you start at $100 a month and then begin chipping away with autopay discounts, mobile convergence, free visa gift cards and all that nonsense, pretty soon you are selling broadband for $35 a month and can't make it work except with truly cruddy service, which leads to high churn rates. So we'll see how this turns out.
My understanding is that 6G is expected to make use of a good decentralised fibre network, with small radio units at the end points of this network providing the wireless contact point. With interoperability being on the books for 6G, whoever controls the fibre network is in for a lot of money (unless the article we're commenting on is right).
So it might make sense for operators to focus on developing fibre regardless.
kjellsbells 12 minutes ago [-]
Yes, this distributed radio idea is a feature of both 5G and 6G. You stick the dumbest, dullest parts of the radio electronics up on the tower and trunk them all back over fiber to an aggregation/control point a few kms away, and then trunk the agg points back to the packet core a few hundreds of km away. That needs a lot of fiber to carry the backhaul. I dont know if operators are thinking of using this fiber to also incidentally deliver broadband service, but today that doesnt happen: for one, oftentimes the mobile operator doesnt own the fiber, they just lease it.
mecdu92 2 hours ago [-]
For me 3G is still 100% sufficient for my usage.
Also the latency has not been a problem for me, but jitter is. The good thing generally latency is pretty much stable so jitter (which is derivative of latency) is minimal.
I already don't understand what 4G added to me, let alone 5G
Nyr 2 hours ago [-]
3G (which nowadays is 3.5G) is sufficient for you only because everyone else is on 4G or 5G.
kinematicgps99 19 hours ago [-]
What we really need is pervasive low data rate backup systems for text messaging and low fidelity emergency calls that don't kill handset batteries. If this means "Starlink" and/or lower frequency bands (<400 MHz): the more options, the merrier for safety. Perhaps there may come a time where no one needs an EPIRB/ELT because that functionality is totally subsumed by smartphones offering equal or superior performance.
srmarm 6 hours ago [-]
The problem I find is with the backhaul rather than the 5G signal. I can still get unreliable internet with a strong 5G signal. Even today I find for general internet browsing is good 3G is OK and can even support a video stream. My understanding is that 4G/5G allow for better network design and the faster headline speeds are just a positive for marketing as much as anything else.
consp 7 hours ago [-]
The providers have alreadyy fixed this (coming soon near you): My fixed network operator, instead of finally forcing the fiber supplier to honor their contract and fix the final 5m of cabel, is pushing me to use a 5g box for fixed internet.
So you want me to trade my horrible 100mbit line with ok latency for anything between 10 and 120mbit depending on the time of day with a latency if over 50ms guaranteed. And you also get a CGNAT IP and thus no incoming connections because who needs ipv6!
nsteel 17 hours ago [-]
Relatedly, Nokia announced a new CEO this week with a datacentre/enterprise background.
There is not a single 5G provider with unlimited high-speed access. Not one.
Perhaps this has something to do with limited mobile bandwidth?
Now imagine we add more bandwidth: what would happen to Comcast and other fiber monopolists if people started replacing fiber with 5G?
angio 6 hours ago [-]
Three UK has unlimited 5G data plans.
Cbzjsj 18 hours ago [-]
In your country.
miki123211 14 hours ago [-]
I think what matters a lot more is the peak data rate per cell, not per device.
Once users stop demanding higher speeds, you can still get a lot more growth by giving them higher data caps or even unlimited plans.
This will make 5G home routers a very attractive option, particularly when the only other options you have available are cable and/or DSL.
I think wireless providers competing in the same playing field as traditional ISPs will turn out to be an extremely good thing for consumers.
nonelog 16 hours ago [-]
Introducing such new technologies is ALWAYS about rendering existing hardware "obsolete" so that new devices can be forced upon the consumer (who most often does not need them).
teleforce 15 hours ago [-]
I wonder why the poster blatantly changed the original title altogether since it's very misleading?
Currently 5G doesn't even meet their very own 3 objectives namely:
1) Enhanced Mobile Broadband (eMBB) - eMBB,
2) Massive Machine Type Communication (mMTC) - mMTC,
3) Ultra-Reliable Low Latency Communication (URLLC) - URLLC
The article is just focusing on the first part that's arguably can be met by prior standard the 4G LTE+ if it's only about 1 Gbps bandwidth of the 5G lower range microwave frequency bands or FR1.
For most parts of the world, 5G higher range millimeter waves (mmWave) band or FR2 are not being widely deployed and implemented that can delivered bandwidth much higher than 1 Gbps but then again the wireless transmission ranges and distances are severely limited for mmWave compared to microwave or RF.
One of the notable efforts by 3GPP (5G standards consortium) for objectives part 2 and 3 is the DECT NR+ [1]. It's the first non-cellular 5G standard that can support local mesh wireless networks but it's availability is very limited now although no base stations modifications are required for its usages since it's non-cellular and the backhaul to base stations can be performed by the existing 5G connectivity. I'd imagined that it will be available inside most phones now since the standard already out for several years but it seems only Nordic is interested in the new standard.
The upcoming 6G standards probably need to carry on and focusing on the 2 and 3 objectives of 5G even more so, since it's expected machine to machine (M2M) will surpass the conventional human-to-human (H2H) and even the new (H2M) with the rise of IoT and AI based system for example intelligent transports of (vehicle-to-everything) V2X systems.
[1] DECT NR+: A technical dive into non-cellular 5G (30 comments):
> Instead, the number of homes with sufficient connectivity and percentage of the country covered by 10 Mb/s mobile may be better metrics to pursue as policy goals.
Hard pass.
5G is far from ubiquitous as it. Though how would we even know? I feel like my phone is always lying about what type of network it's connecting to and carrier shave the truth with shit like "5Ge" and the like.
I have not, ever, really thought "Yeah, my phone's internet is perfect as-in". I have low-signal areas of my house, if the power goes out the towers are sometimes unusable due to the increased load, etc. I do everything in my power to never use cellular because I find incredibly frustrating and unreliable.
Cell service has literally unlimited headroom to improve (technologically and business use-cases). Maybe we need more 5G and that would fix the problems and we don't need 6G or maybe this article is a gift to fat and lazy telecoms who are masters at "coasting" and "only doing maintenance".
kalleboo 13 hours ago [-]
It's interesting how different people's experiences are.
Outside of my home (where I admit I'll never give up my 10 Gbit fiber), I'll always default to using 5G. It's always faster and more stable than any kind of "free wifi" at a coffee shop or hotel or anything, if I'm working out of home, I'm tethering to my phone, racking up 120 GB/mo or so of usage.
cadamsdotcom 13 hours ago [-]
As they say: make it work, make it fast, and finally, make it cheap...
Time to make it cheap!
ElectRabbit 19 hours ago [-]
Without 5G SA that standard is an awful battery sucker. My Pixel 6 looses almost 1/3 of its battery capacity when being stuck at 5G NSA.
With my Pixel 8 and 5G SA activated (Telefonica Germany) everything is back to normal.
nwatson 18 hours ago [-]
Someone I know at a mixed-signal company many of whose chips go to 5G deployments said their revenue really slowed down last year due to 5G deployment uptake decreasing significantly.
transcriptase 18 hours ago [-]
My understanding was that the economics/range of 5G only really worked in densely populated areas? Or has that changed? If not, once those places are saturated it makes sense that build out would slow.
DannyBee 21 hours ago [-]
Um, the airspeed analogy used up front is remarkably silly.
Nobody would shrug at being able to fly 2x faster.
The reason it stopped is because it made lots of noise and was expensive.
Not because it was not needed.
I think you'd be hard pressed to find anyone who would not want faster flights if it could be done at reasonable cost.
datadrivenangel 21 hours ago [-]
If Tbp speed internet was the same cost and effort as Gbps/Mbps internet, obviously faster is better! The marginal returns on speed in most settings drop off pretty quickly.
kelnos 20 hours ago [-]
Yeah I think the level of diminishing returns takes a while to reach when you're taking about flights. I'd be thrilled to drop a 6 hour flight down to 3 hours. Or 1.5. Or 45 minutes. Or 20. Or 10. Or instantaneous teleportation.
But if there are little to no applications for faster Internet speeds -- which for the most part there aren't -- then it just kinda doesn't matter.
14 hours ago [-]
19 hours ago [-]
tomxor 14 hours ago [-]
> for those with good 4G connectivity, 5G makes much less of an improvement on the mobile experience than advertisers like to claim
Complete BS. This misses the critical and common issue of user contention with 4G, which is the main relief 5G brings.
If all you do is compare peak throughput on paper you will miss the real world performance issues. In reality you will never see "good" 4G speeds, there just isn't enough bandwidth to go around in practice due to the lower frequency bands it operates in.
I've got both a 4G and 5G LTE router, both are Cat20 which means in theory can operate at up to 2 Gbit/s. Yet in practice, with any of the carriers in the UK the 4G modem will scape 40 Mbit/s at best in the wee hours, and drop as low as 3 Mbit/s at peak time. The 5G one will give me 600 Mbits all day long... because there is enough bandwidth to go around for all users, this is the key difference.
bluesounddirect 18 hours ago [-]
Title Edit: 4G networks meet consumer needs as mobile data growth slows
rho4 17 hours ago [-]
My need for speed is a long way from saturated. 360° virtual reality calls at a resolution better than my physical senses is just one thing my unimaginative mind can come up with on the spot.
fulafel 7 hours ago [-]
The "fast enough" illusion comes from having consumers adjust for years to flaky wireless connections after we had a good period of steady broadband performance improvements. If the latter had continued at the same rate, we could now rely on home users having at least 10-100 Gbit/s bandwidth and could build entirely different kidns of applications.
IOW: Apps could definitely use high bandwidth & low latency networking (think AI acceleration, remote storage, volumetric telepresence etc) if we had it reliably, but our due to wireless transition apps are adopted to stagnating speeds.
Thaxll 14 hours ago [-]
4g is more than enough on mobile.
bilater 19 hours ago [-]
It's rare I come across something so myopic, unimaginative and laughable. This will age as well as the Paul Krugman prediction:"the Internet's impact on the economy has been no greater than the fax machine's".
We have not even begun to explore the Uber/Airbnb applications of a 6G+ world. And the VR bandwidth ceiling lazy thought is an extension of the limited mindset of this author.
ijidak 11 hours ago [-]
> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades
This is a terrible example.
If supersonic mass travel could be provided safely and cheaply, demand would be bonkers.
If I could get to Tokyo in an hour for $50, I would visit every weekend.
Overall, the article is sound.
But what a terrible example of demand not filling supply.
If I had a terabit per second, indeed I probably wouldn't use it.
But you can not make travel fast enough.
pjmlp 21 hours ago [-]
Still using 4G over here.
WeylandYutani 11 hours ago [-]
I am actually using less mobile data since I have fiber. Of-course I can also get gigabyte speeds on my mobile thanks to 5G... (This country can be pretty nice although I will deny it under torture).
doublerabbit 16 hours ago [-]
This is all well and good, but how about we remove data caps too? Such a racket.
raverbashing 20 hours ago [-]
Honestly after the big investment (and with a lot of drawbacks introduced by it) from 5G, I don't think telecoms have a lot of appetite for massive investments in such a short term again
tguvot 19 hours ago [-]
it feels like telecoms walked back and started to invest into fiber broadband deployment
temptemptemp111 15 hours ago [-]
[dead]
spintin 14 hours ago [-]
[dead]
mikejhonson 21 hours ago [-]
[dead]
ctoth 21 hours ago [-]
Tell me you've never backed up a NAS or downloaded a multi-hundred-gigabyte game without telling me!
crazygringo 20 hours ago [-]
You shouldn't be doing either of things over mobile, except as a last resort.
The article is about mobile specifically.
jlokier 19 hours ago [-]
If I want to download a multi-hundred-gigabyte anything, I switch to my mobile and turn on it's Wi-Fi hotspot. It's a really noticable speed improvement, with the caveat that speed varies a lot according to location and time of day.
My phone, a mid-range Android from 3 years ago, usually downloads over 4G much faster than any of the wired or fibre networks I have access too, including the supposedly newly installed dual fibre links at work, or the "superfast broadband" zone at my local library. It's also much faster than the 4G router at home.
I've downloaded terabytes over my phone, including LLM weights, and my provider's "unlimited" data plan seems fine with it. (They say 3TB/month in the small print.)
kelnos 20 hours ago [-]
For practical purposes I agree, but in principle, why not? Especially in places where land-based ISP choice is not abundant or have crap speeds.
For example, my only choice is Comcast's 1.2Gbps/25Mbps service. If I need faster upload I have to tether to my phone. And I rarely get anywhere near that full 1.2Gbps down.
And there are people in areas even less well served by traditional ISPs where their primary Internet connection is wireless.
jquery 20 hours ago [-]
I made the exact comment somewhere else in the thread. I can’t believe OP’s article made zero mention of that.
md_rumpf 21 hours ago [-]
Are there really use cases for faster chips? I can run all models I want on an H100 pod.
No models exist that I can't run with at least 64 H100s. NVIDIA should just stop.
tomcam 20 hours ago [-]
I’m part of the early test team for tachyon-enabled 7G. Obviously it’s only early alpha but I think non-Googlers can get Pixel 10s with it at the Mountain View Google store. If you upgrade the firmware to version 0.3.75A sometimes short text messages arrive before you type them.
mike50 16 hours ago [-]
Idiot author is a kindness. Data rates compared to an obsolete 1970s technology. Author is too old to write articles on the "newfanged internet". IEEE spectrum needs to kick this guy to the curb hard for writting crap and crossposting his book..
Now let's address latency - that entire article only mentioned the word once and it looks like it was by accident.
Latency isn't cool but it is why your Teams/Zoom/whatevs call sounds a bit odd. You are probably used to the weird over-talking episodes you get when a sound stream goes somewhat async. You put up with it but you really should not, with modern gear and connectivity.
A decent quality sound stream consumes roughly 256KBs-1 (yes: 1/4 megabyte per second - not much) but if latency strays away from around 30ms, you'll notice it and when it becomes about a second it will really get on your nerves. To be honest, half a second is quite annoying.
I can easily measure path latency to a random external system with ping to get a base measurement for my internet connection and here it is about 10ms to Quad9. I am on wifi and my connection goes through two switches and a router and a DSL FTTC modem. That leaves at least 20ms (which is luxury) for processing.
Perhaps we should insist on more from formatting from above. That's probably a post processing thing for ... gAI 8)
Doesn't work well in pure text mediums.
It has all the latency associated with cell networks combined with all the latency of routing all traffic through AWS.
As an added bonus, about 10% of sites block you outright because they assume a request coming from an AWS origin IP is a bot.
Your network needs to be End to End running on 5G-A ( 3GPP Rel 18 ) with full SA ( Stand Alone ) using NR ( New Radio ) only. Right now AFAIK only Mobile networks in China has managed to run that. ( Along with turning on VoNR ) Most other network are still behind in deployment and switching.
Maybe you are thinking of working around bufferbloat?
If you really want to have some fun come out to the country side with me where 4G is the only option and 120ms is the best end to end latency you're going to get. Plus your geolocation puts you half the nation away from where you actually are which only expounds the problem.
On the other hand I now have an acquired expertise in making applications that are extremely tolerant of high latency environments.
That seems to be the theme across all consumer electronics as well. For an average person mid phones are good enough, bargain bin laptops are good enough, almost any TV you can buy today is good enough. People may of course desire higher quality and specific segments will have higher needs, but things being enough may be a problem for tech and infra companies in the next decade.
I say this because we currently use an old 2014 phone as a house phone for the family. It's set to 2G to take calls, and switches to 4g for the actual voice call. We only have to charge it once every 2-3 weeks, if not longer. (Old Samsung phones also had Ultra Power Saving mode which helps with that)
2G is being shutdown though. Once that happens and it's forced into 4G all the time, we'll have to charge it more often. And that sucks. There isn't a single new phone on the market that lasts as long as this old phone with an old battery.
The same principle is why I have my modern personal phone set to 4G instead of 5G. The energy savings are very noticeable.
There was usually a purpose in this that was more profound than just being bored or lonely or something, but none was really required.
And maybe the person they were trying to find wasn't home right now, but that was OK. It was not weird to talk to whoever for a minute, or to hang out for awhile.
Nowadays, the ubiquity of personal pocket supercomputers means that the closest most people ever get to that kind of generally-welcome (if unexpected) interaction is a text message.
"Hey, I'm in front of your house. Are you home?"
And maybe that works more efficiently between two individuals than knocking on the door did, but otherwise this present way of doing things mostly just serves to further increase our social isolation.
Sometimes it seems that the closer it is that our technology allows us to be, the further it is that we actually become.
IT does not help with the just phoning bit but does with stopping by.
I do not get that many spam calls but I think that varies a lot, especially between countries with different laws about it.
Similarly, I find that it’s hard to catch family on their cell and much easier when I call their home.
passing cell phones around still happens for my family
Just a thought when it comes time to change out that device.
(Alternately, if you don't have the option to use VoWiFi, you could take literally any phone/tablet/etc; install a softphone app on it; get a cheap number from a VoIP number and connect to it; and leave the app running, the wi-fi on, and the cellular radio off. At that point, the device doesn't even need a[n e]SIM card!)
4K video reaches you only because it's compressed to crap. It's "good enough" until it's not. 360p TV was good enough at some point too.
Streaming video gets compressed to crap. People are being forced to believe that it is better to have 100% of crap provided in real time instead of waiting just a few extra moments to have the best possible experience.
Here is a trick for movie nights with the family: choose the movie you want to watch, start your torrent and tell everyone "let's go make popcorn!" The 5-10 minutes will get enough of the movie downloaded so you will be able to enjoy a high quality video.
That's because your source video is crap.
I'm not sure if you realize it, but all forms of digital media playback are streaming. Yes, even that MP4 stored locally on your SSD. There is literally no difference between "playing" that MP4 and "streaming" a Youtube or Netflix video.
Yes, even playing an 8K Blu-Ray video is still streaming.
Does that help?
Yes, but I assume when they say the "consumer" they mean everyone not us. Most people i've had in my home couldnt tell you the difference between a 4K bluray at 2 odd meters on a 65" panel vs 1080p.
I can be looking at a screen full of compression artificts that seem like the most obvious thing i've ever seen and ill be surrounded by people going "what are you talking about?"
Even if I can get them to notice, the response is almost always the same.
"Oh...yeah ok I guess I can see. I just assumed it supposed to look like it shrug its just not noticable to me"
I expect a future of screens that are natively subtly 3D and where you could see someone's nose hair without focusing. Only then they will notice "why do they look blurry and flat" when comparing it to an old TV.
Today if you get closer to a TV you will see blur. Tomorrow you will see the bird's individual strands of feathers.
"Good enough" is temporary.
They either won’t notice or won’t care and even if they do, it takes far longer than enthusiasts expect for the line to move.
Large numbers of people still say in 2025 that ‘4K is a gimmick,’ so I’m not holding my breath. ‘Good enough’ lasts much longer for the majority than most realise.
Look at displays today: I can’t even buy a modern one with motion quality that matches what I had 20 years ago. Why? Because for the average consumer, what we have is ‘good enough’ and has been for a long time.
> Today if you get closer to a TV you will see blur. Tomorrow you will see the birds individual strands of feathers
No, I’ll see blur. Unless you’re suggesting we’ve magically solved sample and hold induced motion blur in the consumer display space?
Of course, I know you meant in a still frame however if I wanted to stare at a high quality still image, I’d save myself the money and just go with some nice framed artwork instead.
> “Good enough” is temporary.
I’ll grant you this on a long enough timeframe. But it’s got a long tail and it’s gonna be a slow ride.
Business should learn to earn just good enough to make by.
Once every few months I'm in a situation where I want to watch YouTube on mobile or connect my laptop to mobile hotspot, but then I think "I don't need to be terminally online", or in worst-case scenario, I just pay a little bit extra for the data I use, but again, it happens extremely rarely. BTW quite often my phone randomly loses connection, but then I think "eh, god is telling me to interact with the physical world for five minutes".
At home though, it's a different situation, I need to have good internet connection. Currently I have 4Gbps both ways, and I'm thinking of changing to a cheaper plan, because I can't see myself benefitting from anything more than 1Gbps.
In any case though, my internet connection situation is definitely "good enough", and I do not need any upgrades.
There are no words to describe how stupid this is.
The trick is that the entity that owns the wires has to provide/upgrade the network at cost, and anyone has the right to run a telco on top of the network.
This creates competition for things like pricing plans, and financial incentives for the companies operating in the space to compete on their ability to build out / upgrade the network (or to not do that, but provide cheaper service).
Common carriers become the barrier to network upgrades. Always. Without fail. Monopolies are a bad idea, whether state or privately owned.
Let me give you 2 examples.
In australia we had Telstra (Formerly Telecom, Formerly Auspost). Testra would resell carriers ADSL services, and they stank. The carriers couldn't justify price increases to upgrade their networks and the whole thing stagnated.
We had a market review, and Telstra was legislatively forced to sell ULL instead. So the non monopolist is now placing their own hardware in Telstra exchanges, which they can upgrade. Which they did. Once they could sell an upgrade (ADSL2+) they could also price in the cost of upgrading peering and transit. We had a huge increase in network speeds. We later forgot this lesson and created the NBN. NBNCo does not sell ULL, and the pennies that ISPs can charge on top of it are causing stagnation again.
ULL works way better than common carrier. In singapore the government just runs glass. They have competition between carriers to provide faster GPON. 2gig 10gig 100gig whatever. Its just a hardware upgrade away.
10 years from now Australia will realise it screwed up with NBNCo. Again. But they wont as easily be able to go to ULL as they did in the past. NBN's fibre isn't built for it. We will have to tear out splitters and install glass.
The actual result is worse than you suggest. A carrier had to take the government/NBNCo to court to get permission to build residential fibre in apartment buildings over the monopoly. We have NBNCo strategically overbuilding other fibre providers and shutting them down (Its an offence to compete with the NBN on the order of importing a couple million bucks of cocaine). Its an absolute handbrake on competition and network upgrades. Innovation is only happening in the gaps left behind by the common carrier.
Oh wait ... the reason that freeway is always clogged is they are ripping it up, doubling it's width. And now I think about it, hasn't the NBN recently upgraded their max speeds from 100 Mb/s, to 250Mb/s, and now to 1Gb/s. And isn't the NBN currently ripping out the FttN, replacing it woth FttP, at no cost to the cusytomer? Sounds like a major upgrade to me. And wasn't the reason we got the NBN that Telstra point blank refused to replace the monopoly copper infrastructure with fibre?
If I didn't know better, I'd be think the major policy mistake Australia made in Telecom was the liberals to selling off Telstra. In a competitive market when a new technology came along a telecom is forced to upgrade because their a competitors would use the new technology to steal their customers. That works fabulously for 5G, where there is competition. But when the Libs sold Telstra it was a monopoly. Telstra just refused to upgrade the copper. The Libs thought they could fix that though legislation, but what happened instead is Telstra fought the legalisation tooth and nail and we ended up in the absurd situation of having buildings full of federal court judges and lawyers fighting to get reasonable ULL access. In the end Tesltra did give permission to change the equipment at the ends of the wires. But replacing the wires themselves - never. That was their golden goose. No one was permitted to replace them with a new technology.
Desperate to make the obvious move to fibre, the Libs then offered Telstra, the Optus, then anybody money to build a new fibre network - but they all refused to do so unless the government effectively guaranteed monopoly ownership over the new network.
Sorry, what was your point again? Oh, that's right, public ownership shared natural monopolies like wires, roads, water mains is bad. The thing I missed is why a private rent extracting monopoly beholden to no one except the profit seeking share holders owning those things is better.
I was stuck with a common carrier for years. I could pick different ISPs, which offered different prices and types of support, but they all used the same connection... which was only stable at lower speeds.
If the common carrier is doing all the work, what’s the point of the companies on top? What do they add to the system besides cost?
Might as well get rid of them and have a national carrier.
The trick is that this is essentially wireless spectrum. Which can be leased for limited periods of time and can easily allow for a more competitive environment than what natural monopolies allow for.
It's also possible to separate the infrastructure operators from the backhaul operators and thus entirely avoid the issues of capital investment costs by upstart incumbents. When done there's even less reason to tolerate monopolistic practices on either side.
Doesn't make much sense to me to abstract away most of the parts where an entity could build up its competitive advantage and then to pretend like healthy competition could be build on top.
Imagine if one entity did all the t-shirt manufacturing globally but then you congratulated yourself for creating a market based on altered colors and what is printed on top of these t-shirts.
I used to work at a place that did both on top of the various telcos. We offered ‘premium service’ with 24 hour customer support and a low customer to modem and bandwidth ratio.
Most of our competitors beat us in price but would only offer customers support 9-5 and you may get a busy signal/ lower bandwidth in the back haul during peak hours.
There was a single company that owned the wires and poles, because it’s expensive and complex to build physical infrastructure and hard to compete, but they were bared from selling actual services or undercutting providers because of their position. (Which depended on jurisdiction).
It solved the problem we have now of everyone complaining about their ISP but only having one option in their area.
We have that problem now specifically because we deregulated common carriers for internet right as it took over the role of telephone service.
And for some things it's just too much duplicated effort and wasted resources, T-shirts are one thing, because we don't really need those, but train lines and utilities etc. are another. I can't tell you where the "boundary" is, but if every electric company had to lay their own cables, there would only be one or two.
And in the opinion of many including mine, for example the Deutsche Bundesbahn got worse when it got privatized. They kinda exploited the fact that after reunification, there were two state railroad systems obviously, and instead of merging them into one state railroad system, it was privatized, but because it made more money for some, but not because it benefits the public, the customers. Of course the reasoning was the usual neoliberal spiel, "saving money" and "smaller government" but then that money just ends up not really making things better to the degree privatization made them worse.
Obviously not everything should be state run, far from it. But privatizing everything is a cure actually even worse than the disease, since state-run sinks and swims with how much say the people have, whereas a 100% privatized world just sinks into the abyss.
Same for mobile infrastructure would be great as well.
Regulators have other ways to incentivize quality/pricing and can mandate competition at levels of the stack other than the underlying infrastructure.
I wouldn't expect that "only a single network" is the right model for all locations, but it will be for some locations, so you need a regulatory framework that ensures quality/cost in the case of a single network anyway.
(I know that helium's original IoT network mostly failed due to lack of pmf, but idk about their 5G stuff)
Network providers get paid for the bandwidth that flows over their nodes, but the protocol also allows for economically incentivizing network expansion and punishing congestion with subsidization / taxing.
You can unify everyone under the same "network", but the infrastructure providers running it are diverse and in competition.
Make it easy for a new wireless company to spawn while maintaining the infrastructure everyone needs.
Do private utilities have any incentive to be cheap?
The reason we have utility regulations in the first place is because utilities are natural monopolies with literally zero incentive to be cheap. On the contrary, they are highly incentivized to push up prices as much as possible because they have their customers over a barrel.
...which is unusual with many utilities, but is also pretty common with wireless carriers in much of the world.
"public utility" implies it's owned by the public not a profit seeking group of shareholders.
If you want to say its worse, perhaps you should check if its actually worse first.
Conversely, Texas has significantly above average use per capita, spreading the fixed costs across more kWh, but still results in higher annual costs per capita, despite lower per kWh rates.
Further, the LA fires might have also been caused by a downed line so that's going to be a fairly big cost to the power company.
That's, I assume, a reference to the 2018 Camp Fire.
> That's a huge bill that needs to be paid and that cost is ultimately going to come out of the kwh rates.
The Trust established to pay PG&E liabilities for the 2015 Butte, 2017 North Bay, and 2018 Camp Fires, which discharged PG&E's responsibility for them, receives no additional ratepayer funds after its initial funding and is in the wind-down process expecting a single final top-off payment to already approved claimants. So, no, its not a huge bill that will be paid out of future rates.
This article is essentially arguing innovation is dead in this space and there is no need for bandwidth-related improvements. At the same time, there is no 5G provider without a high-speed cap or throttling for hot spots. What would happen if enough people switched to 5G boxes over cable? Maybe T-Mobile can compete with Comcast?
> During congestion, customers on this plan may notice speeds lower than other customers and further reduction if using >1.2TB/mo., due to data prioritization
So not really a cap, but a deprioritization. A few friends using it around me routinely use >2TB/mo and haven't experienced degradation, I guess there's not excessive congestion. YMMV.
1. It must be well-run.
2. It must be guaranteed to continue to be well-run.
3. If someone can do it better, they must be allowed to do so - and then their improvements have to be folded into the network somehow if there is to be only one network.
Nationalizing telecom is a great way to reward the tech oligarchs by making the capital investments in giant data centers more valuable. If 10 gig can be delivered cheaply over the air, those hyperscale data centers will end up obsolete if technology continues to advance at the current pace. Why would the companies that represent 30% of the stock markets value want that?
Having 5 competing infrastructures trying to blanket the country means that you end up with a ton of waste and the most populated places get priority as they constantly fight each other for the most valuable markets while neglecting the less profitable fringe
L4S is on its way, and may finally be the beginning of the end for bufferbloat and congestion, and vendors of mobile devices are in an almost unique position of being able to roll out network stack changes en masse. And just for once, consumer incentives, vendor incentives and network operator incentives all align - and it's incremental, and lacking in incentives for bad actors.
See this blog entry: https://www.ietf.org/blog/banishing-bufferbloat/ for more on L4S and bufferbloat. And this: https://datatracker.ietf.org/meeting/105/materials/slides-10... for a proper technical deep dive.
The development of L4S has been a pincer operation across all levels of the network stack, integrating everything previously understood about latency and congestion in real networks, and one of the most impressive bits of network engineering I've seen in the history of the Internet.
Open-world games such as Cyberpunk 2077 already have hours-long downloads for some users. That's when you load the whole world as one download. Doing it incrementally is worse. Microsoft Flight Simulator 2024 can pull 100 to 200 Mb/sec from the asset servers.
They're just flying over the world, without much ground level detail. Metaverse clients go further. My Second Life client, Sharpview, will download 400Mb/s of content, sustained, if you get on a motorcycle and go zooming around Second Life. The content is coming from AWS via Akamai caches, which can deliver content at such rates. If less bandwidth is available, things are blurry, but it still works. The level of asset detail is such that you can stop driving, go into a convenience store, and read the labels on the items.
GTA 6 multiplayer is coming. That's going to need bandwidth.
The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.
The CEO of Intel, during the metaverse boom, said that about 1000x more compute and bandwidth was needed to do a Ready Player One / Matrix quality metaverse. It's not that quite that bad.
For my area all the mobile network home internet options offer plenty of speed, but the bandwidth limitations are a dealbreaker.
Everyone I know still uses their cable/FTTH as their main internet, and mobile network as a hotspot if their main ISP goes down.
The PS5 and Xbox Series S/X both had disks that were incapable of holding a terabyte at the launch of The Matrix Awakens. Not sure where you are getting that info from, but both the X S/X and PS5 were about 30GB in size on disk, and the later packaged PC release is less than 20GB.
The full PC development system might total a TB with all Unreal Engine, Metahumans, City Sample packs, Matrix Awakens code and assets (audio, mocap, etc) but even then the consumer download will be around the 20-30GB size as noted above.
The reason why there isn't as much demand for mobile data as they want is because the carriers have horrendously overpriced it, because they want a business model where they get paid more when you use your phone more. Most consumers work around this business model by just... not using mobile data. Either by downloading everything in advance or deliberately avoiding data-hungry things like video streaming. e.g. I have no interest in paying 10 cents to watch a YouTube video when I'm out of the house, so I'm not going to watch YouTube.
There's a very old article that I can't find anymore which predicted the death of satellite phones, airplane phones, and weirdly enough, 3G; because they were built on the idea of taking places that traditionally don't have network connectivity, and then selling connectivity at exorbitant prices, on the hopes that people desperate for connectivity will pay those prices[1]. This doesn't scale. Obviously 3G did not fail, but it didn't fail predominantly because networks got cheaper to access - not because there was a hidden, untapped market of people who were going to spend tens of dollars per megabyte just to not have to hunt for a phone jack to send an e-mail from their laptop[2].
I get the same vibes from 5G. Oh, yes, sure, we can treat 5G like a landline now and just stream massive amounts of data to it with low latency, but that's a scam. The kinds of scenarios they were pitching, like factories running a bunch of sensors off of 5G, were already possible with properly-spec'd Wi-Fi access points[3]. Everyone in 5G thought they could sell us the same network again but for more money.
[0] While I'm ranting about mobile data usage, I would like to point out that either Android's data usage accounting has gotten significantly worse, or Google Fi's carrier accounting is lying, because they're now consistently about 100-200MB out of sync by the end of the month. Didn't have this problem when I was using an LG G7 ThinQ, but my Pixel 8 Pro does this constantly.
[1] Which it called "permanet", in contrast to the "nearernet" strategy of just waiting until you have a cheap connection and sending everything then.
[2] I'm told similar economics are why you can't buy laptops with cellular modems in them. The licensing agreements that cover cellular SEP only require FRAND pricing on phones and tablets, so only phones and tablets can get affordable cell modems, and Qualcomm treats everything else as a permanet play.
[3] Hell, there's even a 5G spec for "license-assisted access", i.e. spilling 5G radio transmissions into the ISM bands that Wi-Fi normally occupies, so it's literally just weirdly shaped Wi-Fi at this point.
I don't know what you mean. My current laptop (Lenovo L13) has a cellular modem that I don't need. And I am certainly a cost conscious buyer. It's also not the first time that this happened as well.
It's getting pretty reasonable these days, with download speeds reaching 0.5 gbit/sec per link, and latency is acceptable at ~20ms.
The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload. If we could improve one thing with future wireless tech, I think upload would be a great candidate.
We're getting 30-50 mbit/sec per connection on a good day.
In San Francisco monkeybrains is the best ISP I’ve ever used to date; symmetric up/down, great ping and cheaper than any other provider
If you're lucky and get pointed at a good site, you can get great speeds. I was getting 500Mbps up/down. If you're unlucky, you might only get 50Mbps.
There's always some jitter and packet loss on the connection too -- much more than a wired connection. Online gaming was not great. Congestion control algos suffered because of this. I would see TCP throughput drop by 90% over long distances because congestion control isn't tuned for these network conditions.
And in the rain my connection would drop out every few minutes.
But for $35/month it was great value, and the whole company is friendly and easy to deal with.
For any folks who are using cable internet, or shitty DSL, I strongly recommend checking to see if Monkeybrains serves your home. If you have symmetric fiber, or are happy with Webpass, then it might not make sense to switch.
For 5G, a lot of the spectrum is statically split into downstream and upstream in equal bandwidth. But equal radio bandwidth doesn't mean equal data rates. Downstream speeds are typically higher because multiplexing happens at one fixed point, instead of over multiple, potentially moving transmitters.
I can grant that a typical usage of wireless bandwidth doesn't require more than 10Mbps. So, what does "even faster buy you"?
The answer is actually pretty simple, at any given frequency you have a limited amount of data that can be transmitted. The more people you have chatting to a tower, the less available bandwidth there is. By having a transmission standard with theoretical capacities in the GB or 10GB, or more you make it so you can service 10, 100, 1000 more customers their 10Mbps content. It makes it cheaper for the carrier to roll out and gives a better experience for the end users.
Now, it's possible that that raw GB/s with unobstructed LoS is the underlying optimization metric driving these standards, but I would assume it's something different (e.g. tower capex per connected user).
What I find fascinating is that in a lot of situations mobile phones are now way faster than wired internet for lots of people. My parents never upgraded their home internet despite there being fire available. They have 80MBit via DSL. Their phones however due to regular upgrades now have unlimited 5G and are almost 10 times as fast as their home internet.
It doesn't really change their argument, but to be fair, Netflix has some of the lowest picture quality of any major streaming service on the market, their version of "high-end 4K" is so heavily compressed, it routinely looks worse than a 15 year old 1080p Blu-Ray.
"High-end" 4K video (assuming HEVC) should really be targeting 30 Mb/s average, with peaks up to 50 Mb/s. Not "15 Mb/s".
…and we only pay for 500 MBit for my home fiber. (Granted, also 500 Mbit upload.)
(T-Mobile, Southern California)
My experiences on portable devices have also seen some mixture of performance, but I'm also on a super cheap MVNO plan. Friends on more premium plans often get far more consistent experiences.
Is your T-Mobile underprovisioned? Where I am, T-Mobile 5G is 400Mbps at 2am, but slows to 5-10Mbps on weekdays at lunchtime and during rush hours, and on weekends when the bars are full.
Not to mention that the T-Mobile Home Internet router either locks up, or reboots itself at least twice a day.
I put up with the inconvenience because it's either $55 to T-Mobile, $100 to Verizon for even less 5G bandwidth, or $140 the local cable company.
I also have Verizon.
Choice of service varies based on location heavily from my experience. I’m a long time big time camper and I’ve driven through most corners of most Western states:
- 1/3 will have NO cellular service
- 1/3 will have ONLY Verizon. If T-Mobile comes up, it’s unusable
- 1/3 remaining will have both T-Mobile and Verizon
My Verizon is speed capped so I can’t compare that. T-Mobile works better in more urban areas for me, but it’s unpredictable. In a medium sized costal town in Oregon, Verizon might be better but I will then get half gigabit T-Mobile in a different coastal town in California.
One thing I have learned is that those coverage maps are quite accurate.
They're constantly running promotions "get free smartglases/video game systems/etc if you sign up for gigabit." Turns out that gigabit is still way more than most people need, even if it's 2025 and you spend hours per day online.
Also you can deliver well over 1 Gbps over coax or DSL with modern DOCSIS and G.fast respectively. But most countries have started dismantling copper wirelines.
Years back, when FTTH started rolling out in Ireland, some of the CPE for the earliest rollouts only had 100Mbit/sec ethernet (on a 1Gbit/sec service)...
10G internet doesn't make your streaming better, but downloads the latest game much faster. It makes for much less painful transfer of a VM image from a remote datacenter to a local machine.
Which is good and bad. The good part is that it makes it easier for the ISPs to provide -- most people won't be filling that 10G pipe, so you can offer 10G without it raising bandwidth usage much at all. You're just making remote workers really happy when they have to download a terabyte of data on a single, very rare occasion instead of it taking all day.
The bad part is that this comfort is harder to justify. Providing 10G to make life more comfortable the 1% of the time it comes into play still costs money.
I have never come remotely close to downloading anything else -- including games -- at 1Gbps.
The source side certainly has the available pipe, but most (all?) providers see little upside to allowing one client/connection to use that much bandwidth.
Only the newest routers do gigabit over wifi. If most of your devices are wireless, you'll need to make sure they all have wifi 6 or newer chips to use their full potential.
Even if upgrading your router is a one-time cost, it's still enough effort that most people won't bother.
I'm getting my Steam games at 2Gbps, and I am suspecting that my aging ISP's "box" is to blame for the cap (didn't want to pay my ISP for the new box that officially supports 8Gbps symmetrical, and just got a SFP+ adapter for the old one). I pay 39€/M for what is supposed to go "up to" 8Gbps/500Mbps on that old box.
Games from Google Drive mirrors are coming at full speed too. Nice when dling that new Skyrim VR 90GB mod pack refresh
[1]:https://www.nperf.com/en/map/GB/-/2012851.Three-Mobile/signa...
Not sure what's going on there.
(Also, where're you getting 'over 80'? From a quick search there seem to have been maybe 5 masts damaged/destroyed.)
https://www.mirror.co.uk/news/uk-news/least-90-phone-masts-a...
https://news.sky.com/story/coronavirus-90-attacks-on-phone-m...
https://www.bbc.com/news/uk-england-52164358
https://www.businessinsider.com/77-phone-masts-fire-coronavi...
> I see you're down to 5 damaged. Good for you.
Who's 'you' in this context?
So it didn't happen, because it was *obviously* made up? The problem with such reasoning is, it does not require substantiation. You should find 20 people that also believe it(shouldn't have a problem on this platform) and you can start a religion.
Like, even if it had been 80 masts destroyed, the ongoing impact would be nil; they'd just have been replaced and 80 masts is in any case a trivial number (it's difficult to get numbers on how many exist because the ONS data counts nano cells, but there seem to be over 60,000 'real' masts in the UK, anyway.)
> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades. Although supersonic passenger aircraft found a niche from the 1970s through the early 2000s with the Concorde, commercial supersonic transport is no longer available for the mainstream consumer marketplace today.
OK, "Bad Analogy Award of the Year" for that one. Traveling at supersonic speeds had some fundamental problems, primarily being that the energy required to travel at those speeds is so much more than for subsonic aircraft, and thus the price was much higher for supersonic travel, and the problem of sonic booms meant they were forbidden to travel over land. When the Concorde was in service, London to NYC flights were 10-20x more expensive on the Concorde compared to economy class on a conventional jet, meaning the ~4 hours saved flight time was only worth it for the richest (and folks just seeking the novelty of it). There are plenty of people that would still LOVE to fly the Concorde if the price were much cheaper.
That is, the fundamental variable cost of supersonic travel is much higher than for conventional jets (though that may be changing - I saw that pg posted recently that Boom has found a way to get rid of the sonic boom reaching the ground over land), while that's not true for next gen mobile tech, where it's primarily just the upfront investment cost that needs to be recouped.
But other people do.
And the main resource that is limited with cell service is air time: there are only so many frequencies, and only so many people can send/receive at the same time.
So if someone wants to watch a video video, and a particular segment is (say) 100M, then if a device can do 100M/s, it will take 1s to do that operation: that's a time period when other people may not be able to do anything. But if the device can do 500M/s, then that segment can come down in 0.2s, which means there's now 0.8s worth of time for other people to do other things.
You're not going to see any difference if you're watching the video (or streaming music, or check mail), but collectively everyone can get their 'share' of the resource much more quickly.
Faster speeds allow better resource utilization because devices can get on and off the air in a shorter amount of time.
It's like I'm paying them extra for the privilege of increasing their network efficiency.
I barely get regular cell service in my house from my provider. There's no way I'd get a hotspot for a service that is a must have. Provider's "coverage" maps are such a joke to make them useless.
I can routinely go around most of the metro area on any given day and get hundreds of megabits of throughput back home at <10ms latency on a plan that's costing me ~$30/mo on a device that cost less than $400. I can be in a crowded sports arena and on my regular cellular internet and still manage to pull >50Mbit down despite the crowd. Several years ago, I'd be lucky to even get SMS/MMS out quickly and reliably.
to the person that is affected by it, it's not a good thing. we can argue all day about it but you're wrong to whoever you're arguing against at that point.
https://www.fastcompany.com/90314058/5g-means-youll-have-to-...
https://venturebeat.com/mobile/sk-telecom-will-use-5g-to-bui...
https://www.ericsson.com/en/blog/2020/12/5g-positioning--wha...
“The initial access period includes the transmission of beamformed SSBs that provide the UEs with a basic synchronization signal and demodulation reference signals. This allows for UEs to save power by going inactive and rejoining the network at a later initial access period. At the physical layer, the UE will receive the SSBs from a single antenna or using one or more spatial filters, such as a multi-panel handset used to overcome hand blockage. The UE will use the received SSB for synchronization and determining the control information. The beam reporting stage includes one or more possible SSB CSI reports which are transmitted in the random access channel. The report includes information for the strongest serving cell and may include a set of the next strongest cells within the same band to assist with load balancing. The number of reported additional cells depends on the carrier frequency, the previous state of the UE in the net- work, and the bands being monitored. In a newly-active state, the UE reports the top 6–16 additional cells across each active frequency range. This reporting helps to manage handover and mitigate cell-edge interference. In the final steps, the UE has connected to a serving cell and is ready to start receiving data. Further beam refinement and channel estimation can occur by transmitting reference signals with more precise beams. Although not specified in the standard, a typical CSI-RS would cover smaller portions of the reported SSBs’ directions or combine coherently across a multipath channel. Using more directional or precise beams can increase the SNR–thereby improving the channel estimates and beam alignment. Beam refinement can also be used to adjust the beamforming slightly to track highly mobile UEs.”
Your linked article even agrees: “Carriers can still see which cell towers your device connects to, use the strength and angle of your device’s signal to the tower, and then look up your device’s unique cellular identifier to determine your general location. Your location may never be private when you’re connected to a cellular network”
Fun fact: modern Wi-Fi standards do this too and it's possible to use the backscattered emissions to see through your walls lol https://www.popularmechanics.com/technology/security/a425750...
There are plenty things that *could* require more bandwidth than video, but it's not clear that a large number of people want to use any of them.
With mobile, I bet contention and poor signal are more of an issue. 5G is a noticeable improvement over LTE, and I am not sure they can do much better.
FYI, the air link latency for LTE was given as 4-5 ms. FDD as it's the best here. The 5G improvement to 1ms would require features (URLLC) that nobody implemented and nobody will: too expensive for too niche markets.
The latency in a cellular network is mostly from the core network, not the radio link anymore. Event in 4G.
(telecom engineer, having worked on both 4G and 5G and recently out of the field)
In a professional context, nobody knows it all in details. There are specializations: core network and RAN, and inside RAN protocol stack vs PHY, and in PHY algos vs implementation, etc.
You can see all the cellular specs (they're public) from there: https://www.3gpp.org/specifications-technologies/specificati...
5G (or NR) is the series 38 at the bottom. Direct access: https://www.3gpp.org/ftp/Specs/archive/38_series
It's a lot ;) But a readable introduction is the 38.300 spec, and the latest edition for the first 5G release (R15, or "f") is this one: https://www.3gpp.org/ftp/Specs/archive/38_series/38.300/3830...
It's about as readable as it can get. The PHY part is pretty awful by comparison. If you have a PHY interest, you'll need to look for technical books as the specs are quite hermetic (but it's not my field either).
I have never seen this. Where do I have to get 5G service to see these latencies?
If you search "5g latency", Google's AI answer says 1 ms, followed by another quote lift from Thales Group™ saying 4G was 20 ms and 5G is 1ms.
Once you scroll past the automated attempts, you start getting real info.
Actual data is in the "SpeedTest Award Report" PDF, retrieved from https://www.speedtest.net/awards/united_states/ via https://www.speedtest.net/awards/reports/2024/2024_UnitedSta....
Spoiler: 23 ms median for fastest provider, T-mobile.
(at least, I don't recall being able to get that sort of info from iOS APIs, nor have I ever seen data that would have required being derived that way)
I can only speculate 5G was so saturated on the initial rollout so it led to congestion and now its stabilized. But latency isn't only affected by distance and hops - congestion matters.
One of the issues with this 5g vs 6g is the long-term-evolution of it all -- I have no idea when/where/if-at-all I will see improvements on my mobile devices for all the sub-improvements of 5g or if it's only reserved for certain use cases
It also takes ages to back up my computer. 18 terabytes of data in my case, and that’s after pruning another 30 terabytes of data as “unnecessary” to back up.
Are you downloading AAA games or backing up your computer over mobile?
Also, I hope you're doing differential backups, in which case it's only the initial backup should be slow. Which it's always going to be for something gargantuan like 18 TB!
There are a ton of other inefficient allocations of spectrum^1, but not all spectrum is suitable for all purposes and the bands for cellular connectivity are highly sought after.
1: https://upload.wikimedia.org/wikipedia/commons/c/c7/United_S...
Lower frequencies have the advantage of longer distances and permeating through obstructions better. I suppose limited bandwidth and considerations of the number of devices coexisting is a limiting factor.
Basically, yes (if you take into account other consideration like radiated power, transmitter consumed power, multipath tolerance, Doppler shift tolerance and so on). Everything is a tradeoff. We could e. g. use higher-order modulation, but that would result in higher peak-to-average power ratio, meaning less efficient transmitter. We could reduce cyclic prefix length, but that would reduce multipath tolerance. And so on.
Another important reason why higher frequencies are preferred is frequency reuse. Longer distance and penetration is not always an advantage for a mobile network. A lot of radio space is wasted in areas where the signal is too weak to be usable but strong enough to interfere with useful signals at the same frequency. In denser areas you want to cram in more base stations, and if the radiation is attenuated quickly with distance, you would need less spectrum space overall.
Exactly. When I was running WiFi for PyCon, I kept the radios lower (on tables) and the power levels at the lower end (especially for 2.4GHz, which a lot of devices still were limited to at the time). Human bodies do a good job of limiting the cell size and interference between adjacent APs in that model. I could count on at least a couple people every conference to track me down and tell me I needed to increase the power on the APs. ;-)
At the end of the day, there is a total speed limit of Mb/s/Hz.
For example, in cities, with a high population density, you could theoretically have a single cell tower providing data for everyone.
However, the speed would be slow, as for a given bandwidth six the data is shared between everyone in the city.
Alternatively, one could have 100 towers, and then the data would only have to be shared by those within range. But for this to work, one of the design constraints is that a smaller range is beneficial, so that multiple towers do not interfere with each other.
mmWave is a flop.
I used to have some early engineering material outlining what had been approved for use in each country and 24GHz was pretty damn common. Could be that changed I havent kept up.
I do know in Australia we have sweet FA and 5G isnt very interesting at all.
It just also supports other bands as well.
“Hey, this isn’t actually supposed to be used to get work done. Keep doing simple phone stuff!“
Other factors, such as jitter, transmission delay, queuing delay, etc., also impact quality. However, if the delay occurs mid-transmission (e.g., due to network congestion or routing inefficiencies), there’s little that can be done beyond optimizing at the endpoints.
[1] https://www.wikiwand.com/en/articles/Latency_(audio)#Telepho...
For example, at the last investor day that AT&T held, they indicated [0:pdf] that their growth plans are in broadband fiber, not building more 5G capacity to serve a surge in traffic. Reading their charts and knowing how AT&T traditionally worked, I believe that they are going to try to cut the expense of running the 5G network via various optimizations and redirect capital heavily to go after the fiber broadband market instead, using convergence (ahem: price bundling) to win subscribers.
(I bet it really stuck in their craw that Comcast mastered the bundle so well that they even built a successful MVNO on the back of their xfinity customer base and parked their tanks on AT&Ts lawn while the latter was futzing about with HBO.)
However, bundling is a price compression game, not an ARPU growth game. If you start at $100 a month and then begin chipping away with autopay discounts, mobile convergence, free visa gift cards and all that nonsense, pretty soon you are selling broadband for $35 a month and can't make it work except with truly cruddy service, which leads to high churn rates. So we'll see how this turns out.
[0:pdf] https://investors.att.com/~/media/Files/A/ATT-IR-V2/reports-...
So it might make sense for operators to focus on developing fibre regardless.
So you want me to trade my horrible 100mbit line with ok latency for anything between 10 and 120mbit depending on the time of day with a latency if over 50ms guaranteed. And you also get a CGNAT IP and thus no incoming connections because who needs ipv6!
https://www.reuters.com/business/media-telecom/nokia-ceo-ste...
Perhaps this has something to do with limited mobile bandwidth?
Now imagine we add more bandwidth: what would happen to Comcast and other fiber monopolists if people started replacing fiber with 5G?
Once users stop demanding higher speeds, you can still get a lot more growth by giving them higher data caps or even unlimited plans.
This will make 5G home routers a very attractive option, particularly when the only other options you have available are cable and/or DSL.
I think wireless providers competing in the same playing field as traditional ISPs will turn out to be an extremely good thing for consumers.
Currently 5G doesn't even meet their very own 3 objectives namely:
1) Enhanced Mobile Broadband (eMBB) - eMBB, 2) Massive Machine Type Communication (mMTC) - mMTC, 3) Ultra-Reliable Low Latency Communication (URLLC) - URLLC
The article is just focusing on the first part that's arguably can be met by prior standard the 4G LTE+ if it's only about 1 Gbps bandwidth of the 5G lower range microwave frequency bands or FR1.
For most parts of the world, 5G higher range millimeter waves (mmWave) band or FR2 are not being widely deployed and implemented that can delivered bandwidth much higher than 1 Gbps but then again the wireless transmission ranges and distances are severely limited for mmWave compared to microwave or RF.
One of the notable efforts by 3GPP (5G standards consortium) for objectives part 2 and 3 is the DECT NR+ [1]. It's the first non-cellular 5G standard that can support local mesh wireless networks but it's availability is very limited now although no base stations modifications are required for its usages since it's non-cellular and the backhaul to base stations can be performed by the existing 5G connectivity. I'd imagined that it will be available inside most phones now since the standard already out for several years but it seems only Nordic is interested in the new standard.
The upcoming 6G standards probably need to carry on and focusing on the 2 and 3 objectives of 5G even more so, since it's expected machine to machine (M2M) will surpass the conventional human-to-human (H2H) and even the new (H2M) with the rise of IoT and AI based system for example intelligent transports of (vehicle-to-everything) V2X systems.
[1] DECT NR+: A technical dive into non-cellular 5G (30 comments):
https://news.ycombinator.com/item?id=39905644
Hard pass.
5G is far from ubiquitous as it. Though how would we even know? I feel like my phone is always lying about what type of network it's connecting to and carrier shave the truth with shit like "5Ge" and the like.
I have not, ever, really thought "Yeah, my phone's internet is perfect as-in". I have low-signal areas of my house, if the power goes out the towers are sometimes unusable due to the increased load, etc. I do everything in my power to never use cellular because I find incredibly frustrating and unreliable.
Cell service has literally unlimited headroom to improve (technologically and business use-cases). Maybe we need more 5G and that would fix the problems and we don't need 6G or maybe this article is a gift to fat and lazy telecoms who are masters at "coasting" and "only doing maintenance".
Outside of my home (where I admit I'll never give up my 10 Gbit fiber), I'll always default to using 5G. It's always faster and more stable than any kind of "free wifi" at a coffee shop or hotel or anything, if I'm working out of home, I'm tethering to my phone, racking up 120 GB/mo or so of usage.
Time to make it cheap!
With my Pixel 8 and 5G SA activated (Telefonica Germany) everything is back to normal.
Nobody would shrug at being able to fly 2x faster. The reason it stopped is because it made lots of noise and was expensive. Not because it was not needed.
I think you'd be hard pressed to find anyone who would not want faster flights if it could be done at reasonable cost.
But if there are little to no applications for faster Internet speeds -- which for the most part there aren't -- then it just kinda doesn't matter.
Complete BS. This misses the critical and common issue of user contention with 4G, which is the main relief 5G brings.
If all you do is compare peak throughput on paper you will miss the real world performance issues. In reality you will never see "good" 4G speeds, there just isn't enough bandwidth to go around in practice due to the lower frequency bands it operates in.
I've got both a 4G and 5G LTE router, both are Cat20 which means in theory can operate at up to 2 Gbit/s. Yet in practice, with any of the carriers in the UK the 4G modem will scape 40 Mbit/s at best in the wee hours, and drop as low as 3 Mbit/s at peak time. The 5G one will give me 600 Mbits all day long... because there is enough bandwidth to go around for all users, this is the key difference.
IOW: Apps could definitely use high bandwidth & low latency networking (think AI acceleration, remote storage, volumetric telepresence etc) if we had it reliably, but our due to wireless transition apps are adopted to stagnating speeds.
This is a terrible example.
If supersonic mass travel could be provided safely and cheaply, demand would be bonkers.
If I could get to Tokyo in an hour for $50, I would visit every weekend.
Overall, the article is sound.
But what a terrible example of demand not filling supply.
If I had a terabit per second, indeed I probably wouldn't use it.
But you can not make travel fast enough.
The article is about mobile specifically.
My phone, a mid-range Android from 3 years ago, usually downloads over 4G much faster than any of the wired or fibre networks I have access too, including the supposedly newly installed dual fibre links at work, or the "superfast broadband" zone at my local library. It's also much faster than the 4G router at home.
I've downloaded terabytes over my phone, including LLM weights, and my provider's "unlimited" data plan seems fine with it. (They say 3TB/month in the small print.)
For example, my only choice is Comcast's 1.2Gbps/25Mbps service. If I need faster upload I have to tether to my phone. And I rarely get anywhere near that full 1.2Gbps down.
And there are people in areas even less well served by traditional ISPs where their primary Internet connection is wireless.