Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Trouble with 5G (backreaction.blogspot.com)
173 points by Brajeshwar on Sept 4, 2022 | hide | past | favorite | 154 comments


The trouble with 5G is that it will be the final nail in the coffin of location privacy, assuming most people carry their phone with them at most times. LTE is already very "good" at this, but 5G brings centimeter-precision.

https://www.ericsson.com/en/blog/2020/12/5g-positioning--wha... sez —

“The arrival of 5G delivers new enhanced parameters for positioning accuracy down to the meter, decimeter and centimeter.”

“Positioning of users and devices across general indoor environments, such as offices, shops, logistics, etc., was a focus area of 3GPP Release 16.”

https://venturebeat.com/mobile/sk-telecom-will-use-5g-to-bui... sez —

“While current [2019] smartphones can under some circumstances send and receive location data with 3-foot accuracy, it takes an external GNSS receiver to access location services with centimeter-level accuracy.”

https://www.fastcompany.com/90314058/5g-means-youll-have-to-... sez —

“[5G network positioning] data can also enable advertisers and data brokers to see the exact routes you take each day and even which buildings you go into. And anyone with access to your mobile network’s cell tower data will now be able to track your movements in real time.”


>The trouble with 5G is that it will be the final nail in the coffin of location privacy,

And yet it's first deployed set of standards that encrypts IMSI? (this is pushing Stingrays to obsolescence, Harris stopped making them) https://www.thalesgroup.com/en/worldwide-digital-identity-an... < from one of the world's primary producers of SIM cards.

In a densified enough network (cities) the carrier knows where users are within a few meters by correlating timing advance from the towers. This is independent of any software installed on user equipment. mmwave spectrum in the most populated areas (stadiums/train stations) brings this to centimeter-level due to physics, but we're just splitting hairs. This should be a key area of focus on privacy legislation, on 'who' should be allowed to access this data (LEO with a warrant). Even if user device lacks the hardware, the network side can have hardware installed to measure precise location of user equipment

Other wireless technologies independent of the cellular network can be used for correlation and privacy relies on the device randomizing macaddr for beacons, both Bluetooth and WiFi. GNSS (except for Beidou, which is disabled in firmware in the US) are one-way comms unless the phone is snitching on you.


https://www.eff.org/deeplinks/2019/01/5g-protocol-may-still-...

> this attack provides a new way to track a user’s location not just over 3G and 4G, but even over 5G. Location tracking seems to be the most common use for IMSI catchers by American law enforcement and this vulnerability could provide the next generation of CSSs with a way to track user location, even over 5G.

> It’s important to keep in mind here that, for cases of lawful intervention from law enforcement agencies, there are better ways than this attack technique to get location information, such as getting a warrant and getting the information directly from the phone companies. People working outside the legal system, such as spies and criminals, cannot get warrants and cannot typically work directly with the phone companies.


2019 article! Quite a bit has changed since then. As I pointed out, Stingrays aren't produced anymore.

As to your second point, law enforcement buys units like this to bypass normal due process out of the system (having to go to a judge for every minor crime/investigation). I mentioned in my comment that the network will always have this information, even back to 2G there was additional equipment added to the towers when E911 mandates took effect to get better location information off connected user equipment. Reading about E911 requirements is enlightening to learn more about the 'how':

https://urgentcomm.com/2008/07/01/different-strokes/

https://www.federalregister.gov/documents/2020/01/16/2019-28...


> Stingrays aren't produced anymore.

If there's a 5G protocol vulnerability, it can be exploited with an SDR and custom software, even if no commercial product is sold for this purpose.

2021 Blackhat, https://i.blackhat.com/USA21/Wednesday-Handouts/us-21-5G-IMS...

> IMSI catcher attack is possible in both 5G NSA and SA networks

  • 4G RAN security == 5G NSA (false sense of 5G security)
  • Unfixed radio protocols allow targeted attacks 
  • For end users, no control over choosing the most secure network
  • No security indicators for connected network 
> Lack of enforcement of security features in operational networks allow tracking of 5G users easily

  • Need continuous & proactive security monitoring of 5G RAN configs


5G SA networks are only becoming more common in the past year. 5G NSA does not benefit from the same enhanced privacy. At the same time SA is starting to get deployed legacy networks are being shut down. One of the reasons it took so long was necessity for user equipment to support NR carrier aggregation, and we're there now.

But we are derailing. All the slides you've linked acknowledge the attempts made in 5G to improve privacy and most of the practical attacks rely on implementation mistakes or downgrade attacks (which should be less possible as legacy networks disappear). I originally responded on the topic to note that for the first time on a global standard there was an attempt to encrypt subscriber identifying data. Earlier standards made no such attempt and were broken in other ways. (For example GSM had two replaced cracked/backdoored encryption standards A5/1 and A5/2 which were replaced by A5/3 and the standard never made any attempt to authenticate towers)


It's good that we have progressed from no protection to broken protection. May we continue to progress in the right direction.

Practically, 5G phones today remain vulnerable to IMSI catching.


Worth noting that tje ability to catch an IMSI or SUPI in the clear depends on whether the packet core implements concealed identifiers (SUCI) with real encryption. IIRC there are three possible encryption schemes and one of them is "null encryption". I dont know what happens if the cellular provider simply sets that setting, do phones just say "ok" and continue to connect?


Related: does the OEM (Apple, Samsung) or baseband radio vendor (Qualcomm) make that decision?


> criminals, cannot get warrants and cannot typically work directly with the phone companies.

Colour me skeptical on the latter. What kind of criminal is tech savvy enough to want location data, but can't think of either using a fake business to buy the data (if it's legally allowed to be sold by the phone company), or bribing a phone company employee, or becoming / having one of your criminal team become an employee there, or commit another crime such as hacking into the phone company to steal data, or...?


IMSI catchers enable realtime scale/volume harvesting.

Blackhat 2020, Detecting Fake 4G Base Stations in Real Time, https://news.ycombinator.com/item?id=32237621


On one hand, this technology has all sorts of good uses, like helping emergency services find someone calling them inside a building or helping you navigate in skyscrapers where GPS doesn't work well. On the other hand, humans are steaming piles of shit and probably can't be trusted to use this technology properly for just "good" uses. I guess there's no way to reverse course, but it makes me sad that my son might grow up without privacy, where his mistakes are never forgotten.


They'll be forgotten unless he decides to criticize the police, the ruling party, or Verizon. Or spends a lot of time walking near people who do.


It’s inevitable that the advance of technology will both improve the world and potentially be a vector of potential misuse.

Ultimately, it has always been like that, since the dawn of civilization. The answer is not fighting technological advances, but regulating the use of technology from law enforcement (and the government, and bad actors) instead.


This only applies to places with UWB (mmWave) which is very few places. Prior to 5G, the network was borderline unusable in these crowded places and people used Wi-Fi (which has similar location tracking concerns). For the average user this hasn't changed anything.

For very privacy-conscious users, you can always turn off UWB.


Privacy vs coverage in crowded places seems like a valid (though uncomfortable) choice.

In crowded places like trains or planes, where it matters most for me personally, you already don't have a location privacy once you've boarded: your seat is known.


Can you? Can you really?


Does it matter? If you do, you'll be using Wifi in those places, which has similar privacy problems, right? And if you disable both, you barely have coverage in those places, which doesn't seem better.


Ordinary mobile network use is not all that location-private. Wifi is worse. Looks like the mmwave 5g is even worse than that. Exact location privacy rarely directly matters to me personally but it doesn't follow that there is no risk of society-scale bad effects, or impact on other people that I ought to be concerned about. What I questioned, though, was whether it will really be practical for anyone to choose how much they engage with this. It's not so hard to turn wifi off when out and on when at home, lots of people do mote because it can be a nuisance than out of privacy concern. Will it really be equally easy to choose which bits of 5g spectrum a phone uses? If you think so, I have a a bridge for sale...


Add to this "smart" washing machines, fridges, dishwashers, air conditioners, irons, stoves, etc. and of course cars and TVs, all happily phoning home the user's personal data with their always on internal module that nobody can disable or filter, since they don't make use anymore of the user's home network.

I expect in a few years even shoes will connect somewhere while being powered and recharged by walking. And of course free gait analysis through internal pressure sensors and accelerometers, apps for running or dancing (turn on a IoT device by doing a tip-tap, find your ideal dancing partner, etc) will be the way manufacturers will sell them to the masses, at the price of more and more personal data being surrendered.

I don't care much about what people do with their privacy, that's their business, but how can I enter someone's house, shop or car knowing that there will be no less than a half dozen devices listening to my voice or shooting a picture of me without asking?

Note that I absolutely love the good that 5G and further technologies can bring. It's just some of their uses that concern me.


How long until we start seeing HN headlines like; How I turned my home into a faraday cage to stop all those pesky IoT devices phoning home.


This already exists. For a project I worked with a 'house of the future' style project. Which was build in a way that it was a faraday cage that had an option of letting selected frequencies through.


IT would be very hard to screen against 5G connected household appliances without rendering cellphones using the same frequencies also useless.


Microwave your shoes for 5 seconds, kill that chip. ;) For everything else, no idea.


>how can I enter someone's house, shop or car knowing that there will be no less than a half dozen devices listening to my voice or shooting a picture of me without asking?

There are solutions to that problem: https://xkcd.com/1807/


Which 5G does these are talking about? The sub 6GHz 5G aka relabeled LTE or the mmWave/UWB 5G? Because the UWB does not really penetrate anything, so it might be working good in lab, but useless in practice.


> relabeled LTE

To be pedantic, LTE stands for “Long-term evolution” and was always intended to be the foundation of future cell network standards.

I won’t get too into the details, but generations 1-4 dealt primarily with modulation techniques, and OFDM (the technique used in LTE) is more or less the best we know how to do over wireless.


To add some confusion, the “nth generation” features have been described by the International Telecommunication Union (e.g. IMT-2020 for the 5th gen), while the standards are developed by the 3rd Generation Partnership Project. Previously the standards had their own names (UMTS in 3G, LTE in 4G) but now the 3GPP is labeling their current releases of standards “5G” too, and yeah they include some concepts that go back even to 3G.


To add further to the confusions, 3G network systems has been made obsolete by not being supported anymore by the 4G/5G operators around the world and somehow they would revert back to EDGE network systems (2.5G) if the wireless signals are bad [1].

[1]https://www.wired.com/story/3g-service-sunset-what-it-means


Lol, it's all about that sweet sweet spectrum


It's not in the general, constant sense that you have to worry about, but rather in specific applications. When you're out in public, "they" will have 3-meter-accuracy, more than enough. But when you're in stores, and malls, and other venues where UWB is set up, then that's where real problems begin. They'll be able to track which advertisements you linger around, and which sections you visit.

It's going to be a whole new category of passive location tracking.


There are companies doing this already, but with their current CCTV systems. You don't even have to buy equipment, just the software, and you get e-commerce metrics for brick and mortar stores


My opinion is that in-building tracking doesn't have to be an issue, and the people who care don't linger watching advertisements in Malls. More power to someone who finds a way to use that data to make buildings like Grocery stores more efficient, like getting room temp products first and frozen things last during the walk.


It works on stadiums and other similar hugely crowded open venues, which is the point of it, AFAICT.


Yeah, accurate positioning is the inevitable end result of massive-MIMO. You need to know where the handset is to point the antenna beam towards it.

And it needs to update fast enough that it can be in a vehicle on the highway or a high speed train and still maintain connectivity.

The need to have more users on the same spevtrum requires sharper antenna beamwidths and smaller cells. All of which either require more precise positioning or make it possible.

All of this is linked with the technology used. Sadly I only see legistlation limiting what can be collected and used being a counter for this getting exploited.


besides the standard bandwidth improvements, it's pretty clear that this is why the major telcos were pushing 5G so hard, so they could sell that more precise location data to any and all comers.


I can't believe what I just read from Sabine. But then if she dont gets it. No wonder why 99% of HN still don't get 5G. Like I said in my HN bubbles post [1], which had lots of upvote but the only disagreement was on 5G.

5G isn't mmWave. mmWave isn't REAL 5G. This is the most common misconception on the internet inclusive but not limited to HN. AFAIK, till late 2021, no county other than USA has used or plan to use mmWave for their mobile network. The only usage outside of USA are for residential Wireless Internet. And those were on trial only.

Even the biggest proponent of mmWave, Verizon are backing off mmWave expansion.

5G has lots of things other than mmWave. As a matter of fact mmWave isn't even 1% of the initial 5G ( 3GPP R15 ) spec. And the same goes to 6G, which is NOT about higher frequency either. Another misconception is you need higher frequency for more bandwidth. Which is true in Shannon's law [2]. But we dont need more maximum bandwidth, we need higher capacity, or same bandwidth in ( e.g 1Gbps ) available to more people at the same time. i.e Network Capacity. And that, is what 5G ( Massive MIMO ) and 6G ( Distributed MIMO ) is about. ( Along with dozens of other things )

[1] https://news.ycombinator.com/item?id=32635268

[2] I still remember my professor told me the exact same thing in the 90s when I was doing work on 3G. How we will soon reach those limit. So for people who has not worked in the Wireless / Mobile industry space. A lot of what we take for granted today were essentially miracle or black magic not that long ago.


I have to say I find that Sabine is more pondering to the nutjob crowd. I really enjoyed some of here posts with deep insights into quantum physics and cosmology, an area where she is a subject expert. Now she is posting on all sorts of topics, some of them where she clearly has very little background and limited understanding.

They also have this strongly science sceptic undertone. While I appreciate that an healthy amount of skepticism is a good thing, the undertones in her posts are much more of a general criticism of science/scientists without any actual arguments.


Could you explain where you get these undertones from? From my perspective she is representing her points in favor of meteorologists, which are scientists and critical of various organizations which are economical organizations. She is also pointing out critisizing about many of the studies that investigated 5G, which is what a scientist should do. Not every study is good. And being critical of bad studies doesn't make you anti-science, but pro-science! She said multiple times that she does not believe there are health-risks to 5G radio.


So, the undertones are not so strong in this article, but I have to say already the choice of topic is pondering to a certain crowd. As a good scientists she should know what she doesn't know, and her understanding is clearly limited.

She also clearly plays with the uncertainty. First stating that yes she doesn't believe there is an issue, but then starting a discussion about the quality of the publications. Now quality of health impact studies is an interesting discussion, but I don't think it's relevant here. Moreover, the way she summarizes the shortcomings in the studies is somewhat selective/misleading, the full quote is:

> Our meta-analysis showed that the bulk of the studies had a quality score lower than 2 out of a possible 5, with only one study achieving a maximum quality score of 5 [9]. The meta-analysis further showed that studies with a low quality score were more likely to show a greater effect.

I think leaving out that last sentence gives a misleading impression.

Another quote from her video/post:

> So scientists say there’s nothing to worry. Well, they also said that smoking is good for you and alcohol doesn’t cross the placenta and that copies of you live in parallel universes.

If you actually follow the link to "they said smoking is good for you" you find it links to the The Stanford Research into the Impact of Tobacco Advertising (SRITA) collection. That's advertising, not scientists who said that smoking is good for you. Again a misleading quote/citation.

And this is not the first time she does this. They are always somewhat plausible deniability kind of sentences, but the repetition of these memes makes me suspect she is purposefully pondering to a specific audience.


She cited sources for her two claims. I didn't read said sources so it is of course possible she either misrepresented them or cherry-picked, but assuming she didn't i'm not really sure what more you want from her.


> As a matter of fact mmWave isn't even 1% of the initial 5G ( 3GPP R15 ) spec

What a silly argument. I'm sorry about nitpicking on one minor point of a long post, but percentage of the text of the standard is a rediculous thing to bring up and very irrelavent.


> 5G has lots of things other than mmWave.

> ( Along with dozens of other things )

To name a few: Enhanced Mobile Broadband (eMBB), Ultra Reliable Low Latency Communications (URLLC), Massive Machine Type Communications (mMTC), massive MIMO (multiple-input, multiple-output), beam forming, etc.


mmWave is also deployed at least here in Japan since 2020. I have Galaxy Z Fold3 with mmWave (though I don't need), it seems that only US/JP models have mmWave antenna. It's natural to misunderstood as it's only in the US since iPhone supports mmWave only for the US IIRC. Where I could test it is only on mobile phone shop becaues I'm on local, and I got 2Gbps. No use case for now.


Small sample size, but I'm typing this on a Fold4 on a Verizon UW cell, inside a hotel in California. Admittedly I could probably hit the tower with a free hotel breakfast muffin thrown hard out the window.

Another data point: I speed tested at more than 3GBits down on UWB service on the street in Chicago.


If I'm buying into your line of reasoning, then you should concede that 5G is a terrible name for this suite of services as anyone who has home 5G wifi or understandably conflates 5G with mmWave would say that 5G is junk because it cannot penetrate walls (or glass).


Then you are not getting 5G. In FR1 (sub 6GHz), we are not going to get much advantage. In the end, there is a limit of data you can send per Hz. This is not increasing from 4G to 5G (yes, you can use higher coding mechanism, but that's only for short distance, otherwise noise impact negates all advantages). To get the higher bandwidth as predicted and expected, industry has to move to FR2 (24GHz+/mmWave, I don't want to go into the discussions that mmWave starts at later band, the difference is statistical error). To support mmWave, standards bodies defined new standards/adopted technologies like beam forming, higher matrix antennas etc. And thats really a difference in 5G from 4G standards releases (along with cloud native core network with more simplistic IP routing and use of polar codes for control plane). In 6G, we are going to see Tera Hz band allocated for mobile network. Saying mmWave is not 5G is not accurate. You would not have needed 1000s of pages of standards for FR1, with little enhancement, LTE would have supported.


I don't get your post. On the one hand you say that you need to go to FR2, due to higher throughput requirements. On the other hand you point out beam forming, MIMO etc. as the real difference between 4G and 4G. But that's exactly the technology used to get higher throughput. You don't need more bandwidth necessarily to get higher throughput, there is a D term in the Shannon formula, thus you can go to several spatial paths to increase throughput.

I doubt we will see _real_ THz comms in 6G, maybe we go up to a few 100s of GHz, but not THz (I know many in the industry use THz to mean anything from ~50GHz, but that's just marketing rubbish).


Beamforming, MIMO, etc are not like you need for FR2, but without that you cant deploy. At this high frequency, the range is in 10s of m with antenna size of coin. To get the gain, you need MIMO and to get the distance, you need beam forming. Both MIMO and beam forming has more advantages, but without these technologies, almost impossible to deploy more than 50GHz. We will mmWave, but will take time. Thats how mobile world works. The high end of the targeted band is usually used at the end of particular generation of mobile or the next one. BTW, mmWave is having issue in adopting in mobile world not the FWA. My understanding is, both Lumen and Verizon are going ahead with mmWave band for FWA.

Regarding THz, FCC already opened the band till 3THz for 6G testing. I dont track, but next WRC might be planned for 2024 or 2025 (because of COVID, all the schedules are screwed up). The next WRC will kick start the standardization fo 6G and there is more possibility of THz to be part of 6G. Although as I mentioned earlier, my guess is, it more than a decade before THz band will be in use for mobility.


Yeah, I never really got the connection between the two. Nowhere I've lived has mmWave, nor will it ever, probably.

It has no reach. It's for high density applications, which seems like a good idea, but is probably already covered adequately by wifi.


If what you’re saying is true then my meter should not be picking up any signals above 4 GHz correct?

So why is my meter picking up signals up to 40 GHz from the towers in my small tourist town?


that spectrum isn't for phones. https://www.ntia.doc.gov/files/ntia/publications/2003-alloch... will let you know how the spectrum is allocated.


except the diagram shows several mobile blocks right above 40ghz, so what do you mean?


"Mobile" frequency allocations dont necessarily mean mobile phones. It just refers to an allocation for devices that arent transmitting from a fixed position.


I said "up to".


Probably a tower-to-tower link? It's sometimes cheaper to use a radio link rather than bringing cables to a new tower location. Or sometimes as a resilience measure.


The real problem with 5G is that it only has a few use cases. The big one is stadiums. Tens of thousands of people watching the game on their cell phones, each needing an independent video-bandwidth channel. (Anybody ever consider WiFi multicast for that? Most of them are watching the same stream, after all.) To get all that bandwidth in one place is the use case for line of sight millimeter microwave with large numbers of small base stations. The first places to get 5G base stations were stadiums.

After that, it tails off. Convention centers. Busy downtown intersections.

Of course China is ahead in this. They need it. China has ten cities with more people than New York. Most of the US has nowhere near the population density of coastal China. Not much of a use case for short range millimeter microwave.


5G isn’t just mmWave. The protocol stack is designed to be adaptive and capable of handling frequencies from the traditional 4G spectrum as well as mmWave. It’s also a rearchitecting of how network components are distributed that is intended to allow easier federation of services.

Regarding use cases for mmWave: mmWave exhibits the classic tradeoff of range vs bitrate. mmWave makes a lot of sense anywhere short range, high bitrate communications for the bill, such as home WAN, for example. When it comes to RF pollution, the short penetration of mmWave is actually better than the sub-6Ghz band of classic WAN (wifi).

Edit: one thing I am curious about is how energy consumption and EM pollution actually compares across a 4G and 5G stack. I could see it going either way depending on protocol differences alone, but physically speaking, allowing higher frequencies and faster bitrates should serve to (1) reduce EM pollution (2) improve energy efficiency of actual wire comms.


The USA 600MHz band is a good example of 5G on the very opposite of mmW frequencies.


>Tens of thousands of people watching the game on their cell phones

I don't know that technology is really the solution here. Why bother going to the stadium to watch the game on your phone?


I’ve never hear of people going to a game to watch the same game on their phones. That’s not what it’s for. The real use case is for communicating with others and uploading selfie videos


Probably for things like replays, commentaries etc.


Yea, nobody does that. They do keep track of the other games happening at the same time though.


For the ambiance and the community.


China has not deployed or even allocated mmwave yet https://5gobservatory.eu/eu-and-china-lagging-behind-in-mmwa... They are deploying mostly MIMO. MIMO can waste a lot of energy when capacity is not an issue. To save electricity carriers would sometimes turn off 5g functionality at night in the beginning (they probably have better fixes now with equipment upgrades) https://news.cgtn.com/news/2021-08-01/Is-5G-a-waste-of-elect...


Those seem like important use cases, no? I was at Pokemon GO Fest in Seattle recently and they had a bunch of 5G antennas. For those that may not know, Pokemon GO Fest is a big in person event that concentrates thousands of people in a small area in order to play a AR phone game. 5G is what kept me and my group online. Previous GO Fests, before 5G was widespread, had lots of connectivity issues.

This argument strikes me as an analog of the "Nobody needs a gigabit line, 25Mbps is enough to stream Netflix in 4k." That position doesn't leave room for future use.


Pokémon Go fest wouldn’t be what I’d cite for importance.


I think the idea is that service providers should be able to support elastic bandwidth consumption including phenomenon such as Pokemon Go festivals, without impacting other, more critical communications.


Stadiums designed from the ground up with Wifi in mind are able to cope in a situation with lots of people using their phone at breaks in the action.

5G ultra wide band has to be one of the most over hyped technologies in recent times. It has stupendously bad range and made zero improvement to a person’s daily use of their mobile phone. Embarrassing it was hyped as much by Verizon and the like (I don’t hear about nearly as much now).

I live in a metropolitan area and I don’t think I have ever had an ultra wide band connection.

Even 5g promises like putting compute closer to edge fall short. Edge computing is merely sending packets to a Verizon data center in the local area: https://aws.amazon.com/wavelength/, which is only slightly better than sending to a computer running on something like cloudflare that works independently of the cell phone network.


5G isn’t just about phones. It’s about IoT and all sorts of mobile (and non mobile) devices. Having the ability to transmit more bandwidth albeit at a smaller distance is a big deal. Edge computing helps on the latency front which is also a big deal, information needs to travel less of a distance than being routed over the internet (the laws of physics, particularly the speed of light is the constraint here). In the communication world I’m not sure what could be more important than bandwidth and latency.


"5G isn’t just about phones. It’s about IoT..."

That line shows up in press releases:

"Imagine a world where car accidents are a thing of the past; where chronic health conditions like diabetes are managed 24-7 without blood sugar highs and lows; where smart homes unlock doors with a face scan, and then automatically adjust lighting and temperature and even order groceries for delivery before you run out of milk."

None of which require 5G. Those are all low-bandwidth applications, or even ones that run locally.


4G was heavily hyped for IoT as well, and even has specific standards: LTE-M and NB-IoT.

IMO the ongoing push to make IoT happen is less about the technology and more about trying to get the next 10B devices online and paying network access fees, software license fees, etc. As you mention, a lot of the proposed use cases are very low data (maybe a few MB/day on the higher end).


The question is really what applications require such low latency?

AR? Gaming? Perhaps.

But most applications are fine with 10-50ms RTT. I think edge compute will happen but it won’t be as ubiquitous as the telcos would like.


The latency to a Cloudflare or Cloudfront POP is going to be very close to a Verizon 5G data center.


there is actually a standard for video broadcasting over lte: eMBMS


I've replaced fibre internet with 5G. It's great, short duration contracts with 500Mps speeds (I even got 1Gbps at one point while on holiday), and I can take it anywhere


I use 5G a lot when I'm "working from home", and my area has pretty good coverage. The one thing I've noticed is that unlike LTE there doesn't seem to be a correlation between connection quality and bandwidth. I can have a full 5G connection indicator on my phone in some places, but get slower speed than if I switch to 4G/LTE. I'm guessing my phone shows the connection quality to the local 5G node, but can't tell me when that node's uplink is degraded.


Do you play video games at all? I imagine the latency is much worse on 5G vs fiber. Have you noticed that?


Just did a speedtest.net run on my 5G link- 14ms ping, 1ms jitter, 500Mbit/s downlink. It’s pretty good.


It’s definitely worse, you’re not going to get single digit pings. That said, I used play Battlefield a fair bit on it and it was largely fine but crapped out occasionally.


When my wifi breaks while playing Dota 2, I quickly USB tether my phone and use my mobile internet to continue playing the same game. Even with 3G connections, I only get an additional 10-15 ms of latency.


If they are at the stadium why they need to watch it on their phone? I think that is a misunderstanding of the stadium use case.


Have you been to a football game (or equivalent) in a big stadium? Depending on where you're sitting, the television experience is often quite frankly better than the live experience in many regards. If you're way up in the nose-bleed section, the cameras are definitely going to give you a better view, in terms of up-close shots, and angles you can't see from your seat. And some people want the commentary from the TV announcers as well.

Honestly, watching football on TV is better than the stadium for many reasons. Yet people want to go to the stadium to be part of an immersive experience (and so they can tell their friends "I was there when so and so broke the NFL rushing record", etc.). So going to the game and still watching a broadcast of the game simultaneously is a desirable experience for a lot of people.

And I'm reasonably sure the same basic principle applies to most other sports that are played in large stadiums.


As someone who doesn't go to any sport events - that's so weird. Also feels like an absolutely monumental waste of time.


To clarify a bit... I'm not saying people are watching their screens the entire time or anything. Although some might depending on their view of the field. But it's especially cool for looking at critical plays and potential (or actual) "video review" situations, where everybody wants to see "did he really get his foot down in-bounds?" or whatever. And of course the stadiums usually replay a lot of that kind of stuff on the big jumbo-tron deals that are fixed in place in the stadium. Still, there are plenty of times you might want a closer look at something and want to check the television broadcast (or some stadium specific internal feed, or whatever).

Whether it's a "waste of time" or not probably depends on your perspective. Some people think watching sports in general is a waste of time. Others think playing D&D is a waste of time. Somebody, somewhere, probably thinks posting on HN is a waste of time...


Someone mentioned they had the Formula 1 live stream on their phone as they were sitting in the grandstands. They'd listen to the commentary, and could watch instant replays with better details and possibly rewind if needed. Best of both worlds kinda.


Right but they want to take pictures and videos which then automatically get uploaded to Google etc.

Not a big problem if they can't do that. But since they are trying to do that it means anybody who truly needs to make a phone-call, or video-call, might not be able to do it. That may not sound so critical, but it can mean whether a phone-company keeps a customer or not. Customers pay for perceived value including reliability.


I'm under the impression there is bearer priority so you would not be able to clog up to a certain point of voice, supposing that is how its configured. But yea, the use case is basically density.


When I went to games as a kid, you would frequently see people watching the game with a radio and ear piece to get the commentary.


A lot of people are watching highlights of other games when they are at american football games since there's a bit of downtime between plays and most games each week happen at the same time.


Right, I'd say specifically the big problem is the "Gs," which to me appear to be fundamentally marketing and handwaving to distract from the much more efficient ways all of this could work if we could get incumbent telcos et al out of the way.


The text transcript could be improved:

> The fourth Generation of wireless networks, four G for short, is now being extended to five G, and six G is in planning.

Spelling out "four G" does not improve clarity. The sentence should be: The fourth generation of wireless networks, 4G for short, is now being extended to 5G, and 6G is in planning.

> GigaHertz ... Giga Hertz

Must be written as gigahertz.

> four hundred Mega Hertz

Should be written as 400 MHz; using number words doesn't improve clarity.

Also, the factual content could be improved in a few places:

> If you want to transfer more information through a channel with a fixed noise-level, you have to increase either the bandwidth or the power.

There's also beamforming and MIMO.

> If you took all the water in the atmosphere and put it on the ground you’d get about 2.5 cm. The clouds alone merely make a tenth of a millimeter.

To make the comparison easier, it should be written as 25.0 mm and 0.1 mm. Ironically, she linked to an original video that indeed uses millimetres.

> The European Commission has agreed on –42 decibel watts for 5G base stations. The FCC in the US set a limit at –20 decibel watt. This is a logarithmic scale, so this is more than 30 orders of magnitude above the limit the meteorologists ask for.

No, it's 3 orders of magnitude, or 1000×.


I think the transcription issues are likely artifacts from computer transcription software


I suppose it's a sign of how good her articles and videos are that the only stuff you can find that're wrong are details.

In the spirit of being pedantic (don't take too seriously):

"There's also beamforming and MIMO" No. Beamforming attempts to increase apparent power by changing parameters. One could just as easily say "moving sender and recipient closer". MIMO is also manipulation of sending and receiving antennae, and therefore irrelevant to the discussion about transmitting through a channel with a fixed noise level.

She did make a mistake about the number of orders of magnitude, though.


> "There's also beamforming and MIMO" No. Beamforming attempts to increase apparent power by changing parameters. One could just as easily say "moving sender and recipient closer". MIMO is also manipulation of sending and receiving antennae, and therefore irrelevant to the discussion about transmitting through a channel with a fixed noise level.

Yes beamforming changes the SNR at the receiver, by shaping the beam to be more "concentrated", i.e. focusing onto the receiver. It is highly relevant because yes by increasing the SNR at your receiver you do increase the throughput (albeit only logarithmically). MIMO is very relevant to the discussion, because instead of increasing bandwidth you increase dimensionality of your channel, which has the same effect (i.e. you increase the term in front of the log in Shannon's formula). I don't know why this should be irrelevant for a channel with fixed noise level (also it's probably more correct to say fixed SNR).


Why isn’t beamforming a way of increasing effective bandwidth?


It is! But she's talking about "a channel with a fixed noise level".

Making new antennae, changing current antennae, moving them closer, aiming them differently, replacing them with an ethernet cable are all ways of increasing effective bandwidth, but those are outside of what she's talking about.


The receiver is part of the channel. So if you do beamforming your power at the receiver is higher, i.e. your SNR is higher. Any other interpretation does not make sense, because it's completely irrelevant to the discussion.


Sigh. The receiver is part of the channel. Exactly. Changing the parameters of the channel means you're no longer commenting on her statement - you're now making a new discussion. Her statement wasn't wrong, technically or otherwise, just because you can think of ways to increase information by changing parameters of the channel. I can think of ways to increase information, but absolutely none of them, suggested by you or by me, can change or invalidate her statement:

"If you want to transfer more information through a channel with a fixed noise-level"

Here, since this seems tough for you:

"If you want to execute instructions more quickly in a CPU, you can increase the clock speed."

You're saying, "but if you put in a new CPU that has higher IPC"...


The weather/5G frequency use conflict reminds me of the FAA 5G filter fiasco 8 monthes ago:

FAA Shows ‘Sample NOTAMs’ for Possible 5G Restrictions https://news.ycombinator.com/item?id=29694085

My comment at the time https://news.ycombinator.com/item?id=29696273

   This is all ridiculous, there's still a 200 MHz band guard between the FAA band and the 5G proposed band.

   Here is what a $1 ESP wifi dongle has to follow:

   https://en.wikipedia.org/wiki/IEEE_802.11

   "The mask requires the signal to be attenuated a minimum of 20 dB from its peak amplitude at ±11 MHz from the center frequency"

   So 2 dB/MHz filter.

   I let you do the math.

   FAA is just ridiculous here if they let old junk radio hardware handle safety landings for airplanes, but well after 737 max what do you expect... 
And obviously this was in line with reality, FAA finally admitted it didn't do its job of preventing crap filters to be kept in planes for decades:

https://www.faa.gov/newsroom/faa-statements-5g

   Airlines and other operators of aircraft equipped with the affected radio altimeters must install filters or other enhancements as soon as possible."
Now I haven't looked in details yet on this new frequency use conflict and Sabine mentionned a scientific study that seemed legit.

One thing is different: around 20 GHz there's lots of frequencies available (vs 5GHz) so we could have larger guard band without significant impact.


The technical analysis you're putting out here is a bit of a canard. Yes, better filters help, but radars are really, really picky. After all, there's an inverse fourth power relationship between return strength and distance.

So we have our doppler weather radar transmitting at 450kW, traveling out a big distance to a storm (inverse square), and reflecting at very low efficiency, and traveling back (inverse square). Compare to a base station putting out 40W that's closer and just subject to inverse square law. It can easily be 10 orders of magnitude stronger. You need pretty good filtering for this.

> One thing is different: around 20 GHz there's lots of frequencies available (vs 5GHz) so we could have larger guard band without significant impact.

You need a much larger guard band. It's easy to make a 1MHz wide filter at 10MHz, and really hard at 100000MHz.

Another important thing: parasitics start to matter a whole lot, too. You can have a filter that sharply rolls off around your fundamental, but then above the resonant frequencies of your passives/filter elements become transmissive again. It's pretty hard to keep 20GHz out of a receiver that was designed for a lower frequency before 20GHz was a major concern.


nitpick: it's not inverse fourth power, it's just inverse square with twice the distance, which is 1/((2 * distance) ^ 2) = 1/(4 * distance ^ 2)

(ignoring reflection efficiency, but that isn't determined by distance)


> nitpick: it's not inverse fourth power, it's just inverse square with twice the distance, which is 1/((2 * distance) ^ 2) = 1/(4 * distance ^ 2)

No. For a diffuse reflection, the amount of light hitting your target is inverse square. And then it is scattered and inverse square back on the return journey.

1/x^2 * 1/x^2 = 1/x^4.

https://en.wikipedia.org/wiki/Radar#Radar_range_equation

"In the common case where the transmitter and the receiver are at the same location, Rt = Rr and the term Rt² Rr² can be replaced by R^4, where R is the range."

(This all assumes that your target is smaller than the beam size, of course-- which is not as true for the two cases of a radar altimeter or a doppler radar as it is for e.g. tracking aircraft... but it's still close enough in practice).


I have T-mobile in the Pacific Northwest and noticed very poor service with countless dead spots over the past few years.

I got a tip from a friend last month to try disabling 5G and use LTE instead. It’s cleared up 90+% of the issues I was experiencing.

5G to me is a marketing joke which made my reception significantly worse


Some aspects of 5G do improve normal cell bandwidth which for internet, basically improves reception. Other aspects of 5G really only make a difference when you are right nearby an antenna, like walking down a city street. The problem is that 5G isn't rolled out everywhere and your phone might try to connect to a 5G antenna when a 4G antenna is much closer and would provide a more robust connection. I rarely see more than 3/5 bars when on 5G but can get much better reception wherever when only using 4G. Only exception was a recent road trip on I-95, there was near constant full 5G coverage.


I remember doing the same thing (minus one) during the 4G/LTE transition. I forced 3G for several years for better battery life and more reliable signal.


That's because T-Mobile has been slow to deploy 5g in the PNW. Still one of the slowest markets to deploy, but it is getting better over time.


A trouble with 5G that nobody has mentioned in the comments is that we'll be redeploying hundred of millions of new devices, which has a significant cost on the environment for very little benefits.


How do those costs and benefits compare to something like, say, coffee? In general, how should society decide whether a technology/product/service has too big an environmental cost?


The sober answer is that if you are living a 'regular' middle-class life in the developed world, your C02 emissions are likely far far above sustainable thresholds as set by the UN. As such buying a new phone before the old one has completely died out is too big of an environmental cost.


Good question, ideally we'd integrate environmental cost in prices, and the market would determine if it's worth or not, but currently not the case. Also I think the ratio marginal_benefit/environmenal_cost for 5G is probably one of the lowest in history. If you think about it, it's almost caricatural.


I think people mention that for literally every single article which involves the physical world…and plenty that don’t.


Posts like this remind me most of HN just likes to hear itself talk. Health conspiracy, incorrect information, linking posts to incorrect information...


Can water vapor be measured using the 5G background noise as a radiation source (akin to passive radar[0])?

[0] https://en.wikipedia.org/wiki/Passive_radar


this seems to be a well written and rational article and it's really too bad that a thing like "5G" is so prone to mass culture conspiracy hysteria, that it's impossible to have a discussion about concerns like these without everyone retreating to "Their corners" - which snuffs out the "middle" where things like, "hey 5G might give us problems with weather forecasts, how do we work that out"?. I can just see the "debunking" blog posts already "debunking" things like "5G will ruin the weather!" or some idiocy.


> it's impossible to have a discussion about concerns like these without everyone retreating to "Their corners"

Which means that not only do advanced economies fail to reap the full benefit of the technologies they develop (because people resist them for wrong reasons), but also the people in those societies get harmed by flaws in the technology that could have been addressed if a proper open conversation were possible.

This means that conspiracy theories and misinformation ends up doing double damage, making it the ideal weapon of authoritarian regimes which have more control over their country's media and less of a technological advantage (so holding back technology in other countries helps them close the gap).

I'll leave it as an exercise to the reader to imagine how this might apply to the online discourse around Covid and vaccines, but I'll give a clue about which country might have the most to gain from misinformation about both that topic and about 5G technology:

https://www.nbcnews.com/business/consumer/factory-lies-russi...


I see two things that are valuable in 5G. Number is campus networks (basically run your own cell network company wide) which makes sense in some use cases (production plant with a lot of IoT devices) and number two is the (potential) low latency. I feel both could be solved by different technologies but a cell based approach isn't a horrible idea.


Yeah 200MHz band guard is a massive guard. You can fit all of the FM band 4-5 times in that range. Also the resonate frequency water has a high attenuation factor so radio waves wouldn't be used there because they transferred a huge portion of there power to heat vibration water.

As for use cases it's a faster network in latency terms. There might be fewer use cases for it now but there were no use cases for WiFi before wifi exist. Once that infrastructure is built people will use it. By definition it's impossible to get a latency of less than 10ms on a large portion of LTE networks. If you have a sensor that requires a response that fast you simply won't use LTE because it's not possible to meet those mission requirements.

5G also has that beam forming whose goal is reduce congestion and solve the penetration issues but that is still being proven


> There might be fewer use cases for it now

There are massive use cases for it, but not at the people level. Low latency tasks such as edge AI classification, IOT interaction, and game streaming are all currently limited to WiFi only.

> Once that infrastructure is built people will use it.

This is a fallacy.


I would argue that that statement is true because it's talking about 5G whose features vastly outcompete LTE in many areas. I suppose there is a chance people won't use 5G but I think it's really unlikely considering the standards already been adopted and being used by many big name players.

All of this was to say that 5G has applications even if they might not appear to the OP and that it's only going to be used more once people can actually access 5G technology. It's still in the early stages even in areas which claim that are on 5G for the most part its 5G NSA mode where the backing core network is all LTE still. I also feel as though OP was really talking down 5G trying to bring nonsensical technically problems and unproven medical problems that have no evidence.

You yourself pointed out several applications but your quoting the very advantage I was talking about which was latency. Which we both agree is extremely beneficial but the over arching point is that we don't know all the things that will benefit from 5G because we have not observed them and while IoT, AI, and streaming will absolutely benefit the benefit does not end there. There absolutely will be more areas that benefit which is what I'm trying to communicate


Good studies on 5G-human health long term exposure?


https://www.europarl.europa.eu/RegData/etudes/STUD/2021/6900...

The review shows: 1) 5G lower frequencies (700 and 3 600 MHz): a) limited evidence of carcinogenicity in epidemiological studies; b) sufficient evidence of carcinogenicity in experimental bioassays; c) sufficient evidence ofreproductive/developmental adverse effects in humans; d) sufficient evidence of reproductive/ developmental adverse effects in experimental animals; 2) 5G higher frequencies (24.25-27.5 GHz): the systematic review found no adequate studies either in humans or in experimental animals. Conclusions: 1) cancer: FR1 (450 to 6 000 MHz): EMF are probably carcinogenic for humans, in particular related to gliomas and acoustic neuromas; FR2 (24 to 100 GHz): no adequate studies were performed on the higher frequencies; 2) reproductive developmental effects: FR1 (450 to 6 000 MHz): these frequencies clearly affect male fertility and possibly female fertility too. They may have possible adverse effects on the development of embryos, foetuses and newborns; FR2 (24 to 100 GHz): no adequate studies were performed on non-thermal effects of the higher frequencies.


> cancer: FR1 (450 to 6 000 MHz): EMF are probably carcinogenic for humans, in particular related to gliomas and acoustic neuromas

I looked into their data for glioma (see page 51, table 4). They used 8 total studies. 3 studies showed glioma to be more likely in mobile phone users, 2 showed no difference, and 3 showed glioma less likely in mobile phone users. From this they concluded "probably carcinogenic for humans, in particular related to gliomas"?

This is a long paper and I didnt read it all, but I'm not sure I see how that conclusion is supported.


I think you misinterpret the words. Equivocal means that it's unclear from the study if there's a causal association while negative means no association was found, not that EMF reduced incidence of gliomas.

Also, an important quote in the description following that table:

"The association of glioma and acoustic neuroma is stronger among long-term heavy users of mobile phones, which is also the most extensively investigated exposure source, and in some cases the onset of tumours was related to the side on which the device was handled."


As far as we know, non ionizing radiation is safe for humans, if the intensity is not too high.

* if the intensity is not too high

Don't put your head inside a microwave. Don't hug the transmisor of a antena that broadcast tv or radio. ...

* non ionizing radiation

Gamma rays, X rays and some UV rays are dangerous. Try to avoid them and keep a low dose for important medical treatments. Use solar protection to block UV rays.


Are you kidding? Didn’t you even see the video? You people think that there’s all the studies out there saying it’s safe and if there aren’t any good quality studies saying it’s safe. You have no grounds to see what you did.


From the transcription:

> Now, as I said, there’s no reason to think that five G is harmful. Indeed, there’s good reason to think it’s not [harmful], because millimeter waves have been used in medicine for a long time and for all we know they only enter the upper skin layers.

Are you afraid of red leds?

Each photons of a red les has like 1000x the energy of a photon of 5G. Most of the damage is caused by the energy of each photon, so red photons are more dangerous.

Also, a red led against your skin has more power than a 5G antena far away. You can get hurt when there are really a lot of photons, but both have very low power to be dangerous.

And a normal light lamp has even more power and higher energy photons, and they are safe.

And sunlight has even more power and higher energy photons, and it's safe if you filter UV-B rays that are the ones with more energy.


Since a red LED has a lower wave length then even 4G, by definition it has less power. I don’t know what you’re talking about here at all.

And neither of us can say it safe because as she pointed out there are enough studently to conclude that it’s safe.


A red LED (or really any LED due to the physics of the type of energy transitions used to produce light with an LED) has a much shorter wavelength (significantly higher energy photons) than any type of EM wave sued for cellular connectivity (including the mm-wave stuff that isn't being deployed very widely).


You should read up on electromagnetics.

c=lambda*f The wavelength of a red led is ~630nm so the frequency is ~480 THz. The energy of a photon is E=hf so proportional to the frequency. Therefore a red LED photon has approximately 20,000 times the energy of a 25 GHz 5G photon.


You should read what I said instead of reading physics. I’m talking about the wavelength energy not the photon energy. The energy of the wavelength is inversely proportional to the energy of the photon. Are you purposely trying obfuscate this conversation?


Can you explain what you mean by "wavelength energy" or "energy of the wavelength"? I can't find any useful hits on Google.

Admittedly I only have a high-school level knowledge of this, but my understanding is that (as the parent said) higher frequency = higher energy = lower wavelength. Hence visible light has more energy than any sort of microwave.


> The energy of the wavelength is inversely proportional to the energy of the photon.

Did you mean the "length of the wave" instead of the "energy of the wavelength"? In that case we all agree.


Is a photon a wave or a particle? Is it's energy in the frequency or the amplitude?

I am using amplitude to describe the energy of a wave, you are using frequency.

EDIT: To expand on this further, you are materialists and think light is a particle, a photon. But light is not a particle, it is a wave and it always is a wave, until we measure it. Objectively it is a wave (a probability), subjectively it is a particle (a certainty).

What you are measuring is what you measure, so you cannot see the effects of what you are not measuring.

Because this radiation is non-ionizing you think it is harmless. That is your folly and not mine.

Because you think the energy of the wave lay only in the photon means you can see the energy of the wavelength.


> But light is not a particle, it is a wave and it always is a wave, until we measure it.

It's more complicated. Anyway, when the light colide with your skin it counts like a "measurement". Most of the times, the light colides with an electron of your skin and the energy that the electron get's is the same energy that it would get in the photoelectric effect. https://en.wikipedia.org/wiki/Photoelectric_effect

> Is it's energy in the frequency or the amplitude?

Both. You can calculate the energy using the amplitude, and that energy determines how many photons will you count in a photoelectric experiment.

The wavelength determines the energy of each photon. The amplitudes determines the total amount of energy in all photons.

> Because this radiation is non-ionizing you think it is harmless.

Note that there are two ways in which light can be dangerous, let's call the "cancer" and "cooking".

As far as we know, non-ionizing radiation does not cause cancer.

You can cook something/someone using non-ionizing radiation but the power of the 5G antenas in not enough where people can go.


Long term is difficult, because it's a relatively new thing. Even 4G and 3G haven't been around very long. So we don't really know, but there are legit concerns about its safety:

https://blogs.scientificamerican.com/observations/we-have-no...

https://pubmed.ncbi.nlm.nih.gov/31991167/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7727890/


The Signal Path has done a really good job breaking this down - skip to 3:00 if you want to avoid the preamble. https://www.youtube.com/watch?v=R0xwyVlqsRo


No, that was not good at all. He’s not talking about any studies, only his theories, which only talk about non-ionizing radiation. There’s already plenty of studies that show there’s reactive oxygen species created in the skin from millimeter wave technology.

https://www.sciencedirect.com/science/article/abs/pii/S14384...


The "studies" of the type you link to typically require exposures of many orders of magnitude higher power than what is produced by a phone in order to observe any effect. This is the crux of Shahriar's analogy of not worrying about getting a sunburn in a dark room.

> He’s not talking about any studies, only his theories, which only talk about non-ionizing radiation.

mmWave, by definition, is non-ionizing radiation. To talk about ionizing effects of mmWave is to fundamentally misunderstand the topic being discussed. And as a scientist specializing in 5G and mmWave technology at Bell Labs, I think he's more than qualified to have an informed opinion on this topic.


There are other ways for EMFs to affect a biological system besides the ionizing radiation. One of them is by disrupting voltage gated ion channels.


At the mmWave power levels used in mobile handsets, the RF power is insufficient to penetrate the human body to any significant degree. Even airport mmWave security scanners can barely scan through clothing - and they operate at much higher power.


You are making several assumption which have not been scientifically studies, which is the only point I am making.

mmWaves do penetrate the skin and the eyes, that much we know. What effect it has on the body system as a whole we do not. You can make all the assumptions you want but until they are studied they are just assumptions.

And I do not mind walking through a mmWave security scanners for less than 10 seconds. But would you want to stand in that for eight hours a day while you were at work or home sleeping?

And again and again I have to point out that saying it is nonionzing radiation is pointless if there is another method by which it make disrupt biological systems, as in triggering voltage gated ion channels.


Don't we also need to confirm that long term exposure is also safe for everything else in our ecology? Soil Bacteria, Insect populations, and all the other plants and animals we depend on?


Is 5G good for super-short distances that you can’t for whatever reason use a cable for? Imagine needing to go through a solid wall, (of wood and drywall), and only 2 to 5 inches thick, then Ethernet on either side of that. Would 5G be the ideal way to not drop too much in speed, or is there something better for that sort of scenario?


Seems like Wifi solves this?


WiFi has much lower channel capacity (ever tried dealing with wifi in larger appartment block where everyone had at least one AP?)


For those of you that have migrated to 5G, have you found it useful? Do you feel bandwidth limited by 4G? Do you run into oversubscribed areas where you don't also have WiFi as an alternative?

My biggest hurdle, has always been access in remote areas, not bandwidth.


Apple's USA iPhone SE3 does not have mmWave radio/antenna support.

Previously, all USA 5G-enabled iPhones included support for mmWave.


Yeah it could hamper weather predictions, but it also enables us to receive emergency notifications in the first place!


Sabine has been so rock solid on everything I've seen from her that I tend to take her seriously.


This post was near the top of page one, a few minutes ago, now it's page 3 or 4. If you think this article, by a renowned scientist is wrong, or misinformation then the way to deal with that is to address it and explain why. Not bury it.


There's no trouble, just purchase one of my patented tin foil helmets to prevent the transmitter that was installed during your covid vaccination to prevent it from reporting your thoughts to The Agency


Funny, only today I tweeted these parody lyrics to “Welcome to 5G Networks” (a parody of a parody linked in https://twitter.com/rcarmo/status/1566402880504041472?s=21&t...):

Welcome to 5G networks, please enjoy your stay

Endless discussions about the state of play

We’ve got endless features, some good, some weird

And lots of little quirky bugs that we’ve engineered

Welcome to 5G networks, log on and take a chance

You can have your phone roam or do the coverage dance

Your radio is abysmal, It’s… not optimized

But throw it up on 3GPP and we’ll call it standardized

Welcome to 5G networks, you’ll never feel alone

Debug chinese radios or inspect packets whole

Ericsson? Nokia? Which one do I choose?

Just pick a third party that has the least SKUs

Welcome to 5G networks, be sure to run your fiber

Duplicate an incumbent network at the whim of the regulator

We’ve got timelines and roadmaps and radio test plans

So you can bill for ringtones nobody wants

Edit: why the downvotes?


> why the downvotes?

That gets an instant downvote




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: