Summary
LinkedIn News recently highlighted my post as an Editors’ Pick in its story, “Research shows telecom weak spots exploited for spying”.
That was flattering. But the more interesting part is why the angle resonated. The post argued that telecom location privacy should not only be understood as an app-permission issue. Sometimes the more important question is not what the phone revealed, but what the network exposed.
That distinction is becoming harder to ignore. Citizen Lab’s latest research, TechCrunch’s reporting, Ofcom’s action on Global Titles and the Telia Norway SIP/IMS case are different stories, with different protocols and different evidence. They should not be collapsed into one claim.
But they do point toward the same infrastructure lesson:
Protocol correctness is not the same as privacy protection.
Why LinkedIn News picked up the angle
LinkedIn News covered Citizen Lab’s research into surveillance campaigns that exploited weaknesses in telecom infrastructure to track people’s locations. The story described vulnerabilities in SS7 and Diameter, and placed the findings in the broader context of surveillance abuse within the telecom sector.
My post was selected as an Editors’ Pick because it made a related point: location privacy is not only about GPS, app permissions or malware on a device. It is also about signaling systems, network metadata and trust boundaries inside mobile infrastructure.
That matters because a user can do everything “right” on the handset and still be exposed if the network layer leaks too much context. The phone may not be the thing leaking your location. The network might be.
What Citizen Lab added
Citizen Lab’s “Bad Connection” report is important because it moves the discussion from theoretical telecom weaknesses to observed surveillance activity. The report identifies two surveillance campaigns and links them to real-world attack traffic and mobile operator signaling infrastructure.
That is the uncomfortable part. This is not only about whether SS7 or Diameter have known weaknesses. That has been discussed for years. The deeper issue is governance: who is trusted inside the telecom ecosystem, who can obtain access to signaling routes, and what kind of sensitive information those routes can expose.
Citizen Lab also points to a structural problem. Telecom signaling protocols were originally designed for a smaller and more trusted operator environment. Today, that environment includes a much larger ecosystem of mobile operators, roaming relationships, intermediaries, SMS providers, private networks, leased identifiers and commercial access arrangements.
A trust model that made sense in a smaller telecom world becomes harder to defend when access paths multiply.
What TechCrunch made visible
TechCrunch’s reporting helped translate Citizen Lab’s findings into a broader public-interest story. The key point was not only that surveillance vendors allegedly abused weaknesses in global cellular infrastructure. It was that location tracking can happen without the familiar consumer-security narrative.
No suspicious app permission prompt. No spyware necessarily installed on the phone. No obvious warning to the user.
That distinction matters. If the public discussion focuses only on malicious apps, device compromise or Pegasus-style spyware, then we miss a deeper infrastructure layer: the systems beneath the handset that make mobile networks work.
That is where the telecom privacy discussion needs to go next.
Where the Telia Norway case fits
The Telia Norway case is not the same as the Citizen Lab campaigns. There is no public evidence that the Telia Norway SIP/IMS issue was connected to SS7, Diameter, Global Titles, Citizen Lab’s campaigns or commercial surveillance vendors.
The connection is architectural, not evidentiary.
But that architectural connection is important. In the Telia Norway case, location-related access-network information was exposed through VoLTE/SIP call signaling. The issue was not that someone installed spyware on a phone. It was that valid telecom signaling carried sensitive contextual information across a boundary where it should not have been exposed.
That makes the case relevant to the broader trust-boundary discussion. It shows that location-sensitive metadata can appear in places where ordinary users, and often even ordinary security controls, are not looking.
Different protocols, same class of problem
SS7, Diameter and SIP/IMS are not the same thing. They belong to different parts of the telecom stack and should not be casually mixed. SS7 is associated with older mobile signaling and roaming. Diameter is used in 4G and parts of modern inter-operator signaling. SIP/IMS is used for IP-based voice services such as VoLTE.
The technical details matter. But different protocols can still expose the same class of governance failure:
- too much implicit trust between systems
- insufficient filtering at network boundaries
- sensitive metadata treated as operational context
- protocol-valid messages carrying privacy-sensitive information
- monitoring focused on service availability rather than metadata exposure
- telecom infrastructure behaving correctly while still revealing too much
That is the common thread. The risk is not always that a protocol “breaks”. Sometimes the risk is that it works exactly as designed, but the design assumes a trust environment that no longer exists.
The lesson: valid is not safe
A signaling message can be valid. A header can be allowed. A route can work. A system can behave according to specification. And the result can still be a privacy failure.
That is the uncomfortable lesson across these cases.
Telecom privacy cannot be reduced to whether the handset is secure or whether an app requested location permission. The network itself can reveal location-sensitive context if signaling systems, interconnects and operational boundaries are not designed around modern privacy expectations.
This is where telecom security and privacy meet infrastructure architecture. The question is no longer only:
Did an app access your location?
It is also:
What did the network reveal about you?
Why this matters now
The fact that LinkedIn News highlighted this angle is a useful signal. It suggests that telecom privacy is moving out of a narrow specialist discussion and into a broader professional conversation about infrastructure trust.
That is where it belongs.
The future of privacy will not only be decided by app stores, permission dialogs and endpoint security. It will also be decided by the invisible systems that connect devices, operators, countries and networks.
Telecom signaling was built to make mobile networks function across borders. Now it has to be re-examined as a privacy boundary.
Sources and further reading
- LinkedIn News: Research shows telecom weak spots exploited for spying
- Citizen Lab: Bad Connection
- TechCrunch: Surveillance vendors caught abusing access to telcos to track people’s phone locations, researchers say
- Previous article: Telecom signaling is still a location privacy problem
- Telia Norway case: Telia: Location data leaked through telecom signaling