Two things have happened in Norwegian digital governance over the past year that, looked at separately, are easy to defend on their own terms. Looked at together, they describe an architectural drift that should worry anyone who cares about how a constitutional state actually works.
The first is the practice — documented in detail by Wiersholm partner Bettina Banoun in an opinion piece originally published in Dagens Næringsliv and subsequently in Advokatbladet — by which the Norwegian Tax Administration, in administrative book audits of companies, demands access to and copies of mobile phones belonging to employees, board members and managers. There is no court order at the moment of intrusion. There is no requirement of criminal suspicion. The phones are copied in their entirety, including content that has nothing to do with the company under audit. The Tax Administration itself acknowledges that the copy will contain attorney correspondence, personal data and health information, and undertakes to filter this out after the copy has been made. As of April 2026, this practice has been tested at the Supreme Court appeal committee — and upheld with a notable dissent — on a reasoning that explicitly rests on the assumption that the after-the-fact filtering works as described.
The second is the DSOP-Politi service — Digital Samhandling Offentlig Privat, the digital interface between Norwegian banks and the police — which since its 2.1 publication in August 2025 has been operational for fully automated production orders against bank data. The prosecution service issues a digitally signed JSON Web Signature production order. The bank validates the signature and the format. The data flows back through a documented API in seconds. Bits AS, which operates the standard, projects the volume of these requests will rise from approximately 23,000 per year to approximately 300,000 per year — a roughly thirteen-fold increase.
Each of these has its own legitimacy story. Tax control is a real public interest and the Tax Administration has every right to detect fraud. Police investigation of serious crime needs faster, more reliable access to evidence than the previous paper-based process delivered. Both stories are true. Neither story is the architecture story.
The architecture story is that, in two of the most invasive measures the Norwegian state can take against a person — copying their phone, or pulling a decade of their financial history — the moment of intrusion now happens before any court is in the loop. The court has not been removed. It is still available afterwards, when the citizen objects. But it is no longer a precondition. It has been re-engineered from a gate into an exit ramp.
This piece is about how that drift works in each pipeline, why "filter after the fact" is a technical fiction in both cases, and what a constitutional state actually requires architecturally, as opposed to rhetorically.
Story one: The Norwegian Tax Administration and the unannounced phone copy
The Tax Administration practice that Banoun describes is concrete enough to read at the technical level. In administrative book audits of companies, tax inspectors arrive unannounced and demand access to mobile phones belonging to employees and board members. Banoun, who has acted in several such cases, characterises some of them as "fishing expeditions" — audits conducted without specific suspicion of wrongdoing.
The mechanism is that the entire phone is copied. The Tax Administration's own technical justification, as paraphrased by Banoun and confirmed in the agency's formal response, is that it is "not practically possible" to filter relevant material from irrelevant material at the time of the copy. The filtering is performed afterwards, by Tax Administration staff, before the case handlers see the material.
There is, per Banoun, an unpublished decision from the Norwegian Tax Directorate establishing that the entire phone may be copied even where only one per cent of its content concerns the company under audit. There is a documented case where a phone used purely privately was copied. Employees and board members who refuse to surrender their phones have, according to Banoun, been referred to the police, with the Tax Administration seeking to penalise individuals for non-cooperation with administrative control.
The Tax Administration's defence
The Tax Administration has not been silent on the criticism. On 15 April 2026, the head of the agency's Tax Crime division, Erik Nilsen, published a formal response in Dagens Næringsliv, also posted to the Tax Administration's own press page, explicitly defending the practice. Nilsen's argument, in translation from the Norwegian original: smartphones and PCs that are connected to the company's cloud services and used in the operation of the business are part of the company's archive, and can contain information that is central when the Tax Administration conducts an audit. Access to copy such storage devices is, in Nilsen's framing, absolutely essential for the agency to be able to conduct book audits at companies. The legal authority cited is the Tax Administration Act.
On the privacy concern, Nilsen's response holds that sensitive information — and other content not containing business-related information — is filtered out before the material is made available to case handlers. The translation is faithful to the original; the operative claim is that downstream filtering separates the private content from the business content.
The statutory authority the Tax Administration relies on is the Tax Administration Act § 10-4 second paragraph, last sentence. In English translation: "When going through the company's archives, the tax authorities may copy to a storage medium for later review either at the premises of the party with the disclosure obligation or at the tax authorities."
The courts have so far agreed — with one important dissent
This was tested in court in early 2026. In a case where the Tax Administration copied the personal mobile phone of a managing director during a tax audit, the company sought a temporary injunction barring the Tax Administration from reviewing the copy until the legality of the order had been finally determined.
Borgarting Court of Appeal ruled in January 2026 (LB-2025-208819) that where a company has organised its affairs such that information it is obliged to disclose only exists on a privately-owned device, that information forms part of the company's "archive" within the meaning of the Tax Administration Act. The court further held that this was not a disproportionate interference with the right to privacy under ECHR Article 8.
The managing director appealed. In April 2026, the Supreme Court appeal committee declined to admit the appeal, two votes to one (HR-2026-853-U). Justices Bergljot Webster and Ragnhild Noer formed the majority. Justice Kine Steinsvik dissented.
The reasoning of the majority is the most important detail in this entire architectural debate. In the operative passage of the ruling, the committee wrote — translated here from the Norwegian original:
In its assessment, the appeal committee majority places decisive weight on the fact that, per the description, information which is not part of "the company's archives" under § 10-4 second paragraph — and is therefore not relevant to the control — is to be screened from review. This includes, among other things, private information, and makes the interference with A's right to privacy under ECHR Article 8 markedly smaller than it would otherwise be.
In other words, the majority's proportionality assessment under ECHR Article 8 turns on the assumption that information which is not part of the company's archives — including private information — will be screened from review. This screening, they hold, makes the interference with the right to privacy "markedly smaller than it would otherwise be".
The majority then explicitly conditioned their ruling on this assumption (also translated from the Norwegian):
The committee presupposes that the control is conducted in such a way that the screening, both formally and in reality, covers all private content on the mobile phone, and that only the portion of the information that concerns the companies and is relevant to the control is accessible to the inspector.
The verb the majority uses — "presupposes" — is doing real work here. It is the committee assuming, as the load-bearing premise of its reasoning, that the screening covers, both formally and in reality, all private content on the phone, with only company-relevant content accessible to the inspector.
The dissent identifies the gap precisely. Justice Steinsvik writes, in translation: "The minority cannot see that the tax authorities' access to inspect the mobile phones of private individuals during ordinary administrative control, where there is no suspicion of tax evasion, has been considered by the legislator." This is the dissent's central point: that the question of whether ordinary administrative control should include phone access without prior suspicion has never been put to, and resolved by, the Norwegian legislature.
Nilsen, in his Dagens Næringsliv response, expressed the Tax Administration's reading of the broader debate: that copying digital storage devices is something the agency has always done, and that this cannot be compared, in his framing, to the continuous camera surveillance Banoun's piece invoked from Orwell.
Why this makes the architectural argument stronger, not weaker
This is the moment where the architectural critique stops being a policy preference and becomes a load-bearing premise of the legal regime itself.
The Supreme Court appeal committee did not rule that copying a phone wholesale is unobjectionable. They ruled that copying a phone is constitutionally acceptable because the screening procedure exists and works. The proportionality assessment under ECHR Article 8 turns, in their reasoning, on the assumption that the screening reliably separates private content from company content — sufficiently that the actual intrusion experienced by the citizen is "markedly smaller than it would otherwise be".
This means the technical question of whether the "filter-afterwards" arrangement works as advertised is not a side issue. It is the foundation on which the constitutional permissibility of the practice currently rests. If the screening is operationally robust, the committee's reasoning holds. If the screening is, in practice, a soft procedural control that does not eliminate lateral knowledge, contextual information leakage, institutional incentives to use what was seen but ostensibly filtered out, or simple discoverability through future search and retrieval inside the agency's data store, then the committee's reasoning has been built on a premise that does not technically obtain.
This is the legal regime making itself dependent on a technical assertion that nobody outside the Tax Administration has independently verified. The screening procedure is unaudited by the Norwegian Data Protection Authority, unspecified in the Tax Administration Act itself, and operationally defined entirely by the Tax Administration's own internal practice. It is the institutional version of "trust us — we filter".
The dissenting justice anticipates exactly this. Steinsvik's point is that the legislature never considered whether ordinary administrative control should include phone access without prior suspicion. The screening is doing the work that, in the dissent's reading, would otherwise need to come from a fresh legislative judgment about the appropriate balance between effective tax control and the constitutional protection of private life. The screening is the bridge between an old statutory authority and a new technical reality, and the bridge is being built and inspected by the same party.
This is a different kind of structural problem from "the practice is illegal". The position is: even with the Supreme Court appeal committee on its side, the Tax Administration's practice depends, at the constitutional level, on a screening procedure whose technical robustness is currently unverified by any independent body. That is a structural concern that architectural and technical analysis is precisely the relevant expertise to address.
A modern mobile phone is not a filing cabinet, a binder of receipts, an accounting system, or a company server. It is a single physical object that contains, by current technical reality: BankID and other authentication credentials, two-factor authentication seeds, password manager content, the synchronised contents of Signal, WhatsApp, Telegram and SMS conversations, photographs and videos, location history and movement patterns, calendar entries spanning private and professional life, health and fitness data, communications with attorneys, doctors, family and minors, social media drafts and direct messages, browsing history, and the operational metadata that ties all of this together. It is the physical embodiment of a person's digital life.
The technical claim that "we will filter the irrelevant material out afterwards" is not, on inspection, a meaningful safeguard. Once a forensic image of a phone exists, the data is taken. The custodial party has the data. The filtering is an institutional promise, not a cryptographic guarantee. Even if the filtering is performed by personnel separate from the case handlers, the existence of the unfiltered image inside the agency creates a category of access — and a category of risk — that did not exist before the copy was made. The filtering does not retroactively un-make the data collection. It produces, at best, a smaller second collection inside the larger first one. The first collection still exists.
This is the same principle that European data protection law expresses through the data minimisation requirement of GDPR Article 5(1)(c) — the requirement that personal data be limited to what is necessary in relation to the purposes for which they are processed. Norwegian companies of all sizes are required to demonstrate compliance with this principle. The current Tax Administration practice is its architectural inverse: maximisation, with discretionary minimisation downstream. There is no reading of Article 5(1)(c) that this satisfies.
The disproportionate burden of the practice falls on small and medium-sized businesses. In a small company the same individual is often founder, managing director, finance lead, customer contact, board chair, support, and HR. The same phone is used for company business and for everything else. There is no IT department to issue separate work devices, no compliance team to negotiate scope of disclosure, no general counsel on retainer. The unannounced demand is functionally non-negotiable in the moment.
That is story one. It happens without a court order, without a suspicion threshold, and at the speed of one inspector with a copy tool.
Story two: DSOP-Politi and the production order in JWS
DSOP — Digital Samhandling Offentlig Privat, in English roughly "Public-Private Digital Cooperation" — is a programme between the Norwegian financial industry and public agencies. Its Politi-Utlevering service ("Police Disclosure") is the operational digital channel for production orders under section 210(3) of the Criminal Procedure Act, directed at banks for account information. The legal authority for these orders rests with the prosecution service. For account information specifically, the prosecution service issues the order on its own authority — police prosecutors and police lawyers acting within the scope of the statute — and does not require a prior court ruling. The court enters only if and when the order is contested.
The Bits AS technical specification for DSOP-Politi describes the mechanics in detail. The prosecution service composes a production order as a JSON Web Signature data object containing the case number, the legal basis cited (the order specifies whether the request is made under § 210(3) first sentence for criminal cases, or second sentence for missing-persons cases), the time window, and the endpoints to which access is granted. The order is digitally signed. It is transmitted to the bank through a Maskinporten-authenticated API call. The bank's compliance task is to validate three things: that the signature on the order is valid, that the format conforms to the specification, and that the request from the prosecution service is consistent with what the order itself authorises. The bank does not, and is not asked to, evaluate the substance of the underlying suspicion.
Once those validations pass, the bank's API delivers, through a sequence of standardised endpoints, the data described in the order. The endpoints, per the published specification, are: Accounts (the list of every account the subject has owned or held during the time window, with ownership periods and disposition rights), Account Details (current and historical balances, credit lines, currency, status, ownership history), Transactions (every booked and pending transaction in the time window, with counterparty name, country of residence, full postal address including street, building number, postcode, town and country, payment card identifier, currency exchange details where applicable, merchant identifier, and transaction codes), Cards (every payment card associated with the subject, including identifier, holder name, issuer, expiry, status and version), and Roles (every account role the subject has held, with permissions, identity references and date ranges).
The time window — fromDate — extends up to ten years before the current date, with a small additional grace window of 90 days built into the validation rule. This is not theoretical. It is a documented validation constraint published by Bits AS: fromDate ≥ today minus 10 years minus 90 days.
Bits AS publishes the volume forecast directly in the Politi-Utlevering service description. The current annual figure for production orders for account lists, transactions and balances — across all Norwegian police districts, the National Criminal Investigation Service (Norway's serious-crime authority), the National Authority for Investigation and Prosecution of Economic and Environmental Crime, and the Norwegian Enforcement Authority — is estimated at approximately 23,000. The forecast post-rollout figure is approximately 300,000. This is a roughly thirteen-fold increase, openly stated by the operator.
It is the single most concrete piece of evidence we have for the way digital infrastructure quietly changes the operational reality of investigative powers without changing the underlying legal text.
The legal threshold has not changed. The prosecution service still has to satisfy itself that the conditions of § 210(3) are met. The operational threshold — what it costs to issue an order, how much friction stands between an investigator's hypothesis and a decade of someone's bank life — has changed dramatically.
For the avoidance of doubt: in serious criminal cases, faster access to bank data is genuinely useful. Norway's economic-crime authority noted in October 2025 that information that previously took several days to obtain can now be transferred in seconds, and in many cases that speed is necessary. The point is not that DSOP-Politi is illegitimate. The point is that infrastructure of this efficiency, once built, normalises a class of intrusion that previously had to be rationed by friction. When the friction was paper, an investigator would not casually pull ten years of someone's transactional life on a hunch. With a JWS-signed API call returning in seconds, the cost-benefit of the marginal hunch is different.
That is story two. It happens on the prosecution service's own authority, at the speed of an HTTP request, and the court is not in the loop until someone challenges it after the fact.
Foretold in 2022 — and built anyway
When DSOP-Politi went live operationally in October 2025, the official framing was celebratory. National Police Director Håkon Skulstad described it, in a press release reported by Investornytt on 15 October 2025, as a small but real digital revolution for the police. In the English translation of his words: "Access to fast and legally sound disclosure of high-quality information is a very important tool for meeting the new challenges the police face."
The trajectory described in this piece did not arrive unannounced. It was foretold in concrete, specific terms in January 2022 — three years and nine months before that press release — in Norway's two leading financial papers, by people qualified to recognise it.
On 3 January 2022, Dagens Næringsliv published a feature reporting that Norwegian banks were concerned about a tool then under development that would force them to deliver, as the article framed it, "intimate payment details" of customers to police and the Tax Administration "fully automatically". The piece quoted Associate Professor Małgorzata Cyndecka of the Faculty of Law at the University of Bergen on the architectural risk, in translation: "When everything is to happen automatically, there is a high risk that this information ends up where it should not."
On 28 January 2022, Mikkel Toft Gimse and Petter Enholm — both lawyers at the Hjort law firm, Enholm specialising in ICT and data protection law on the back of more than 25 years of prior experience in the IT industry — published an opinion piece in Finansavisen titled, in translation, "The police should not have an open door to the bank". They described the proposed architecture in detail. They named what was being asked for: direct digital access to "transaction history up to ten years backwards, including direct retrieval of transaction amounts, parties' names, times and places". They observed that the existing manual process required the police to "document and justify their disclosure decision, and at the same time specify to a third party — the bank — which information they want access to". They identified the moderating function that this friction served. They wrote: "We are aware that the police have withdrawn production orders after closer questioning of the actual basis. The banks' control, although superficial, can therefore have a moderating effect, and limit the scope of personal data that is disclosed without sufficient justification."
In a follow-up published on 8 February 2022, after responses from police academy lecturers defending the proposed model, Gimse and Enholm sharpened the locus of their concern with a precision that, four years on, reads as a specification document for what is actually now in operation. In English translation:
We are not concerned about a court's decision — the core of our point is the fear that the prosecution service's own assessments are not sufficiently balanced when the need for access is weighed against the individual's privacy. A one-sided assessment by the police can lead to abuse of authority.
Four years later, the system the Hjort lawyers described is in production. The Bits AS forecast for annual volume is 300,000. The bank's compliance task, per the same operator's specification, is exactly what Gimse and Enholm warned would not be enough: validation of signature and format, not of substance. The court is not in the loop for the issuance of the order in account-information cases. The data delivered, per the published API specification, includes precisely what they identified — transaction amounts, counterparty names, postal addresses including street and building number, times, places. The ten-year time window is exactly the window they pointed to.
This is not hindsight. This is the architecture being built precisely as it was warned about, by named ICT-law specialists, four years before it shipped. The warnings were public, published in mainstream financial press, attributable to qualified people, and specific enough to function as requirements documents in reverse. They were considered. They were responded to. The system was built anyway.
The relevant question, looking forward, is not whether the warnings were right. They were. The relevant question is whether the same pattern is currently playing out for the Tax Administration's mobile-copying practice — whether the warnings now being raised by Banoun will likewise be heard, debated, and overruled, with the system shipping in roughly the form being objected to. The mechanism by which Norway moved from a paper-based production-order system in 2022 to an automated 300,000-per-year API in 2026 is the same mechanism by which an unannounced full-mobile-copy regime, currently contested, becomes normalised infrastructure by 2030.
What the two pipelines have in common
The instinct, looking at these two cases separately, is to evaluate them on their own merits. Each can be defended in isolation. The architectural insight only appears when they are placed side by side.
Both pipelines have the same shape. A state actor — tax inspector, police prosecutor — concludes on its own internal assessment that an intrusive measure is justified. The actor executes the intrusion at machine speed against a target that, in practice, has no meaningful ability to refuse in the moment. The intrusion is bulk: it acquires more data than the stated purpose can possibly justify, with the explicit or implicit reassurance that filtering will happen afterwards. The court is available, but only as a remedy for a citizen who chooses, with full knowledge of the legal landscape, to fight back after the data is already in the agency's hands.
The judicial review has not been abolished. It has been moved from precondition to remedy. That is a different architecture, and it produces different outcomes. A precondition forces the state to articulate, in advance and to a third party, what it intends to do and why. A remedy requires the citizen to discover the action, recognise its illegitimacy, find legal counsel, fund a legal challenge, and absorb the cost of doing so against an institutional opponent. These are not equivalent. They are not even close to equivalent.
They produce different kinds of state behaviour. A pre-judicial regime produces narrow, well-justified intrusions because the cost of articulating the basis is non-trivial. A post-judicial regime produces broad, rarely-challenged intrusions because the cost of articulating the basis is paid only in the small minority of cases that are actually contested.
There is a second layer to the Tax Administration case that sharpens the architectural problem. The Supreme Court appeal committee's ruling in HR-2026-853-U did not validate bulk phone copying in the abstract. It validated bulk phone copying conditional on the screening procedure working as described. The constitutional permissibility of the practice is now formally tethered to a technical premise — that downstream filtering reliably separates private from company content — that nobody outside the Tax Administration has independently verified. The court is, in effect, in the loop after all: just not in the way the Constitution traditionally placed it. It is in the loop as a body that trusts a procedure that no independent body has audited. That is a different role than judicial pre-authorisation. It is closer to retrospective ratification of a technical arrangement that the legislature did not specifically consider.
This is not a Norwegian peculiarity. It is the architectural pattern that every state with the technical capacity will follow if it is permitted to. The pattern is not new. What is new in the two cases described here is the speed with which Norway has adopted it, and the documentary trail — the 2022 warnings, the 2026 court ruling that anchors constitutional permissibility in a technical premise, the published volume forecast — that allows the architecture to be examined directly rather than inferred from outcomes.
Why "filter afterwards" is a technical fiction
Both pipelines rely on the same operational claim: that the bulk acquisition is acceptable because filtering will happen downstream. In the Tax Administration's case, irrelevant phone content is filtered from the case handlers — and as established above, this filtering is now the explicit constitutional premise on which the Supreme Court appeal committee has based the proportionality assessment under ECHR Article 8. In the DSOP-Politi case, the bank delivers a defined dataset and the prosecution service is responsible for using it within the scope of the order — a softer formulation of the same premise.
Neither claim survives technical scrutiny. The Tax Administration variant is the more important to address, because Norway's highest court has tied its constitutional reasoning to it.
Once a dataset has been transferred to a custodial party, the custodial party has the data. There is no cryptographic mechanism for taking it back. There is no mechanism for unseeing it. The "filter" is an institutional process performed by people working inside the same agency that holds the data — and even when those people are organisationally separate from the case handlers, the existence of the unfiltered dataset inside the agency creates information flows, lateral inferences, contextual knowledge, and discoverability through future search and retrieval that no procedural separation eliminates. The filter is, at best, a soft control. The intrusion is a hard fact.
This matters legally now, not just ethically. The Supreme Court appeal committee in HR-2026-853-U conditioned the constitutional acceptability of the practice on the assumption that the screening covers, "both formally and in reality, all private content" on the phone. The committee did not specify, and is not in a position to specify, what "in reality" requires technically. The screening procedure is not described in the Tax Administration Act. It is not subject to mandatory independent audit by the Norwegian Data Protection Authority. It is not specified at the level of detail that would allow an external technical assessor to verify the claim. It is operational practice, internal to the Tax Administration, attested to by the Tax Administration itself. The court has, in effect, accepted an institutional assurance as a constitutional safeguard.
Real data minimisation is not a downstream filter. It is an upstream constraint. It is the difference between asking the bank for "all transactions for this customer in 2018-2026" and asking the bank for "all transactions to or from this specific counterparty between January and June 2024". It is the difference between copying a phone wholesale and using forensic tooling that exposes only the targeted application's data. Both of these alternatives are technically achievable. The choice not to use them is operational, not technical.
In the Tax Administration's case, modern enterprise systems already provide the surgical access that data minimisation demands. The company's email system, accounting system, CRM, document archive and cloud storage all support time-bounded, field-bounded, account-bounded extraction. Independent third-party filtering is a real industry capability. Audit logs, access trails and verifiable deletion of out-of-scope material are all implementable. None of this requires copying a phone. The Borgarting and Supreme Court reasoning that turned on "if it is not allowed, evasion would be too easy" assumes a binary that does not technically exist: either copy the whole phone, or get nothing. That binary is false. There is a wide range of intermediate options, all technically achievable, all more aligned with the data minimisation principle, and none of them currently required.
In the DSOP-Politi case, the API specification itself supports a more minimised request than the orders typically issued. The fromDate field can be any date within the ten-year window — it is not required to be the full window. The onlyPrimaryOwner flag exists specifically to allow the data minimisation that Bits AS recommends in the documentation. The endpoint set for the order can be restricted to what the order actually requires. Whether prosecutors use these constraints, and how often, is operational practice. The infrastructure does not enforce minimisation; it permits it.
The technical capability for a more minimal architecture exists in both cases. What is missing is the requirement to use it, paired with independent verification that it is being used as required.
What a constitutional state actually requires
The architectural argument can be stated cleanly, independently of how the legal questions resolve.
A state's character is not visible in what it does in the easy cases. It is visible in what it forces itself to do before it does the difficult ones. That is what a court order is. It is the procedural cost the state pays in advance for having access to the citizen's private life. It exists not because every individual case is suspicious, but because the architecture of advance review is what keeps the state's exercise of its powers honest in aggregate, across thousands of cases, across decades, across changes in government and changes in the underlying technology.
When you remove that cost — when you re-engineer the court from a precondition into a remedy — you do not reduce abuse to the extent that everything was already legitimate. You reduce abuse to the extent that the citizen has the resources, the awareness, and the will to fight back after the fact. That is a much smaller subset of cases. The architectural change does not make abuse less likely. It makes abuse less detectable.
This is the practical, non-rhetorical answer to "if you have nothing to hide, you have nothing to fear". The fear is not that any specific innocent citizen will be persecuted by a specific tax inspector or police prosecutor. The fear is that the architecture, once normalised, has no mechanism for surfacing the cases where the discretion was misused — because the cost of surfacing them has been transferred from the state to the citizen.
The constructive position, then, is not "abolish DSOP-Politi" or "ban the Tax Administration from doing book audits". The constructive position is that:
Where the constitutional acceptability of bulk acquisition rests on the assumption that downstream screening separates private from operational content, that screening procedure should be independently auditable. Right now, it is not. The Supreme Court appeal committee in HR-2026-853-U has placed decisive constitutional weight on a procedure that the Norwegian Data Protection Authority does not audit, that is not specified in statute, and that is operationally defined entirely by the Tax Administration itself. A constitutional state cannot rest its proportionality assessments on institutional self-attestation. If the screening is the constitutional bridge, the bridge needs an inspector other than the agency that operates it.
Bulk acquisition with downstream filtering is not data minimisation. The data minimisation principle of GDPR Article 5(1)(c) applies to the public sector, including tax administration and police. A regime that the same regulators would prosecute in the private sector cannot be the regime applied to the citizen by the state.
Operational efficiency is a public good. It is not a constitutional override. The infrastructure that makes intrusive measures cheap to execute should be paired with infrastructure that makes those measures harder to misuse — not infrastructure that removes the friction of advance judicial review. If DSOP-Politi can pull a decade of bank data in seconds, the system that authorises those pulls should be built to a standard that reflects what is actually being authorised. Self-issuance by the same body that benefits from the issuance is not such a standard.
Small and medium-sized businesses, where the administrative-control regime hits hardest, deserve specific architectural protection. The current Tax Administration practice falls disproportionately on companies that lack the resources to litigate it. That is not an argument against tax control. It is an argument for tax control conducted at a higher minimum standard of procedural protection — particularly when the device being intruded upon is not, by any honest description, a "company archive" in the sense the legislator originally had in mind.
Closing
The court is not a formality. It is the gear that keeps the architecture honest. The case against engineering it out of digital intrusion is not that any specific case is illegitimate. The case is that the architecture of advance judicial review is the thing that prevents the slow, undocumented drift toward a state that does what its technology permits, rather than what its constitution requires.
Two pipelines, one drift. The Tax Administration copies the phone, and the Supreme Court appeal committee in April 2026 made the constitutional acceptability of that copying explicitly conditional on the technical robustness of an internal screening procedure that no independent body audits. DSOP-Politi pulls the decade in seconds, on a digitally signed production order issued by the prosecution service against itself, with the bank validating only signature and format. In both cases, the court enters the conversation only after the data is already taken — and in the Tax Administration case, the court has now taken on a role it is institutionally ill-suited for: that of a body trusting a technical procedure it cannot itself verify.
In the case of DSOP-Politi, the architecture was foretold in concrete terms in January 2022, by qualified ICT-law specialists writing in mainstream financial press, and built anyway. The Tax Administration's phone-copying practice is now being warned about in the same way, in 2026, by Banoun. The mechanism by which the first warning was overruled is the same mechanism that will determine whether the second warning is heard.
A constitutional state is recognised by what it requires of itself before it acts, not by what it permits its citizens to challenge afterwards — and not by what its highest courts have presumed will happen in the rooms they cannot see into. The drift away from that standard is not abrupt. It is engineered, signed in JWS, and shipped at API speed. And in 2026, it is anchored, for the first time, in a Supreme Court reasoning that treats institutional self-attestation as a constitutional safeguard.
Sources
All sources below are Norwegian-language documents in their original form; the link text is the English description and the URL points to the original.
Tax Administration phone copying
- Bettina Banoun's opinion piece, Advokatbladet, 16 April 2026 (originally published in Dagens Næringsliv)
- Erik Nilsen's response, the Tax Administration press release, 15 April 2026 (also published in Dagens Næringsliv)
- Kjetil Kolsrud's report on the dissent, Rett24, 20 April 2026
- Knut M. Haugland, prior commentary on the practice, Dinbedrift.no, 9 May 2026
Court rulings
- Borgarting Court of Appeal, LB-2025-208819, January 2026, Lovdata
- Supreme Court appeal committee, HR-2026-853-U, April 2026, Lovdata
DSOP-Politi technical specification
- Bits AS, DSOP-Politi solution description, version 2.1.1, January 2026
- Bits AS, DSOP-Politi production order specification
- Bits AS, DSOP-Politi service overview
- Bits AS, circular 2025 No. 8 — new DSOP services
Official rollout framing
- Investornytt report on the operational launch, 15 October 2025 (National Police Director Håkon Skulstad quoted)
2022 warnings
- Agnete Klevstrand's feature on automated bank-data access, Dagens Næringsliv, 3 January 2022
- Mikkel Toft Gimse and Petter Enholm, "The police should not have an open door to the bank", Finansavisen, 28 January 2022
- Mikkel Toft Gimse and Petter Enholm, "Automated access is risky", Finansavisen, 8 February 2022
Statutes and constitutional sources