Something is either public record - in which case it should be on a government website for free, and the AI companies should be free to scrape to their hearts desire...
Or it should be sealed for X years and then public record. Where X might be 1 in cases where you don't want to hurt an ongoing investigation, or 100 if it's someone's private affairs.
Nothing that goes through the courts should be sealed forever.
We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
AI firms have shown themselves to be playing fast and loose with copyrighted works, a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
>”Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.”
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
Also, courts record charges which are dismissed due to having no evidential basis whatsoever and statements which are deemed to be unreliable or even withdrawn. AI systems, particularly language models aggregating vast corpuses of data, are not always good at making these distinctions.
That is a critical point that AI companies want to remove. _they_ want to be the system of record. Except they _can't_. Which makes me think of LLMs are just really bad cache layers on the world.
> I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Many countries have solved this with a special background check. In Canada we call this a "vulnerable sector check," [1] and it's usually required for roles such as childcare, education, healthcare, etc. Unlike standard background checks, which do not turn up convictions which have received record suspensions (equivalent to a pardon), these ones do flag cases such as sex offenses, even if a record suspension was issued.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
That's the reality in my country, and I think most European countries. And I'm very glad it is. The alternative is high recidivism rates because criminals who have served their time are unable access the basic resources they need (jobs, house) to live a normal life.
>That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
No one is forcing you to hire formerly incarcerated nannies but you also aren’t entitled to everyone’s life story. I also don’t think this is the issue you’re making it out to be. Anyone who has “gotten in trouble” with kids is on a registry. Violent offenders don’t have their records so easily expunged. I’m curious what this group is (and how big they are) that you’re afraid of.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent. Alternately if they are convicted and served their sentence, they might need to prove that in the future.
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
Or in the case of, down the road, repeating an offense. The judge sees you had an issue in the past, was good for a while, then repeated, suggesting an event or something has happened or that the individual has lost their motivation to stay reformed. Sentencing to time for the crime but then also being able to assist the individual in finding help to get them back on track. We have the systems in place to do this, we just don’t.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
> Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
>Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
> If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent.
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
That seems compatible with OP's suggestion, just with X being a large value like 100 years, so sensitive information is only published about dead people.
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
Right, except there are some cases where that information should be disclosed prior to their death. Sensitive positions, dealing with child care, etc. but those are specific circumstances that can go through a specific channel. Like we did with background checks. Now, AI is in charge and ANY record in ANY system is flagged. Whether it’s for a rental application, or a job, or a credit card.
The AI should decide if it's still relevant or not. People should fully understand that their actions reflect their character and this should influence them to always do the right thing.
I find this a weird take. Are you saying you _want_ unaccountable and profit driven third party companies to become quasi-judicial arbiters of justice?
> What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
France has a similar system to the German Führungszeugnis. Our criminal record (casier judiciaire) has 3 tiers: B1 (full record, only accessible by judges), B2 (accessible by some employers like government or childcare), and B3 (only serious convictions, the only one you can request yourself). Most employers never see anything. It works fine, recidivism stays manageable, and people actually get second chances. The US system of making everything googleable forever is just setting people up to fail.
The UK has common law: the outcomes of previous court cases and the arguments therein determine what the law is. It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
It is the outcome of appellate court cases and arguments that determine law in common law jurisdictions, not the output of trial courts. Telling what the law is in a common law system would not be affected if trial court records were unavailable to the public. You only actually need appellate court records publicly available for determining the law.
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
> It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
You are correct which is why I recently built such a tool. Well, an evidence management tool.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
> Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions?
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
> Blocking access to the information is the wrong way to deal with this problem.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
> Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had.
If all you care about is preventing the information from being abused, preventing it from being used is a great option. This has significant negative side effects though. For court cases it means a lack of accountability for the justice system, excessive speculation in the court of public opinion, social stigma and innuendo, and the use of inappropriate proxies in lieu of good data.
The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it. Let's say we block internet access but keep in person records access in place. What's to stop Google or anyone else from hiring a person to go visit the brick and mortar repositories to get the data exactly the same way they sent cars to map all the streets? And why are we making the assumption that AI training on this data is a net social ill? While we can certainly imagine abuses, it's not hard to imagine real benefits.
> What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
We've been dealing with people making bad decisions from data forever. As an example, there was red lining where institutions would refuse to sell homes or guarantee loans for minorities. Sometimes they would use computer models which didn't track skin color but had some proxy for it. At the end of the day you can't stop this problem by trying to hide what race people are. You need to explicitly ban that behavior. And we did. Institutions that attempt it are vulnerable to both investigation by government agencies and liability to civil suit from their victims. It's not perfect, there are still abuses, but it's so much better than if we all just closed our eyes and pretended that if the data were harder to get the discrimination wouldn't happen.
If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them. If an AI rejects my loan application, you better be able to prove that the AI isn't doing so based on my skin color. If you can do that, you should also be able to prove it's not doing so based off an expunged record. If evidence comes out that the AI has been using such data to come to such decisions, those who made it and those who employ it should be liable for damages, and depending on factors like intent, adherence to best practices, and severity potentially face criminal prosecution. Basically AI should be treated exactly the same as a human using the same data to come to the same conclusion.
> Blocking access to the information is the wrong way to deal with this problem.
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
The actions of the government should always be publicly observable. This is what keeps it accountable. The fear that a person might be unfairly treated due to a long past indiscretion does not outweigh the public's right to observe and hold the government to account.
Alternatively consider that you are assuming the worst behavior of the public and the best behavior of the government if you support this and it should be obvious the dangerous position this creates.
"court records are public forever" and "records of crimes expunged after X years" are incompatible.
Instead, we should make it illegal to discriminate based on criminal conviction history. Just like it is currently illegal to discriminate based on race or religion. That data should not be illegal to know, but illegal to use to make most decisions relating to that person.
Even if made illegal, how does enforcement occur? The United States, at least, is notorious for HR being extremely opaque regarding hiring decisions.
Then there's cases like Japan, where not only companies, but also landlords, will make people answer a question like: "have you ever been part of an anti-social organization or committed a crime?" If you don't answer truthfully, that is a legal reason to reject you. If you answer truthfully, then you will never get a job (or housing) again.
Of course, there is a whole world outside of the United States and Japan. But these are the two countries I have experience dealing with.
The founders of modern nation-states made huge advancements with written constitutions and uniformity of laws, but in the convenience of the rule of law it is often missed that the rule of law is not necessarily the prevalence of justice.
The question a people must ask themselves: we are a nation of laws, but are we a nation of justice?
The parent comment is not presenting a false dichotomy but is making precisely the point that it is how you apply the laws that matter; that just having laws is not enough.
One of the ways they keep crime so low. Being convicted destroys your reputation in a country where reputation is extremely important. Everyone loves saying it would be great to have lower crimes like Japan, but very few would really want the system that achieving that requires.
>Instead, we should make it illegal to discriminate based on criminal conviction history
Absolutely not. I'm not saying every crime should disqualify you from every job but convictions are really a government officialized account of your behavior. Knowing a person has trouble controlling their impulses leading to aggrevated assault or something very much tells you they won't be good for certain roles. As a business you are liable for what your employees do it's in both your interests and your customers interests not to create dangerous situations.
This is an extremely thorny question. Not allowing some kind of blank slate makes rehabilitation extremely difficult, and it is almost certainly a very expensive net social negative to exclude someone from society permanently, all the way up to their death at (say) 70, for something they did at 18. There is already a legal requirement to ignore "spent" convictions in some circumstances.
However, there's also jobs which legally require enhanced vetting checks.
> However, there's also jobs which legally require enhanced vetting checks.
I think the solution there is to restrict access and limit application to only what's relevant to the job. If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
>If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
Welcome to the world of certificates of the good conduct and criminal record extracts:
Other people have rights like freedom of association. If you’re hell-bent on violating that, consider the second-order effects. What is the net social negative when non-criminals freely avoid working in industries in which criminals tend to be qualified to work?
Assuming you're asking in good faith, the parent could be referring to the 'market for lemons' in employment, where in lieu of being able to easily determine worker quality, employers start using second- or third-order- proxies for questions about, say, a candidate's likelihood of having a criminal record.
> "court records are public forever" and "records of crimes expunged after X years" are incompatible.
Exactly. One option is for the person themselves to be able to ask for a LIMITED copy of their criminal history, which is otherwise kept private, but no one else.
This way it remains private, the HR cannot force the applicant to provide a detailed copy of their criminal history and discriminate based on it, they can only get a generic document from the court via Mr Doe that says, "Mr Doe is currently eligible to be employed as a financial advisor" or "Mr Doe is currently ineligible to be employed as a school teacher".
Ideally it should also be encrypted by the company's public key and then digitally signed by the court. This way, if it gets leaked, there's no way to prove its authenticity to a third party without at least outing the company as the source.
In many sane countries, companies can ask you to provide a legal certificate that you did not commit X category of crime. This certificate will then either say that you did not do any crimes in that category, or it will say that you did commit one or more of them. The exact crimes aren't mentioned.
Coincidentally these same countries tend to have a much much lower recidivism rate than other countries.
Everything should remain absolutely private until after conviction.
And only released if it's in the public interest. I'd be very very strict here.
I'm a bit weird here though. I basically think the criminal justice system is very harsh.
Except when it comes to driving. With driving, at least in America, our laws are a joke. You can have multiple at fault accidents and keep your license.
DUI, keep your license.
Run into someone because watching Football is more important than operating a giant vehicle, whatever you might get a ticket.
I'd be quick to strip licenses over accidents and if you drive without a license and hit someone it's mandatory jail time. No exceptions.
By far the most dangerous thing in most American cities is driving. One clown on fan duel while he should be focusing on driving can instantly ruin dozens of lives.
But we treat driving as this sacred right. Why are car immobilizers even a thing?
No, you can not safely operate a vehicle. Go buy a bike.
There is an entire world where arrests are not a matter of the public record and where people don't get disappeared by the government. And then there is US where it is a matter of public record and (waves hand at the things happening).
Let's say a cop kills somebody in your neighborhood. Some witnesses say it looked like murder to them, but per your wishes the government doesn't say who the cop was and publishes no details about the crime.. for two years, when they then say they cop was found not guilty. And as per your wishes again, even then they won't say anything about the alleged crime, and never will. Is this a recipe for public trust in their government?
It is also possible to apply a higher standard to the government employees and force greater transparency on them, up to treating them as de-facto slaves of the society.
Yeah okay, different standard just for government employees... So consider the same scenario above except instead of a cop its the son of a politician or the nephew of a billionaire. Not government employees. Are you comfortable with the government running secret trials for them too? Are you confident that the system can provide fair and impartial judgments for such people when nobody is allowed to check their work?
So here in the U.S., the Karen Read trial recently occupied two years of news cycles— convicted of a lesser crime on retrial.
Is the position that everyone who experienced that coverage, wrote about it in any forum, or attended, must wipe all trace of it clean, for “reasons”? The defendant has sole ownership of public facts? Really!? Would the ends of justice have been better served by sealed records and a closed courtroom? Would have been a very different event.
Courts are accustomed to balancing interests, but since the public usually is not a direct participant they get short shrift. Judges may find it inconvenient to be scrutinized, but that’s the ultimate and only true source of their legitimacy in a democratic system.
right, for example someone convicted of killing their parents should fit right into an elderly care home staff team and convicted child rapists should not be barred from working in an elementary school, protecting honest and innocent people from criminals is basically the same thing as racism!
Problem is it's very hard to prove what factors were used in a decision. Person A has a minor criminal record, person B does not? You can just say "B was more qualified" and as long as there's some halfway credible basis for that nothing can really be done. Only if one can demonstrate a clear pattern of behavior might a claim of discrimination go anywhere.
If a conviction is something minor enough that might be expungable, it should be private until that time comes. If the convicted person hasn't met the conditions for expungement, make it part of the public record, otherwise delete all history of it.
Curious, why should conviction history not be a factor? I could see the argument that previous convictions could indicate a lack of commitment to no longer committing crimes.
I couldn't parse the intended meaning from "lack of commitment to no longer commiting crimes"), so here's a response that just answers the question raised.
Do you regard the justice system as a method of rehabilitating offenders and returning them to try to be productive members of society, or do you consider it to be a system for punishment? If the latter, is it Just for society to punish somebody for the rest of their life for a crime, even if the criminal justice considers them safe to release into society?
Is there anything but a negative consequence for allowing a spent conviction to limit people's ability to work, or to own/rent a home? We have carve-outs for sensitive positions (e.g. working with children/vulnerable adults)
Consider what you would do in that position if you had genuinely turned a corner but were denied access to jobs you're qualified for?
The short answer is that it's up to a judge to decide that, up to the law what it's based on and up to the people what the law is.
Sure there is still some leeway between only letting a judge decide the punishment and full on mob rule, but it's not a slippery slope fallacy when the slope is actually slippy.
It's fairly easy to abuse the leeway to discriminate to exclude political dissidents for instance.
Because we as a society decided it creates externalities we don't want to deal with. With a list of exceptions where it actually is important because risk-reward balance is too much.
You'd need so many exceptions to such a law it would be leakier than a sieve. It sounds like a fine idea at ten thousand feet but it immediately breaks down when you get into the nitty gritty of what crimes and what what jobs we're talking about.
> Instead, we should make it illegal to discriminate based on criminal conviction history.
Good luck proving it when it happens. We haven't even managed to stop discrimination based on race and religion, and that problem has only gotten worse as HR departments started using AI which conveniently acts as a shield to protect them.
Which is why in any country where criminal history is considered discrimination, this information is simply not provided. Because these companies have learned over the years that "please don't do X" just doesn't work with corporations.
What we do here in sweden is that you can ask the courts for any court document (unless it is confidential for some reason).
But the courts are allowed to do it conditionally, so a common condition if you ask for a lot of cases is to condition it to redact any PII before making the data searchable. Having the effect that people that actually care and know what to look for, can find information. But you can't randomly just search for someone and see what you get.
There is also a second registry separate from the courts that used to keep track of people that have been convicted during the last n years that is used for backgrounds checks etc.
Thanks, it’s super refreshing to hear this take. I fear where we are headed.
I robbed a drug dealer some odd 15 years ago while strung out. No excuses, but I paid my debt (4-11 years in state max, did min) yet I still feel like a have this weight I can’t shake.
I have worked for almost the whole time, am no longer on parole or probation. Paid all fines. I honestly felt terrible for what I did.
At the time I had a promising career and a secret clearance. I still work in tech as a 1099 making way less than I should. But that is life.
What does a background check matter when the first 20 links on Google is about me committing a robbery with a gun?
Edit: mine is an extreme and violent case. But I humbly believe, to my benefit surely, that once I paid all debts it should be done. That is what the 8+ years of parole/probation/counseling was for.
Fully agree. The AI companies have broken the basic pacts of public open data. Their ignoring of robots.txt files is but one example of their lack of regard.
With the open commons being quickly pillaged we’ll end up in a “community member access only model”. A shift from grab any books here you like just get them back in a month; to you’ll need to register as a library member before you can borrow. I see that’s where we’ll end up. Public blogs and websites will suffer and respond first is my prediction.
Can you explain your reasoning about “forever convictions”, and for full disclosure, do you have a conviction and are thereby biased?
Additionally, do you want a special class of privileged people, like a priestly class, who can interpret the data/bible for the peasantry? That mentality seems the same as that behind the old Latin bibles and Latin mass that people were abused to attend, even though they had no idea what was being said.
So who would you bequeath the privileges of doing “research”?Only the true believers who believe what you believe so you wouldn’t have to be confronted with contradictions?
And how would you prevent data exfiltration? Would you have your authorized “researchers” maybe go to a building, we can call it the Ministry of Truth, where they would have access to the information through telescreen terminals like how the DOJ is controlling the Epstein Files and then monitoring what the Congressmen were searching for? Think we would have discovered all we have discovered if only the Epstein people that rule the country had access to the Epstein files?
Yes, convictions are permanent records of one’s offenses against society, especially the egregious offenses we call felonies on the USA.
Should I as someone looking for a CFO or just an accountant not have the right that to know that someone was convicted of financial crimes, which is usually long precipitated by other transgressions and things like “mistakes” everyone knows weren’t mistakes? How would any professional association limit certification if that information is not accessible? So Madoff should Ave been able to get out and continue being involved in finances and investments?
The names of minors should never be released in public (with a handful of exceptions).
But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
Would you want the first thing to show up after somebody googles your name to be an accusation for improper conduct around a child? In theory, people could dig deeper and find out you won in court and were acquitted, but people here should know that nobody ever reads the article...
If you were hiring a childminder for your kids, would you want to know that they had 6 accusations for improper conduct around children in 6 different court cases - even if those were all acquittals?
As a parent, I would want to know everything about anyone who's going to be around my children in any capacity. That doesn't mean I have a right to it, though.
If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.
Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?
> If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.
This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.
But you create an incentive for organized crime to recruit youth to commit crimes and not have to suffer the consequences.
At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.
There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.
The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.
> working as a underling for organized crime to commit robberies, drug deals, and violent crime
Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?
The problem here isn't the lack of long term consequences for kids.
What's the alternative? A 14 year old steals a pack of gum, and he's listed as a shoplifter for the rest of his life?
Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?
If you don't think this crime is a big deal, then why do you think this crime would matter if it was in the public record tied to their name? These two ideas you have are not compatible.
> Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.
Totally agree. And it goes beyond criminal history. Just because I choose to make a dataset publicly available doesn't mean I want some AI memorizing it and using it to generate profit.
AI isn't the problem here. Once something goes on the internet it lives forever (or should be treated as such). So has it always been.
If something is expungable it probably shouldn't be public record. Otherwise it should be open and scrapable and ingested by both search engines and AI.
Names and other PII can be replaced with aliases in bulk data, unsealed after ID verification on specific requests and within quotas. It’s not a big problem.
>Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
Is this the UK thing where PII is part of the released dataset? I know that Ukrainian rulings are all public, but the PII is redacted, so you can train your AI on largely anonymized rulings.
I think it should also be against GDPR to process sensitive PII like health records and criminal convictions without consent, but once it hits the public record, it's free to use.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
On the other hand, perpetrating crime is a GREAT predictor of perpetrating more crime -- in general most crime is perpetrated by past perps. Why should this info not be available to help others avoid troublemakers?
Jail time is not always the entire punishment, especially on the enlightened continent, where jail time is used sparingly. Keeping the conviction on record is a thing, because consecutive convictions often carry come with higher punishment. So depending on a crime cathegory, there is X years after which the record no longer counts, but it's not 0.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
The idea that society is required to forget crime is pretty toxic honestly.
Society does a poor job of assessing the degree of crime. It's too binary for people: You're either a criminal or not. There are too many employers who would look at a 40 year old sitting in front of them applying for a job, search his criminal record, find he stole a candy bar when he was 15, and declare him to be "a criminal" ineligible for employment.
No, public doesn't mean access should be limited to academics of acceptable political alignment, it means open to the public: everybody.
That is the entire point of having courts, since the time of Hammurabi. Otherwise it's back to the clan system, where justice is made by avenging blood.
Making and using any "profiles" of people is an entirely different thing than having court rulings accessible to the public.
I believe it would be more accurate to say: "I believe in free speech but only from accredited researchers. Oh btw the government can also make laws to control such accreditation"
We should remember that local journalism has been dead for a decade in most of the UK, largely due to social media.
Any tool like this that can help important stories be told, by improving journalist access to data and making the process more efficient, must be a good thing.
I think the right balance is to air gap a database and allow access to the public by your standard: show up somewhere with a USB.
I think it's right to prevent random drive by scraping by bots/AI/scammers. But it shouldnt inhibit consumers who want to use it to do their civic duties.
>We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
No. Open is open. Beyond DDoS protections, there should be no limits.
If load on the server is a concern, make the whole database available as a torrent. People who run scrapers tend to prefer that anyway.
This isn't someone's hobby project run from a $5 VPS - they can afford to serve 10k qps of readonly data if needed, and it would cost far less than the salary of 1 staff member.
Pedantically: rate limiting is DoS prevention, not DDoS prevention. If you rate limit per IP, you're not mounting effective protection against a distributed attack. If you're rate limiting globally, you're taking your service offline for everyone.
You're talking about a tragedy of the commons situation. There is an organic query rate of this based on the amount of public interest. Then there is the inorganic vacuuming of the entire dataset by someone who wants to exploit public services for private profit. There is zero reason why the public should socialize the cost of serving the excess capacity caused by private parties looking to profit from the public data.
I could have my mind changed if the public policy is that any public data ingested into an AI system makes that AI system permanently free to use at any degree of load. If a company thinks that they should be able to put any load they want on public services for free, they should be willing to provide public services at any load for free.
The issue with that is people can then flood everything with huge piles of documents, which is bad enough if it's all clean OCR'd digital data that you can quickly download in its entirety, but if you're stuck having to wait between downloading documents, you'll never find out what they don't want you to find out.
It's like having you search through sand, it's bad enough while you can use a sift, but then they tell you that you can only use your bare hands, and your search efforts are made useless.
This is not a new tactic btw and pretty relevant to recent events...
Systems running core government functions should be set up to be able to efficiently execute their functions at scale, so I'd say it should only restrict extreme load, ie DoS attacks
If the rate limit is reasonable (allows full download of the entire set of data within a feasible time-frame), that could be acceptable. Otherwise, no.
> Something is either public record - in which case it should be on a government website for free, and the AI companies should be free to scrape to their hearts desire...Or it should be sealed for X years and then public record.
OR it should be allowed for humans to access the public record but charge fees for scrapers
Spoken like someone who's never spent thousands of dollars and literal years struggling to get online records corrected to reflect an expungement. Fuck anything that makes that process even more difficult which AI companies certainly will.
I don't know what the particular issue is in this case but I've read about what happens with Freedom of Information (FOI) requests in England: apparently most of the requests are from male journalists/writers looking for salacious details of sex crimes against women, and the authorities are constantly using the mental health of family members as an argument for refusing to disclose material. Obviously there are also a few journalists using the FOI system to investigate serious political matters such as human rights and one wouldn't want those serious investigations to be hampered but there is a big problem with (what most people would call) abuse of the system. There _might_ perhaps be a similar issue with this court reporting database.
England has a genuinely independent judiciary. Judges and court staff do not usually attempt to hide from journalists stuff that journalists ought to be investigating. On the other hand, if it's something like an inquest into the death of a well-known person which would only attract the worst kind of journalist they sometimes do quite a good job of scheduling the "public" hearing in such a way that only family members find out about it in time.
A world government could perhaps make lots of legal records public while making it illegal for journalists to use that material for entertainment purpose but we don't have a world government: if the authorities in one country were to provide easy access to all the details of every rape and murder in that country then so-called "tech" companies in another country would use that data for entertainment purposes. I'm not sure what to do about that, apart, obviously, from establishing a world government (which arguably we need anyway in order to handle pollution and other things that are a "tragedy of the commons" but I don't see it happening any time soon).
I should clarify that I was talking about the FOI requests submitted to a particular authority: I think it was the National Archives or some subsection thereof. If you're talking about all FOI requests submitted to all authorities then probably most of them don't relate in any way to criminal cases. I think we don't really need precise numbers to observe that public access to judicial data can be abused, which is all I wanted to say, really. I wrote too many words.
One of the problems with open access to these government DBs is that it gives out a lot of information that spammers and scammers use.
Eg if you create a business then that email address/phone number is going to get phished and spammed to hell and back again. It's all because the government makes that info freely accessible online. You could be a one man self-employed business and the moment you register you get inundated with spam.
The idea that an individual can look up and case they want is the same thing as a bot being able to scrape and archive an entire dataset forever is just silly.
One individual could spend their entire life going through one by one recording cases and never get through the whole dataset. A bot farm could sift through it in an hour. They are not the same thing.
I don't think all information should be easily accessible.
Some information should be in libraries, held for the public to access, but have that access recorded.
If a group of people (citizens of a country) have data stored, they ought to be able to access it, but others maybe should pay a fee.
There is data in "public records" that should be very hard to access, such as evidence of a court case involving the abuse of minors that really shouldn't be public, but we also need to ensure that secrets are not kept to protect wrongdoing by those in government or in power.
Totally agreed! This is yet another example of reduced friction due to improved technology breaking a previously functional system without really changing the qualities it had before. I don't understand why this isn't obvious to more people. It's been said that "quantity has a quality all its own", and this is even more true when that quantity approaches infinity.
Yes, license plates are public, and yes, a police officer could have watched to see whether or not a suspect vehicle went past. No, that does not mean that it's the same thing to put up ALPRs and monitor the travel activity of every car in the country. Yes, court records should be public, no, that doesn't mean an automatic process is the same as a human process.
I don't want to just default to the idea that the way society was organized when I was a young person is the way it should be organized forever, but the capacity for access and analysis when various laws were passed and rights were agreed upon are completely different from the capacity for access and analysis with a computer.
Yep, these are commonly sealed records and having worked with family law lawyers there are things that happened to the victims that should never be unsealed.
Even outside of family law there are many justifiable reasons for sealing and even expunging (deletion) of records. I’m a believer that under the correct circumstances criminal records should be sealed & even in some cases expunged as well. People deserve a second chance.
Family law is just the most obvious and unarguable example.
Confused what to make of the comments here. Access to court lists has always been free and open, it's just a pain in the ass to work with. The lists contain nothing much of value beyond the names involved in a case and the type of hearing
It's not easy to see who to believe. The MP introducing it claiming there is a "cover up" is just what MPs do. Of course it makes him look bad, a service he oversaw the introduction of is being withdrawn. The rebuttal by the company essentially denies everything. Simultaneously it's important to notice the government are working on a replacement system of their own.
I think this is a non-event. If you really want to read the court lists you already can, and without paying a company for the privilege. It sounds like HMCTS want to internalise this behaviour by providing a better centralised service themselves, and meanwhile all the fuss appears to be from a company operated by an ex-newspaper editor who just had its only income stream built around preferential access to court data cut off.
As for the openness of court data itself, wake me in another 800 years when present day norms have permeated the courts. Complaining about this aspect just shows a misunderstanding of the (arguably necessary) realities of the legal system.
I think you're underestimating how important "just a pain in the ass to work with" may be.
An analogy would be Hansard and theyworkforyou.com. The government always made Hansard (record of parliamentary debates) available. But theyworkforyou cleaned the data, and made it searchable with useful APIs so you could find how your MP voted. This work was very important for making parliament accessible; IIRC, the guys behind it were impressive enough that they eventually were brought in to improve gov.uk.
> “We are also working on providing a new licensing arrangement which will allow third parties to apply to use our data. We will provide more information on this in the coming weeks.
Seems quite absurd that they would shut down the only system that could tell journalists what was actually happening in the criminal courts under the pretext that they sent information to a third-party AI company (who doesn’t these days). Here’s a rebuttal by one of the founders i believe: https://endaleahy.substack.com/p/what-the-minister-said
Absolutely fucking crazy that you typed this out as a legitimate defense of allowing extremely sensitive personal information to be scraped.
> only system that could tell journalists what was actually happening in the criminal courts
Who cares? Journalism is a dead profession and the people who have inherited the title only care about how they can mislead the public in order to maximize profit to themselves. Famously, "journalists" drove a world-renowned musician to his death by overdose with their self-interest-motivated coverage of his trial[1]. It seems to me that cutting the media circus out of criminal trials would actually be massively beneficial to society, not detrimental.
Absolutely fucking crazy that you call accurately describing the reality of AI scraping "absolutely fucking crazy" while at the same time going "who cares?" on attacks against journalism and free speech.
>Oh no, some musician died, PASS THE NATIONAL SECURITY ACT, LOCK DOWN ALL INFORMATION ABOUT CRIMINALS, JAIL JOURNALISTS!!!!
The government provided data to a private company. The private company sold resold access to a third party for AI ingestion. it's a plain case of tough titties to the private company.
That said I don't know why the hell the service concerned isn't provided by the government itself.
Perhaps that is true, but the response linked by GP claims exactly the opposite:
"We hired a specialist firm to build, in a secure sandbox, a safety tool for journalists. They are experts in building privacy-preserving AI solutions - for people like law firms or anyone deeply concerned with how data is held, processed, and protected. That’s why we chose them. Their founders are not only respected academics in addition to being professionals, they have passed government security clearance and DBS checks in the past, and have worked on data systems for the National Archives, the Treasury, and other public agencies. They’ve published academic papers on data compliance for machine learning.
"The Minister says we ‘shared data with an AI company”... as if we were pouring this critically sensitive information into OpenAI or some evil aggregator of data. This is simply ridiculous when you look at what we do and how we did it.
"We didn’t “share” data with them. We hired them as our technical contractor to build a secure sandbox to test an idea, like any company using a cloud provider or an email service. They worked under a formal sub-processor agreement, which means under data protection law they’re not even classified as a “third party.” That’s not our interpretation. It’s the legal definition in the UK GDPR itself. ...
"And “for commercial purposes”? The opposite is true. We paid them £45,000 a year. They didn’t pay us a penny. The money flowed from us to them. They were prohibited, in writing, from selling, sharing, licensing, or doing anything at all with the data other than providing the service we hired them for.. and they operated under our supervision at all times. They didn’t care what was in the data - we reviewed, with journalists, the outputs to make sure it worked."
If this is true, it does seem that the government has mischaracterized what happened.
It sounds very reasonable. But it's also directly contradicted by the government information about this case, which was very specific even about the number of breaches:
> Our understanding is that some 700 individual cases, at least, were shared with the AI company. We have sought to understand what more may have been shared and who else may have been put at risk, but the mere fact that the agreement was breached in that way is incredibly serious.
> ... the original agreement that was reached between Courtsdesk and the previous Government made it clear that there should not be further sharing of the data with additional parties. It is one thing to share the data with accredited journalists who are subject to their own codes and who are expected to adhere to reporting restrictions, but Courtsdesk breached that agreement by sharing the information with an AI company.
When there is a risk of feeding sensitive data to the AI giants the first reaction should be to pull the plug. I'm impressed the government acted quickly and decisively for once. Maybe the company involved will think twice before entering an agreement with an AI company. Notice in the whole rant it is never mentioned which AI giant they were feeding.
But when the conspiracy involves lack of prosecution or inconsistent sentencing at scale and then the Ministry of Justice issues a blanket order to delete one of the best resources to look into those claims...? Significantly increases the legitimacy of the claims.
I assumed it was the usual conspiracy stuff up until this order.
They raise the interesting point that "publicly available" doesn't necessarily mean its free to store/process etc:
> One important distinction is that “publicly available” does not automatically mean “free to collect, combine, republish and retain indefinitely” in a searchable archive. Court lists and registers can include personal data, and compliance concerns often turn on how that information is processed at scale: who can access it, how long it is kept, whether it is shared onward, and what safeguards exist to reduce the risk of harm, especially in sensitive matters.
I can’t believe that this even needs to be said. There are plenty of things which are publicly available but not free to share and definitely not allowed to be made money of.
The company in question had a direct relationship with HM Courts & Tribunals Service, and disputes that they sold/distributed any data to any 'AI third party' - says what they actually did was to hire AI-focussed contractors to build some new tool/feature for the platform.
The counter claim by the government is that this isn't "the source of truth" being deleted but rather a subset presented more accessibly by a third party (CourtsDesk) which has allegedly breached privacy rules and the service agreement by passing sensitive info to an AI service.
Coverage of the "urgent question" in parliament on the subject here:
House of Commons, Courtsdesk Data Platform Urgent Question
They made the data accessible though. From what I can gather, before them, the data was only accessible via old windows apps. If the source of truth is locked and gated, what good is it?
You can get the daily court listings for free online. I would assume this service just provided a ux-friendly front end,etc. which in the clip the minister says they will be replacing.
This is odd; this is supposed to be public information, isn't it? I suspect it's run into bureaucratic empire-defending rather than a nefarious scheme to conceal cases.
Relatedly, there's an extremely good online archive of important cases in the past, but because they disallow crawlers in robots.txt: https://www.bailii.org/robots.txt not many people know about it. Personally I would prefer if all reporting on legal cases linked to the official transcript, but seemingly none of the parties involved finds it in their interest to make that work.
It is public information. The private company whose database is being deleted processed the public information into an easier to search and consume dataset
This kind of logic does more disservice than people realize. You can combat bigotry towards immigrants (issue #1), without covering up for criminal immigrants (issue #2) in fear of increase of issue #1 among the natives. It only brings up more resentment and bigotry.
You can also insinuate that decisions completely unrelated to immigrants (issue #3) are a coverup to "protect immigrants" in order to use the popularity of bigotry towards immigrants (issue #1) to make the issue salient to bigots that have literally no interest in the rights or wrongs of third party court databases, which anyone with the slightest level of political understanding can see is going on here.
Crimes are committed by individuals. "Immigrants" is a group.
Prosecution of sexual assault is often handled extremely badly. It needs to be done better, without fear or favor, including people who are friends with the police or in positions of power. As we're seeing the fallout of the Epstein files.
> Crimes are committed by individuals. "Immigrants" is a group.
Great. How does it change the substance of my comment?
Perhaps, instead of arguing about whether “immigrants” is always a group as a collective, or a certain number of individuals acting together, you would focus on the high level implications of government’s action or inaction?
What do Epstein files have to do with anything right now? Stop shifting the goal posts.
- Trying to claim things are just a series of isolated incidents with absolutely nothing in common
- Claiming there are wider problems (that should be addressed in a manner that would take years and isn't even defined well enough to claim measure as being "better")
Actually that’s what data and the preponderance of victims allege: an intersection of immigration and policing which interlocked to systematically deprioritize the investigation into abuse of working-class white girls by an over represented ethnic group.
In the local data that the audit examined from three police forces, they identified clear evidence of “over-representation among suspects of Asian and Pakistani-heritage men”.
It’s unfortunate to watch people and entire countries twist themselves in logic pretzels to avoid ever suggesting that immigration has no ills, and we’re just being polite here about it.
This database exposed half a million weekend cases which were heard with zero press notification. Many grooming gang trials were heard this way. The database is being deleted weeks before the national inquiry into the grooming gang cover up begins, and the official reason for deleting the data is nonsensical.
Can you explain your reasoning? Horrible crimes committed by foreign men against native children were covered up for political reasons in the UK. This is common knowledge.
In the case of Rotherham, I believe that most of those committing the crimes were not foreign; most were British born British nationals. Ethnicity is not the same as immigration status.
The question is: Would the crimes have been covered up by authorities if the predators were ethnically English and the victims were children of foreigners?
Anyone who's actually paid any attention to the many documented failings of the child protection services in those cases knows the answer to that question is "yes".
The problem with that argument is that, IIRC, there is direct evidence that one reasons that the abuse was covered up was that authorities were afraid of being accused of racism and/or of stirring up ethnic tensions. I don't think that, to accept this, you need believe that CPS is always perfect when this issue is absent.
This is one of the factors that lead to crimes not being better investigated at the time, but then fear of tensions would cases where the alleged perpetrators were white and the victims/witnesses non-white as in the OP's question. (And more generally, teenage girls from working class backgrounds got far less sympathy and far more scepticism than they should have done from the police and CPS and even social workers when they talked about being sexually exploited regardless of race. Certainly no evidence was found they were much more keen to listen to non-white victims or prosecute white people...)
> ... the agreement restricts the supply of court data to news agencies and journalists only.
> However, a cursory review of the Courtsdesk website indicates that this same data is also being supplied to other third parties — including members of @InvestigatorsUK — who pay a fee for access.
> Those users could, in turn, deploy the information in live or prospective legal proceedings, something the agreement expressly prohibits.
This aligns with the explanation given in response to the urgent question. What looks like a simple breach of contract issue is being weaponized in bad faith by politicians who spend far too much time time on the shadier parts of the internet.
They believe that they exist to control us. And let's been honest, British people are a meek bunch who have done little to disillusion them of that notion, at times positively encouraging our own subjugation.
In other countries, interference with the right to a fair trial would have lead to widespread protest. We don't hold our government to account, and we reap the consequences of that.
I think there is a legitimate argument that the names of people who go to court and are either victims or are found innocent of the charges, should not be trivially searchable by anyone.
Though I'm not sure stopping this service achieves that.
Also - even in the case that somebody is found guilty - there is a fundamental principle that such convictions have a life time - after which they stop showing up on police searches etc.
If some third party ( not applicable in this case ), holds all court cases forever in a searchable format, it fundamentally breaches this right to be forgotten.
This presumably also falls under the Data (Use and Access) Act 2025 which forbids this kind of citizen data being relayed to third parties without permission. The company don't have a leg to stand on here, which is why it is basing its public appeals now on the impact to its users (journalists). But no company has a right to flout data protection regulations or its agreed conditions of use without serious consequences. Since the data has already been passed on, the breach itself can't be fixed, so it is totally proportionate to order the service to be closed and its data deleted. Frankly, fuck companies with the arrogance to behave this way - cheating agreements and responsibilities in order to make more money, and then expecting indulgence because of the uniqueness of their service.
The political answer is that open justice provides ammunition for their political opponents, and that juries also tend to dislike prosecutions that feel targeted against political opponents. See palestine action as a left wing example and Jamie Michael's racial hatred trial as a right wing example.
Obviously the government Ministry of Justice cannot make other parts of government more popular in a way that appeases political opponents, so the logical solution is to clamp down on open justice.
FYI, apparently there was a data breach, but it would seem better to fix the issue and continue with this public service than to just shut it down completely. Here is the Journalist organization in the UK responding:
"The government has cited a significant data protection breach as the reason for its decision - an issue it clearly has a duty to take seriously."
Along with the attempt to prevent jury trials for all but the most serious criminal cases, this is beginning to look like an attempt to prevent reporting on an upcoming case. I can think of one happening in April, involving the prime minister. Given he was head of the CPS for 5 years, would know exactly which levers to pull.
Why do you think "they" are trying to suppress reporting on a Russian-recruited Ukranian national carrying out arson attacks against properties the PM is "linked to" but does not live in? What's the supposed angle?
And how exactly is eliminating a third party search tool for efficiently searching lots of obscure magistrates court proceedings going to stop journalists from paying attention to a spicy court case linked to foreign agents and the PM?
5 Ukrainians. People have traced what some of them were doing professionally when the PM would've been living there. It could be nothing, but we need transparency.
Courtsdesk are rather misrepresenting this situation.
Quoting from an urgent question in the House of Lords a few days ago:
> HMCTS was working to expand and improve the service by creating a new data licence agreement with Courtsdesk and others to expand access to justice. It was in the course of making that arrangement with Courtsdesk that data protection issues came to light. What has arisen is that this private company has been sharing private, personal and legally sensitive information with a third-party AI company, including potentially the addresses and dates of birth of defendants and victims. That is a direct breach of our agreement with Courtsdesk, which the Conservatives negotiated.
Digital access to UK court records was already abysmal. And we're somehow going even further backwards. At least in the US you have initiatives like https://www.courtlistener.com/.
My guess is that the company running this were found to be collaborating with contentious partners, and so the government is shutting down the collaboration as risk-mitigation, in order to internalize decisions within government.
Ministry of Justice in UK has always struck me as very savvy, from my work in the UK civic tech scene. They're quite self-aware, and I assume this is more pro-social than it might seem.
In many cases government texts are not covered by copyright, so it may not even be relevant here, regardless of it is is allowed to copy the data or not.
In the UK government records are generally covered by Crown Copyright (which is its own slightly more restrictive weird thing) rather than in the public domain. I haven't checked to see what the status of the court listings are, but the default is very different to the US.
Hard to see how this isn't totally asinine and retarded.
I haven't confirmed it but I've seen journalist friends of mine complain also that the deletion period for much data is about 7 years so victims are genuinely shocked that they can't access their own transcripts even from 2018... Oh and if they can they're often extremely expensive.
I've looked into the Courtdesk service. It's a stream of events from the courts, as they happen. They claim up to 12,000 updates in a 24 period, aggregated, filtered and organised. While court judgements are public, I don't know if the information Courtdesk provides is. This is a worrying direction.
If you don't "know about them from another source" you can't effectively find/access the information and you might not even know that there is something you really should know about.
The service bridged the gap by providing a feed about what is potentially relevant for you depending on your filters etc.
This mean with the change:
- a lot of research/statistics are impossible to do/create
- journalists are prone to only learning about potentially very relevant cases happening, when it's they are already over and no one was there to cover it
I kept digging and reached the service https://www.courtserve.net. Seems like a windows application (old school one) that receives the data, but I need more time to explore there. They've been working with MoJ for 20 years (their claim).
Initially I thought they have people at the courts live reporting but that's a bit of a stretch...
When people who are involved in the upkeep of these systems start saying no is when these decisions will cease. This problems are all done by our acquiesce.
> HMCTS acted to protect sensitive data after CourtsDesk sent information to a third-party AI company.
(statement from the UK Ministry of Justice on Twitter, CourtsDesk had ran the database)
but it's unclear how much this was an excuse to remove transparency and how much this actually is related to worry how AI could misuse this information
Shutting down the only working database is the proof point that perfect is the enemy of good.
Of course, this gives cover to the "well, we're working on something better" and "AI companies are evil"
Fine, shut it down AFTER that better thing is finally in prod (as if).
And if the data was already leaked because you didn't do your due diligence and didn't have a good contract, that's on you. It's out there now anyway.
And, really, what's the issue with AI finally having good citations? Are we really going to try to pretend that AI isn't now permanently embedded in the legal system and that lawyers won't use AI to write extremely formulaic filings?
This is either bureaucracy doing what it does, or an actual conspiracy to shut down external access to public court records. It doesn't actually matter which: what matters is that this needs to be overturned immediately.
The subject of the article is the public right to access the daily schedule of England's courts.
In the United States our Constitution requires the government conduct all trials in public ( with the exception of some Family Court matters ). Secret trials are forbidden. This is a critical element to the operation of any democracy.
this is mostly not about historic data but about live data
Through the way AI companies could "severely/negligent mishandle the data potentially repeatedly destroying innocent people live" is about historic data, tho.
It's not just the historical data - they provided what is effectively a live stream of events from the courts in real time, allowing you to aggregate and filter it. This is not trivial.
Or it should be sealed for X years and then public record. Where X might be 1 in cases where you don't want to hurt an ongoing investigation, or 100 if it's someone's private affairs.
Nothing that goes through the courts should be sealed forever.
We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
AI firms have shown themselves to be playing fast and loose with copyrighted works, a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
[1] https://rcmp.ca/en/criminal-records/criminal-record-checks/v...
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
This made me pause. It seems to me that if something is not meant to inform decision making, then why does a record of it need to persist?
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
I'm not sure we can write that much more COBOL.
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
As if "character" was some kind of immutable attribute you are born with.
The UK does not have a statute of limitations
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
If all you care about is preventing the information from being abused, preventing it from being used is a great option. This has significant negative side effects though. For court cases it means a lack of accountability for the justice system, excessive speculation in the court of public opinion, social stigma and innuendo, and the use of inappropriate proxies in lieu of good data.
The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it. Let's say we block internet access but keep in person records access in place. What's to stop Google or anyone else from hiring a person to go visit the brick and mortar repositories to get the data exactly the same way they sent cars to map all the streets? And why are we making the assumption that AI training on this data is a net social ill? While we can certainly imagine abuses, it's not hard to imagine real benefits.
> What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
We've been dealing with people making bad decisions from data forever. As an example, there was red lining where institutions would refuse to sell homes or guarantee loans for minorities. Sometimes they would use computer models which didn't track skin color but had some proxy for it. At the end of the day you can't stop this problem by trying to hide what race people are. You need to explicitly ban that behavior. And we did. Institutions that attempt it are vulnerable to both investigation by government agencies and liability to civil suit from their victims. It's not perfect, there are still abuses, but it's so much better than if we all just closed our eyes and pretended that if the data were harder to get the discrimination wouldn't happen.
If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them. If an AI rejects my loan application, you better be able to prove that the AI isn't doing so based on my skin color. If you can do that, you should also be able to prove it's not doing so based off an expunged record. If evidence comes out that the AI has been using such data to come to such decisions, those who made it and those who employ it should be liable for damages, and depending on factors like intent, adherence to best practices, and severity potentially face criminal prosecution. Basically AI should be treated exactly the same as a human using the same data to come to the same conclusion.
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
Alternatively consider that you are assuming the worst behavior of the public and the best behavior of the government if you support this and it should be obvious the dangerous position this creates.
Instead, we should make it illegal to discriminate based on criminal conviction history. Just like it is currently illegal to discriminate based on race or religion. That data should not be illegal to know, but illegal to use to make most decisions relating to that person.
Then there's cases like Japan, where not only companies, but also landlords, will make people answer a question like: "have you ever been part of an anti-social organization or committed a crime?" If you don't answer truthfully, that is a legal reason to reject you. If you answer truthfully, then you will never get a job (or housing) again.
Of course, there is a whole world outside of the United States and Japan. But these are the two countries I have experience dealing with.
The question a people must ask themselves: we are a nation of laws, but are we a nation of justice?
Absolutely not. I'm not saying every crime should disqualify you from every job but convictions are really a government officialized account of your behavior. Knowing a person has trouble controlling their impulses leading to aggrevated assault or something very much tells you they won't be good for certain roles. As a business you are liable for what your employees do it's in both your interests and your customers interests not to create dangerous situations.
However, there's also jobs which legally require enhanced vetting checks.
I think the solution there is to restrict access and limit application to only what's relevant to the job. If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
Welcome to the world of certificates of the good conduct and criminal record extracts:
- https://www.justis.nl/en/products/certificate-of-conduct
- https://diia.gov.ua/services/vityag-pro-nesudimist
Or, you might just be doing the meme: https://x.com/MillennialWoes/status/1893134391322308918?s=20
Exactly. One option is for the person themselves to be able to ask for a LIMITED copy of their criminal history, which is otherwise kept private, but no one else.
This way it remains private, the HR cannot force the applicant to provide a detailed copy of their criminal history and discriminate based on it, they can only get a generic document from the court via Mr Doe that says, "Mr Doe is currently eligible to be employed as a financial advisor" or "Mr Doe is currently ineligible to be employed as a school teacher".
Ideally it should also be encrypted by the company's public key and then digitally signed by the court. This way, if it gets leaked, there's no way to prove its authenticity to a third party without at least outing the company as the source.
Coincidentally these same countries tend to have a much much lower recidivism rate than other countries.
And only released if it's in the public interest. I'd be very very strict here.
I'm a bit weird here though. I basically think the criminal justice system is very harsh.
Except when it comes to driving. With driving, at least in America, our laws are a joke. You can have multiple at fault accidents and keep your license.
DUI, keep your license.
Run into someone because watching Football is more important than operating a giant vehicle, whatever you might get a ticket.
I'd be quick to strip licenses over accidents and if you drive without a license and hit someone it's mandatory jail time. No exceptions.
By far the most dangerous thing in most American cities is driving. One clown on fan duel while he should be focusing on driving can instantly ruin dozens of lives.
But we treat driving as this sacred right. Why are car immobilizers even a thing?
No, you can not safely operate a vehicle. Go buy a bike.
But the Internet's memory means that something being public at time t1 means it will also be public at all times after t1.
To be this brings in another question when the discussion should be focused on to what extent general records should be open.
Is the position that everyone who experienced that coverage, wrote about it in any forum, or attended, must wipe all trace of it clean, for “reasons”? The defendant has sole ownership of public facts? Really!? Would the ends of justice have been better served by sealed records and a closed courtroom? Would have been a very different event.
Courts are accustomed to balancing interests, but since the public usually is not a direct participant they get short shrift. Judges may find it inconvenient to be scrutinized, but that’s the ultimate and only true source of their legitimacy in a democratic system.
If a conviction is something minor enough that might be expungable, it should be private until that time comes. If the convicted person hasn't met the conditions for expungement, make it part of the public record, otherwise delete all history of it.
Sometimes can you can't prove B was more qualified, but you can always claim some BS like "B was a better fit for our company culture"
Do you regard the justice system as a method of rehabilitating offenders and returning them to try to be productive members of society, or do you consider it to be a system for punishment? If the latter, is it Just for society to punish somebody for the rest of their life for a crime, even if the criminal justice considers them safe to release into society?
Is there anything but a negative consequence for allowing a spent conviction to limit people's ability to work, or to own/rent a home? We have carve-outs for sensitive positions (e.g. working with children/vulnerable adults)
Consider what you would do in that position if you had genuinely turned a corner but were denied access to jobs you're qualified for?
Sure there is still some leeway between only letting a judge decide the punishment and full on mob rule, but it's not a slippery slope fallacy when the slope is actually slippy.
It's fairly easy to abuse the leeway to discriminate to exclude political dissidents for instance.
Good luck proving it when it happens. We haven't even managed to stop discrimination based on race and religion, and that problem has only gotten worse as HR departments started using AI which conveniently acts as a shield to protect them.
If it's not supposed to be public then don't publish it. If it's supposed to be public then stop trying to restrict it.
But the courts are allowed to do it conditionally, so a common condition if you ask for a lot of cases is to condition it to redact any PII before making the data searchable. Having the effect that people that actually care and know what to look for, can find information. But you can't randomly just search for someone and see what you get.
There is also a second registry separate from the courts that used to keep track of people that have been convicted during the last n years that is used for backgrounds checks etc.
I robbed a drug dealer some odd 15 years ago while strung out. No excuses, but I paid my debt (4-11 years in state max, did min) yet I still feel like a have this weight I can’t shake.
I have worked for almost the whole time, am no longer on parole or probation. Paid all fines. I honestly felt terrible for what I did.
At the time I had a promising career and a secret clearance. I still work in tech as a 1099 making way less than I should. But that is life.
What does a background check matter when the first 20 links on Google is about me committing a robbery with a gun?
Edit: mine is an extreme and violent case. But I humbly believe, to my benefit surely, that once I paid all debts it should be done. That is what the 8+ years of parole/probation/counseling was for.
Additionally, do you want a special class of privileged people, like a priestly class, who can interpret the data/bible for the peasantry? That mentality seems the same as that behind the old Latin bibles and Latin mass that people were abused to attend, even though they had no idea what was being said.
So who would you bequeath the privileges of doing “research”?Only the true believers who believe what you believe so you wouldn’t have to be confronted with contradictions?
And how would you prevent data exfiltration? Would you have your authorized “researchers” maybe go to a building, we can call it the Ministry of Truth, where they would have access to the information through telescreen terminals like how the DOJ is controlling the Epstein Files and then monitoring what the Congressmen were searching for? Think we would have discovered all we have discovered if only the Epstein people that rule the country had access to the Epstein files?
Yes, convictions are permanent records of one’s offenses against society, especially the egregious offenses we call felonies on the USA.
Should I as someone looking for a CFO or just an accountant not have the right that to know that someone was convicted of financial crimes, which is usually long precipitated by other transgressions and things like “mistakes” everyone knows weren’t mistakes? How would any professional association limit certification if that information is not accessible? So Madoff should Ave been able to get out and continue being involved in finances and investments?
Please explain
But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
great moral system you have there
[1] https://en.wikipedia.org/wiki/Disclosure_and_Barring_Service
Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?
This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.
At a certain point, we say someone is an adult and fully responsible for their actions, because “that’s who they are”.
It’s not entirely nuanced—and in the US, at least, we charge children as adults all the time—but it’s understandable.
At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.
There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.
The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.
Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?
The problem here isn't the lack of long term consequences for kids.
Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?
Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.
Yes
If something is expungable it probably shouldn't be public record. Otherwise it should be open and scrapable and ingested by both search engines and AI.
Is this the UK thing where PII is part of the released dataset? I know that Ukrainian rulings are all public, but the PII is redacted, so you can train your AI on largely anonymized rulings.
I think it should also be against GDPR to process sensitive PII like health records and criminal convictions without consent, but once it hits the public record, it's free to use.
On the other hand, perpetrating crime is a GREAT predictor of perpetrating more crime -- in general most crime is perpetrated by past perps. Why should this info not be available to help others avoid troublemakers?
https://bjs.ojp.gov/library/publications/returning-prison-0
https://www.prisonpolicy.org/graphs/sex_offense_recidivism_2...
https://usafacts.org/articles/how-common-is-it-for-released-...
https://pmc.ncbi.nlm.nih.gov/articles/PMC3969807/
https://ciceroinstitute.org/research/the-case-for-incarcerat...
https://bjs.ojp.gov/topics/recidivism-and-reentry
I think this is wrong, it should be reported entirely at least for 5 years after the fact happened
The idea that society is required to forget crime is pretty toxic honestly.
That is the entire point of having courts, since the time of Hammurabi. Otherwise it's back to the clan system, where justice is made by avenging blood.
Making and using any "profiles" of people is an entirely different thing than having court rulings accessible to the public.
It's not about any post-case information.
Any tool like this that can help important stories be told, by improving journalist access to data and making the process more efficient, must be a good thing.
I think it's right to prevent random drive by scraping by bots/AI/scammers. But it shouldnt inhibit consumers who want to use it to do their civic duties.
How about rate limited?
If load on the server is a concern, make the whole database available as a torrent. People who run scrapers tend to prefer that anyway.
This isn't someone's hobby project run from a $5 VPS - they can afford to serve 10k qps of readonly data if needed, and it would cost far less than the salary of 1 staff member.
I’d then ask OpenAI to be open too since open is open.
I could have my mind changed if the public policy is that any public data ingested into an AI system makes that AI system permanently free to use at any degree of load. If a company thinks that they should be able to put any load they want on public services for free, they should be willing to provide public services at any load for free.
It's like having you search through sand, it's bad enough while you can use a sift, but then they tell you that you can only use your bare hands, and your search efforts are made useless.
This is not a new tactic btw and pretty relevant to recent events...
OR it should be allowed for humans to access the public record but charge fees for scrapers
England has a genuinely independent judiciary. Judges and court staff do not usually attempt to hide from journalists stuff that journalists ought to be investigating. On the other hand, if it's something like an inquest into the death of a well-known person which would only attract the worst kind of journalist they sometimes do quite a good job of scheduling the "public" hearing in such a way that only family members find out about it in time.
A world government could perhaps make lots of legal records public while making it illegal for journalists to use that material for entertainment purpose but we don't have a world government: if the authorities in one country were to provide easy access to all the details of every rape and murder in that country then so-called "tech" companies in another country would use that data for entertainment purposes. I'm not sure what to do about that, apart, obviously, from establishing a world government (which arguably we need anyway in order to handle pollution and other things that are a "tragedy of the commons" but I don't see it happening any time soon).
Eg if you create a business then that email address/phone number is going to get phished and spammed to hell and back again. It's all because the government makes that info freely accessible online. You could be a one man self-employed business and the moment you register you get inundated with spam.
They have ability to seal documents until set dates and deal with digital archival and retrieval.
I suspect some of this is it's a complete shit show and they want to bury it quickly or avoid having to pay up for an expensive vendor migration.
Why? They generate massive traffic, why should they get access for free?
One individual could spend their entire life going through one by one recording cases and never get through the whole dataset. A bot farm could sift through it in an hour. They are not the same thing.
I don't think all information should be easily accessible.
Some information should be in libraries, held for the public to access, but have that access recorded.
If a group of people (citizens of a country) have data stored, they ought to be able to access it, but others maybe should pay a fee.
There is data in "public records" that should be very hard to access, such as evidence of a court case involving the abuse of minors that really shouldn't be public, but we also need to ensure that secrets are not kept to protect wrongdoing by those in government or in power.
Yes, license plates are public, and yes, a police officer could have watched to see whether or not a suspect vehicle went past. No, that does not mean that it's the same thing to put up ALPRs and monitor the travel activity of every car in the country. Yes, court records should be public, no, that doesn't mean an automatic process is the same as a human process.
I don't want to just default to the idea that the way society was organized when I was a young person is the way it should be organized forever, but the capacity for access and analysis when various laws were passed and rights were agreed upon are completely different from the capacity for access and analysis with a computer.
What about family law?
Family law is just the most obvious and unarguable example.
It's not easy to see who to believe. The MP introducing it claiming there is a "cover up" is just what MPs do. Of course it makes him look bad, a service he oversaw the introduction of is being withdrawn. The rebuttal by the company essentially denies everything. Simultaneously it's important to notice the government are working on a replacement system of their own.
I think this is a non-event. If you really want to read the court lists you already can, and without paying a company for the privilege. It sounds like HMCTS want to internalise this behaviour by providing a better centralised service themselves, and meanwhile all the fuss appears to be from a company operated by an ex-newspaper editor who just had its only income stream built around preferential access to court data cut off.
As for the openness of court data itself, wake me in another 800 years when present day norms have permeated the courts. Complaining about this aspect just shows a misunderstanding of the (arguably necessary) realities of the legal system.
An analogy would be Hansard and theyworkforyou.com. The government always made Hansard (record of parliamentary debates) available. But theyworkforyou cleaned the data, and made it searchable with useful APIs so you could find how your MP voted. This work was very important for making parliament accessible; IIRC, the guys behind it were impressive enough that they eventually were brought in to improve gov.uk.
> “We are also working on providing a new licensing arrangement which will allow third parties to apply to use our data. We will provide more information on this in the coming weeks.
Absolutely fucking crazy that you typed this out as a legitimate defense of allowing extremely sensitive personal information to be scraped.
> only system that could tell journalists what was actually happening in the criminal courts
Who cares? Journalism is a dead profession and the people who have inherited the title only care about how they can mislead the public in order to maximize profit to themselves. Famously, "journalists" drove a world-renowned musician to his death by overdose with their self-interest-motivated coverage of his trial[1]. It seems to me that cutting the media circus out of criminal trials would actually be massively beneficial to society, not detrimental.
[1] https://www.huffpost.com/entry/one-of-the-most-shameful_b_61...
>Oh no, some musician died, PASS THE NATIONAL SECURITY ACT, LOCK DOWN ALL INFORMATION ABOUT CRIMINALS, JAIL JOURNALISTS!!!!
If it is public, it will be scraped, AI companies are irrelevant here.
If information is truly sensitive, do not make it public, and that's completely fine. This might have been the case here.
That said I don't know why the hell the service concerned isn't provided by the government itself.
"We hired a specialist firm to build, in a secure sandbox, a safety tool for journalists. They are experts in building privacy-preserving AI solutions - for people like law firms or anyone deeply concerned with how data is held, processed, and protected. That’s why we chose them. Their founders are not only respected academics in addition to being professionals, they have passed government security clearance and DBS checks in the past, and have worked on data systems for the National Archives, the Treasury, and other public agencies. They’ve published academic papers on data compliance for machine learning.
"The Minister says we ‘shared data with an AI company”... as if we were pouring this critically sensitive information into OpenAI or some evil aggregator of data. This is simply ridiculous when you look at what we do and how we did it.
"We didn’t “share” data with them. We hired them as our technical contractor to build a secure sandbox to test an idea, like any company using a cloud provider or an email service. They worked under a formal sub-processor agreement, which means under data protection law they’re not even classified as a “third party.” That’s not our interpretation. It’s the legal definition in the UK GDPR itself. ... "And “for commercial purposes”? The opposite is true. We paid them £45,000 a year. They didn’t pay us a penny. The money flowed from us to them. They were prohibited, in writing, from selling, sharing, licensing, or doing anything at all with the data other than providing the service we hired them for.. and they operated under our supervision at all times. They didn’t care what was in the data - we reviewed, with journalists, the outputs to make sure it worked."
If this is true, it does seem that the government has mischaracterized what happened.
> Our understanding is that some 700 individual cases, at least, were shared with the AI company. We have sought to understand what more may have been shared and who else may have been put at risk, but the mere fact that the agreement was breached in that way is incredibly serious.
> ... the original agreement that was reached between Courtsdesk and the previous Government made it clear that there should not be further sharing of the data with additional parties. It is one thing to share the data with accredited journalists who are subject to their own codes and who are expected to adhere to reporting restrictions, but Courtsdesk breached that agreement by sharing the information with an AI company.
(from https://hansard.parliament.uk/Commons/2026-02-10/debates/037...)
But when the conspiracy involves lack of prosecution or inconsistent sentencing at scale and then the Ministry of Justice issues a blanket order to delete one of the best resources to look into those claims...? Significantly increases the legitimacy of the claims.
I assumed it was the usual conspiracy stuff up until this order.
https://www.tremark.co.uk/moj-orders-deletion-of-courtsdesk-...
They raise the interesting point that "publicly available" doesn't necessarily mean its free to store/process etc:
> One important distinction is that “publicly available” does not automatically mean “free to collect, combine, republish and retain indefinitely” in a searchable archive. Court lists and registers can include personal data, and compliance concerns often turn on how that information is processed at scale: who can access it, how long it is kept, whether it is shared onward, and what safeguards exist to reduce the risk of harm, especially in sensitive matters.
The counter claim by the government is that this isn't "the source of truth" being deleted but rather a subset presented more accessibly by a third party (CourtsDesk) which has allegedly breached privacy rules and the service agreement by passing sensitive info to an AI service.
Coverage of the "urgent question" in parliament on the subject here:
House of Commons, Courtsdesk Data Platform Urgent Question
https://www.bbc.co.uk/iplayer/episode/m002rg00
Relatedly, there's an extremely good online archive of important cases in the past, but because they disallow crawlers in robots.txt: https://www.bailii.org/robots.txt not many people know about it. Personally I would prefer if all reporting on legal cases linked to the official transcript, but seemingly none of the parties involved finds it in their interest to make that work.
https://x.com/CPhilpOfficial/status/2021295301017923762
https://xcancel.com/CPhilpOfficial/status/202129530101792376...
This kind of logic does more disservice than people realize. You can combat bigotry towards immigrants (issue #1), without covering up for criminal immigrants (issue #2) in fear of increase of issue #1 among the natives. It only brings up more resentment and bigotry.
That’s why the government should be transparent.
Prosecution of sexual assault is often handled extremely badly. It needs to be done better, without fear or favor, including people who are friends with the police or in positions of power. As we're seeing the fallout of the Epstein files.
Great. How does it change the substance of my comment?
Perhaps, instead of arguing about whether “immigrants” is always a group as a collective, or a certain number of individuals acting together, you would focus on the high level implications of government’s action or inaction?
What do Epstein files have to do with anything right now? Stop shifting the goal posts.
- Policing language to distract from the topic.
- Trying to claim things are just a series of isolated incidents with absolutely nothing in common
- Claiming there are wider problems (that should be addressed in a manner that would take years and isn't even defined well enough to claim measure as being "better")
In the local data that the audit examined from three police forces, they identified clear evidence of “over-representation among suspects of Asian and Pakistani-heritage men”.
It’s unfortunate to watch people and entire countries twist themselves in logic pretzels to avoid ever suggesting that immigration has no ills, and we’re just being polite here about it.
https://www.aljazeera.com/news/2025/6/17/what-is-the-casey-r...
https://celina101.substack.com/p/the-uks-rape-gang-inquiry
Whilst we're on Rotherham:
"...by men predominantly of Pakistani heritage" [0]
https://www.bbc.com/news/uk-england-south-yorkshire-61868863
Their parents or grandparents were immigrants...
So it would seem that you're the one straying from the topic.
> ... the agreement restricts the supply of court data to news agencies and journalists only.
> However, a cursory review of the Courtsdesk website indicates that this same data is also being supplied to other third parties — including members of @InvestigatorsUK — who pay a fee for access.
> Those users could, in turn, deploy the information in live or prospective legal proceedings, something the agreement expressly prohibits.
https://hansard.parliament.uk/Commons/2026-02-10/debates/037...
Then they start jailing people for posts.
Then they get rid of juries.
Then they get rid of public records.
What are they trying to hide?
In other countries, interference with the right to a fair trial would have lead to widespread protest. We don't hold our government to account, and we reap the consequences of that.
Though I'm not sure stopping this service achieves that.
Also - even in the case that somebody is found guilty - there is a fundamental principle that such convictions have a life time - after which they stop showing up on police searches etc.
If some third party ( not applicable in this case ), holds all court cases forever in a searchable format, it fundamentally breaches this right to be forgotten.
Obviously the government Ministry of Justice cannot make other parts of government more popular in a way that appeases political opponents, so the logical solution is to clamp down on open justice.
"The government has cited a significant data protection breach as the reason for its decision - an issue it clearly has a duty to take seriously."
https://www.nuj.org.uk/resource/nuj-responds-to-order-for-th...
ETA: They didn't ship data off to e.g. ChatGPT. They hired a subcontractor to build them a secure AI service.
Details in this comment:
https://news.ycombinator.com/item?id=47035141
leading to this:
https://endaleahy.substack.com/p/what-the-minister-said
The government is behaving disgracefully.
They don't have a budget for that. And besides, it might be an externalized service, because self hosting is so 90s.
https://www.bbc.co.uk/news/articles/c20dyzp4r42o
Traced what? Innuendo is not a substitute for information.
Quoting from an urgent question in the House of Lords a few days ago:
> HMCTS was working to expand and improve the service by creating a new data licence agreement with Courtsdesk and others to expand access to justice. It was in the course of making that arrangement with Courtsdesk that data protection issues came to light. What has arisen is that this private company has been sharing private, personal and legally sensitive information with a third-party AI company, including potentially the addresses and dates of birth of defendants and victims. That is a direct breach of our agreement with Courtsdesk, which the Conservatives negotiated.
https://hansard.parliament.uk/Commons/2026-02-10/debates/037...
Ministry of Justice in UK has always struck me as very savvy, from my work in the UK civic tech scene. They're quite self-aware, and I assume this is more pro-social than it might seem.
you _really_ shouldn't be allowed to train on information without having a copyright license explicitly allowing it
"publicly available" isn't the same as "anyone can do whatever they want with it", just anyone can read it/use it for research
I haven't confirmed it but I've seen journalist friends of mine complain also that the deletion period for much data is about 7 years so victims are genuinely shocked that they can't access their own transcripts even from 2018... Oh and if they can they're often extremely expensive.
If you don't "know about them from another source" you can't effectively find/access the information and you might not even know that there is something you really should know about.
The service bridged the gap by providing a feed about what is potentially relevant for you depending on your filters etc.
This mean with the change:
- a lot of research/statistics are impossible to do/create
- journalists are prone to only learning about potentially very relevant cases happening, when it's they are already over and no one was there to cover it
> HMCTS acted to protect sensitive data after CourtsDesk sent information to a third-party AI company.
(statement from the UK Ministry of Justice on Twitter, CourtsDesk had ran the database)
but it's unclear how much this was an excuse to remove transparency and how much this actually is related to worry how AI could misuse this information
Shutting down the only working database is the proof point that perfect is the enemy of good.
Of course, this gives cover to the "well, we're working on something better" and "AI companies are evil"
Fine, shut it down AFTER that better thing is finally in prod (as if).
And if the data was already leaked because you didn't do your due diligence and didn't have a good contract, that's on you. It's out there now anyway.
And, really, what's the issue with AI finally having good citations? Are we really going to try to pretend that AI isn't now permanently embedded in the legal system and that lawyers won't use AI to write extremely formulaic filings?
This is either bureaucracy doing what it does, or an actual conspiracy to shut down external access to public court records. It doesn't actually matter which: what matters is that this needs to be overturned immediately.
In the United States our Constitution requires the government conduct all trials in public ( with the exception of some Family Court matters ). Secret trials are forbidden. This is a critical element to the operation of any democracy.
Through the way AI companies could "severely/negligent mishandle the data potentially repeatedly destroying innocent people live" is about historic data, tho.
This is why having decent standards in politics and opposition matters so much.
We all got together to vote out the last wankers, only to find that the current lot are of the same quality but in different ways.
And to think... the 'heads up their arses while trying to suppress a Hitler salute' brigade (Reform) are waiting in the wings to prove my point.