Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm trying to imagine a more "real-world" example of this to see how I feel about it. I dislike that there is yet another loophole to gain access to peoples' data for legal reasons, but this does feel like a reasonable approach and a valid goal to pursue.

I guess it's like if someone noticed you had a case shaped exactly like a machine gun, told the police, and they went to check if it was registered or not? I suppose that seems perfectly reasonable, but I'm happy to hear counter-arguments.



The main factual components are as follows: Party A has rented out property to Party B. Party A performs surveillance on or around the property with Party B's knowledge and consent. Party A discovers very high probability evidence that Party B is committing crimes within the property, and then informs the police of their findings. Police obtain a warrant, using Party A's statements as evidence.

The closest "real world" analogy that comes to mind might be a real estate management company uses security cameras or some other method to determine that there is a crime occurring in a space that they are renting out to another party. The real estate management company then sends evidence to the police.

In the case of real property -- rental housing and warehouse/storage space in particular -- this happens all the time. I think that this ruling is imminently reasonable as a piece of case law (ie, the judge got the law as it exists correct). I also thing this precedent would strike a healthy policy balance as well (ie, the law as it exists if interpreted how the judge in this case interprets it would a good policy situation).


Is there any such thing as this surveillence applying to the inside of the renters bed room, bath room, filing cabinet with medical or financial documents, or political for that matter?

I don't think there is, and I don't think you can reduce reality to being as simple as "owner has more right over property than renter" renter absolutely has at least a few rights in at least a few defined contextx over owner because owner "consented" to accept money in trade for use of property.


> Is there any such thing as this surveillence applying to the inside of the renters bed room, bath room, filing cabinet with medical or financial documents, or political for that matter?

Yes. Entering property for regular maintenance. Any time a landlord or his agent enters a piece of property, there is implicit surveillance. Some places are more formal about this than others, but anyone who has rented, owned rental property, or managed rental property knows that any time maintenance occurs there's an implicit examination of the premises also happening...

But here is a more pertinent example: the regular comings and goings of people or property can be and often are observed from outside of a property. These can contribute to probable cause for a search of those premises even without direct observation. (E.g., large numbers of disheveled children moving through an apartment, or an exterior camera shot of a known fugitive entering the property.)

Here the police could obtain a warrant on the basis of landlord's testimony without the landlord actually seeing the inside of the unit. This is somewhat similar to the case at hand, since what Google alerted the police to a hash match without actually looking at the image (ie, entering the bedroom).

> I don't think you can reduce reality to being as simple as "owner has more right over property than renter"

But I make no such reduction, and neither does the opinion. In fact, quite the opposite -- this is contributory why the court determines a warrant is required!


> ...Google alerted the police to a hash match without actually looking at the image (ie, entering the bedroom).

Google cannot have calculated that hash without examining the data in the image. They, or systems under there control obviously looked at the image.

It should not legally matter whether the eyes are meat or machine... if anything, machine inspection should be MORE strictly regulated, because of how much easier and cheaper it tends to make surveillance (mass or otherwise).


> It should not legally matter whether the eyes are meat or machine

But it does matter, and, perhaps ironically, it matters in a way that gives you STRONGER (not weaker) fourth amendment rights. That's the entire TL;DR of the fine article.

If the court accepted this sentence of yours in isolation, then the court would have determined that no warrant was necessary in any case.

> if anything, machine inspection should be MORE strictly regulated, because of how much easier and cheaper it tends to make surveillance (mass or otherwise).

I don't disagree. In particular: I believe that the "Reasonable Person", to the extent that we remain stuck with the fiction, should be understood as having stronger privacy expectations in their phone or cloud account than they do even in their own bedroom or bathroom.

With respect to Google's actions in this case, this is an issue for your legislator and not the courts. The fourth amendment does not bind Google's hands in any way, and judges are not lawmakers.


> Yes. Entering property for regular maintenance.

In every state that I've lived in they must give advance notice (except for emergencies). They can't just show up and do a surprise check.


Only in residential properties, typically. There are also states that have no such requirement even on residential rentals.

In any case, I think it's a bit of a red herring and that the "regular comings and goings" case is more analogous.

But also that, at this point in the thread, we have reached a point where analogy stops being helpful and the actual thing has to be analyzed.


The point of the analogy is that the contents of ones files should be considered analogous to the contents of ones mind.

Whatever reasons we had in the past for deciding that financial or health data, or conversations with attorneys, or bathrooms and bedrooms, are private, those reasons should apply to ones documents which includes ones files.

Or at least if not, we should figure out and be able to show exactly how and why not with some argument that actually holds water.

Only after that does it make any sense to either defend or object to this development.


Fair enough.


If I import hundreds of pounds of poached ivory and store it in a shipping yard or move it to a long term storage unit, the owner and operator of those properties are allowed to notify police of suspected illegal activities and unlock the storage locker if there is a warrant produced.

Maybe the warrant uses some abstraction of the contents of that storage locker like the shipping manifest or customs declaration. Maybe someone saw a shadow of an elephant tusk or rhino horn as I was closing the locker door.


Pretty much all rental storage, shipping container, 3rd party semi trailer pool, safe deposit box type services and business agreements stipulate that the user of the arbitrary box gets to deny the owner of the arbitrary box access so long as they're holding up their end of the deal. The point is that the user is wholly responsible for the security of the contents of the arbitrary box and the owner bears no liability for the contents. This is why (well run) rental storage places make you use your own lock and if you don't pay they add an additional lock rather than removing yours.


I don't think that argument supports the better analogy of breaking into a computer or filing cabinet owned by someone renting the space. Just because someone is renting space doesn't give you the right to do whatever you want to them. Cameras in bathrooms of a rented space would be another example.


But he wasn’t running a computer in a rented space, he was using storage space on google’s computers.

In an older comment I argued against analogies to rationalize this. I think honestly at face value it is possible to evaluate the goodness or badness of the decision.


> In an older comment I argued against analogies to rationalize this. I think honestly at face value it is possible to evaluate the goodness or badness of the decision.

I generally do agree that analogies became anti-useful in this thread relatively quickly.

However, I am not sure that avoiding analogies is actually possible for the courts. I mean, they can try, but at some point analogies are unavailable because most of the case law -- and, hell, the fourth amendment itself -- is written in terms of the non-digital world. Judges are forced to reason by analogy, because legal arguments will be advanced in terms of precedent that is inherently physical.

So there is value in hashing out the analogies, even if at some point they become tenuous, primarily because demonstrating the breaking points of the analogies is step zero in deviating from case law.


Yes, that is why I presented an alternative to the analogy of "import hundreds of pounds of poached ivory and store it in a shipping yard or move it to a long term storage unit".

Like having the right to avoid being videoed in the bathroom, we have the right to avoid unreasonable search of our files by authorities, whether stored locally or on the cloud


Wait until you hear about third party doctrine.

I have this weird experience where people that get all their legal news from tech websites have really pointed views about fourth amendment jurisprudence and patent law.


The issue of course being the government then pressuring or requiring these companies to look for some sort of content as part of routine operations.


I agree. This is a case where the physical analogy leads us to (imo) the correct conclusion: compelling major property management companies to perform regular searches of their tenant's properties, and then to report any findings to the police, is hopefully something that most judges understand to be a clear violation of the fourth amendment.


> The issue of course being the government then pressuring or requiring these companies to look for some sort of content as part of routine operations.

Was that the case here?


Not requiring, but certainly pressure. See https://www.nytimes.com/2013/12/09/technology/tech-giants-is... for example. Also all of the heat Apple took over rolling back its perceptual hashing.


> Party A discovers very high probability evidence that Party B is committing crimes within the property ...

This isn't accurate: the hashes were purposefully compared to a specific list. They didn't happen to notice it, they looked specifically for it.

And of course, what happens when it's a different list?


>> Party A discovers very high probability evidence that Party B is committing crimes within the property ...

> This isn't accurate: the hashes were purposefully compared to a specific list. They didn't happen to notice it, they looked specifically for it.

1. I don't understand how the text that comes on the right side of the colon substantiates the claim on the left side of the colon... I said "discovers", without mention of how it's discovered.

2. The specificity of the search cuts in exactly the opposite direction than you suggest; specificity makes the search far less invasive -- BUT, at the same time, the "everywhere and always" nature of the search makes it more invasive. The problem is the pervasiveness, not the specificity. See https://news.ycombinator.com/user?id=aiforecastthway

> And of course, what happens when it's a different list?

The fact that the search is targeted, that the search is highly specific, and that the conduct plainly criminal, are all, in fact, highly material. The decision here is not relevant to most of the "worst case scenarios" or even "bad scenarios" in your head, because prior assumptions would have been violated prior to this moment in the legal evaluation.

But with respect to your actual argument here... it's really a moot point. If the executive branch starts compelling companies to help them discover political enemies on basis of non-criminal activity, then the court's opinions will have exactly as much force as the army that court proves capable of raising, because such an executive would likely have no respect for the rule of law in any case...

It is reasonable for legislators to draft laws on a certain assumption of good faith, and for courts to interpret law on a certain assumption of good faith, because without that good faith the law is nothing more than a sequence of forceless ink blotches on paper anyways.


I don't think that changes anything. I think it's entirely reasonable for Party A to be actively watching the rented property to see if crimes are being committed, either by the renter (Party B) or by someone else.

The difference I do see, however, is that many places do have laws that restrict this sort of surveillance. If we're talking about an apartment building, a landlord can put cameras in common areas of the building, but cannot put cameras inside individual units. And with the exception of emergencies, many places require that a landlord give tenants some amount of notice before entering their unit.

So if Google is checking user images against known CSAM image hashes, are those user images sitting out in the common areas, or are they in an individual tenant's unit? I think it should be obvious that it's the latter, not the former.

Maybe this is more like a company that rents out storage units. Do storage companies generally have the right to enter their customers' storage units whenever they want, without notice or notification? Many storage companies allow customers to put their own locks on their units, so even if they have the right to enter whenever they want, regularly, in practice they certainly do not.

But like all analogies, this one is going to have flaws. Even if we can't match it up with a real-world example, maybe there's still no inconsistency or problem here. Google's ToS says they can and will do this sort of scanning, users agree to it, and there's no law saying Google can't do that sort of thing. Google itself has no obligation to preserve users' 4th Amendment rights; they passed along evidence to the police. I do think the police should be required to obtain a warrant before gaining access to the underlying data; the judge agrees on this, but the police get away with it in the original case due to the bullshit "good faith exception".


This is an excellent example, I think I get it now and I'm fully on-board. Thanks.

I could easily see an AirBNB owner calling the cops if they saw, for instance, child abuse happening on their property.


Ok. But that would also be invasion of privacy. If the property you rented out was being used for trafficking and you don’t want to be involved with trafficking, then the terms would have to first explicitly set what is not allowed. Then it would also have to explicitly mention what measures are taken to enforce it and what punishments are imposed for violations. It should also mention steps that are taken for compliance.

Without full documentation of compliance measures, enforcement measures, and punishments imposed, violations of the rule cannot involve law enforcement who are restricted to acting on searches with warrants.


> If the property you rented out was being used for trafficking and you don’t want to be involved with trafficking, then the terms would have to first explicitly set what is not allowed.

I don't believe that's the case. You don't need to state that illegal activities are not allowed; that's the default.

> Then it would also have to explicitly mention what measures are taken to enforce it

When Airbnb used to allow cameras indoors, they did -- after some backlash -- require hosts to disclose the presence of the cameras.

> ... and what punishments are imposed for violations.

No, I don't think that is or should be necessary. If you do illegal things, the possible punishments don't need to be enumerated by the person who reports you to the police.

Put another way: if I'm hosting someone on Airbnb in the case where I'm living in the same property, and I walk into the kitchen to see my Airbnb guest dealing drugs, I am well within my rights to call the police, without having ever said anything up-front to my guest about whether or not that's acceptable behavior, or what the consequences might be. Having the drug deal instead caught on camera is no different, though I would agree that the presence of the cameras should have to be disclosed beforehand.

In Google's case, the "camera" (aka CSAM scanning) appears to have been disclosed beforehand.


> You don't need to state that illegal activities are not allowed; that's the default

Technically you would have to say to be able to walk away from accusations of complicity.


>Without full documentation of compliance measures, enforcement measures, and punishments imposed, violations of the rule cannot involve law enforcement who are restricted to acting on searches with warrants.

That's not the only way police get information...


In the case of in-progress child abuse, that wouldn’t require a warrant as entry to prevent harm to a person is an exigent circumstance and falls under the Emergency Aid doctrine. If they found evidence or illegal items within plain view, that evidence would be permitted under the plain view doctrine. However, if they went and searched drawers or opened file cabinets, evidence discovered in that circumstance would not be allowed (opening a file cabinet isn’t required to solve the emergency aid situation typically.)

What’s really fascinating is that Children Protective Services acts as if they never need a warrant even if there is not an exigent circumstance. To my knowledge there hasn’t been a Supreme Court case challenging that and circuits are split. Interesting reading about that if anyone is interested:

https://family.jotwell.com/ending-cps-home-searches-evasion-...

(The 4th Amendment is not limited to actual police BTW.)


With their hidden camera in the bathroom.


I just meant it as an analogy, not that I'm specifically on-board with AirBNB owners putting cameras in bathrooms.

Anyways, that's why I just rent hotel rooms, personally. :)


I think the real-world analogy would be to say that the case is shaped exactly like a machine gun and the hotel calls the police, who then open the case without a warrant. The "private search" doctrine allows the police to repeat a search done by a private party, but here (as in the machine gun case), the case was not actually searched by a private party.


But this court decision is a real world example, and not some esoteric edge case.

This is something I don’t think needs analogies to understand. SA/CP image and video distribution is an ongoing moderation, network, and storage issue. The right to not be under constant digital surveillance is somewhat protected in the constitution.

I like speech and privacy and am paranoid of corporate or government overreach, but I arrive at the same conclusion as you taking this court decision at face value.


Wait until Trump is in power and corporations are masterfully using these tools to “mow the grass” (if you want an existing example of this, look at Putin’s Russia, where people get jail time for any pro-Ukraine mentions on social media).


Yeah I’m paranoid like I said, but this case it seems like the hash of a file on google’s remote storage flagged as potential match that was used as justification to request a warrant. That seems common sense and did not involve employees snooping pre-warrant.

The Apple CSAM hash detection process, that the launch was rolled back, concerned me namely because it was run on-device with no opt out. If this is running on cloud storage then it sort of makes sense. You need to ensure you are not aiding or harboring actually harmful illegal material.

I get there are slippery slopes or whatever but the fact is you cannot just store whatever you wish in a rental. I don’t see this as opening mass regex surveillance of our communication channels. We have the patriot act to do that lol.


I think the better option is a system where the cloud provider cannot decrypt the files, and they’re not obligated to lift a finger to help the police because they have no knowledge of the content at all


In my opinion, despite the technical merits of an algorithm, encryption is only as trustworthy as the computer who generates and holds a private key.

I would personally not knowingly use a cloud provider to commit a crime. That is a fairly naive take to assume because your browser is https that data at rest and in process isn’t somehow observable.

And I see where you’re coming from but I am afraid that position severely overestimates the will of US people to trade freedom/privacy for security and the legislature to hold citizens’ privacy in such high regard.


I only worry that, in the case that renting becomes a roundabout way of granting more oversight ability to the government, then as home ownership rates decrease, government surveillance power increases.

Sure, it's facilitated through a third party (the owner), but the extrapolated pattern seems to be: "1. Only people in group B will have fewer rights, so people in group A shouldn't worry" followed closely by "2. Sorry, you've been priced out of group A."

In the case of renting, we end up in the situation where those who have enough wealth to own their own home are afforded extra privileges of privacy.

Now to bring this back to the cloud; the cynical part of me looks towards a future of cheap, cloud-only storage devices. Or an intermediate future of devices where cloud is first party and local storage is just enough of a hassle that people don't use it. And the result is that basically everyone now has the present day equivalent of local storage scanning.

If renting de-facto grants fewer rights, then in the future where "you'll own nothing and be happy", you'll also have no rights, and all the way people will say "as a renter, what did you expect?"


OK I agree with you about setting a precedent that future storage will be scanned by default. Additionally who will control the reference hash list?, since making one necessitates hashing that illicit material.

I only hope the court systems escalate it and manage to protect free speech or unreasonable search and seizure or self incrimination or whatever if the CSAM hash comparisons are used against political opponents or music piracy or tax evasion or whatever.

Good point.


> You need to ensure you are not aiding or harboring actually harmful illegal material.

Is this actually true, legally speaking?


I’m unsure I wrote that from like an ethics standpoint. The silk road guy was got on conspiracy for attempting murder and not drug or human trafficking charges. So I’m unsure of legal side.

I think if you knowingly provided a platform to distribute SA/CP/CSAM and the feds become involved you will be righteously fucked.

Reddit clamped down on the creepy *bait subreddits years ago. Maybe it was self-preservation on the business side or maybe it was forward looking about legal issues.

I’m not a lawyer I was just mentioning things that I would follow for ethics morals and my sense of self preservation.


I'm reasonably certain Reddit's decision to ban /r/jailbait and the like was driven by business/reputation. It was widely discussed for some time before it was banned and, IIRC given a "worst of" award by the admins at one point. Once it got major media coverage, Reddit got its first real content policy.


> The silk road guy was got on conspiracy for attempting murder and not drug or human trafficking charges

Actually, the murder stuff was not part of his sentencing or what they tried him for.

https://en.m.wikipedia.org/wiki/Ross_Ulbricht


[flagged]


The only one sounding like Putin is Hillary Clinton and her numerous acolytes in the government.

So sure, let’s talk more about Trump.

Except she's no longer running for office, last we checked.

under the Biden/Harris regime,

Except it wasn't a "regime", and neither was Trump's administration. Both were democratically elected governments, whether you happen to like them or not.

Meanwhile, if you live and breathe according to emotionally manipulative language like this, then you already have a "regime" of sorts installed in your head.


It is worse. Trump will actually put people on concentration camps! Glenn Greenwald explains the issue here:

https://www.youtube.com/watch?v=8EjkstotxpE


It's like a digital 'smell'; Google is a drug sniffing dog.


I don't think the analogy holds for two reasons (which cut in opposite directions from the perspective of fourth amendment jurisprudence, fwiw).

First, the dragnet surveillance that Google performs is very different from the targeted surveillance that can be performed by a drug dog. Drug dogs are not used "everywhere and always"; rather, they are mostly used in situations where people have a less reasonable expectation of privacy than the expectation they have over their cloud storage accounts.

Second, the nature of the evidence is quite different. Drug-sniffing dogs are inscrutable and non-deterministic and transmit handler bias. Hashing algorithms can be interrogated and are deterministic and do not have such bias transferal issues; collisions do occur, but are rare, especially because the "search key" set is so minuscule relative to the space of possible hashes. The narrowness and precision of the hashing method preserves most of the privacy expectations that society is currently willing to recognize as objectively reasonable.

Here we get directly to the heart of the problem with the fictitious "reasonable person" used in tests like the Katz test, especially in cases where societal norms and technology co-evolve at a pace far more rapid than that of the courts.


This analogy can have two opposite meanings. Drug dogs can be anything from a prop used by the police to search your car without a warrant (a cop can always say in court the dog "alerted" them) to a useful drug detection tool.


>yet another loophole

What's the new legal loophole? I believe what's described above is the same as it's been for decades, if not centuries.

Disclosure: I work at Google but not on anything related to this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: