Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Google Authenticator cloud sync: Google can see the secrets, even while stored (defcon.social)
408 points by Signez on April 26, 2023 | hide | past | favorite | 141 comments


It's a tradeoff. They could let (or require) a password be entered to encrypt/decrypt it on each device, but then people would be ticked off when they forget their password and can't recover their 2FA stuff.

They should have handled it the same way they do Sync in chrome, and I expect they will eventually. But, as always, unless a service advertises that it's full E2EE and you can verify that, assume it's not.

One part of this that's funny to me:

>>>Also, 2FA QR codes typically contain other information such as account name and the name of the service (e.g. Twitter, Amazon, etc). Since Google can see all this data, it knows which online services you use, and could potentially use this information for personalized ads.

I guarantee you, Google knows which online services you use in about 800 other ways, it doesn't need to scrape it from your 2FA accounts.


I don't think Google will use those secrets to look into your other accounts, but they can be politely requested by some governments to divulge the secrets, and not tell you about it. Then those governments would have no problems looking into your other accounts. And I'm not talking only about the US government, other governments can have dubious standards for requesting user data, such as failure to parrot the "facts" approved by their ministry of truth (China, Russia, and everyone in their sphere).

Not to mention Google can be hacked.


> I don't think Google will use those secrets to look into your other accounts, but they can be politely requested by some governments to divulge the secrets, and not tell you about it

A lot of companies and institutions use GA for 2FA for their secure systems too. Google doesn't even need to willingly share, as soon as the possibility of extracting the data is there they paint a target on it. External attacks are now extremely low hanging fruit with that unencrypted traffic. And internal attacks like getting an employee with access to that unencrypted data to provide it (knowingly or unknowingly) are relatively low effort and don't require the overhead of any legal proceedings or complex exploits. And if this goes through official/legal channels, Google doesn't have one single shred of protection between that data and what authorities can (shockingly legally) ask for.

Having the data is a liability for Google. Transferring it insecurely is also a liability for the user.


I appreciate this. A lot of people think your tech is between you and the provider, but they forget about the governments that have power over those providers.


Exactly - one of the major reason American BigTech is so invasive today in collecting our personal data is because the US government has shown them how valuable it is for them. Project PRISM has evolved and is huge success ...


The problem is now they know your TOTP secrets, they are only one password away from pretending to be you.

And actually, they serve you emails, so password is moot for most of the sites today.


They always knew your TOTP secrets. The algorithm requires both parties to know the plaintext secret as it’s an input to the HMAC. It’s not a public key operation and they can’t store it as a hashed representation.

It’s possible to have 2FA methods that are verify only (usually using public keys and signing), but TOTP is not one of them.


The website you log into with TOTP has always known the TOTP secret. Now, Google also knows your TOTP secret.


Ah! I misunderstood this being applied to non-Google accounts. Yes that’s scary.


The whole writeup seems like missing the forest for the trees - Google as root access on the Android device where this app runs, so they can certainly see everything if they want ....


Google pushing malicious updates would leave forensic traces, not to mention it'd be difficult to establish a legal framework allowing a government to force Google to do so.

In contrast, subpoena'ing data from the cloud is routine for police in countries all over the world.


This may sound like a naive question, but what stops countries with flexible ethical standards from abusing this power?

For example, in the past I've worked with an AI company with presence in China, where the data of their Chinese clientele must be stored on a separate data centre operated by a local enterprise.

Despite the provider being ISO compliant and holds internationally recognised certs, is there realistically a chance that those data could be accessed without the permission or consent of the users?


Depends on your threat model. Practically, the cloud admin would be able to dump that data in clear during processing even if that data is encrypted at rest and in transit.


So your threat model is a sovereign state able to subpoena cloud data.

Under this model, if Google gets a court order to root-break a specific phone (push malicious update), they will be forced to, and that's all the legal framework necessary, so end-to-end encryption doesn't protect you in this case either.


In the US at least, there’s not a lot of precedent for forcing a company to do something like that. (Yet.) Saying hand over your user’s files is demanding information. Compelling them to write and distribute malware is a lot closer to compelling speech. A 1st amendment problem.


That's a lot more detectable, and has a high risk of coming out, see e.g. Pegasus. Cloud access is nearly invisible.


There is a lot of difference between being as secure as practically possible with password data, and allowing anyone with root access to see all your password data.


But it's the same thing - the threat model described is not safeguarded by end to end encryption when the encryption device itself is compromised.

"secure as practically possible" depends a lot on what practically means, and that is just a function of cost vs threat level. If you believe that Google will access your private data, you should not be using a Google-developed application and OS in the first place.


you are so correct. people don't even question this anymore.

Everyone also might have "google play protect" enabled on their phones which allow google to pack and ship all and every app at regular intervals (no mention anywhere if it include user data) to their servers for threat analysis.


So because things are hard, you just reduce 2fa to 2 of the same factors. And make it worthless.

It’s typically not a disaster if you lose the 2fa keys and if it is, you should carefully save the recovery codes. But the keys get lost all the time so just about every service has a recovery procedure. So there is no need to store the secrets in such a way they can be recovered without the password.


> Google knows

TBF knowing which online service you bothered to activate 2FA on is an information a lot more interesting than which mailing list you forgot to unsubscribe for instance.

Now I don't think they'd use if for ads, I'd assume it would probably be more long term, like knowing which service to buy next, or where the trends are going.


Is decrypting via a password literally not how Authy, BitWarden et al handle it?


> Google know

www.reddit.com/r/degoogle


> if someone obtains access to your Google Account, all of your 2FA secrets would be compromised.

This overlooks that fact Google itself also has access to your 2FA secrets, which could be even worse considering Google could be requested to peer not just into the user's google account, but into accounts they have with other companies/organisations too.


I think this is a little far fetched as a scenario.

Under which assumptions should Google be forced to "peer into accounts a user has with other services"?

This is not only not enforceable, it would be illegal.

Companies can not be enlisted to do such things governmental agencies are doing. How should a company decide what to look for? Google is not the police and can not be made an investigator just-for-fun. FBI's search engine?!

Also, you need the first factor. Do you expect Google would also send the "password reset"-request, reset the password, use your 2nd factor... Just to be nice to the authorities?

Wild theory if you ask me...


Legality doesn't matter when the authorities pull out the magic National Security Letter, slap you with a gag order, and fine you an amount doubling from $50,000 per day until you comply.


Wow.

Fine for what?

Gag order? Does not help them.

Legality? That would count, if it was something I could order them to do, but again, what do you think would happen there?

"Dear Google, we know you have user TechBro8615@gmail.com, could you please:

- Go through all your data, and gather which Accounts for which services TechBro8615 has

- Go through all these accounts TechBro8615 has with every possible service and reset all his passwords with these accounts (without him noticing)

- And use the second factor TechBro8615 has to login

- Make a user data takeout for all the data TechBro8615 has with all these services

- Create an index of this data, because, well we ask you to, although we can't make you do that

- Tell us if TechBro8615 likes Cranberry juice???

Legality does matter, if you request something from third parties. Why on earth should Google ever cooperate beyond step 1? Would you do that?!


My comment is referencing what was documented to happen during 2013: the NSA compelled tech companies to turn over user data under threat of jailing the executives and fining the company huge amounts.

They don't need to ask Google for data from other companies. They can compel them to provide the passwords or authentication codes which are stored on Google's servers. Or they could just ask for a list of which accounts have a saved password, so they know who to target next with an NSL.


Dude. Everything you said is correct, but:

I responded to

> Google could be requested to peer not just into the user's google account, but into accounts they have with other companies/organisations too

Your response to my response should somehow relate to that.


Well in that case, after rereading your comment I guess I agree with you. I don't think the government would deputize Google to effectively hack into other companies on their behalf. I wouldn't put it past them to ask for a backdoor into a user's device, but I doubt Google would comply with that (although there is precedent for Google remotely installing Covid tracking apps on devices in Massachusetts).


Under what conditions do you suggest "Google itself also has access to all your 2FA secrets"?

(Without cloud backup, & without the installation of a malicious version of 'Google Authenticator', how would they – especially, say, on iOS?)


I think they are referring to the scenario where cloud backup is enabled.


Aha, thanks. But, if the 2FA secrets were end-encrypted as the top link suggests, Google then wouldn't in fact have them - so the ggp-comment accusation that the link "overlooks" this factor is nonsensical.

(And if Google were denied access to the cleartext 2FA secrete this, way, then briefly compromising someone's Google account – say by hacking or abuse of legal process – wouldn't automatically compromise all other 2FA-key-protected accounts.)


Hey, I am about to start writing a position paper covering Social login providers the company should enable/support. Do you have any references you can share for the above comment please ?


Why would google have access to that material? Is their general secret mechanism not E2EE? I'm fairly cynical on google's approach to privacy but I would be shocked if they're normal syncing isn't actually secure and private.


That this is indeed the case is the topic of TFA.


Chrome passwords are encrypted with your Google password by default, it's just not e2ee. This isn't even encrypted in that way it seems.

The only real "threat" is your Google account itself being compromised by a third party able to phish their way into your account or bypass your 2fa mechanisms (e.g. by SMS sim swapping). As always, https://landing.google.com/advancedprotection/

The people here saying "privacy" are speaking of some doomsday scenario where Google itself leaks all of this data, which would be unprecedented and is unlikely with how many safeguards there are for employees to access any user data at all within Google.


> doomsday scenario where Google itself leaks all of this data, which would be unprecedented and is unlikely ... (emphasis added)

From the linked article [1]:

> December 2018: Google+ Bug Exposes 52.5 Million Users’ Data Google+ faced its second big breach of 2018 when a November update created an API bug that exposed data from 52.5 million Google+ accounts. Google fixed the bug within six days, and moved up Google+’s burial date from August to April 2019.

> Google originally decided to terminate Google+ after another breach became public earlier in 2018

and an earlier Google+ bug that was reported in WSJ [2]

> Google Exposed User Data, Feared Repercussions of Disclosing to Public

> Google opted not to disclose to users its discovery of a bug that gave outside developers access to private data. It found no evidence of misuse.

[1]: https://firewalltimes.com/google-data-breach-timeline

[2]: https://www.wsj.com/articles/google-exposed-user-data-feared... (Oct, 2018)



Chrome sync has let you set a separate password for end to end encryption for as long as I can remember, though badly until January 2020. https://bugs.chromium.org/p/chromium/issues/detail?id=820976


That's not end-to-end, that's just regular old symmetric encryption.


Assymetric keys aren't a requirement for E2EE. In fact, in most cases, assymetric encryption is only used to exchange a symmetric key to use for data decryption anyway. Assymetric cryptography is way too inefficient to encrypt and decrypt more than short secrets.

The whole reason you'd enable this feature is for when you lose your phone and need to provision a replacement. There's not really any way to do the whole key exchange dance if you don't have access to the original source. A password derived key is essential in this case.


"End to end encryption" can mean either a scheme in which each device has its own key and arranges to trust others, or a scheme in which a key is shared across devices (symmetric).

Just because there is only one key doesn't disqualify the solution from e2ee status. If the middleman does not have the key, it's still end to end secure.


>Chrome passwords are encrypted with your Google password by default, it's just not e2ee.

Source?


Chrome settings -> sync -> encryption


I believe that's off by default (even when you turn on sync), and I believe that if you do turn it on, it's not using your Google password, but a separate password of your choosing.


How does https://passwords.google.com work then?

I think the default behaviour is not E2EE.


It's a dual facing problem. Not only do users have no defence against google snooping, but google has no defence against requests to snoop: Apple seems to drive harder to "we'd help if we could, but we can't: to us its just blobs"


Apple regularly gives up customers' private data when requested, and they keep logs of it themselves[1].

[1] https://www.apple.com/legal/transparency/us.html


But Apple now has Advanced Data Protection that add E2EE for majority of iCloud data and they only keep keys on your devices. Not that I have absolute trust in Apple, but Google don't even have that.


Google had that four years before Advanced Data Protection existed. https://security.googleblog.com/2018/10/google-and-android-h...

Warning: this blog post has meaningless marketing-speak. It starts by saying Android is about choice but instead of announcing the ability to set your own backup provider, it just says how Google's Android backup service works, which is wholly unrelated to that first sentence.


If you are talking about device backup, apple had it encrypted before Advanced data protection.

If you are talking about other data, Google don't have it encrypted even today.


> If you are talking about device backup, apple had it encrypted before Advanced data protection.

Not end to end: https://www.wired.com/story/apple-end-to-end-encryption-iclo...


Parent talks about device backup. Your link talks about iCloud backup. Different things



So far apple has not been compelled by the courts to make a tool so they could decrypt the e2e stuff which now pretty much includes all iCloud content. As far as we publicly know.


Prior to ADP, agencies like FBI could actually access a lot of information from Apple, primarily because Apple stored keys alongside your data. Which meant even though iMessage content was encrypted (and therefore useless), they could still get a lot of information from a request to Apple -- and Apple hands over quite a lot. FBI had a document all about that in 2021.

Also important to note, ADP is opt-in currently.


Users have defence against snooping, it's not using Google in the first place if this is a consideration in their threat model.

Does or should grandma care? No.

Should a political dissenter living in an oppressive regime think twice? Yes.


After my phone was stolen last month, I switched to https://2fas.com and couldn't be any happier.

It's free, open source and has tons of great features.


How does this unknown Delaware company support 12 employees working on a free mobile app? There's zero verifiable information available about its history, and the founder seems to be heavily involved in cryptocurrency.


My guess is they're all contractors and work as needed


Yup - that! Plus we have donations. We know it's not a perfect way to earn money, but our priority is to deliver the best, most secure and private 2FA app out there!


Thank you! That's awesome!


Looks good at a first glimpse. Please don't write "it's free". That's a non-message many companies give, Google of course one of them. We know that it means you pay by providing your data. Other models could be "run by volunteers" or "fully funded by donations".


But it is free, both by the casual definition (zero cost) and by FSF definition (Free Software).


He wants to distinguish between "free: you pay with your privacy and we share your data with whoever wants it!" or "free: but only basic features, want more? pay" and "free: because people like you help it being 100% free and we have no pressure to use your data and everything is open so you can look at the code"


Many open source enthusiasts are really 'dont like paying money for things' enthusiasts.


That’s certainly a big part for me. But it’s not about saving $2, I think that free is sustainable therefore likely to run in 50 or 100 years where any non-zero amount decreases that probability.

So if I don’t pay money for things there’s not a service element, or a phone home to activate element, or other things that require an ongoing cost.


Is there anyone who operates an authentication service which:

- Has a contractual obligation to keep your data secure.

- Accepts financial responsibility for data compromise.

- Carries insurance and bonding to back that responsibility.

- Does not require binding arbitration or forbid class actions.

- Has their employees bonded in the way bank employees are bonded.

Well?


How much would you pay for it?


Huh, can't I get this service for free or at most $1 a month...What are you saying? :)


Outside of the price issue, this service would also be a prime target go get compromised: I'd assume it would get the juiciest users, and national agencies would have the strongest incentives to backdoor it for later use.

We'd need a bunch of services to get to that level first to see any meaningful choice IMHO. I have no idea how that would happen.


It would make sense as a service offered by banks. They already have to verify ID. They're usually required to take financial responsibility for their errors, too.


So, just like okta.com, used by lots of huge companies.


But all that juicy data they could steal would just be going to waste when they did this.

I was really annoyed when iDrive, the backup service, pulled this stunt. Originally, they didn't have access to your encryption key. Then they put a dark pattern on their site to encourage users to give them the encryption key, to support the "the Cloud interface". Then you needed to give them the encryption key for some support functions.


I can recommend Aegis Authenticator - https://getaegis.app/

It has an option for encrypted, automated backups to Google or Nextcloud.


> likely even while they’re stored on their servers.

I'm all for castigating Google for not encrypting the TOTP seed which is (apparently) transmitted in the clear, but there's no actual proof (one way or the other) that the secrets are/are not being stored encrypted. Thus claiming "even while stored" claim is a bit much.


Yeah, there isn't such thing as an unencrypted disk at Google. Most things are encrypted multiple times in different layers before hitting physical media. Not E2EE which is a serious concern, but definitely encrypted in some form in transit (exceptions for intra-datacenter transfers) and at rest.


If it is not E2E encrypted, 3-letter agencies can put their tap somewhere in the Google infrastructure.


Imagine your google account getting deleted cuz you got banned from Google and the suddenly you lose all your 2FA secrets cuz they are part of that account


You would have to loose your phone simultaneously.


Someone will, of course, claim Google would never do this, but this presumably would make it trivial for Google itself to log into all of your accounts. In many cases they are already syncing a copy of your passwords.


Chrome passwords are encrypted with your password (just not e2ee) so it'd have to be a targeted attack where they log your password the next time you log in and then use that to decrypt your chrome passwords. Chrome also allows you to set your own sync passphrase different from your Google account password.


Google says it's encrypted "with your Google account", not with your password. I don't know what the former means, but they do not say they're using your password to encrypt it.


I hope someone has set up a honeypot by backing up a load of secrets with this service, and then seeing if anyone ever uses any of those secrets.


I’d love it if google did this to me. I wouldn’t settle for anything less than a 9 figure payout.


Bear in mind, you would probably not see this as a covert/criminal act, but something sold as a feature. "Our artificial intelligence now can assist you by analyzing your transaction habits with your bank and can help manage your Facebook account."

The sort of thing as a tech crowd would horrify many of us, but the public would largely go "oh neato".


It's 2FA so it would only get them half way there right?


> In many cases they are already syncing a copy of your passwords.

No, that gets them the full way there. They have your 2FA codes, and if you use Chrome and opt into it syncing passwords for you (passwords.google.com), this gives them both pieces of the puzzle.


If you use chrome, they could just download your cookies if they really wanted to.


FreeOTP recently gained support for backups to local file storage or cloud providers. The backups are encrypted with a passphrase, so the cloud provider can't obtain your OTP keys.


What? Why would this not get the same end-to-end encryption as Android backups? They'd have to do extra work to make this less secure.

Edit: oh I guess because it supports syncing between Android and iOS? Still lame, they should at least have an option to use the normal Android backup system. Which should have been the default since the start.


Even if they synchronize cross platform, there's no reason why they can't take the Android backup algorithm and stuff it into their iOS app. Google owns both side of the connection here and the Android code is even open source.


Why don’t people use their own TOTP provider, like KeepassXC/Strongbox, storing the DB in an encrypted manner on a cloud of their choice.

Then use across multiple devices.

It took time for this to sync in, so maybe that’s why so many others do not see that there is really no need to have a third party involved in this pattern?


It's fairly easy to make your own TOTP provider.

I wrote a simple CLI TOTP utility that works using an AES encrypted lookup table of secrets.

I piggybacked access to this off an unrelated web site and it is now readily available from any device if you have the decrypt key and know the URL.


I think Mysk just described "sync the secrets to your Google account." E2E would rather imply that there's a second E, but that's not the use case here.

If you already have a second E, just use the QR export/import feature.


tl;dr if a hacker gets access to your Google account it'll be like you didn't have 2FA at all

to be fair, storing your 2FA seeds in 1Password is about the same, except 1Password supposedly can't see your secrets. but if a hacker gets access to your unlocked 1Password data it's the same

tl;dr2 use offline TOTP or similar for real 2FA


Now I wonder if this just a bad netsec beginners mistake (dev, tester, pm all being stupid), or if it's unencrypted on purpose? Both options are not thrilling.


TOTP (the six digit codes) is bad and outdated 2FA anyway. It's vulnerable to phishing.

Use WebAuthn with security keys.


Until people stop using SMS codes it's still way more safe from cell phone cloning attacks.


People should have stopped using SMS codes when NIST told them to stop six years ago. The fact that there are websites that still support it is an abomination and should come with hefty legal penalties.


People? It's banks, who all insist on SMS, not people,


And Apple


Where does Ape use SMS based 2FA? Do you have a Windows computer maybe?


If you don't log into an Apple OS with your account

Apple specifically says for Apple School Manager they want: a work email address that is not associated with an App Store or iCloud account, and has not been used as an Apple ID for any other Apple service or website


The Australian Government's national website for its citizens to interact with gov services uses SMS based 2FA, and doesn't appear to support any other type. :/


Progress, not perfection.

Sms should never be used or offered, and needs congressional action to be stopped as a practice.

TOTP at least prevents turning Wireless carriers into security providers and is "good enough" for nearly everything.

And yes, WebAuthn/U2F is top of totem pole and should be something we're striving for nearly everything.


> Sms should never be used or offered

It's better than nothing.



No, it's WORSE than nothing. You're turning a 3rd party [wireless provider] into a security service that can authenticate you without your knowledge.


you could say the same about any shitty security, but most people don't want shitty security


From this thread and elsewhere, I'd argue that's not the case.


The problem with security keys is that they're expensive and you have to carry them around.

TOTP is cheap and much better 2FA than OTP over SMS.


This is sort of willfully missing the point, I concede, but I have my U2F token physically embedded in my arm and have minimal fear of losing it/being without it. Right now it runs OpenPGP and a Yubikey U2F emulator, but it can run just about any flavor of MFA with the appropriate companion app (full subdermal Java Card platform).

https://dangerousthings.com/product/flexsecure/

Hard agree, though, TOTP >>>>>> OTP via SMS


That's some wild Bourne Identity stuff. The thing permanently bricks itself after a specific number of failed attempts. About 3cm long and does TOTP and PGP. Wild.


Security keys can be built into the phone and still provide a reasonable expectation of security, e.g. Apple's Passkeys.

Obviously, a YubiKey would be better, but Passkeys don't require you to carry an additional thing and are still more secure than TOTP apps.


> can be built into the phone

I don’t like this idea. As a person whose had a phone break, like many others, tying auth to something so fragile should not be preferable. I’ll never forget my phone breaking and the process of trying to order a new one: the online shopping here, Shopee, demanded SMS 2FA (only option) which I needed to purchase a new phone so I found a different vendor but then my bank required SMS 2FA (only option) to do a transfer.

At least with these hardware security tokens, they’re pretty ‘dumb’ and often covered in epoxy or other weather-resistant material that makes them quite rugged & durable. Mine have gone through the wash and dangle from my motorbike’s keychain during the monsoons without issue.

> YubiKey

Please just use a generic term like “hardware security key/token” rather than endorsing a singular brand–especially one that is closed-source and looking to “go public” (https://news.ycombinator.com/item?id=35625065). If you think the closed nature of Google Authenticator is bad, consider an open hardware token option rather than a closed one.


who’s* had a phone break


WebAuthn doesn't work with apps. At the moment, FIDO2 relies on either your Big Tech account of choice or an expensive USB key. TOTP is free. There are no backups, so you also need to manually get your second USB key out of that safe place (the fire resistant safe in another physical location people talk about) every time you need to register a 2FA device.

WebAuthn and friends are definitely an improvement, but it's not quite a perfect replacement yet.

Besides, even some banks still use SMS for 2FA at this point. TOTP would make a lot of sites more secure already.


What do you all like to do for security key backups?

I use yubikeys wherever I can, but I've got to admit that fetching all 5+ keys from their various locations and then replacing them every time I need to enroll in something gets old fast, especially when you get into locations like "buried in the mountains".

The phishability of TOTP is indeed a big problem but being able to just save the seed (theoretically, somewhere not next to the other credential) is really nice.


Tangent: why can’t I backup the whole key? I have two OnlyKeys. I like have the second as a backup, and it covers the plaintext and TOTP slots well, but it can’t copy the U2F/FIDO2 slot. Naïvely I thought it would work like this and when I retrieved the backup, I was locked out of the account needing to fallback to TOTP anyhow.


it might not be optimal but it is infinitely better than the no 2fa that most people do or the sms that is all too common among 2fa. it is fine for most people and most things.


Weird no mention of authy


Authy gets this right. Not sure why anyone would trust Google with their 2FA secrets.


Authy has custom derivation time and makes it very difficult to export keys for use elsewhere.

I will only use 2fa applications that do standard 30 second TOTP. Bonus points if it can set custom times and hash algos. A good application like this is Aegis Authenticator for mobile.


How exactly?


I guess the person meant this: encrypt-than-upload of backups with backup passkey managed by yourself, details e.g. in this blogpost: https://authy.com/blog/how-the-authy-two-factor-backups-work...


I suspect the vast majority of Authy backups use passwords trivially susceptible to brute-force attacks despite only 1000 (!!!) iterations of PBKDF2. If Authy wanted to do things right, it would generate local encryption keys instead of asking normies for file encryption passphrases.


argon2/scrypt with significantly larger costs sound like the right fix. Asymmetric crypto can make backing up still cheap, who cares if restoring takes 30 seconds.


That would be a great improvement for technical users. But also consider that the target for Authy is the average mobile user. I’m not unconvinced that the typical backup password looks like S3cr3tP@s$w0rd, which no amount of key stretching will fix.


Yes. What I meant was that Authy does encrypted backups. You can criticize how it's implemented, but it's there, it works, and your 2FA secrets aren't just sitting in the cloud.

I think I'll refrain from posting on topics about Google — clearly there is a huge pro-Google sentiment among HN readers and anything detracting from that gets instantly downvoted.


You can just log out in the app to stop the synchronization. Problem solved.


Can the title get changed?

- E2E in this case is nebulous, this isn't a chat or email client, it isn't client <-> client with Google acting as an intermediary. It is between your Google account/Google's Server and Google's software.

- It isn't clear if during transportation it is encrypted (e.g. HTTPS?); since seemingly this post isn't about that (or if it is the evidence or technical information is lacking). The term "E2EE" typically is referring to encryption from client <-> client through a blind intermediary, but again that doesn't describe the relationship here.

The actual complaint SEEMS to be:

> Google Authenticator backup isn't encrypted at rest on Google's Servers

My big complaint is that this is a misuse of the term "E2E" (E2EE). It simply doesn't apply in this situation. That doesn't mean it isn't discussion worthy (e.g. not using HTTPS is a major red flag, and not encrypting at rest on Google's Servers is discussable).

In general the linked post doesn't do a good job describing what they found and how they found it.


E2EE is a valid term. Just because both ends are controlled by the same person doesn’t make it not “client ↔ client”. Just because Google wrote the software and stores the backups does not mean that those backups should be readable to someone (e.g. Google servers) with access only to the backup. E2EE means that no one other then the end-users can see the data—in this case, that is just the one user. Neither transit encryption nor encryption at rest provide that.

Encryption at rest is not really part of the discussion. There’s no way to verify client side that it is happening, and it does not prevent Google servers from seeing the plaintext backup.

> In general the linked post doesn't do a good job describing what they found and how they found it

Seemed pretty clear: they did MITM to bypass any transit encryption and saw the plaintext secrets being sent, and thus Google servers can see all the secrets.


End-to-end is being used to mean encrypted on the client and decrypted on other clients. "Encrypted at rest" is overloaded, it's often used to refer to secrets stored encrypted by an HSM or something where the key is still accessible by whoever is storing the data; pretty common for compliance where the threat model being addressed is someone stealing a hard drive from a data center, but not very useful for when you want a secret to remain secret even to the server operator.

This is pretty common use for the term E2EE as far as I can tell. That all of the "clients" for which the encryption would be end-to-end for are all run by the same user is not really a big deal; I've seen it used similarly for things like "end-to-end encrypted notes" and that sort of thing. Example: https://standardnotes.com/


> The actual complaint SEEMS to be: Google Authenticator backup isn't encrypted at rest on Google's Servers

Encryption at rest would NOT solve the problem being described here. Even if the data was encrypted both in transit and at rest, that does not mean that Google is incapable of getting access to the data. The data needs to be encrypted from the moment it leaves the device until the moment it arrives back on the user's device again (e.g. client to client E2EE), which is a stronger criteria than encryption in transit and at rest.


The "not encrypted at rest" part seems to be pure speculation: "As shown in the screenshots, this means that Google can see the secrets, likely even while they’re stored on their servers." Is there some actual evidence for this claim that I'm not noticing?


That sentence doesn't imply anything about whether they're encrypted at rest or not. Even if they were encrypted at rest (but not E2E encrypted), Google can just decrypt the secrets to see them. The problem here doesn't involve encryption at rest in any way and there's nothing being claimed about whether the secrets are encrypted at rest or not.


there's really no reason google couldn't have implemented (as the article suggests) a prompt for a one-time password while the user initiates the credential transfer. and it's clear the transfer isn't completely protected. this is a sloppy product change.


Also, all your private cloud storage photos and data are stored in publicly accessible urls.


The people who are downvoting this comment, can you prove it otherwise?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: