Internals: decryption process difference between traditional and business accounts ?

mickael
mickael
Community Member
edited May 2019 in Lounge

Hi,

I try to improve my op-local CLI tool to manage multiple accounts setups.

So far it's working fine with a simple business account, but I have troubles when I add a personal account and try to decrypt personal data.

My decryption process is the following:

  1. Grab session key from env variable
  2. Grab encrypted session private key from temp file
  3. Decrypt the encrypted session private key using the session key
  4. Get the encrypted symmetric key from the related account (business or personal)
  5. Decrypt the encrypted symmetric key using the session private key encoded muk
  6. Get the encrypted master unlock key from the related account (business or personal)
  7. Decrypt the master unlock key using the symmetric key

I am currently blocked at this step with the personal account, with the error "Invalid Tag".

Am I missing something ?

Thanks,


1Password Version: Not Provided
Extension Version: Not Provided
OS Version: Not Provided
Sync Type: Not Provided

Comments

  • mickael
    mickael
    Community Member
    edited May 2019

    Ok, got it, it looks like this is the symmetric key of the 'main' account to use to decrypt the master unlock key.

  • Glad to hear you got it sorted. :)

    Ben

  • jpgoldberg
    jpgoldberg
    1Password Alumni

    @mickael,

    You've hit upon something that individual clients vary on and that we feel free to change as needed for the local clients. (And so it isn't "standard" or externally documented.)

    So some 1Password clients will derive a "master key" from your Master Password, and that key is used to decrypt the master unlock keys and SRP-x keys for all of your accounts. The idea is that you should be able to unlock all of your accounts with your "primary" account's Master Password. But we don't want to put the other account secrets into your primary account because you may wish to have different accounts available on different devices. Also keep in mind that without some additional layer, the defender needs to go through the key derivation twice, once for the SRP-x and the other for the MUK. (An attacker guessing at Master Passwords would only need to go through once for each guess.)

    This also all goes back to well before "accounts". If you were synchronizing multiple OPVaults or Agile Keychains, the same problems arose. The fine details of these also developed differently on different platforms. This was correct as we could also take advantage of what different platforms could offer in terms of key derivation and data protection. Also this yet an extra layer of indirection allowed for a more gentle propagation of Master Password changes.

    Clients are going to differ from each other. Our web app is going to have very different ways of handling than, say, 1Password on iOS. The needs are different as well as the tools available. And as those tools develop, we may find ways to tune individual clients. So this makes it very hard for us to document how individual clients do things.

    And this is alway why you might find the rug pulled out from underneath you if you are trying to build tools around the local storage. We need the freedom to change those. These have been quite stable since the release of 1Password 7, and I don't anticipate changes soon, but suppose that some day we wanted to add in Argon2 or scrypt for that local additional layer of key derivation. We wouldn't be able to do it for the web clients (well not until WASM was more mature), and we would definitely want different tunings for mobile versus desktop clients if we were to do this on mobile clients at all. This is a freedom we have by having this extra layer. I'm happy that you are doing what you are doing, but we are going to tinker with that local, client specific, undocumented layer without worrying how that will affect your project.

  • mickael
    mickael
    Community Member

    @jpgoldberg, thanks for the details.

    I understand that building tools over undocumented local storage is not a reliable way to do stable things.

    To be honest, I hope that my tool will be deprecated soon or at least before you heavily change how the desktop client works because your official CLI will have make a big step forward regarding performance.

    But, currently, waiting 3 to 4s to retrieve a secret is totally unacceptable for my use case. I used to use hosted secrets in my Ansible playbooks (a provisionning tool). I currently have around 100 secrets. Each time the file containing the secrets needs to be parsed that makes 7min to wait instead of 10s.

    So, I hope to see in a near future the op CLI coming with it's own syncing feature and use it's own local storage at least for the decryption process.

    But, BTW, as you explained me that things are different regarding clients, could you tell me if the decryption process and database structure are currently the same on the Mac and Windows desktop clients?

  • could you tell me if the decryption process and database structure are currently the same on the Mac and Windows desktop clients?

    They're similar, but not the same. What differs is mostly how we store what's effectively the "master account". i.e. the top of the pyramid, the one you need to decrypt first before you can decrypt the others.

    Rick

  • So, I hope to see in a near future the op CLI coming with it's own syncing feature and use it's own local storage at least for the decryption process.

    It's something we talk about internally. It's not coming in the short term, but it's something that would be cool for us to do one day. Syncing of data adds a lot of complexity though.

    Rick

  • mickael
    mickael
    Community Member

    @rickfillion I agree that bidirectionnal syncing of data adds a lot of complexity.

    But, a first step could be to use the local database only for read-only / query / list / get operations on items.

    So the sync could be one direction only and should be far more easy to implement, I guess.

    But, finally, the feature request is "performance improvements" not "use a local database". Use a local database is only a way to improve performance, but maybe you could find another solution like "improving" your current server infrastructure or whatever ;)

  • a first step could be to use the local database only for read-only / query / list / get operations

    Yup. That too adds complexity, but you're right, it's less so.

    But, finally, the feature request is "performance improvements" not "use a local database".

    I agree 100%. There are improvements coming in 0.6 that should help. Right now the best thing you can do to help with performance is to reference both items and vaults by uuids. That'll net you a win today, and will net you an even bigger win once 0.6 is out.

    Another optimization you can do on your end is to make sure that the user you're running commands as has access to a limited set of vaults and items. 0.6 will help with that, and we have ideas for how to make things much better there.

    Are you doing op signin for each op get item? If so I'd recommend you find a way to avoid that. The sign-in process is relatively expensive, purposefully.

    Rick

  • mickael
    mickael
    Community Member

    @rickfillion I only sign in when needed, and to be honest, currently, this is indeed a rather long process around 40s to login over my 2 accounts, and as it looks like there is a trouble with session management, I need to reauth every 30 minutes even if I use extensively the CLI.

    When I do some queries this is always by specifying an item uuid, but I don't specify the vault necessary. Those requests take around 8-10s in normal conditions to perform, this is really really slow, and sometimes they take around 30s even...

  • Those timings sound really bad. Sign-in should take about a second or so + network latency.

    I need to reauth every 30 minutes even if I use extensively the CLI

    Hrmm.... that's odd. It's supposed to automatically push that back.

    When I do some queries this is always by specifying an item uuid, but I don't specify the vault necessary.

    Specifying the vault should help.

    Those times seem really abnormal though. We'd love to work with you to understand where the time is being spent. Unfortunately both Connor and I are taking time off for the next 2 weeks, but when we're back would you be willing to work with us to figure out what's going on?

    Rick

  • mickael
    mickael
    Community Member
    edited May 2019

    @rickfillion, I understand. I would be pleased to help you debugging that. Don't hesitate to come back to me when you have some time to investigate that.

    Have a nice break !

    Thanks !

  • Will do!

    Rick

This discussion has been closed.