Jump to content
Sign in to follow this  
bdl

HOWTO export from Enpass 6 JSON to CSV

Recommended Posts

I had to export a bunch of password records from Enpass to CSV (and then to another password manager app;  though I'm still using and prefer Enpass for my own stuff!).

Here's how I did it:

  1. Move all the to-be-exported records to a new vault (yay, Enpass 6's support for multiple vaults); I ought to have done this when upgrading to Enpass 6 from 5
  2. Use the Enpass app's File > Export feature to export the vault as a JSON file
  3. Use the `jq` script below to convert the JSON to a CSV file

The conversion isn't entirely lossless - the new app / CSV import doesn't support attachments, and metadata such as timestamps, credential history and record types are lost (e.g. the destination app only supports three record types, nowhere near the flexibility of Enpass).

Enpass' flexible schema is great but when it comes to trying to shoehorn that into a simple CSV it's a bit of a squeeze. In this case, the  CSV file had to be of the form:

Folder,Title,Login,Password,Website,Notes,Subfolder,Custom Fields

Where "Folder" is a fixed string (e.g. "enpass import"), and title, login, password, website and notes map to the Enpass record's title and so on (there's an important nuance here that I'll get to in a moment). I'm ignoring the Subfolder (set it to a fixed "" in the CSV), and finally Custom Fields is a list of arbitrary key,value pairs that will get stored as such in the new app.

So, seems straightforward: extract the relevant fields from the Enpass JSON export, print them in the right order, and Bob's your Uncle.

Not so.

Problem #1 is that Enpass' is mightily flexible: you could have zero, one or a dozen "username"s associated with a record, ditto passwords/other secrets, and let alone other data such as phone numbers, TOTP codes, arbitrary text fields, etc. (Aside: as far as I can tell none of the schema is documented by Enpass...). Plus you could have attachments, and finally records that don't have any fields at all (e.g. a secure note). This makes processing the JSON a little complicated.

Problem #2, related to #1, in that you might have data in Enpass that just doesn't fit in the above CSV, for example security question and answer pairs. You don't want to lose these, so they should be included as Custom Fields in the CSV. Alas not everything is going to make it across - e.g. attachments, where the target CSV just doesn't support 'em in any meaningful way.

Fortunately there is an awesome tool available for slicing and dicing JSON data - jq. And here's a jq script that will finagle an Enpass vaullt exported in JSON format to the above-mentioned CSV format:

  • for each Enpass record, it copies over the record title and notes fields; for the login/password/website triumvirate it looks for Enpass fields labelled "Username", "Password", and "URL" to map over to the CSV; if it can't find a field it leaves it blank
  • any other fields that contain content get stuffed in to the Custom Fields list
    • that is, it should retain all field data, though a password field labelled "Secret" will turn up as a Custom Field, not in Password
% jq -r '[
.items[] | [ . as $item 
  | {"Username": "", "Password": "", "URL": "", "E-mail": "" } +
    ( [(.fields // [])[] | {(.label|tostring): .value}] | add )
  | . as $all_fields
  | delpaths([["Username"], ["Password"], ["URL"]])
  | . as $rem
  | "import", $item.title,
    $all_fields."Username", $all_fields."Password", $all_fields."URL",
    $item.note, "",
    ($rem | to_entries[] | select(.value | length > 0) | [.key, .value][]?)
] ][] | @csv' < enpass.json > other_app.csv

I'm not going to try an explain that in blow-by-blow detail here, but suffice to say it deals with the fact that not every Enpass record has all the required target CSV fields (lines 3-5) or even has a fields list at all (line 5). It also has to deal with dumping out exactly the first few fields ("Username", etc) in exactly that order - even when those fields are not present in the Enpass input - then "all the rest" of the fields. It does that by merging (adding) a "default" object with each `fields` objects (line 4), and stashing that merged object away in `$all_fields`. Next it deletes the keys associated with the fixed part of the CSV output to leave only the remaining objects (i.e. "all the rest", destined to become the Custom Fields) in `$rem`. That takes us to line 7 if I'm counting right.

From line 8 we get to start spitting data out, constructing an array of fields: first a fixed string "import", which is the folder in the new app where the imported data will end up. Also the item/record title here, lifted directly from the corresponding field in the Enpass JSON.

Next comes the username through URL from the $all_fields object we stashed earlier. Then the item note (direct translation), and an empty field (don't care about sub-folders).

The next little bit has to take the remains, the `$rem` object, and gather every non-empty field (every label:value pair where the length of the value string is > 0) and then turn that into a flat list of "Key", "Value" pairs.

Finally the whole array's closed out, expanded and fed to the CSV formatter. Note the use of the `-r` command argument to emit raw (non-escaped) output.

Easy, eh?

 

Edited by bdl
clarity

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...