-
Notifications
You must be signed in to change notification settings - Fork 881
Account import & export #2356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Account import & export #2356
Changes from all commits
b4a7b63
b43b267
bef2016
6e1cd2f
cb1f4de
b1c3391
59c3e68
2d696db
fec5819
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,38 @@ | ||
| class ImportsController < ApplicationController | ||
| disallow_account_scope | ||
|
|
||
| layout "public" | ||
|
|
||
| def new | ||
| end | ||
|
|
||
| def create | ||
| account = create_account_for_import | ||
|
|
||
| Current.set(account: account) do | ||
| @import = account.imports.create!(identity: Current.identity, file: params[:file]) | ||
| end | ||
|
|
||
| @import.perform_later | ||
| redirect_to import_path(@import) | ||
| end | ||
|
|
||
| def show | ||
| @import = Current.identity.imports.find(params[:id]) | ||
| end | ||
|
|
||
| private | ||
| def create_account_for_import | ||
| Account.create_with_owner( | ||
| account: { name: account_name_from_zip }, | ||
| owner: { name: Current.identity.email_address.split("@").first, identity: Current.identity } | ||
| ) | ||
| end | ||
|
|
||
| def account_name_from_zip | ||
| Zip::File.open(params[:file].tempfile.path) do |zip| | ||
| entry = zip.find_entry("data/account.json") | ||
| JSON.parse(entry.get_input_stream.read)["name"] | ||
| end | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| class Users::DataExportsController < ApplicationController | ||
| before_action :set_user | ||
| before_action :ensure_current_user | ||
| before_action :ensure_export_limit_not_exceeded, only: :create | ||
| before_action :set_export, only: :show | ||
|
|
||
| CURRENT_EXPORT_LIMIT = 10 | ||
|
|
||
| def show | ||
| end | ||
|
|
||
| def create | ||
| @user.data_exports.create!(account: Current.account).build_later | ||
| redirect_to @user, notice: "Export started. You'll receive an email when it's ready." | ||
| end | ||
|
|
||
| private | ||
| def set_user | ||
| @user = Current.account.users.find(params[:user_id]) | ||
| end | ||
|
|
||
| def ensure_current_user | ||
| head :forbidden unless @user == Current.user | ||
| end | ||
|
|
||
| def ensure_export_limit_not_exceeded | ||
| head :too_many_requests if @user.data_exports.current.count >= CURRENT_EXPORT_LIMIT | ||
| end | ||
|
|
||
| def set_export | ||
| @export = @user.data_exports.completed.find_by(id: params[:id]) | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,19 @@ | ||
| class ImportAccountDataJob < ApplicationJob | ||
| include ActiveJob::Continuable | ||
|
|
||
| queue_as :backend | ||
|
|
||
| def perform(import) | ||
| step :validate do | ||
| import.validate \ | ||
| start: step.cursor, | ||
| callback: proc { |record_set:, record_id:| step.set! [ record_set, record_id ] } | ||
| end | ||
|
|
||
| step :process do | ||
| import.process \ | ||
| start: step.cursor, | ||
| callback: proc { |record_set:, record_id:| step.set! [ record_set, record_id ] } | ||
| end | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,8 +1,19 @@ | ||
| class ExportMailer < ApplicationMailer | ||
| helper_method :export_download_url | ||
|
|
||
| def completed(export) | ||
| @export = export | ||
| @user = export.user | ||
|
|
||
| mail to: @user.identity.email_address, subject: "Your Fizzy data export is ready for download" | ||
| end | ||
|
|
||
| private | ||
| def export_download_url(export) | ||
| if export.is_a?(User::DataExport) | ||
| user_data_export_url(export.user, export) | ||
| else | ||
| account_export_url(export) | ||
| end | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| class ImportMailer < ApplicationMailer | ||
| def completed(identity) | ||
| mail to: identity.email_address, subject: "Your Fizzy account import is complete" | ||
| end | ||
|
|
||
| def failed(identity) | ||
| mail to: identity.email_address, subject: "Your Fizzy account import failed" | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,48 @@ | ||
| class Account::DataTransfer::AccessRecordSet < Account::DataTransfer::RecordSet | ||
| ATTRIBUTES = %w[ | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is the set of attributes in the transfer different than the table's columns? If it's not, could we determine these from the schema automatically?
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also, what's your thoughts about how to handle situations where data has been migrated and the columns differ in some way? I imagine there would be some cases where this would make in import impossible, but there may be others were it's OK (added nullable column or usable defaults, etc). |
||
| accessed_at | ||
| account_id | ||
| board_id | ||
| created_at | ||
| id | ||
| involvement | ||
| updated_at | ||
| user_id | ||
| ].freeze | ||
|
|
||
| private | ||
| def records | ||
| Access.where(account: account) | ||
| end | ||
|
|
||
| def export_record(access) | ||
| zip.add_file "data/accesses/#{access.id}.json", access.as_json.to_json | ||
| end | ||
|
|
||
| def files | ||
| zip.glob("data/accesses/*.json") | ||
| end | ||
|
|
||
| def import_batch(files) | ||
| batch_data = files.map do |file| | ||
| data = load(file) | ||
| data.slice(*ATTRIBUTES).merge("account_id" => account.id) | ||
| end | ||
|
|
||
| Access.insert_all!(batch_data) | ||
| end | ||
|
|
||
| def validate_record(file_path) | ||
| data = load(file_path) | ||
| expected_id = File.basename(file_path, ".json") | ||
|
|
||
| unless data["id"].to_s == expected_id | ||
| raise IntegrityError, "Access record ID mismatch: expected #{expected_id}, got #{data['id']}" | ||
| end | ||
|
|
||
| missing = ATTRIBUTES - data.keys | ||
| if missing.any? | ||
| raise IntegrityError, "#{file_path} is missing required fields: #{missing.join(', ')}" | ||
| end | ||
| end | ||
| end | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,54 @@ | ||
| class Account::DataTransfer::AccountRecordSet < Account::DataTransfer::RecordSet | ||
| ACCOUNT_ATTRIBUTES = %w[ | ||
| join_code | ||
| name | ||
| ] | ||
|
|
||
| JOIN_CODE_ATTRIBUTES = %w[ | ||
| code | ||
| usage_count | ||
| usage_limit | ||
| ] | ||
|
|
||
| private | ||
| def records | ||
| [ account ] | ||
| end | ||
|
|
||
| def export_record(account) | ||
| zip.add_file "data/account.json", account.as_json.merge(join_code: account.join_code.as_json).to_json | ||
| end | ||
|
|
||
| def files | ||
| [ "data/account.json" ] | ||
| end | ||
|
|
||
| def import_batch(files) | ||
| account_data = load(files.first) | ||
| join_code_data = account_data.delete("join_code") | ||
|
|
||
| account.update!(name: account_data.fetch("name")) | ||
| account.join_code.update!(join_code_data.slice("usage_count", "usage_limit")) | ||
| account.join_code.update(code: join_code_data.fetch("code")) | ||
| end | ||
|
|
||
| def validate_record(file_path) | ||
| data = load(file_path) | ||
|
|
||
| unless (ACCOUNT_ATTRIBUTES - data.keys).empty? | ||
| raise IntegrityError, "Account record missing required fields" | ||
| end | ||
|
|
||
| unless data.key?("join_code") | ||
| raise IntegrityError, "Account record missing 'join_code' field" | ||
| end | ||
|
|
||
| unless data["join_code"].is_a?(Hash) | ||
| raise IntegrityError, "'join_code' field must be a JSON object" | ||
| end | ||
|
|
||
| unless (JOIN_CODE_ATTRIBUTES - data["join_code"].keys).empty? | ||
| raise IntegrityError, "'join_code' field missing required keys" | ||
| end | ||
| end | ||
| end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I quite like the structure of having the
RecordSettypes, but I notice that most of them are very similar. The bulk of them boil down to "get all the records by account ID, store into a path based on the name". Which, I think, is what we were originally hoping the whole import/export process could be.I realise that as we get deeper into it, some of the models might deviate from that a bit. But maybe we could try to really push back on the number of deviations where possible, and then have hooks for anywhere we do need to handle something in a different way. That way the bulk of the process could use the same pattern without having to restart parts of it on each model.
So if the places where it strays from convention are few and minor, do you think we could push some of this up into the base class, and then all the "normal" tables could avoid needing to implement most of the methods here?
Or, alternatively, could we avoid deriving individual classes at all, expect in the cases where a table has some special behaviour? So if a model needs a special join, or some pre/post processing we can specify just those parts?