Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CSV import with multiple files prompts "There is already a table of that name..." for every one #1121

Closed
justinclift opened this issue Sep 6, 2017 · 11 comments
Labels
bug Confirmed bugs or reports that are very likely to be bugs.

Comments

@justinclift
Copy link
Member

justinclift commented Sep 6, 2017

Just went to test the new and improved CSV import code (6432517) with a bulk import of many files (366) from the Backblaze drive stats data release.

The table structure is prepopulated, to ensure the fields have the correct data type.

CREATE TABLE drive_stats (
    date TEXT NOT NULL,
    serial_number TEXT NOT NULL,
    model TEXT NOT NULL,
    capacity_bytes INTEGER (8) NOT NULL,
    failure INTEGER (1) NOT NULL,
    smart_1_normalized INTEGER,
    smart_1_raw INTEGER,
    smart_2_normalized INTEGER,
    smart_2_raw INTEGER,
    smart_3_normalized INTEGER,
    smart_3_raw INTEGER,
    smart_4_normalized INTEGER,
    smart_4_raw INTEGER,
    smart_5_normalized INTEGER,
    smart_5_raw INTEGER,
    smart_7_normalized INTEGER,
    smart_7_raw INTEGER,
    smart_8_normalized INTEGER,
    smart_8_raw INTEGER,
    smart_9_normalized INTEGER,
    smart_9_raw INTEGER,
    smart_10_normalized INTEGER,
    smart_10_raw INTEGER,
    smart_11_normalized INTEGER,
    smart_11_raw INTEGER,
    smart_12_normalized INTEGER,
    smart_12_raw INTEGER,
    smart_13_normalized INTEGER,
    smart_13_raw INTEGER,
    smart_15_normalized INTEGER,
    smart_15_raw INTEGER,
    smart_22_normalized INTEGER,
    smart_22_raw INTEGER,   
    smart_183_normalized INTEGER,
    smart_183_raw INTEGER,
    smart_184_normalized INTEGER,
    smart_184_raw INTEGER,
    smart_187_normalized INTEGER,
    smart_187_raw INTEGER,
    smart_188_normalized INTEGER,
    smart_188_raw INTEGER,
    smart_189_normalized INTEGER,
    smart_189_raw INTEGER,
    smart_190_normalized INTEGER,
    smart_190_raw INTEGER,
    smart_191_normalized INTEGER,
    smart_191_raw INTEGER,
    smart_192_normalized INTEGER,
    smart_192_raw INTEGER,
    smart_193_normalized INTEGER,
    smart_193_raw INTEGER,
    smart_194_normalized INTEGER,
    smart_194_raw INTEGER,
    smart_195_normalized INTEGER,
    smart_195_raw INTEGER,
    smart_196_normalized INTEGER,
    smart_196_raw INTEGER,
    smart_197_normalized INTEGER,
    smart_197_raw INTEGER,
    smart_198_normalized INTEGER,
    smart_198_raw INTEGER,
    smart_199_normalized INTEGER,
    smart_199_raw INTEGER,
    smart_200_normalized INTEGER,
    smart_200_raw INTEGER,
    smart_201_normalized INTEGER,
    smart_201_raw INTEGER,
    smart_220_normalized INTEGER,
    smart_220_raw INTEGER,
    smart_222_normalized INTEGER,
    smart_222_raw INTEGER,
    smart_223_normalized INTEGER,
    smart_223_raw INTEGER,
    smart_224_normalized INTEGER,
    smart_224_raw INTEGER,   
    smart_225_normalized INTEGER,
    smart_225_raw INTEGER,
    smart_226_normalized INTEGER,
    smart_226_raw INTEGER,  
    smart_240_normalized INTEGER,
    smart_240_raw INTEGER,
    smart_241_normalized INTEGER,
    smart_241_raw INTEGER,
    smart_242_normalized INTEGER,
    smart_242_raw INTEGER,
    smart_250_normalized INTEGER,
    smart_250_raw INTEGER,
    smart_251_normalized INTEGER,
    smart_251_raw INTEGER,
    smart_252_normalized INTEGER,
    smart_252_raw INTEGER,
    smart_254_normalized INTEGER,
    smart_254_raw INTEGER,
    smart_255_normalized INTEGER,
    smart_255_raw INTEGER
    );

Steps followed:

  1. Open the (empty) database, created prior using the sqlite3 command line and the above SQL statement in a script.
  2. Launch the CSV import dialog
  3. Select all 366 files in the import dialog
  4. Type in the table name drive stats
  5. Deselect the "Separate tables" option
  6. Click OK

The import then begins, displaying the "Decoding..." dialog. After the decode finishes, a dialog appears asking:

csv import bug dialog msg1

Clicked "Yes", after which the insert happens, then a new "Decoding..." dialog appears. And the "There is already a table of that name..." dialog appears again.

It seems like this would pop up all 366 times, so I clicked No, and then Cancel to stop the process. DB4S crashed (leaving a -journal file behind) too. Oops.

The "There is already a table..." dialog should probably appear before the first decoding step, and the answer to that be kept for all of the imported CSV's so it doesn't hassle the user who does want all of them in the existing table. 😄

@justinclift justinclift added the bug Confirmed bugs or reports that are very likely to be bugs. label Sep 6, 2017
@justinclift
Copy link
Member Author

As an aside, trying to open the database when a matching -journal file is present didn't work. DB4S complained:

read only due to journal file1

The error message seems a bit misleading. A user likely wouldn't know "why is this database read only? How do I fix it?" from that. Looking at the file permissions (on this Linux system), the file wasn't read only. After just deleting the -journal file, the database then opened. We should probably figure out a more helpful approach for this situation. 😄

@iKlsR
Copy link
Member

iKlsR commented Sep 7, 2017

Agreed, this is sane behavior before batch so old assumptions.. but a Yes for all button would be nice. (Currently swamped with investor demo crap) but should be a quickie this weekend, Sunday the latest.

@justinclift
Copy link
Member Author

@iKlsR Awesome. 😄

MKleusberg added a commit that referenced this issue Sep 27, 2017
Add a 'Yes for all' button to the "There is already a table of
that name..." message box in the Import CSV dialog.

See issue #1121.
@MKleusberg
Copy link
Member

@justinclift Just pushed a commit which adds a 'Yes for all' button to that message box. Can you check if that's good enough for your use case? 😄

@justinclift
Copy link
Member Author

Thanks @MKleusberg. Yep, it's "good enough".

As a data point, the crash can still be triggered by pressing a key while this is happening. Not sure if that was looked at in the commit (haven't eyeballed the commit code yet).

In case it's useful, the data set I'm testing it with is this:

    https://f001.backblaze.com/file/Backblaze-Hard-Drive-Data/data_2013.zip

That's a ~80MB zip file, filled with a bunch of csv's. Makes for a really good use case for this. 😄

@justinclift
Copy link
Member Author

Hmm... if you're up for being a perfectionist about the dialog, maybe check if the table already exists before doing the first import? If it doesn't already exist, then automatically set dontAskForExistingTableAgain to true? That way it'll only ask for tables which really are in the database already. 😄

MKleusberg added a commit that referenced this issue Sep 27, 2017
@MKleusberg
Copy link
Member

Ah good idea. I'm not 100% sure how to best implement this but I've just pushed a commit that, to me, feels like a good approach for this. It also distinguishes between different table names now.

I haven't yet investigated the crash issue. I'll leave that for issue #1072 😃

@justinclift
Copy link
Member Author

justinclift commented Sep 27, 2017

Cool!

Testing it now, it's working really well. 😀

As a data point, that slowdown over time isn't happening. Looks like it's related to having separate tables so doesn't occur when it's all going into just the one.

Just realised we're also missing a Cancel button for this "There is already a table named ..." dialog. At the moment there's no way for the user to go "Oops, wrong table name. I need to cancel out". 😄

MKleusberg added a commit that referenced this issue Sep 27, 2017
@MKleusberg
Copy link
Member

The cancel button was a bit trickier but should be working now as well 😄

@justinclift
Copy link
Member Author

Yep, it's all working really well now. Thanks @MKleusberg, well done. 😄

@MKleusberg
Copy link
Member

Cool, thanks 😃

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Confirmed bugs or reports that are very likely to be bugs.
Projects
None yet
Development

No branches or pull requests

3 participants