Speed up JSONL to PostgreSQL import
- Status: Open
- Prize: €120
- Entries Received: 4
Contest Brief
The goal of this project is to significantly speed up an already existing data import script (see import_data.sh).
The steps needed to import the data consist of:
- Setup the necessary tables for the import to work (I already provide you with the postgres init script, see init.sql)
- Downloading/fetching the large ±24GB .JSONL data dump
- Importing and converting this single .JSONL-dump into two postgres tables running in a docker container
Requirements:
- The resulting tables should be exactly the same as when you run the original import script.
- The import time should be significantly reduced compared to the import time of the original import script using the same hardware requirements.
- It should be easy to run everything: clearly document how to run it
How you achieve this is completely up to you (Bash + SQL, pure SQL, Python, …).
Recommended Skills
Top entries from this contest
-
SakibKaiser Bangladesh
Public Clarification Board
How to get started with contests
-
Post Your Contest Quick and easy
-
Get Tons of Entries From around the world
-
Award the best entry Download the files - Easy!