Speed up JSONL to PostgreSQL import

  • Status: Open
  • Prize: €120
  • Entries Received: 4

Contest Brief

The goal of this project is to significantly speed up an already existing data import script (see import_data.sh).

The steps needed to import the data consist of:
- Setup the necessary tables for the import to work (I already provide you with the postgres init script, see init.sql)
- Downloading/fetching the large ±24GB .JSONL data dump
- Importing and converting this single .JSONL-dump into two postgres tables running in a docker container

Requirements:
- The resulting tables should be exactly the same as when you run the original import script.
- The import time should be significantly reduced compared to the import time of the original import script using the same hardware requirements.
- It should be easy to run everything: clearly document how to run it

How you achieve this is completely up to you (Bash + SQL, pure SQL, Python, …).

Recommended Skills

Top entries from this contest

View More Entries

Public Clarification Board

  • blui88
    blui88
    • 9 hours ago

    May I help you in this regard?

    Let's chat to know more about the issue and about probable improvements in the Script.
    Regards

    Santosh

    • 9 hours ago
  • kreativesystem91
    kreativesystem91
    • 17 hours ago

    Hello. I can help you with your project contact me #1

    • 17 hours ago

How to get started with contests

  • Post your contest

    Post Your Contest Quick and easy

  • Get tons of entries

    Get Tons of Entries From around the world

  • Award the best entry

    Award the best entry Download the files - Easy!

Post a Contest Now or Join us Today!