Python Tool for Big Data Processing
- Status: Open
- Hadiah: $173
- Entri Diterima: 30
Deskripsi Kontes
We are seeking a skilled Python developer to design and optimize a robust tool capable of importing and processing hundreds of thousands of rows of data while interacting with an internal API. The tool must achieve a throughput of at least 10,000 transactions per second (TPS), with a preference for even higher performance.
The data import process will be semi-automatic, requiring real-time updates to the imported content, as the source file will dynamically grow or shrink with the addition or removal of data. Once the import and processing begin, the tool should produce two output tables:
1. A table containing the processed data.
2. A table with manually selected entries.
To elaborate your entry, please submit a big-data example of your choice that showcases:
- Efficient handling of large-scale datasets.
- High performance and scalability.
- Interaction with APIs to fetch or update data.
This will serve as the foundation for further refinement and development during the task.
Key Requirements:
- Advanced proficiency in Python, particularly for large-scale data processing.
- Proven experience working with big data and developing high-performance solutions.
- Expertise in creating efficient, scalable, and maintainable data processing pipelines.
Keahlian yang Disarankan
Papan Klarifikasi Publik
Bagaimana untuk memulai sebuah kontes
-
Buat Kontes Anda Cepat dan mudah
-
Dapatkan Jutaan Entri Dari seluruh dunia
-
Pilih entri terbaik Unggah file - Mudah!