Etl project powercenterpekerjaan
Permasalahan sbb: Programmer saya yg lama membuatkan Task scheduler utk ETL di sql server, dan skarang tidak bisa dibuat otomatis. Sebelumnya berjalan nomormal tapi kemudian fungsi otomatisnya hilang. Mohon untuk bisa diperiksa sql servernya
Permasalahan sbb: Programmer saya yg lama membuatkan Task scheduler utk ETL di sql server, dan skarang tidak bisa dibuat otomatis. Sebelumnya berjalan nomormal tapi kemudian fungsi otomatisnya hilang. Mohon untuk bisa diperiksa sql servernya
I'm in need of an Azure Data Engineer with substantial experience working with SAP HANA as a data source in Azure Data Factory. The primary focus of this project is to use the ODBC connector in ADF to read the source data. I look forward to your bids. Please ensure you meet the requirements before applying. The expected duration of the project is less than 1 day. The project requires basic ETL transformations.
... disaster recovery and business continuity strategies: • Acts on Disaster recovery tests to process fail over and fail back. • Interventions scheduled on weekends and non-working hours. Profile Skills required: • Technical Skills for Production Support Engineer • Linux REDhat since 6.2 version mandatory, CentOS • Scripting KSH Skill and SSH • Windows servers since 2012 version • Experience on ETL tools (Informatica) • Experience on Databases (Oracle, Teradata, PostgreSql) • Experience on BI Tools (MSBI, Business Object, Power BI) • Experience on Files Transfer Tool (Tom, SFTP,) • Experience on scheduling tools (Control-M, Jenskins) • Experience on automation (Jenskins, Ansible, Nexus , Terraform) • Experience on n...
I am looking for a skilled professional who can assist with using Python and Terraform. The project involves automating infrastructure on the cloud, specifically AWS Key Responsibilities: - Automate cloud-based infrastructure using Terraform - Process data using Python - Ensure efficient and streamlined operations Ideal Skills: - Strong proficiency in Python - Extensive experience with Terraform - Familiarity with AWS - Background in data processing Please bid if you have the relevant experience and skills. The expected timeline for this project is less than 1 month. The project will involve ETL (Extract, Transform, Load) processes for data handling. The expected volume of data to be processed is small. Provide regular progress updates during the automati...
Sila Dafter atau Log masuk untuk melihat butiran.
...optimize robust data pipelines for structured, semi-structured, and unstructured data. The ideal candidate will have extensive experience in ETL processes, database architecture, distributed data processing, and cloud platforms (AWS, Azure, GCP). This role is a full-time freelance position, requiring 8-hour workdays as part of a Scrum team, with occasional meetings in Pacific Time but most work done in IST. Key Responsibilities: Data Pipeline & Architecture: Design and architect scalable data pipelines for various data types. Develop, manage, and optimize databases, including RDMS (MySQL, PostgreSQL), NoSQL (MongoDB), and data lakes (S3). Implement efficient ETL processes using PySpark and Hadoop to transform and prepare data for analytics and AI use cases. Optimize d...
1. Project Overview The AI Agent will be designed to automatically pull data from multiple sources, including different projects, documents, cloud storage services (Dropbox, S3), URLs, and media content (YouTube videos). The agent will process, clean, and store the extracted data for further analysis, search, and retrieval. 2. Objectives Develop an AI-driven system to aggregate data from various sources. Enable automated extraction, transformation, and loading (ETL) processes. Ensure seamless integration with cloud storage, document repositories, and media platforms. Implement Natural Language Processing (NLP) for document comprehension and YouTube transcription. Provide an API interface for querying and accessing the processed data. Support integration with various L...
...central data platform for our Epiko Hub project. Epiko Hub is a comprehensive ecosystem unifying gaming, NFTs, phygital assets, and social features, serving both Web2 and Web3 users. The successful candidate will design, implement, and optimize data pipelines, storage, and processing workflows using Snowflake’s advanced capabilities. Key Responsibilities: Architectural Planning: Design a scalable, secure, and efficient data architecture with Snowflake as the core repository. Develop detailed technical documentation outlining integration points and data flows across Epiko Hub modules. Data Ingestion & Processing: Set up and configure Snowpipe for real-time data ingestion from game servers, blockchain APIs, and external marketplaces. Develop and optimize ETL pipe...
...IT project and I need a seasoned data engineer to assist me. The project revolves around extensive data engineering tasks with a focus on DBT and Snowflake data warehousing. Key Responsibilities: - Leveraging Snowflake for Data Warehousing - Implementing ELT strategies - Utilizing DBT for data transformations - Overseeing the entire Data Warehouse (DWH) setup Data Type: - Primarily dealing with transactional data The ideal candidate would have: - Extensive experience with DBT and Snowflake - Deep understanding of data engineering principles - Proven track record in setting up databases and creating ETL pipelines - Skills in optimizing data queries - Ability to work with transactional data Please let me know if you're the right fit for this project. ...
I'm looking for a Power BI expert who can link my Power BI dashboard with Timegate and Coredinate. I need to visualize and analyze specific data from these platforms. Data Requirements: - Attendance data - Task management data - Performance metrics This project involves not only connecting these platforms to Power BI, but also handling the entire data extraction, transformation, and loading (ETL) process. Ideal Skills: - Proficiency in Power BI - Experience with Timegate and Coredinate - Strong ETL skills - Data visualization expertise
Busco DESARROLLADOR de powercenter, que hable en español como lenua materna. Para trabajar a tiempo completo o parcial. Larga duración del proyecto
...skilled StreamSets ETL Tech Consultant with a strong background in data integration, pipeline development, and real-time data streaming. The successful candidate will be responsible for the design, development, and optimization of ETL/ELT pipelines using StreamSets Data Collector (SDC). Key Responsibilities: - Integrate with Oracle databases - Develop ETL pipelines that interact with Clickhouse cloud platform - Utilize streamsets for big data technologies Ideal Skills and Experience: - Proven experience with StreamSets Data Collector (SDC) - Strong expertise in Oracle database - Proficiency in developing ETL pipelines for Clickhouse - Experience with streamsets in big data environments - Excellent skills in pipeline development and real-time data streaming ...
...business continuity strategies: • Acts on Disaster recovery tests to process fail over and fail back. • Interventions scheduled on weekends and non-working hours. Profile Skills required: • Technical Skills for Production Support Engineer • Linux REDhat since 6.2 version mandatory, CentOS • Scripting KSH Skill and SSH • Windows servers since 2012 version • Experience on ETL tools (Informatica) • Experience on Databases (Oracle, Teradata, PostgreSql) • Experience on BI Tools (MSBI, Business Object, Power BI) • Experience on Files Transfer Tool (Tom, SFTP,) • Experience on scheduling tools (Control-M, Jenskins) • Experience on automation (Jenskins, Ansible, Nexus , Terraform) •...
Estoy buscando...y oportuna, utilizando las mejores prácticas de integración de datos. Requisitos Técnicos: Amplia experiencia en integración de bases de datos y soluciones de análisis de datos. Conocimientos sólidos en SQL, así como experiencia en la conexión y gestión de bases de datos relacionales (MySQL, PostgreSQL, SQL Server, etc.). Dominio en el uso de herramientas y tecnologías de integración de datos (ETL, APIs RESTful, Web Services, etc.). Experiencia en la configuración de entornos de acceso seguro a bases de datos, incluyendo autenticación y autorización de usuarios. Capacidad para trabajar con herramientas de visualización de datos y otros sistemas analítico...
*Description*: We are seeking an experienced consultant to integrate *JD Edwards* and *Epicor Kinetic ERP* systems, specifically focusing on *Accounts Receivable (AR)* and *Accounts Payable (AP)* modules. The ideal candidate should have expertise in ERP integration, data mapping, API development, and ETL processes. *Key Requirements*: • Proven experience with JD Edwards and Epicor Kinetic ERP systems. • Strong knowledge of AR/AP workflows and data structures. • Proficiency in API integration and data security best practices. • Ability to deliver within a defined timeline. *Deliverables*: ]• Seamless integration of AR/AP modules between the two systems. • Documentation of the integration process. • Post-integrat...
...with the implementation of SAP Analytics Cloud for Planning (SAC Planning) Scope of project is to deliver a Planning & Reporting solution which includes Financial Statements Planning and Revenue Planning aligning with the S/4 go-live The solution will be used by approximately 40 users and such work will incorporate the tasks to define, develop, test and rollout the planning system Offshore role responsibilities include: • Understanding of integrating SAP SAC Planning with other SAP solutions such as SAP S/4HANA, SAP S/4HANA Cloud using standard integration methods and APIs • Experience in integrating SAP SAC Planning with other SAP and non-SAP systems for data extraction, transformation, and loading (ETL) • Experience in configuring Import Jobs for Ma...
...details for every execution. Documentation and Guidance: - Provide clear instructions for running and maintaining the scripts. - Minimal guidance to help me perform small tasks like testing queries or troubleshooting basic issues in BigQuery if needed. Requirements: - Strong experience with SQL and Python. - Proficiency in Google BigQuery for setting up and managing datasets. - Familiarity with ETL/ELT processes and cloud platforms like Google Cloud. Deliverables: Python scripts that: - Extract data from SQL and load it into BigQuery. - Run automatically at scheduled intervals. - Organized BigQuery datasets ready for analysis. - Documentation for maintaining the process and handling common issues....
I'm looking for a seasoned data engineer to assist with my data warehousing project on Amazon Redshift. Key Responsibilities: - Design and implement an efficient and scalable data warehousing solution - Ensure seamless integration of data from various sources - Optimize the data warehouse for performance and cost-efficiency Ideal Skills: - Extensive experience with Amazon Redshift - Proficient in ETL pipeline development - Knowledgeable in data warehousing best practices - Strong problem-solving skills and attention to detail.
Overview Design and implement ETL processes on GCP, optimize data workflows, ensure data accuracy and integrity, troubleshoot pipeline issues, and collaborate with cross-functional teams to support data integration and analytics for the EDW program. Job Description Responsibilities: Design, develop, and implement ETL processes on Google Cloud Platform (GCP) to ensure efficient data extraction, transformation, and loading for EDW program. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions on GCP. Utilize Google Cloud services, such as native services, BQSQL, DAG, Cloud composer, Dataflow, and Cloud Storage, to optimize data processing and storage workflows. Create and maintain documentation for E...
...business continuity strategies: • Acts on Disaster recovery tests to process fail over and fail back. • Interventions scheduled on weekends and non-working hours. Profile Skills required: • Technical Skills for Production Support Engineer • Linux REDhat since 6.2 version mandatory, CentOS • Scripting KSH Skill and SSH • Windows servers since 2012 version • Experience on ETL tools (Informatica) • Experience on Databases (Oracle, Teradata, PostgreSql) • Experience on BI Tools (MSBI, Business Object, Power BI) • Experience on Files Transfer Tool (Tom, SFTP,) • Experience on scheduling tools (Control-M, Jenskins) • Experience on automation (Jenskins, Ansible, Nexus , Terraform) •...
**Project Title:** Power BI Report for Sales and Marketing Data **Description:** We are seeking an experienced Power BI developer to create a comprehensive report for our sales and marketing data. The ideal candidate should have a strong background in data visualization and analysis, with a proven track record of delivering high-quality Power BI reports. **Project Requirements:** 1. **Data Integration:** - Integrate sales and marketing data from various sources (e.g., CRM, ERP, Excel files). - Ensure data accuracy and consistency. 2. **Report Design:** - Create visually appealing and interactive dashboards. - Include key metrics such as sales performance, marketing campaign effectiveness, customer segmentation, and ROI analysis. - Design reports that are ea...
We are looking Strong candidates for Automation QA Excellent communication is mandatory. Automation QA Exp req- 5+yrs JOB DESCRIPTION: Automation QA Strong in Java programming Strong at creating automation frameworks. Good at Web and API automation Good to have mobile automation Python QA 5 plus experience Remote Python, SQL- strong command ETL - good to have ability to work with database using python
I am seeking a seasoned Data Engineer with over 4 years of experience. The ideal candidate should possess strong skills in ETL processes, Data warehousing, and Big data technologies. Proficiency in tools like Apache Spark, Hadoop, and SQL is a must. Key Requirements: - Extensive experience in ETL processes, Data warehousing and Big data technologies - Proficient in Apache Spark, Hadoop and SQL - Seeking for a Full-time employment Skills and Experience: - 4+ years in Data Engineering - Mastery of relevant tools and technologies - Ready for Full-time commitment If you're a dedicated professional with the required skills and experience, I look forward to your application.
Job Title: Power BI Freelancer Needed for Data Visualization and Reporting J...measures and columns. Ensure data accuracy and troubleshoot issues within the data model. Optimize dashboards for performance and user experience. Collaborate with stakeholders to gather requirements and deliver solutions. Required Skills: Proficiency in Power BI and its components (Power Query, DAX, etc.). Strong knowledge of data visualization best practices. Experience with data modeling and ETL processes. Familiarity with integrating multiple data sources. Good communication skills for understanding requirements and presenting insights. Preferred Qualifications: Experience in [specific industry, if relevant]. Knowledge of SQL, Python, or other relevant tools. Previous experience in creating similar d...
I'm looking for a skilled Data Engineer/Full Stack Developer with extensive knowledge and experience in database structuring and implementation on AWS. The project revolves around a financial platform, necessitating the design and implementation of a robust ETL (Extract, Transform, Load) pipeline for our data warehouse. Key Responsibilities: - Creating the ETL pipeline - Data extraction primarily from vendor APIs with occasional web scraping - Data loading to various databases Ideal Skills: - Strong expertise in AWS and cloud-based database implementation - Proficiency in designing and optimizing ETL pipelines - Experience in integrating vendor APIs - Knowledge in database structuring and optimization Good to Have: - Proficiency in AI and prompt engineer...
**Project Title:** Power BI Report for Sales and Marketing Data **Description:** We are seeking an experienced Power BI developer to create a comprehensive report for our sales and marketing data. The ideal candidate should have a strong background in data visualization and analysis, with a proven track record of delivering high-quality Power BI reports. **Project Requirements:** 1. **Data Integration:** - Integrate sales and marketing data from various sources (e.g., CRM, ERP, Excel files). - Ensure data accuracy and consistency. 2. **Report Design:** - Create visually appealing and interactive dashboards. - Include key metrics such as sales performance, marketing campaign effectiveness, customer segmentation, and ROI analysis. - Design reports that are ea...
**Project Title:** Power BI Report for Sales and Marketing Data **Description:** We are seeking an experienced Power BI developer to create a comprehensive report for our sales and marketing data. The ideal candidate should have a strong background in data visualization and analysis, with a proven track record of delivering high-quality Power BI reports. **Project Requirements:** 1. **Data Integration:** - Integrate sales and marketing data from various sources (e.g., CRM, ERP, Excel files). - Ensure data accuracy and consistency. 2. **Report Design:** - Create visually appealing and interactive dashboards. - Include key metrics such as sales performance, marketing campaign effectiveness, customer segmentation, and ROI analysis. - Design reports that are ea...
**Project Title:** Power BI Report for Sales and Marketing Data **Description:** We are seeking an experienced Power BI developer to create a comprehensive report for our sales and marketing data. The ideal candidate should have a strong background in data visualization and analysis, with a proven track record of delivering high-quality Power BI reports. **Project Requirements:** 1. **Data Integration:** - Integrate sales and marketing data from various sources (e.g., CRM, ERP, Excel files). - Ensure data accuracy and consistency. 2. **Report Design:** - Create visually appealing and interactive dashboards. - Include key metrics such as sales performance, marketing campaign effectiveness, customer segmentation, and ROI analysis. - Design reports that are ea...
1. Project Overview Objective: Build a robust backtesting pipeline for options trading strategies using AI/ML techniques. The pipeline will incorporate: GDFL 1-minute OHLCV data. Level 2 order book data (market depth). Computed Greeks (Delta, Gamma, Theta, Vega, etc.). AI/ML models for generating trading signals and evaluating performance. Scope: Acquire, clean, and store the relevant data. Calculate and integrate Greeks for each option contract. Engineer features from OHLCV, Level 2 order book, and Greeks. Develop AI/ML models for predictive signals or strategy optimization. Backtest and analyze strategy performance. Provide documentation, deliverable code, and a final report. 2. Key Deliverables Data Management & Integration Scripts/Processes to fetch and organize GDFL 1-m...
We have 2 SQL databases. A source database and a target database that have completely different structures (tables, field names/types and relations). We're doing some migration planning from source to target and need some consulting...databases. A source database and a target database that have completely different structures (tables, field names/types and relations). We're doing some migration planning from source to target and need some consulting advice on how to best map tables and fields for ETL migration. We are looking for a solution that uses AI and/or ML to avoid doing this all manually the schema mapping. Please put "SCHEMA123" at the start of your proposal so we can filter out automated responses. Please list any details of your experience in doing...
We are migrating data from our current ERP system (that is stored in an SQL database) to Microsoft Dynamics Business Central Cloud. The source database structures is different than Business Central. We have been a...been able to reverse engineer both the databases and create ERD models and DDL scripts. Also we know how to create Dynamics APIs and use AL to interact with Dynamics. We are looking some consulting advice on how best migrate the data into dynamics. For example looking for ideas how to best table and field match, what software tools to use, best proccess etc for this ETL project. Please describe any experience you have with ETL/migrating data between different systems. Please add the word DYMANICS123 to the start of your proposal so we can weed out autom...
... o Data wrangling using SAC Modeler (acquired and live connections). o SAC Planning: Complex data actions, aggregations, and input schedules. o Finance Planning: Supporting budgeting, forecasting, and financial reporting processes. • Design and optimize SAP BW/4HANA models, including: o DSOs, ADSOs, Composite Providers, and InfoObjects. o Develop BW queries using BEx Query Designer. o Implement ETL pipelines for structured data loading and transformation. • Support integration between SAC and SAP BW using live and acquired connections. • Assist in system testing, debugging, and troubleshooting. • Prepare and maintain technical documentation for analytics solutions. Specific Skills Required: • Proficiency in SAC Story Design, data blending, and linked ana...
...Utilize relational databases and ORMs (e.g., SQLAlchemy) effectively for data management. Technical Skills Must-Have Skills ● Python Programming (Advanced): ○ Comprehensive knowledge of syntax, semantics, multi-threading, multi-processing, regular expressions, and exception handling. ○ Expertise in libraries such as Pandas and NumPy. ● ETLPipelines: ○ Proficiency in designing and implementing ETL pipelines for large-scale data projects. ● Object-Oriented Programming (OOP) (Intermediate): ○ Understanding of classes, objects, inheritance, polymorphism, and encapsulation. ● Relational Databases (Advanced): ○ Experience in query optimization, query building, and managing databases such as PostgreSQL, MSSQL, and Oracle. Expected Proficiency ● AWSCloudServices: ○ Fami...
I am looking for an experienced Data Integration Specialist for a 3-month contract, based in Pune or Bangalore. The ideal candidate will have over 6 years of experience, with a strong focus on Informatica PowerCenter, SQL, and Microsoft SQL Server. Key Responsibilities: - Creating and maintaining end-to-end source to target flows using Informatica PowerCenter. - Conducting data analysis and reporting, utilizing SQL expertise. - Basic Unix commands and shell scripting to support tasks. - Direct communication with users to understand and fulfill requirements. Compensation will be based on experience. Strong communication skills are essential, as the role involves independent task completion based on user requirements. Production monitoring and issue handling experience will ...
...embeddings, sentiment analysis, , RESTful APIs, OAuth2, Pinecone, PostgreSQL, MongoDB, Scraping tools, ETL pipelines, and multi-agent systems. Looking for someone passionate about AI that can work thoughtfully and independently. The first project we are working on is an automation Workflow process, based on the Dream 100 concept to identify, track, and built relationships with influential content creators. Here is an outline of the project: Please review carefully and then reply with your relevant qualifications and MORE IMPORTANTLY your specific observations on this specific project along with budget and timeline. We're looking for a long term relationship, so we will schedule meetings
Job Title: Freelance Data Engineer/Programmer for ETL Development Job Description: We are seeking an experienced Data Engineer/Programmer to develop robust ETL (Extract, Transform, Load) processes to seamlessly integrate our Warehouse Management System (WMS) with our Enterprise Resource Planning (ERP) system. This role is critical in ensuring smooth and reliable data flow between these systems to enhance operational efficiency and decision-making. Key Responsibilities: - Design, develop, and maintain ETL pipelines to connect WMS and ERP systems. - Extract data from various sources, transform it to meet business requirements, and load it into the target system or data warehouse. - Work with Python and Pandas for data manipulation and automation of workflows. - Levera...
...Description: This project focuses on designing and deploying a robust data pipeline using Delta Live Tables (DLT) on Databricks. The pipeline ensures high data quality through defined expectations within DLT or an equivalent framework. Once the pipeline is developed and validated, it will be deployed using Databricks Asset Bundles, enabling seamless and scalable management of data workflows. The project will cover the following steps: Pipeline Development with Delta Live Tables: Creating ETL pipelines to process and transform data incrementally. Data Quality Assurance: Implementing expectations to validate and monitor data at every stage of the pipeline. Deployment with Databricks Asset Bundles: Packaging and deploying the pipeline for operational use in productio...
I'm looking for a seasoned Python Engineer with extensive Flask experience to build a robust API service. Key responsibilities will include: - Web development with Flask: Your primary focus will be creating a scalable and efficient API service that meets our project's requirements. - Data Handling and ETL: While the bulk of the work will be web development, a strong capability in data handling, extraction, transformation, and loading is essential. Experience with Panda is a must. - Integration of Data Analytics: The API will need to have data analytics capabilities. Prior experience in this area will be beneficial. - Implementing Real-time Updates: The API service should be able to support real-time updates. Experience with similar features in previous projects is a plus....
I am in need of a highly skilled Senior Python Engineer to assist with the backend of a web application. The ideal candidate should have substantial experience with Python, MongoDB...Handling data processing and storage. - Developing robust APIs. Critical Skills: - Proficiency in Python and Flask. - Extensive knowledge of MongoDB and PostgreSQL. - Experience with object-oriented programming. - Understanding of scalable architecture. - Proficient in data extraction, transformation, and loading (ETL) using Pandas. This role will focus heavily on data processing and storage, as well as API development, so experience and skills in these areas are vital. The project requires a full backend development, so a full-stack developer would also be considered, but the primary focus is...
I'm looking for a freelance resume writer who specializes in tech resumes, particularly for data e...in the UK. My goal is to create a compelling resume that highlights my skills and experiences effectively for mid-level roles. Key Skills to Highlight: - Proficiency in Python and SQL - Experience with Cloud platforms like Azure - Familiarity with Databricks, PySpark, PowerBI, Snowflake, DBT cloud, Azure Data Factory, Synapse Analytics, ETL, CI/CD, Azure DevOPs, C# .Net Most Recent Job Experience: - Data Engineering The ideal candidate for this project should have: - Proven experience in writing tech resumes, particularly for data engineering roles - In-depth understanding of the data engineering field - Ability to articulate and present technical skills and experience...
I am looking for someone to mentor, train and provide project support on Talend and complex sql tasks. This is at least a 3-6 months arrangement. I would require a right person to give me a demo before we can start the project together. Confidently is an important part of the process.
Develop a comprehensive dashboard in Tableau to track and analyze sales performance across multiple regions and stores, enabling stakeholders to make data-driven decisions. Key Responsibilities: Data Integration: Gathered data from multiple sources, including SQL Server, SAP, and Excel files. Designed and optimized ETL pipelines using Alteryx to clean, transform, and load data into a centralized data warehouse. Dashboard Development: Built interactive Tableau dashboards showcasing KPIs such as total sales, profit margin, sales trends, and regional performance. Incorporated advanced features like drill-downs, filters, and trend analysis to provide actionable insights. Created custom visualizations to highlight underperforming regions and top-selling products. Data Modeling: Desi...
I am looking for an experienced ETL tester with strong skills in various ETL testing tools and methodologies. Your expertise in data warehousing and cloud platforms will be highly beneficial for this project. SHARE RESUME Key Requirements: - Proficient in ETL Testing tools like Informatica Power Center, IICS, and Informatica DVO - Familiar with Agile Scrum, Waterfall, and STLC methodologies - Experienced with cloud platforms, particularly GCP (Google Cloud Platform), AWS Cloud, and Azure Cloud - Proficient in databases such as MySQL, Oracle, SQL Server, and PostgreSQL - Comfortable working with various operating systems, including Windows XP/7/10, UNIX, and MAC OS - Skilled in SQL, PL/SQL, and Python - Experienced with test management tools like HP ALM / Qualit...
...Data Analytics and Data Science. The target trainees are absolute beginners, so the trainer must be adept at simplifying complex concepts. The Trainee should explain all the topic that are required but not subject to the one mentioned below. The Trainer should also execute end to end hands on project with the trainee. Please refer to the attached Training topic requirement. The training should cover: - Python and R - SQL and Excel - Tableau and Power BI - Data Extracting, Transforming, and Loading (ETL) processes - Large data platforms Ideal candidates are those with extensive experience dealing with corporate clients and delivering projects. A proven track record of training success is a plus. Your ability to engage with beginners and make learning enjoyable a...
I'm looking for a Senior Java Backend Application Developer, primarily focused on integration and development with Kafka. Key Responsibilities: - Spearhead integration development tasks - Potential for application development tasks Essential Skills: - Over 5...Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Proficiency with container orchestration tools like Kubernetes or Docker Swarm. - Proficiency in scripting languages such as Python, Bash, or PowerShell. - Knowledge of API management tools and practices. - Understanding of security protocols and best practices in application integration. - Experience with data transformation and ETL tools. If you have a strong background in integration development and are eager to work with Kafka, I would like to...
I am in need of an ETL engineer who can assist with transferring data from our Oracle database to a MySQL server. The project involves: - **Data Transfer**: The volume of data to be transferred is less than 10 GB. - **Frequency**: The data will be transferred on a weekly basis for one case and as and when available for other case. Your script should fire automatically to perform the transfer. - **Data Transformation**: Some level of data transformation is required during the transfer process. - Incorporate data validation checks to ensure the accuracy and integrity of the transferred data. - Implement robust error handling and logging mechanisms to capture any issues during the ETL process. - Set up and configure a scheduling mechanism to automate the weekly a...
I am in need of an ETL engineer who can assist with transferring data from our Oracle database to a MySQL server. The project involves: - **Data Transfer**: The volume of data to be transferred is less than 10 GB. - **Frequency**: The data will be transferred on a weekly basis for one case and as and when available for other case. Your script should fire automatically to perform the transfer. - **Data Transformation**: Some level of data transformation is required during the transfer process. - Incorporate data validation checks to ensure the accuracy and integrity of the transferred data. - Implement robust error handling and logging mechanisms to capture any issues during the ETL process. - Set up and configure a scheduling mechanism to automate the weekly a...
...pipeline for a project I'm currently working on. Key Project Details: - Data Type: The project primarily deals with structured data, so familiarity and expertise with processing structured datasets is crucial. - Data Storage: The processed data will be stored in a SQL database. Experience with SQL and database management is necessary. - Data Source: The primary source of the data is APIs. Proficiency in working with APIs and extracting data from them is important. Ideal Skills and Experience: - Strong background in data engineering - Proven experience in building end-to-end data pipelines - Proficient in SQL and managing SQL databases - Experienced in processing structured data - Skilled in working with APIs - Knowledge of data warehousing concepts would be ...
...configuration of Apache NiFi. Designing and managing data flow pipelines. Proficiency in NiFi Processors, such as: Data ingestion: GetFile, GetHTTP, GetKafka, etc. Data transformation: ConvertRecord, ReplaceText, etc. Data distribution: PutFile, PutKafka, PostHTTP, etc. Knowledge of NiFi Registry for version control and deployment of flow definitions. 2. Data Integration and Management: Experience with ETL (Extract, Transform, Load) processes. Understanding of data streaming and batch processing. Ability to connect to various data sources (APIs, databases, file systems, cloud storage). 3. System Administration: Proficiency in server setup and management on Linux/Windows. Knowledge of networking and firewalls for secure NiFi access. Experience with SSL/TLS for secure data transmis...