Web scraping using sql jobs
I need an expert in ASP.NET and SQL Server to assist with migrating my website and database from GoDaddy to a new host, Ionos.com. Need to scan for malware as well. Not sure of the number of tables but it's around 20. It's not a large dbase. Need SSL activate. Key Requirements: - Full migration of ASP.NET files and SQL Server database - Ensure all data is accurately transferred and the website functions seamlessly on the new host Skills and Experience: - Proven experience with ASP.NET - Extensive knowledge and experience with SQL Server - Familiarity with GoDaddy and Ionos hosting services - Strong troubleshooting skills to address any potential issues during migration
For our second project, we need to modify the UI of an existing GitHub repository that isn’t ours. We already have a Figma design prepared (some screenshots are attached, and detailed files will be shared during our discussions). The front-end technology stack is Vue.js. After updating the UI, the modified project should first be deployed on the developer’s own test server for evaluation. Once everything is confirmed to be functioning correctly, it will then be deployed to our server. Please ensure that the front-end and logo remain consistent, with no appearances of the old logo or name upon refresh. The GitHub repository can be found here:
I'm looking for a PHP expert to create a script or class that generates PDF/A-1a compliant files from the text stored in my database. The content of the PDF will primarily be text, with some logos and icons as images, as well as tables and bullet lists. Key Features: - The PDFs will gener...an existing PDF/A-1a document at the end. Additionally, I would like to know if it's possible also have the document exportable to Word with the same characteristics for future saving as a PDF/A-1a. Ideal candidates should be well-versed in PHP/MySql and have experience with PDF generation and compliance standards. Attached there is a simple example of a document to create as PDF/A-1a and the sql of tables with document name, index entry, text and tables used in the pdf and can be...
For our second project, we need to modify the UI of an existing GitHub repository that isn’t ours. We already have a Figma design prepared (some screenshots are attached, and detailed files will be shared during our discussions). The front-end technology stack is Vue.js. After updating the UI, the modified project should first be deployed on the developer’s own test server for evaluation. Once everything is confirmed to be functioning correctly, it will then be deployed to our server. Please ensure that the front-end and logo remain consistent, with no appearances of the old logo or name upon refresh. The GitHub repository can be found here:
...Twitter, LinkedIn) - Manage strong, targeted social media campaigns. - Oversee and manage paid advertisements. - Assist in purchasing followers to increase visibility. - Email marketing (in charge of scraping details for target markets) (also creating music and uploading it on youtube spotify and other music platforms. Incharge of creating ai generated music videos, for multiple countries (faceless YouTube channels) This project involves two app launches, each targeting a different market. Therefore, flexibility and understanding of various market segments is crucial. Ideal Skills: - Proficient in web development - Experienced in social media management - Skilled in content creation and digital marketing - Knowledgeable in managing paid adverts and purchasing followe...
I need a freelancer who can help me scrape users from specific public Telegram groups and add them to my Telegram group.
I need a professional web scraper to extract contact information from multiple pages of a website and export the data into an Excel spreadsheet. The specific contact information to be extracted includes: - Process number - Face value - Creditor's name - Creditor's CPF Ideal candidates for this project should have: - Proven experience in web scraping - Proficiency in Excel - Attention to detail to ensure all data is accurately captured.
I'm seeking a professional adept in web scraping and macro automation. The project primarily involves: - Scraping financial data from company websites. - Extracting contact details from these websites. Ideal candidates should have extensive experience in data scraping, with a strong understanding of financial data and company websites. Proficiency in macro automation is a must. Please provide examples of similar projects you've completed in your proposal.
1) Collect data, examine it, and identify any duplicates or outliers, ensuring its completeness and accuracy. 2) Create a table and organize the data in a clear and easy-to-understand format. 3) Provide statistical analysis using mean, median, mode, variance, standard deviation, and other relevant metrics. 4) Present charts that illustrate the relationships between variables 5) Present charts that help predict future trends, identify gaps, and improve performance and operational efficiency.
I'm looking for a one-time web scrape of an eBay page containing 5,500 products. The products do not have any variants, which should simplify the task. Data to be scraped: - Product name, price, category, subcategory, brand - Product description and images The final deliverable should be an Excel/CSV file containing all the scraped data. Ideal skills for this project include: - Proficient in web scraping - Experienced in data organization and Excel/CSV Please, only bid if you can deliver quality work in a timely manner. Thank you!
I'm looking for a skilled data scraper who can collect both organic and paid Google search results for me. The data will need to be delivered in CSV format. Ideal Skills and Experience: - Proficient in web scraping tools and techniques - Experience with data extraction from Google - Knowledge of SEO and understanding of organic vs paid results - Ability to deliver data in specified formats, particularly CSV - Prior experience with similar projects would be an advantage. Please include examples of your previous work with Google data scraping in your proposal.
...code for a program which needs to be completed if there are any missing pieces. 2. You'll be responsible for programming the database, which will be SQL Server. 3. Finally, the completed program will need to be uploaded on a server, where you will also run it. The code might need improvement in the following aspects: - Functionality/features - User interface - Error handling and debugging Part II: Apps Developer I need skilled apps developers to create Android and iPhone apps based on the aforementioned program. Ideal skills and experience for the job includes: - Proficiency in Dotnet programming - Extensive experience with database SQL and MySQL Servers - Strong skills in Android and iPhone app development - Able to improve code functionality and user interface...
I need a seasoned Octoparse 8 expert who can create various bots for scraping different E-commerce stores daily. The data will be used to stock my own Wordpress store, so accuracy and timely updates are crucial. Requirements: - Develop different bots for different product categories, including Electronics, Home and Garden, Kitchen, and Makeup. - Scrape specific data fields such as product title and description, price and availability. - Manage scraped data to include correct data fields and images The ideal candidate should have: - Proven experience with Octoparse 8 - Previous work in E-commerce data scraping - Familiarity with Wordpress store setup - Attention to detail for data accuracy - Ability to deliver on a daily schedule.
I need column AB from this list created - email addresses for the people in each row
I'm in need of a reliable and efficient web scraper skilled in Selenium. The task will be to scrape an e-commerce site for clothing information on a daily basis. Key Responsibilities: - Scraping data on a daily basis - Capturing product name and price, product images, and product descriptions Ideal Skills: - Proficiency in Selenium - Experience in web scraping, particularly from e-commerce sites - Ability to deliver consistent, reliable data on a daily basis
Scrape data from an auto part website and create CSV. You will also get images without copyright. Project must be in Shopify standards and be ready to upload. Scraping may be over 15k sku's
I'm seeking a skilled data scraper who can gather product descriptions and pricing information from a specified auto parts website. Key Requirements: - Scraping data for more than 2000 products - Creating a well-structured CSV file - Formatting product descriptions in plain text Ideal Skills: - Proficient in web scraping tools and techniques - Experienced in data organization and CSV creation - Attention to detail to ensure accurate data collection Please only apply if you have previous experience with web scraping and can provide examples of similar projects.
I am looking for a Python expert who can help me automate sending direct messages on Instagram and other social platforms. Ideal Skills: - Proficient in Python - Experience with Instagram's API or similar automation tools - Understanding of Python libraries for web scraping and data handling Key Requirements: - Create a script that can send pre-defined messages to a list of users at specified intervals to avoid being flagged as spam. - Ensure the script can handle potential errors, such as users with private accounts or those who have blocked the sender. - Provide a user-friendly guide for running the script on a local machine. Please note, adherence to Instagram's terms of service is a must. Your proposal should highlight any relevant projects you've compl...
...possible, with minimal overhead on the database. I will provide an Excel spreadsheet with simple formulas required to replicate the logic. The extension involves implementing approximately 9 additional calculations and updating a different table using simple IF statements based on the calculated values. Key Requirements: • Extend the existing MySQL cursor, which already has a loop set up for fetching values. • Implement about 9 additional calculations based on simple formulas provided in an Excel spreadsheet. • Update a different table using simple IF conditions based on the calculated values. • Ensure the solution is optimized for minimal database overhead and runs as efficiently as possible. • Deliver the source code with minimal documenta...
I need an experienced creator for a 5-minute 4K 3D animation video of the Airbus A350. This video should include scenes of passenger boarding, takeoff, and landing. Key Requirements: - High level of detail and accurate engineering representation. - Inclusion of a modern/contemporary background score. - Seamless integration of sound effects and a professional voiceover. Deliverables: - A 5-minute 4K video. - Source files. - A draft for review. - Step-by-step documentation of the generative AI process used. Ideal Skills: - Proficient in 3D engineering animation. - Strong understanding of the Airbus A350. - Excellent audio-visual integration skills. - Experience with creating detailed source files and documentation.
I'm looking for a skilled database programmer to create a stored procedure for data retrieval across five tables. This procedure should handle complex joins and maintain foreign key relationships. Key Requirem...skilled database programmer to create a stored procedure for data retrieval across five tables. This procedure should handle complex joins and maintain foreign key relationships. Key Requirements: - Design a stored procedure primarily for data retrieval. - Proficient in managing joins across multiple tables. - Understands and can maintain foreign key relationships. Ideal Skills: - Proficiency in SQL and experience with stored procedures. - Strong understanding of database management and relational databases. - Excellent problem-solving skills to handle complex data r...
...identify profitable wallets worth copying, based on reliable data metrics such as Profit and Loss (PnL), win rate, and performance over weekly/monthly timeframes. The current implementation is plagued by inaccurate or false data, resulting in wallets with near-zero balances being reported as highly profitable. Despite trying multiple APIs (Solana public API, GMGN API, and BirdEye API) and even scraping data from relevant websites, the results have been inconsistent and unreliable. This project requires a solution that not only filters out fake or non-credible wallets but also provides a robust mechanism to analyze and verify real-world wallet performance. Key Objectives: 1. Data Accuracy: - The output data must accurately reflect the true wallet performance on th...
I'm in need of an AI agent tailored for market research purposes. This agent will be responsible for scraping specific product pricing from Australian supermarkets, focusing on household items. The data collected should encompass a comprehensive set of product details including price, promotional price, weight, dimensions, colour, and additional product information. Ideal Skills: - AI/ML Development - Web Scraping Expertise - Data Analysis - Market Research Understanding The AI should collect this data on a monthly basis. Prior experience in developing similar projects would be highly regarded, as well as a deep understanding of market research methodologies and requirements.
I'm in need of a proficient freelancer or a professional agency capable of curating and delivering a detailed database of 7,000 US companies. Each company pr...Owner email, Founder email etc.) along with the website. Key Requirements: 1. All data must be current and verified for precision. 2. Data must be sourced from credible platforms to ensure uniqueness, eliminating any duplication. 3. Final data should be presented in an orderly Excel or Google Sheets format. 4. Timely delivery within 7 days from the project start date. 5. Proficient use of data scraping and verification tools and platforms. Please ensure to read the project description carefully before bidding. If you're not in agreement with the budget or are unable to fulfill the requirements, kindly refrain from...
...scripts using AWS Lambda to fetch data from APIs. Handle API rate limits, errors, and retry mechanisms. Store raw data temporarily in Amazon S3. Data Parsing Clean and structure raw data for further processing. Ensure data validity and store parsed data in S3. Data Transformation Enrich and reshape parsed data for analysis or database upload. Store intermediate results in S3 for staging. Data Upload to SQL Database Save transformed data into Amazon RDS (PostgreSQL/MySQL) with optimized database inserts. Ensure data integrity and performance. Data Analysis and SQL Transformations Execute SQL procedures and queries in RDS for additional analysis and derived insights. Store final results in the database for website access. RESTful API Development Build a se...
I need a skilled programmer to automate login on a specific website. The primary goal is to extract data, with the volume of requests ranging from 20,000 to 100,000 per week. Key Requirements: - The automation script must be able to interact with a website. - The automation must be capable of bypassing DataDoom CAPTCHA challenges. Ideal Skills: - Proficient in web scraping and automation tools - Experience dealing with CAPTCHA bypass solutions. - Understanding of ethical and legal considerations in data extraction. The goal is to extract data systematically and efficiently. Please ensure your proposal outlines your relevant experience and approach to this project.
I'm looking for a quick Python script that can help me automate the reading and databasing of LAS file format headers using the LASIO library. The product tables should be exported in csv format. - Read a library of LAS files - Ingest and export CSV files The task is straightforward: there's no need for specific processing or data transformations on the files before exporting to CSV. The ideal freelancer for this job should have: - Proficiency in Python, particularly in file handling and CSV manipulation - Experience with automating tasks using Python scripts - Ability to deliver a simple and effective script in a short timeframe.
...process of registering vehicle license plates for visitors at our parking platform. Currently, the process is manual, and we would like to integrate voice-to-text and SMS-based automation to streamline registrations and improve customer experience. Key Responsibilities: Voice-to-Text Integration: Set up a system that can handle incoming calls, convert the spoken license plate information into text (using Twilio Speech Recognition), and automatically register the details in our backend system. Send an automated confirmation SMS or email to the visitor after successful registration. SMS Registration Automation: Integrate Twilio’s SMS API to automatically process incoming SMS with the license plate number. Parse the SMS content to extract the license plate number and valida...
I need an expert in data scraping to create Parsehub templates for extracting product information from a retailer's website. The specific product details I want to be captured include: - Product names and prices - Product descriptions and reviews - Product images and specifications - Product specs The data will be used for catalog comparison. Ideal skills for this job include: - Proficiency in using Parsehub - Experience in scraping data from e-commerce websites - Knowledge in handling product-related data - Ability to create effective templates for recurring tasks Please bid only if you have the relevant experience and skills. Thank you.
I'm looking for a skilled data scraper who can extract information from exhibitor profiles on a trade website. The specific information I need includes: - State and Local Destination Marketing Orga...information I need includes: - State and Local Destination Marketing Organisations Name Contact details (primarily email addresses and phone numbers) I would prefer freelancers with prior experience in web scraping, and who are familiar with data extraction software. Please include examples of similar projects you've completed in your bid. Confirm price and only apply if you can complete by monday 9am GMT
I'm in need of a proficient data collector who specializes in web scraping. The primary focus will be on gathering market data from various e-commerce sites. Target Audience: Ages 18 to 35 years old Office job professionals Followers of global and economic news Interested in cryptocurrencies and trading Previously or currently involved in online work Ideal Skills and Experience: - Extensive experience in web scraping, particularly from e-commerce platforms - Strong understanding of market data and its significance - Proficiency in data analysis and interpretation - Familiarity with data collection tools and software - Ability to deliver accurate and timely data I need the Name and Email just Please provide examples of similar projects you've ...
...with new design) in terms of colors and modern aesthetics, with an emphasis on ease of navigation and responsiveness. old web site: figma design: Key Requirements: - All components from the old site must be retained and linked to the same DB, but with a new identity. - The design must be compatible with the Figma design and implemented using .NET framework 8, SQL server DB, JavaScript, and AJAX (in search). - Two language versions are required: Arabic and English. - All website functionalities
Please see attached image "" in the brief. We need a similar graphic design using our TWO logos. So we will need two different files for this. Red Devil GI Block Our colors are Scarlet (Red) and Grey On the Red Devil Version, please put GROSSE ILE right under the graphic like in sample2 Please take the Red Devil logo and make a similar graphic to the sample 1. Then take the GI block letter logo and make a similar graphic to the sample1 photo. The mascot should not be a lion on the submissions Additional note: Graphic 1: It should be a devil mascot with the “THEY NOT LIKE US” Graphic 2: Then a GI Block Logo with “THEY NOT LIKE US”
I'm looking for a proficient Terraform developer to create a script for Google Cloud. We need the script to create these items 1. networking 2. GKE cluster with one node 3. permanent volume and claim for GKE 4. Postgres SQL service - needs to be accessible from GKE 5. API Gateway - if API URL is /example1 > run a lambda function - otherwise forward API request to /endpoint/{incoming URL) (a service running in GKE) 6. IAM Role Binding for Cluster Access
...evident. Please apply only if you meet ALL the requirements below. Tech Requirements 1. Fluent English (native speaker preferred). 2. Minimum 4 years’ experience** with PHP/Symfony, including developing complex websites and backend booking systems. 3. Proven experience with **booking systems** involving calculations, GPS tracking, and dynamic graphical displays. 4. Advanced knowledge of **scraping technologies**, including techniques to bypass IP blocking and Captcha. 5. **Independent developer**: I expect you to work solely with me (no outsourcing or staff substitutions). Role Expectations** - Collaborative and technically engaging work environment. - No urgent deadlines—focus is on high-quality code -Knowledge sharing: I expect to learn from you...
I need an existing PCB design re-created from provided Gerber files. The project involves a PCB wi...Gerber files. The project involves a PCB with 220x LEDs. You will be supplied with: - Gerber, component position, and BOM files of the design - A reference design to emulate The expected output is: - Comprehensive KiCAD files (including schematics and PCB design files) - The original design with added appropriate heatsink pads Ideal skills and experience for this job include: - Proficiency in using KiCAD - Experience in PCB design - Ability to incorporate heatsink pads based on thermal requirements Please note, while you will have access to a reference design, I expect you to use your best judgment and expertise to create a functional and effective PCB design. Duration to comple...
I'm seeking a developer who can create a Google Chrome web extension to scrape mobile numbers from various sources such as web pages, social media profiles, and databases. The extension should also be equipped with the following features: - Data Export: The ability to export the scraped data to a CSV file. - Automated Data Cleaning: The extension should have a feature for automating the cleaning of the scraped data. Ideal candidates should have prior experience in developing web extensions and should be adept at data scraping. Knowledge of data cleaning techniques will be an added advantage.
I'm in urgent need of a cyber security specialist who can perform thorough penetration testing on my React web application. The primary goal is to identify any potential security vulnerabilities, particularly: - Cross-Site Scripting (XSS) - SQL Injection - Cross-Site Request Forgery (CSRF) Your expertise in assessing these specific types of vulnerabilities will be crucial. The ideal candidate would also have experience in Quality Assurance, particularly with automated testing tools such as Selenium and Cypress. Your role will not only involve identifying security vulnerabilities, but also ensuring the overall code quality through automated testing. Please include in your proposal your relevant experience and any examples of previous work that may be applicable. Tha...
I'm looking for an expert in Human-Computer Interaction (HCI) and Raspberry Pi development to create a gesture recognition system that interprets hand movements for controlling dev...(HCI) and Raspberry Pi development to create a gesture recognition system that interprets hand movements for controlling devices. Key Requirements: - Develop a gesture recognition system on the Raspberry Pi 4b. - System should accurately recognize and interpret hand movements and body movements - Integrate functionality to control various devices through hand gestures. - Development of system using AI and ML Ideal Skills: - Proficient in Raspberry Pi programming. - Extensive experience in developing HCI systems. - Knowledge in gesture recognition technologies. - Strong background in interactive ...
I need a freelancer to set up a daily backup of my Microsoft SQL Server database from a dedicated Windows server to S3 AWS Storage. The backup files need to be compressed. Ideal Skills and Experience: - Proficiency in Microsoft SQL Server - Experience with AWS S3 - Knowledge of Windows Server - SQL database management - Data compression techniques
I'm in need of a proficient web developer who can create a web scraping bot for me. The bot should scrape text data from multiple websites and store it in an HTML file. Key Requirements: - The bot should only scrape text, not images. - It should be capable of extracting data from various websites. - The bot's operation is a one-time task, so it should be efficient and able to complete the task without supervision. Ideal Skills: - Strong experience in web scraping and bot development. - Proficiency in HTML and understanding of its structure. - Ability to work with multiple websites and understand their layout. I'm looking for someone who can deliver a reliable and effective scraping bot. Please include examples of similar projects...
Critical requirements : Open AI, Embedding, Vector databases, Scraping *** If you are interested you must be able to show me a previous related (Open AI embedding) project in your portolfio. Please send me a notification with a link to this project**** General Concept We are building a data-driven content creation system for our agency / clients. The project involves extracting content from client websites, structuring it into JSON packets, tagging it with metadata via a user-friendly web interface, and using OpenAI's embedding with a vector database and GPT APIs to retrieve and generate new content. This will be a new independent prorotype project to get the integrations and workflow set up and working correctly. Full-Stack Developer (Preferred One-Dev Soluti...
...warehousing. The project involves integrating various data sources, primarily AWS S3 and SQL databases, with Snowflake. Key Responsibilities: - Data Warehousing: You'll be primarily focusing on the data warehousing aspect of Snowflake. - Data Integration: Integrating AWS S3, SQL databases, DBT, and Snowpark with Snowflake. - Data Transformation: Your main use of dbt will be for data transformation. Ideal Skills: - Proficient in Snowflake, dbt, and AWS. - Experience with data warehousing and integration. - Hands-on experience with live projects using these technologies. - Strong understanding of data transformation processes. Successful candidates will have previous experience working on live projects using all these technologies. Please include examp...
I need a Python Data Analyst skilled in processing data from an SQL database and conducting descriptive statistics.
...background in web scraping and web development, with proficiency in various Python libraries and frameworks such as BeautifulSoup, Requests, Scrapy, and Selenium WebDriver. Responsibilities: -- Build and maintain automated web scraping solutions using Python and Selenium. -- Scrape data from websites and efficiently process it. -- Develop and integrate databases with frontend applications. -- Work closely with the frontend team to ensure seamless integration and functionality. -- Collaborate in building and maintaining the business’s web-based systems. -- Work on web development projects using Python and associated tools. -- Ensure code is scalable, maintainable, and efficient. Required Skills: -- Strong experience i...
I need assistance with manual data entry into an SQL database. The project involves working with database entries, so experience with SQL and data handling is essential. Ideal Skills: - Proficient in SQL - Manual data entry experience - Attention to detail - Data handling skills - Able to work independently and meet deadlines
I'm in need of a Python expert who can help scrape data from various E-commerce websites. The primary focus will be on gathering product details and pricing information on a weekly basis. Key responsibilities: - Scrape product details and pricing information from E-commerce websites - Perform this data extraction on a weekly schedule Ideal skills: - Proficiency in Python and web scraping tools - Experience with data extraction from E-commerce websites - Ability to work on a consistent schedule
I'm looking for a professional who can help me build a database by scraping sales data from a password protected website. I attached an example. I need the dashboard to self populate with the data that is held in the "Clearview" and "TimInsights" tabs. The data in both of these tabs need to be screen scraped from it's respective websites. Once the data is created, i want it to be able to create automated powerpoint presentation with respective data. Key Requirements: - Proficient in screen scraping techniques - Experience with building and managing databases - Ability to handle data from password protected sites I have the necessary login credentials for the website. The data needs to be scraped on a weekly basis. Please reach out if you have ...
I need a Python-based web scraper to collect comprehensive product information from various Australian websites including , , , , and davidjones.com.au. The specific data points I need include: - Product names and prices - Product descriptions and reviews - Product availability and stock levels - Images The scraped data should be delivered in a JSON format on a daily basis. Ideal skills and experience for this project: - Proficiency in Python, particularly in web scraping libraries such as BeautifulSoup and Scrapy - Experience in scraping e-commerce sites - Knowledge of JSON and data structuring - Ability to set up a daily automated scraping task Please provide examples of similar projects you've completed in your proposal.