SonicJobs Logo
Left arrow iconBack to search

Freelance Data Scraping Engineer (Python)

Mindrift
Posted 5 days ago, valid for 14 days
Location

New York, NY 10008, US

Salary

$32 per hour

Contract type

Part Time

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • Mindrift is seeking Python Data Scraping Engineers for the Tendem project, focusing on specialized data scraping workflows within a hybrid AI and human system.
  • Candidates should have a minimum of 3 years of relevant experience in data engineering, web scraping, automation, or software development.
  • This part-time remote freelance role offers compensation of up to $32 per hour, depending on contribution level and expertise.
  • Key responsibilities include managing end-to-end data extraction workflows and ensuring data quality through validation and verification processes.
  • The position requires strong Python web scraping skills and offers the opportunity to work flexibly while contributing to innovative AI projects.

Mindrift is looking for highly skilled Python Data Scraping Engineers to join the Tendem project and drive specialized data scraping workflows within our hybrid AI + human system.

In this role, as an AI Pilot – that’s how we refer to this role at Mindrift – you’ll collaborate with Tendem Agents that handle repetitive tasks, while you provide critical thinking, domain expertise, and quality control to deliver accurate and actionable results.

This part-time remote opportunity is ideal for technical professionals with hands-on experience in web scraping, data extraction and processing.

What We Do

The Mindrift platform connects specialists with AI projects from major tech innovators. Our mission is to unlock the potential of Generative AI by tapping into real-world expertise from across the globe.

About the Role

This is a freelance role for a Tendem project. As a Python Data Scraping Engineer, you'll handle data scraping tasks requiring technical precision for web extraction and processing, utilizing various tools such as our provided Apify and OpenRouter alongside your own resourceful approaches.

Key Responsibilities

  • Own end-to-end data extraction workflows across complex websites, ensuring complete coverage, accuracy, and reliable delivery of structured datasets.
  • Leverage internal tools (Apify, OpenRouter) alongside custom workflows to accelerate data collection, validation, and task execution while meeting defined requirements.
  • Ensure reliable extraction from dynamic and interactive web sources, adapting approaches as needed to handle JavaScript-rendered content and changing site behavior.
  • Enforce data quality standards through validation checks, cross-source consistency controls, adherence to formatting specifications, and systematic verification prior to delivery.
  • Scale scraping operations for large datasets using efficient batching or parallelization, monitor failures, and maintain stability against minor site structure changes.

Compensation

On this project, contributors can earn up to $32 per hour equivalent, depending on their level and pace of contribution.

Compensation varies across projects depending on scope, complexity, and required expertise. Please note that other projects on the platform may offer different earning levels based on their requirements.

How to get started

Simply apply to this post, qualify, and get the chance to contribute to projects that match your technical skills, on your own schedule. From coding and automation to fine-tuning AI outputs, you’ll play a key role in advancing AI capabilities and real-world applications.

  • At least 3 year of relevant experience in data engineering, web scraping, automation, or software development (required).
  • Bachelor's or Master’s Degree in Engineering, Applied Mathematics, Computer Science, or related technical fields is a plus.
  • Strong experience in Python web scraping (BeautifulSoup, Selenium or similar), including dynamic content (JS, AJAX, infinite scroll) and APIs via proxies.
  • Proven ability to extract data from complex structures (hierarchies, archived pages, inconsistent HTML).
  • Solid background in data cleaning, normalization, and validation, delivering structured datasets (CSV, JSON, Google Sheets).
  • Hands-on experience with LLMs and AI frameworks to enhance automation and problem-solving.
  • Strong attention to detail and commitment to data accuracy.
  • Self-directed work ethic with ability to troubleshoot independently.
  • A link to GitHub is a plus.
  • English proficiency: Upper-intermediate (B2) or above (required).

Why this freelance opportunity might be a great fit for you?

  • Work fully remote on your own schedule with just a laptop and stable internet connection.
  • Gain hands-on experience in a unique hybrid environment where human expertise and AI agents collaborate seamlessly — a distinctive skill set in a rapidly growing field.
  • Participate in performance-based bonus programs that reward high-quality work and consistent delivery.



Learn more about this Employer on their Career Site

Apply now in a few quick clicks

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.