Duration: Long Term Contract
Only W2
Work Arrangement:
- Hybrid: 3 days onsite at the American Express Tower, New York, NY.
- Design, develop, and deploy ETL pipelines using Python from scratch.
- Write efficient, reusable, and modular code adhering to OOP principles.
- Design and optimize data models including schemas, entity relationships, and transformations.
- Develop and analyze SQL queries across multiple RDBMS systems (SQL Server, DB2, Oracle).
- Work with data warehouses such as BigQuery and Databricks Delta Lakehouse for ingestion, cleansing, governance, and reporting.
- Create, manage, and monitor Airflow DAGs for scheduling and workflow automation.
- Collaborate with cross-functional teams to ensure data consistency, accuracy, and accessibility.
- Troubleshoot data issues and optimize ETL performance.
- 6+ years of experience in Data Engineering, ETL Development, and Data Warehousing.
- Strong programming skills in Python with proven experience in ETL pipeline design.
- Expertise in SQL, RDBMS (MS SQL Server, DB2, Oracle).
- Hands-on experience with Airflow for orchestrating workflows.
- Solid understanding of OOP concepts and data modeling principles.
- Experience with BigQuery, Databricks Delta Lakehouse, or similar data warehouse technologies.
- Familiarity with Spark is a strong plus.
- Experience with IBM Apptio platform and cost management data workflows.
- Knowledge of Node.js for integration scripting or data API handling.
Flexible work from home options available.
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.
Learn more about this Employer on their Career Site
