SonicJobs Logo
Left arrow iconBack to search

Database Developer

Greater Kansas City Community Foundation
Posted 4 days ago, valid for 16 days
Location

Kansas City, MO 64188, US

Salary

Competitive

Contract type

Full Time

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • The Database Developer position involves designing, building, and maintaining data integration pipelines for enterprise reporting and analytics.
  • Candidates should have 3-6 years of relevant experience and a bachelor's degree, with an emphasis on ETL/ELT processes and data pipeline management.
  • This full-time, exempt role offers opportunities for remote work after a successful training period and requires local residency in Kansas City, MO.
  • The salary for this position is competitive, commensurate with experience, and specific figures may be provided during the interview process.
  • Preferred qualifications include experience with Azure Data Factory and familiarity with cloud data warehouses or analytics platforms.

Description

The Database Developer is responsible for designing, building, and maintaining reliable data integration pipelines that power enterprise reporting, analytics, and operational systems. This role focuses on the consistent and secure movement of data across applications, warehouses, and cloud platforms.

The Database Developer works across the organization to provide reliable, analytics-ready data that informs effective decision-making. This is a full-time, exempt, salaried position reporting to the Director of Data Technologies. Candidates must be local to Kansas City, MO, and after a successful training period, there are opportunities to work remotely.

Requirements

Data Pipelines & Integrations

  • Design, build, and maintain scalable batch and near-real-time data pipelines.
  • Develop integrations using tools such as Azure Data Factory, Fabric Data Factory, SSIS, or equivalent technologies.
  • Maintain clear, well-documented data processes that ensure secure, reliable data delivery. 
  • Ensure pipelines are secure, reliable, and well-documented.

ETL / ELT Development

  • Develop structured ETL and ELT processes that support data warehouse models and downstream analytics.
  • Partner with staff to ensure data structures align with reporting and semantic model needs.

Scheduling & Automation

  • Manage orchestration, scheduling, and dependencies across data workflows.
  • Implement automation to improve reliability, monitoring, and recovery from failures.

AI Platform Readiness

  • Partner with leadership to evaluate, govern, and leverage AI-enabled capabilities within the data platform ecosystem. 
  • Leverage AI-assisted tooling to improve platform reliability and operations efficiencies. 

Data Reliability & Monitoring

  • Monitor pipeline execution and proactively identify failure patterns.
  • Implement improvements to increase resiliency, observability, and operational predictability.

Source System Integration

  • Work with application owners (e.g., financial systems, CRM platforms) to understand data structures, APIs, and integration requirements.
  • Maintain clear documentation for data flows, lineage, integration logic, and operational processes.
  • Support ingestion of files, APIs, JSON/XML payloads, and database sources.

Education & Experience

  • At least 3-6 years of related experience with a bachelor's degree. An equivalent combination of education and experience will be considered.

Required Technical Background

  • Hands-on experience developing and supporting ETL/ELT pipelines in Azure, SQL Server, or comparable data environments.
  • Strong SQL skills with the ability to troubleshoot data quality, transformation, and performance issues.
  • Experience integrating data from APIs, SaaS platforms, files, and relational databases (including JSON and XML payloads).
  • Experience managing job orchestration, scheduling, documentation, retries, and failure of recovery for data workflows. 
  • Familiarity with version control (Git) and deployment practices for data pipelines and integration code. 
  • Ability to monitor pipeline execution and proactively identify and remediate reliability issues. 

Preferred Background

  • Experience with Azure Data Factory, Fabric Data Factory, SSIS, or similar tools.
  • Exposure to cloud data warehouses or analytics platforms.
  • Experience working in regulated, audit-conscious, or highly governed environments.

Physical Requirements

  • Office & Computer Work: Ability to work regularly at a computer terminal in a fast-paced environment with frequent interruptions.
  • Noise & Communication: Able to work in an office with moderate noise levels. Ability to communicate and interpret detailed information effectively.

This job description serves as a summary of the employment-at-will relationship and is not a contract. Responsibilities may evolve, and other duties may be assigned as needed.




Learn more about this Employer on their Career Site

Apply now in a few quick clicks

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.