SonicJobs Logo
Left arrow iconBack to search

Cloud Data Engineer

RadNet
Posted 19 days ago, valid for 15 days
Location

Owings Mills, MD 21117, US

Salary

$90,000 - $125,000 per year

Contract type

Full Time

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • RadNet is seeking a Cloud Data Engineer with over 5 years of data engineering experience, particularly in GCP.
  • The role involves developing cloud ELT pipelines, optimizing BigQuery, and implementing data quality measures using dbt.
  • Candidates should have strong skills in Python and SQL, along with hands-on experience with BigQuery, Dataflow, and Pub/Sub.
  • The position offers a salary range of $90,000.00 to $125,000.00 per year and includes comprehensive benefits such as medical, dental, and vision coverage.
  • A passion for patient care and effective communication skills are essential for success in this role.
Responsibilities

Artificial Intelligence; Advanced Technology; The very best in patient care.  With decades of expertise, RadNet is Leading Radiology Forward. With dynamic cross-training and advancement opportunities in a team-focused environment, the core of RadNet’s success is its people with the commitment to a better healthcare experience. When you join RadNet as a Cloud Data Engineer, you will be joining a dedicated team of professionals who deliver quality, value, and access in the 21st century and align all stakeholders- patients, providers, payors, and regulators to achieve the best clinical outcomes.

 

You Will:

  • Own end-to-end development of cloud ELT pipelines — from CDC/API ingestion (Airbyte) and batch loads (Cloud Build/Composer) into BigQuery landing tables to dbt transformations that publish curated Silver/Gold datasets.
  • Design and document Medallion-style models (Bronze → Silver → Gold) in dbt with clear naming, sources, exposures, and owners; maintain dbt project structure, packages, and environments.
  • Implement data quality with dbt tests (generic/custom), freshness checks, and documentation (dbt docs); publish artifacts to the Analytics Hub/catalog for discoverability.
  • Optimize BigQuery for dbt: partitioning, clustering, materializations (incremental/merge), cost controls (slots/quotas), job monitoring, and query performance tuning.
  • Support streaming use cases by landing real time data via Pub/Sub with BigQuery subscriptions (or Dataflow/Beam where required) and shaping near real time models with dbt incremental strategies.
  • Build orchestration and CI/CD for dbt using dbt Cloud or dbt Core (Cloud Build/Composer), with code review, automated tests, and artifact promotion across Dev/Test/Prod.
  • Partner with BI and AI/ML teams to expose trusted datasets and features; publish contract backed schemas and semantic conventions aligned to enterprise KPIs.
  • Migrate and reconcile legacy on prem pipelines (e.g., SQL Server CDC) into GCP; validate row level fidelity and handle late arriving and schema evolution scenarios.
  • Implement security, privacy, and governance (IAM, CMEK, BQ row/object level security, HIPAA); contribute to auditable data lineage (dbt exposures + warehouse lineage).
  • Establish monitoring/alerting for pipeline reliability (Cloud Monitoring/Logging), with SLAs/SLOs, retries/backfills, and incident runbooks; participate in on call as needed.

If You Are:

  • Passionate about patient care and exercise sound judgement and an ability to remain professional in all situations.
  • You demonstrate effective and professional communication, interpersonal skills and respect with patients, guests & colleagues.
  • You have a structured work-approach, understand complex problems and you are able to prioritize work in a fast-paced environment.

To Ensure Success in This Role, You Must Have:

  • 5+ years in data engineering with GCP experience.
  • Strong Python and SQL.
  • Hands-on with BigQuery, Dataflow, Pub/Sub, and Dataproc.
  • Experience with dbt/Airflow/Composer for orchestration.
  • Experience integrating APIs and SaaS sources using Airbyte.
  • Knowledge of data formats (Parquet, Avro, JSON, Delta Lake).
  • Experience with MongoDB Atlas or Neo4j AuraDB preferred.
  • Familiarity with vector search or graph analytics preferred.
  • Healthcare data experience (HL7, FHIR) preferred.

We Offer:

  • Comprehensive Medical, Dental and Vision coverages.
  • Health Savings Accounts with employer funding.
  • Wellness dollars
  • 401(k) Employer Match
  • Free services at any of our imaging centers for you and your immediate family.

 

Pay Range: $90,000.00 – $125,000.00 per year




Learn more about this Employer on their Career Site

Apply now in a few quick clicks

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.