SonicJobs Logo
Left arrow iconBack to search

Data Warehouse Engineer

Cadence Design Systems
Posted 2 months ago, valid for 17 days
Location

San Jose, CA 95103, US

Salary

$90,000 - $108,000 per year

info
Contract type

Full Time

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • Cadence is seeking a highly motivated Data Warehouse Engineer with at least 7 years of experience in the DW/BI space, including two end-to-end implementations.
  • The role requires expertise in SQL and Azure technologies, specifically Azure SQL and Azure Data Factory, along with strong data management and ETL skills.
  • Key responsibilities include data modeling, ETL development, designing data pipelines, and optimizing database performance to ensure data accuracy and integrity.
  • The position is located in San Jose, CA, and is an onsite opportunity that involves collaboration with business users to gather requirements and develop solutions.
  • Salary details are not provided in the job description.

At Cadence, we hire and develop leaders and innovators who want to make an impact on the world of technology.

A hardworking, highly motivated Data warehouse engineer and should have collaboration skills, ability to work across multiple domains and software disciplines to achieve results. This is an experienced and hands-on Data Warehouse developer position that requires research, analysis, design, and development of solutions per our business requirements.

To work closely with business users, collect business requirements, prepare design documents, develop ETL logic and validation scripts to ensure 100% data accuracy, design security control, and deploy and monitor solutions in

production.

Role: Data Warehouse Engineer
Location: San Jose, CA (onsite position)

Key Responsibilities
•    Data Modelling - Designing and developing data models to efficiently store and retrieve information. 
•    ETL Development - Creating and optimizing ETL processes to extract, transform, and load data from various sources ensuring data consistency and quality
•    Design, implement and maintain Data Pipelines using ADF
•    Database Design and Implementation: Designing and implementing relational and multidimensional database structures within the data warehouse. 
•    SQL Query Development: Writing efficient SQL queries, stored procedures, and functions to extract, manipulate, and analyze data. 
•    Database/ Performance Optimization: Optimizing database performance through indexing, query tuning, and other optimization techniques. 
•    Data Management: Ensuring data integrity, security, and consistency across multiple environments. 
•    Troubleshooting and Problem Solving: Troubleshooting and resolving Datawarehouse-related issues and errors.
•    Reporting and Analysis: Developing and providing reports and dashboards based on data warehouse data. 

Skills and Qualifications:
•    7+ years of experience in DW/BI space in both designing the model and building analytics. At least executed two end-to-end implementations. At least worked for min. 2 year in Azure SQL and ADF 
•    SQL / Relational Database Proficiency: Strong knowledge of SQL syntax, database design principles, and SQL Server, Oracle or other database systems. 
•    Data Management: Experience with data modeling, data warehousing, and ETL (Extract, Transform, Load) processes
•    Proficient in Infor Omni-Channel Campaign Management- Enterprise Marketing Suite – creating jobs, ETL process in Infor Admin, setting up jobs, creation of reports, creating/updating users and Groups
•    Preferred Understanding of IBM WAS (Web Application server) – node agents, manager, applications
•    Strong understanding of Azure Data Factory: Expert-level knowledge of Azure Data Factory's features, capabilities, and best practices.
•    Master in Azure SQL stack architecture (SQL Server, SQL Warehouse and Azure Data factory)
•    Understanding of data analysis principles and reporting techniques. 
•    Preferred knowledge of python, Pentaho, or shell scripting
•    Problem-solving and Analytical Skills: Ability to identify and resolve data warehouse issues and analyze complex data sets. 
•    Communication Skills: Ability to communicate technical concepts to both technical and non-technical audiences. 
•    Functional knowledge of common business processes like quote to cash, clickstream analysis, revenue, booking, billing, etc. is desirable.

Education: Bachelor’s degree in computer science, a related field or equivalent experience.

We’re doing work that matters. Help us solve what others can’t.




Learn more about this Employer on their Career Site

Apply now in a few quick clicks

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.