The data warehouse engineer position will play a significant role in the implementation, maintenance, and continuous improvement of enterprise data platforms. The individual will work closely with business stakeholders, software development and support teams.
KEY RESPONSIBILITIES:
- Design reporting facts and dimensions using data warehousing best practices.
- Design and build data extraction, transformation, and loading processes by writing custom data pipelines.
- Develop and operate data pipelines and data wrangling procedures using SQL and/or Python.
- Collaborate with engineers and business customers to understand business domain data needs (batch and real time), capture requirements to deliver advanced analytical solutions.
- Develop technical standards and specifications for database models, data security and data warehouse performance.
- Prepare technical documentation for current and proposed data modeling tool sets.
- Tune application and query performance using performance profiling tools and SQL.
POSITION REQUIREMENTS/QUALIFICATIONS:
- 3+ years of experience with ETL/ELT design, implementation, and maintenance.
- 3+ years of experience with data warehouse schema design and data modeling.
- Strong SQL and shell scripting expertise/Production level experience.
- Maintain data dictionary and enterprise conceptual data models.
- Experience in analyzing data to identify deliverables, gaps, and inconsistencies.
- Experience with building large scale data processing systems.
- Solid understanding of data design patterns and best practices.
- Excellent organizational, time management and communication skills.
- Maintain confidentiality of proprietary, financial, and personal data.
- Familiarity with agile software development practices.
- Proven ability and desire to mentor others in a team environment.
Preferred Qualifications:
- Experience with cloud technologies such as Google Cloud Platform (GCP), Azure and Amazon Web Services Redshift (AWS).
- Python development experience.
- Experience and knowledge of reporting platforms such as Microsoft Power BI, Looker, Tableau, Data Studio, etc.
- Experience with batch and stream processing (Apache Kafka, Confluent, Kinesis).
Education and Experience:
- Bachelor’s degree in computer science, information systems, or related field.