Data Engineer

This job has been expired
Contract

Role: Data Engineer

Location: Hybrid in Oakland, CA/Alpharetta, GA(Only Local)

Duration: 6 month

 

 

The Senior Data Engineer will design, build, and implement data integration solutions, including data pipelines, data API’s, and ETL jobs, to meet the data needs of applications, services, data assets, and business intelligence and analytical tools. Working with business users, data architects, application development teams, data analytics teams, business analytics, product managers, and the data governance COE, the Senior Data Engineer will design and develop ELT between databases, data assets, external partners, and third-party systems in a combination of cloud and on-premise platforms.

 

How will you make an impact:

·         Develops and maintains scalable data pipelines and builds out new integrations to support continuing increases in data volume and complexity

·         Designs and develops scalable ELT packages/pipelines for data extract, load and transform data between source systems, extraction, and integration of data into various data assets, including data lake and fit for purpose data repositories, both on prem and cloud

·         Collaborate with data architects to understand data structures and concepts including conceptual, logical, and dimensional models and provide early feedback

·         Designs and develops data migrations in support of enterprise application and system implementations from legacy systems

·         Writes functional specifications for data pipelines and APIs and writes and performs unit/integration tests

·         Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues

·         Assists in planning, coordinating, and executing engineering projects

·         Supports and collaborates with other Engineers through evaluation, design analysis, and development phases

·         Maintains knowledge, ensures competency and compliance with policies and procedures, to be the technical expert while collaborating with cross-functional teams

·         This list is not all-inclusive and you are expected to perform other duties as requested or assigned

 

Skills/Experience

 

·         Bachelor’s degree in Computer Science, Mathematics, Statistics, or other related technical field. 5+ years of related experience.

·         5+ years of experience in ETL tool, cloud technologies like Azure Data Factory, Informatica

·         3+ years of experience in Matillion and Snowflake.

·         Experience in data profiling, data analysis and building Data Quality metrics

·         Experience in reference data management tools like EBX. Informatica MDM or related tools.

·         Experience building ETL for moving the data from desperate source system into data lake and dimensional models

·         Expertise in SQL (No-SQL experience is a plus) against relational and cloud data structures. Experience in data virtualization, Tibco Data Virtualization preferred.

·         Experience in documenting and implementing the best practices for building data transformation and load processes

·         Knowledge of best practices and IT operations in an always-up, always-available service

·         Experience with or knowledge of Agile Software Development methodologies

·         Excellent problem solving and troubleshooting skills

·         Process oriented with great documentation skills

·         Excellent oral and written communication skills with a keen sense of customer service

Scroll to Top