Job Description
Our Business Technology Data Services team is currently looking for a Sr. Data Engineer that enables self-service business intelligence and advanced analytics teams (Marketing, Sales, Finance and Services) to explore outliers, build scalable data lakehouse and data vault systems that differentiate us from our competition. You will work closely with other team members like enterprise architects, domain solution architects, technical leads, and business analysts to understand what the business is trying to achieve, move data from source to target, and craft efficient data models. This hands-on technical role demands excellent knowledge and can demonstrate standard processes in the industry. An ideal candidate will have extensive knowledge of the modern data platforms (data lakehouse, data vault, data mesh) and should be able to craft database solutions using the latest tools and Open-source frameworks. Basic Qualifications: 8+ years of experience building scalable data solutions for very large scale Data Lakehouse, Data Warehouse, Data Quality, Data Analytics/BI, Data Enrichment, Data Integration/ETL, Data Security, Data Governance and Data Engineering Projects. 8+ years of experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business. 5+ years experience working with Snowflake cloud data warehouse and/or Databricks lake houses. 5+ years of experience in one or more programming languages for processing of large data sets, such as Python and/or Scala. Other Qualifications: Build reliable, efficient, testable, & maintainable data pipelines. Design and Develop data pipelines using Metadata driven ETL Tools and Open-source data processing frameworks. Solid understanding of database structure systems and data mining. Ability to build enterprise data models, star and snowflake schemas for data consumption. Experience with SQL and NoSQL databases. Should be proficient in writing advanced SQLs, Expertise in performance tuning of SQLs. Expertise in working in a high paced Agile team with excellent verbal communication and collaboration skills. Should be familiar with Jira and other collaboration tools. Ability to mentor, guide, and lead engineers in the team. Excellent organizational and analytical abilities. Experience with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation) for automating infrastructure provisioning and management. Hands-on experience with source version control, continuous integration and experience with release/change management delivery tools. Experience developing low latency data processing solutions like AWS Kinesis, Apache Kafka, Apache Spark Stream processing and other Data Integration tools. Bachelor’s degree or higher in Computer Science or Engineering or related.