Apply now »

Senior Data Engineer

Position Summary

This Senior Data Solutions Engineer builds, manages, and optimizes reusable enterprise data pipelines for onboard systems. They use technical and analytical skills to solve business problems for onboard Cruise Products while ensuring data governance and data security compliance. Senior Engineers also mentor junior engineers in finding optimal and efficient solutions for designing, preparing, and storing data for analytical and operational use cases.   


Essential Duties and Responsibilities 

  • Participate in requirements gathering, data mapping, and designing while considering the interconnectivity of systems onboard 
  • Build and maintain data pipelines from disparate sources that meet functional and non-functional business requirements 
  • Create, maintain, and reuse existing ETL processes, employing a variety of data integration and data preparation tools 
  • Identify, design, and implement internal process improvements, i.e., automating manual processes, optimizing data delivery, re-designing pipelines for greater scalability, etc.
  • Work with stakeholders, including Product, Data, and Business teams, to assist with data-related technical issues and support their data needs 
  • Create datasets for operational reports, KPIs/metrics, analytics, and data science to uncover insights helping the business make objective decisions and gain a competitive edge 
  • Write, debug, and implement complex queries involving multiple tables or databases across platforms 
  • Collaborate with the Enterprise Architecture team to ensure alignment on data standards and processes 
  • Work with data and analytics experts to strive for greater functionality in data systems  
  • On-call and off-hours support required 
  • Create and maintain technical design documentation  


Qualifications, Knowledge, and Skills 

  • Bachelor of Science in Computer Science, Information Technology, or related field 
  • 5+ years of experience in data/cloud engineering, working/creating datasets for a data warehouse, and ETL development tools (Azure Data Factory, Databricks, SQL, and Python) 
  • Experience using best practices in designing, building, and managing data pipelines that require data transformations as well as metadata and workload management 
  • Experienced in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using traditional and new data integration technologies (such as ETL, ELT, data replication, change data captures, message-oriented data movement, API design, stream data integration, and data virtualization) 
  • Experienced in performing root cause analysis on internal and external data and processes to identify issues and opportunities for improvement 
  • Expert-level knowledge with programming languages including Python, SQL, PL/SQL, T-SQL, and/or relational SQL databases such as Oracle and SQL Server 
  • NoSQL database experience is a plus
  • Proven ability to collaborate with technical peers to best support cross-functional teams in a dynamic environment  
  • Capable of working independently
  • Strive to help guide junior engineers  
  • Experience with continuous integration and continuous deployment practices 
  • Ability to approach complex problems with creativity and display analytical and problem-solving skills 
  • Display curiosity in understanding the data for the specific area of responsibility 
  • Knowledge and experience working with agile methodologies and tools (such as Jira) are a plus 
  • Familiarity with onboard system vendor data models such as Fidelio, BriefCam, SAIA, Silverwhere, Xcontrol, Tritan, Hybris 


Nearest Major Market: Miami

Apply now »