******* P&C Insurance Experience Is Required For This Role********
About the Role
The Senior Data Engineer will play a key role in designing, implementing, and optimizing Mission’s data infrastructure as part of our modern data platform initiative. This hands-on engineering role will focus on building scalable data pipelines, enabling a centralized enterprise data warehouse, and supporting business reporting needs. The ideal candidate will collaborate across technology, operations, product, and analytics teams to create high-quality, governed, and reusable data assets, while supporting a long-term architecture aligned with Mission’s growth. This role is highly technical and focused on execution and is ideal for a data engineer who thrives in fast-paced environments and is passionate about data quality, performance, and scalability.
What You'll Do
- Design and implement scalable data pipelines to ingest, transform, and store data from third-party vendors and internal systems using APIs, files, and databases.
- Build and maintain a cloud-based data warehouse solution in partnership with architects, ensuring clean, normalized, and performant data models.
- Establish and monitor reliable data ingestion processes across systems with varying grain and cadence, ensuring data quality and completeness.
- Collaborate with API and integration teams to develop secure, robust data exchange processes with external vendors and internal services.
- Set up and manage data connections from the warehouse to BI tools (e.g., Power BI, Looker, Tableau) to enable self-service analytics and dashboarding.
- Document data flows, schemas, and definitions, and help drive data governance and standardization efforts across the organization.
- Implement data validation, cleansing, and enrichment processes to ensure high-quality data for financial reporting and analytics
- Ensure compliance with data standards, regulatory requirements (e.g., NAIC, SOX), and data security best practices
- Provide technical leadership in data engineering best practices, including version control, CI/CD for data pipelines, testing, and observability.
- Build initial dashboards in BI tools.
- Give directions to data engineers to ensure that projects are completed on time and to a high standard.
Required Qualifications
- 5+ years of experience in data engineering, data warehousing, or a related field with at least 1-2 years leading implementation projects.
- 3+ years of experience in the commercial insurance industry.
- Hands on experience with Azure data services.
- Strong SQL and performance tuning skills in Microsoft SQL Server environments.
- Experience designing data warehouses or data marts with a focus on dimensional modeling and analytics-ready schemas.
- Fluent in range of data ingestion: RESTful APIs, JSON/XML ingestion, flat files, message queues, and CDC patterns.
- Understanding of data governance, metadata management, and access control.
- Familiarity with version control (GitLab) and CI/CD pipelines for data workflows.
- Experience working with BI tools like Power BI, Looker, or Tableau, including building dashboards or semantic layers.
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent experience.
- Demonstrated ability to translate business needs into scalable data systems and design.
Preferred Qualification
- Experience with data cleanup, fuzzy matching, or AI-driven normalization.
- Familiarity with Python, dbt, or Azure Functions is a plus.
- Strong communication skills and ability to work cross-functionally with product, engineering, and analytics teams.
- Ability to travel up to 10% of the year.
Knowledge, Skills and Abilities
- Deep understanding of data warehousing principles, dimensional modeling, and ETL/ELT architecture, with hands-on experience building analytics-ready structures in Azure SQL and SSMS.
- Understand core business concepts within the commercial property and casualty insurance industry to guide data modeling and transformation efforts.
- Proficient in developing scalable data pipelines using Azure Data Factory and SQL, with strong debugging and optimization skills.
- Skilled in integrating with diverse data sources, including REST APIs, files, and third-party systems, and handling complex data ingestion across varying levels of granularity.
- Familiar with core Azure services and their role in secure, cloud-native data engineering workflows.
- Able to connect and model data for downstream BI tools (Power BI, Tableau, Looker), supporting self-service reporting and building initial dashboards when needed.
- Strong communicator and collaborator who can translate business needs into technical solutions, document data workflows, and mentor team members on best practices.
Additional Information
This is a remote position. Planned, in-office activities may be required on occasion (typically 2-4x per year).
You must live in the United States and be authorized to work in the United States without requirement of employment sponsorship/visa.