Data Engineer 6
Lansing, MI
On-site: Hybrid position - Onsite 2 days per week
Duration: 1 year with possible extension.
Job Description:
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Required Skills: -
Developing complex database systems.
Databricks.
Elastic search, Kibanna.
Python/Scala.
Oracle.
Experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
Experience with AWS.
Experience with data warehousing, data visualization Tools, data integrity .
CMM/CMMI Level 3 methods and practices.
Implemented agile development processes including test driven development.
Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have