A leading kidney care management organization is looking for a skilled Data Engineer to support their mission to reimagine healthcare. This company partners closely with physicians to improve the lives of patients through advanced data systems and compassionate innovation. As a Data Engineer, you’ll play a critical role in building and optimizing data pipelines, supporting cloud migration efforts, and contributing to the overall architecture of a modern, scalable data ecosystem.
This is an opportunity to join a fast-growing, mission-driven team focused on transforming how healthcare data is leveraged to improve outcomes. You’ll collaborate cross-functionally, innovate constantly, and directly impact systems that support patients nationwide.
Benefits
- Meaningful Work: Contribute to improving the lives of patients through advanced data solutions in the healthcare space.
- Growth-Oriented Culture: Collaborate with experienced professionals and receive mentorship opportunities.
- Modern Tech Stack: Work hands-on with tools like Azure Data Factory, Databricks, SQL Server, and Python in a cloud-native environment.
- Innovation-First Environment: Drive improvements and suggest enhancements that shape the future of the company’s data infrastructure.
- Remote Flexibility: Work remotely while contributing to projects that make a real-world impact.
Responsibilities
- Collaborate with development teams to ensure timely and successful project delivery.
- Design, review, and deploy SQL code, while assisting with schema design and query optimization.
- Monitor, maintain, and troubleshoot data pipelines to resolve bottlenecks and ensure performance.
- Provide mentorship to junior engineers and guide best practices in development methodologies and frameworks.
- Contribute to the design and development of solutions within the existing Datawarehouse and Lakehouse architecture.
- Ensure proper documentation and knowledge sharing within the team to support smooth project hand-offs and scalability.
- Participate in defining methodologies and building tools within the Lakehouse and Datawarehouse infrastructure.
- Research and propose new technologies and enhancements for ongoing process improvement.
- Support cloud migration initiatives and the integration of data systems with external applications.
Requirements
-
Education & Experience:
- Bachelor’s degree with 3–5 years of experience in data engineering, or Master’s degree with 3+ years of experience.
-
Technical Expertise:
- Strong command of Azure-based tools including Azure Data Factory, Azure SQL databases, and Databricks.
- Deep experience with the Microsoft SQL Server stack (SSIS, ADF).
- Proficient in Python for data manipulation and pipeline support.
-
Architecture & Integration:
- Understanding of data lake and Lakehouse principles, data modeling, and data warehousing best practices.
- Experience with cloud migration projects and integrating with third-party applications.
-
Communication & Collaboration:
- Excellent verbal and written communication skills.
- Strong presentation skills for stakeholder meetings, requirement gathering, and feedback sessions.
-
Work Ethic:
- Proven ability to meet tight deadlines without sacrificing quality.
- Self-driven and collaborative, with a continuous improvement mindset.