Worldwide Remote Jobs

Infrastructure Engineer

datma
๐Ÿ“ USA ๐Ÿ’ผ full_time
Apply Now ๐Ÿ“… 1 week ago

Job Description

Join Datma: Shaping the Future of Healthcare Data

At Datma, we’re revolutionizing how healthcare data is understood and utilized. We believe the untapped potential within this data is immense, and we’re building the platform to unlock it. Backed by Transformation Capital and Generator Ventures, we’re an early-stage company on a mission to simplify the application of complex data to real-world health challenges. If you’re passionate about making a difference in healthcare through innovative technology, we want to hear from you!

Opportunity: DevOps and Data Infrastructure Engineer

We’re seeking a talented and versatile engineer to play a crucial role in bridging our DevOps and data engineering functions. As our DevOps and Data Infrastructure Engineer, you’ll be responsible for building and maintaining a scalable, secure, and reliable data infrastructure while ensuring the smooth deployment and operation of our data systems. We encourage applications from candidates with a strong background in either DevOps or Data Engineering.

Key Responsibilities:

This role offers a unique blend of responsibilities, challenging you to leverage your expertise in both DevOps and Data Engineering. You’ll be instrumental in ensuring our platform’s performance, security, and scalability.

DevOps Functions:
  • Architect, deploy, and manage Kubernetes clusters in customer cloud environments (AWS, Azure, GCP).
  • Develop and maintain robust Infrastructure-as-Code (IaC) templates (e.g., Terraform, Helm) for repeatable deployments.
  • Implement scaling, monitoring, disaster recovery strategies, and observability solutions (metrics, logging, tracing) for proactive infrastructure management.
  • Automate deployment processes for data pipelines, ML models, and analytics applications, including automated testing.
  • Manage containerization and orchestration of data services and workloads using Docker and Kubernetes.
  • Troubleshoot performance and reliability issues across environments.
  • Evaluate and recommend infrastructure solutions through cost-benefit analyses.
  • Implement and maintain security controls aligned with HIPAA and HITRUST frameworks.
  • Partner with compliance teams to ensure infrastructure supports ongoing certification and audit requirements.
  • Configure secure networking, identity and access management (IAM), encryption, and audit logging.
Data Engineering Functions:
  • Build infrastructure to host in-house AI models and integrate with external AI services (e.g., OpenAI APIs).
  • Optimize data pipelines and storage for AI training and inference workloads.
  • Support GPU-based compute environments for ML workloads when required.
  • Design and manage scalable API gateways and authentication mechanisms for external data consumers.
  • Ensure API infrastructure can handle high-throughput, low-latency access to sensitive healthcare datasets.
  • Collaborate with the data/applications team to develop and optimize data processing pipelines, using data orchestration tools like Prefect or cloud-native solutions, and support diverse client integrations (Python, R, SQL, BI tools, etc.).

What You’ll Bring:

We’re looking for a passionate problem-solver with a strong understanding of cloud infrastructure and data engineering principles. If you meet the following technical requirements, we encourage you to apply:

  • 3+ years of experience in cloud infrastructure engineering, preferably in a regulated data environment.
  • Deep expertise with Kubernetes and container orchestration in production.
  • Strong proficiency in Infrastructure as Code tools (Terraform, Helm, Ansible, etc.).
  • Experience with cloud security best practices and regulatory frameworks (HIPAA, SOC 2, or HITRUST).
  • Hands-on experience with CI/CD pipelines and monitoring tools (e.g., Prometheus, Grafana, ELK).
  • Proficiency in Python and/or Go, SQL, and bash scripting.
  • Understanding of data modeling, warehousing concepts, and data pipeline orchestration tools.
Bonus Points:
  • Experience deploying in customer-owned cloud environments.
  • Familiarity with secure API design and management (OAuth2, JWT, API gateways).
  • Knowledge of machine learning infrastructure and MLOps practices.
  • Background involving healthcare data and associated interoperability standards (FHIR, HL7).
  • Prior work supporting HITRUST certification efforts.
  • Experience with multi-tenant architecture design.

Our Commitment to You:

At Datma, we’re committed to fostering a supportive and inclusive environment where everyone can thrive. We prioritize the well-being of our team and offer comprehensive benefits focused on physical, financial, and emotional health. These benefits include healthcare coverage (vision and dental), retirement plans, generous PTO, family leave, and more. We value authenticity and strive to create a compassionate and caring work experience for all.

Ready to Make a Difference?

If you’re excited about the opportunity to contribute to a company that’s transforming healthcare data, we encourage you to apply! Join us at Datma and help us unlock the power of data to improve lives.

Build Your CV for remote jobs in Minutes

Latest Jobs

Similar Jobs

Lumen
๐Ÿ“ USA ๐Ÿ’ผ full_time ๐Ÿ“… Sep 15, 2025
HireAnalytics
๐Ÿ“ LATAM ๐Ÿ’ผ full_time ๐Ÿ“… Sep 15, 2025
UnitedHealth Group
๐Ÿ“ USA ๐Ÿ’ผ full_time ๐Ÿ“… Sep 15, 2025
Anaplan
๐Ÿ“ UK ๐Ÿ’ผ full_time ๐Ÿ“… Sep 15, 2025