National University logo

DevOps Engineer, Data Platform

National University
Full-time
Remote
United States
$78,496 - $105,974 USD yearly
Azure

The DevOps Engineer is a critical hands-on role responsible for building, automating, and managing the university's modern data platform. They will partner directly with data scientists and engineers to create a scalable, reliable, and secure environment, enabling the rapid development and deployment of analytics and machine learning models.

The DevOps Engineer will own our CI/CD processes, primarily using GitHub Actions, and will be responsible for establishing and scaling our Infrastructure as Code (IaC) practices for Microsoft Fabric and the wider Azure ecosystem. They will ensure our platform is robust, cost-efficient, and optimized to support the university's data-driven mission.

Essential Functions:

  • Designs, builds, and maintains our CI/CD pipelines using GitHub Actions for data engineering workloads, analytics, and machine learning model deployment.
  • Develops, deploys, and manages our Infrastructure as Code (IaC) to automate the provisioning and configuration of Azure and Microsoft Fabric resources (e.g., using Bicep, ARM Templates, or Terraform).
  • Administers, monitors, and optimizes our core data platforms, including Microsoft Fabric and Azure Synapse Analytics, ensuring high availability and performance.
  • Implements and manages comprehensive monitoring, logging, and alerting solutions to ensure platform health, security, and cost-efficiency.
  • Collaborates closely with data scientists and engineers to troubleshoot, optimize data pipelines, and support the deployment of ML models (using tools like MLFlow).
  • Implements and enforces data governance and security best practices for identity, access, and data protection within the cloud environment.
  • Assists in managing and integrating with secondary cloud infrastructure on Google Cloud Platform (GCP) as needed.
  • Creates and maintains high-quality documentation for our platform architecture, automation, and operational procedures.
  • Performs other duties as assigned.  

Supervisory Responsibilities: N/A

Requirements:

Education & Experience:

  • Bachelor's degree in Computer Science, Information Technology, Engineering, or equivalent combination of education and experience, Master’s degree preferred.
  • Three (3) to Five (5) years of hands-on experience in a DevOps, SRE, or Data Engineering role with a strong focus on automation.
  • All skills, abilities and education will be considered for minimum qualifications.

Competencies/Technical/Functional Skills:

  • Strong, demonstrable experience with the Microsoft Azure cloud platform.
  • Proven experience building and managing CI/CD pipelines, preferably with GitHub Actions.
  • Hands-on experience with Infrastructure as Code (IaC) tools (e.g., Bicep, ARM templates, Terraform).
  • Proficiency in scripting languages (e.g., Python, PowerShell, Bash).
  • Experience with data platforms like Azure Synapse, Microsoft Fabric, or Databricks is highly desirable.
  • Working knowledge of SQL and data warehousing/data lake concepts.
  • Familiarity with Google Cloud Platform (GCP) is a plus.
  • Experience with containerization (Docker, Kubernetes) is a plus.
  • A strong "automation-first" and "infrastructure-as-code" mindset.
  • Excellent problem-solving and troubleshooting skills.
  • Strong written, oral communication, and interpersonal skills.
  • Strong project management, organizational, and prioritization skills.
  • Ability to work independently, manage competing priorities, and collaborate effectively with technical and non-technical stakeholders.
  • Embraces diverse people, thinking and styles.

Location: Remote, USA

Travel: up to 10% travel

#LI-Remote

Apply now
Share this job