Senior Data DevOps Engineer
WFA Digital Insight
The demand for skilled Data DevOps Engineers has surged in the remote job market, with a 25% increase in 2025 alone. Hard Rock Digital is at the forefront of this trend, seeking a seasoned expert to drive technical decisions and mentor engineering teams. With the online gaming industry projected to reach
Job Description
About the Role
As a Senior Data DevOps Engineer at Hard Rock Digital, you will be part of a team that is redefining the online gaming experience. You will work closely with data science, machine learning, and reporting teams to deploy and support cutting-edge data applications. Your primary focus will be on architecting scalable data infrastructure, leading technical initiatives, and mentoring engineering teams.The Senior Data DevOps Engineer role is a critical component of Hard Rock Digital's strategy to become the best online sportsbook, casino, and social gaming company in the world. You will be responsible for driving strategic technical decisions, collaborating with cross-functional teams, and establishing best practices for data infrastructure and DevOps.
Hard Rock Digital is a dynamic and fast-paced environment that values passion, learning, and innovation. As a Senior Data DevOps Engineer, you will have the opportunity to work with a talented team of professionals who are dedicated to delivering exceptional results.
What You Will Do
- Architect and lead the design of complex, enterprise-scale data pipelines using Airflow, DBT, and Databricks
- Define and implement strategies for pipeline performance optimization to support real-time and batch processing at scale
- Lead the design and optimization of AWS-based data infrastructure, including S3, Lambda, and Snowflake architecture
- Establish and enforce best practices for cost-efficient, secure, and scalable data processing across the organization
- Design and optimize AWS SageMaker environments for ML teams, ensuring optimal performance and resource utilization
- Lead cross-functional collaboration with ML, Data Science, and Reporting teams to establish data strategy and ensure seamless data accessibility
- Design and implement comprehensive data pipeline monitoring, alerting, and logging frameworks to proactively detect failures and performance bottlenecks
- Architect automation solutions for data quality, lineage tracking, and schema evolution management
- Lead incident response efforts, performing complex troubleshooting and root cause analysis for critical data issues
- Champion and evolve Data DevOps best practices, driving automation, reproducibility, and scalability across the organization
What We Are Looking For
- 5+ years of experience in data engineering, DevOps, or a related field
- Strong expertise in cloud-based data infrastructure, including AWS, Azure, or Google Cloud
- Proficiency in languages such as Python, Java, or Scala
- Experience with data pipeline tools such as Airflow, DBT, or Databricks
- Strong understanding of data architecture, data modeling, and data governance
- Experience with machine learning and data science technologies, including SageMaker, TensorFlow, or PyTorch
- Strong communication and collaboration skills, with the ability to work with cross-functional teams
- Experience with agile development methodologies and version control systems such as Git
- Strong problem-solving skills, with the ability to troubleshoot complex data issues
Nice to Have
- Experience with Excel, including data analysis and visualization
- Knowledge of data security and compliance, including GDPR, HIPAA, or CCPA
- Experience with containerization technologies such as Docker or Kubernetes
- Familiarity with data warehousing and business intelligence tools, including Snowflake or Tableau
Benefits and Perks
- Competitive salary and benefits package
- Opportunity to work with a talented team of professionals in a dynamic and fast-paced environment
- Flexible working hours and remote work options
- Professional development and training opportunities
- Access to the latest technologies and tools
- Recognition and reward for outstanding performance
- Comprehensive health insurance and wellness programs
- Generous paid time off and holiday policies
- Retirement savings plan and matching contributions
How to Stand Out
- To stand out as a candidate, be prepared to discuss your experience with cloud-based data infrastructure and DevOps initiatives. Highlight your ability to architect scalable data pipelines and lead cross-functional teams.
- Make sure to review the company's technology stack and be familiar with the tools and technologies they use.
- Be prepared to provide examples of your problem-solving skills, including troubleshooting complex data issues and performing root cause analysis.
- Consider creating a portfolio or repository of your work, including examples of your data engineering and DevOps projects.
- When negotiating salary, be sure to research the market rate for your role and location, and be prepared to discuss your expectations and requirements.
- Be wary of companies that lack a clear vision or strategy for their data infrastructure and DevOps initiatives. Look for signs of a strong company culture and values that align with your own.
This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.