Data Engineering Manager
WFA Digital Insight
As demand for skilled data engineers continues to rise, with a 25% growth in job postings over the past year, Airalo is looking for a seasoned Data Engineering Manager to spearhead their data organization. With the increasing importance of cloud-native technologies, professionals with expertise in major cloud platforms, such as GCP, and data warehouses like BigQuery, are in high demand. Airalo stands out by offering a unique blend of remote work flexibility and a dynamic, growing environment. Before applying, candidates should be prepared to showcase their experience in managing data engineering teams, scaling cloud-based data infrastructure, and driving innovation through AI-integrated workflows.
Job Description
About the Role
The Data Engineering Manager role at Airalo is a pivotal position, responsible for overseeing the foundational backend of the company's data organization. This involves managing a team of skilled Data Engineers, shaping the hiring roadmap as the function grows, and ensuring the seamless integration of data engineering processes with the broader organizational goals. Reporting to senior leadership, the Data Engineering Manager will play a crucial role in designing and implementing the data infrastructure that will drive Airalo's continued growth and success. The ideal candidate will have a deep understanding of major cloud platforms, with a preference for Google Cloud Platform (GCP), and experience with cloud-native data warehouses such as BigQuery. The role is based in Romania, Spain, or the United Kingdom, offering the flexibility of remote work that aligns with Airalo's forward-thinking approach to talent acquisition and retention.What You Will Do
- Lead a team of Data Engineers, providing guidance on data engineering best practices, overseeing project execution, and mentoring team members to enhance their skills.
- Develop and implement the strategic roadmap for data engineering, aligning with the company's overall vision and objectives.
- Design, build, and manage large-scale data systems, including data pipelines, data warehouses, and data lakes, leveraging cloud-native technologies.
- Collaborate with cross-functional teams to integrate data insights into product development, marketing strategies, and business operations.
- Manage the planning, execution, and maintenance of data infrastructure, ensuring scalability, reliability, and performance.
- Implement and manage data governance policies, ensuring data quality, security, and compliance with regulatory requirements.
- Oversee the adoption and integration of new technologies and tools, staying abreast of industry trends and innovations.
- Develop and manage budgets for data engineering initiatives, ensuring cost-effectiveness and ROI.
- Foster a culture of innovation, encouraging experimentation and learning within the data engineering team.
What We Are Looking For
- 7+ years of professional experience in data or software engineering, with at least 2+ years of experience directly managing and scaling data engineering teams.
- Strong background in cloud computing, preferably with Google Cloud Platform (GCP), including experience with cloud-native data warehouses like BigQuery.
- Experience with orchestration tools such as Airflow or Dagster, ELT pipelines like Fivetran or dbt, and distributed data processing frameworks such as Apache Spark or Flink.
- Hands-on experience using AI tools to accelerate engineering workflows, including code generation, code review, pipeline debugging, and documentation.
- Advanced coding skills in Python (and/or Scala) and SQL, across both relational and non-relational databases.
- Experience with implementing CI/CD, Infrastructure as Code, and observability/monitoring for data pipelines.
- Proven ability to manage and lead high-performing teams, with excellent communication and project management skills.
Nice to Have
- Experience with other cloud platforms such as AWS or Azure, and their respective data services.
- Knowledge of containerization technologies such as Docker and Kubernetes.
- Certification in data engineering or a related field, such as GCP's Professional Data Engineer certification.
- Experience with Agile development methodologies and version control systems like Git.
Benefits and Perks
- Generous PTO policy, allowing for a healthy work-life balance.
- Wellness and learning allowances to support personal and professional development.
- Annual Airalo Away retreat, providing opportunities for team bonding and networking.
- Flexible remote work arrangements, allowing you to work from anywhere.
- Access to cutting-edge technologies and tools, enabling you to stay at the forefront of data engineering.
- Collaborative and dynamic work environment, with a team passionate about innovation and growth.
- Opportunities for professional growth and career advancement within a rapidly expanding company.
How to Stand Out
- Ensure your resume and cover letter highlight specific experiences with cloud platforms, data warehouses, and data engineering tools.
- Be prepared to discuss your approach to managing and scaling data engineering teams, including strategies for talent development and pipeline optimization.
- Showcase your understanding of data governance, security, and compliance, especially in cloud-based environments.
- Demonstrate hands-on experience with AI tools and technologies, and how you've applied them to improve engineering workflows.
- Prepare to walk through your experience with CI/CD, Infrastructure as Code, and observability/monitoring for data pipelines, highlighting successes and challenges.
- Highlight any certifications or continuing education in data engineering or related fields to demonstrate your commitment to professional development.
This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.