Backend Engineer — Ingestion
WFA Digital Insight
The demand for skilled backend engineers has grown significantly, with a focus on building scalable and efficient data pipelines. As the remote job market continues to evolve, companies like Posthog are leading the way in providing innovative solutions for product analytics and customer data management. With the rise of big data, the need for experts who can process and analyze vast amounts of information is becoming increasingly important. In 2025, the demand for backend engineers grew by 25%, and this trend is expected to continue. Posthog's commitment to transparency, autonomy, and shipping fast makes it an attractive company for those looking to make a real impact in the industry. Before applying, candidates should be aware of the company's unique culture and values, which prioritize productivity, ambition, and doing things differently.
Job Description
About the Role
Posthog is seeking a highly skilled Backend Engineer to join their team, focusing on ingestion pipeline development. As a key member of the engineering team, you will be responsible for designing, building, and maintaining the systems that process billions of data points. Your work will have a direct impact on the company's mission to provide a comprehensive operating system for businesses.The role entails working closely with cross-functional teams to identify and prioritize project requirements, designing and implementing scalable data pipelines, and collaborating with other engineers to ensure seamless integration with existing systems. You will be part of a team that values autonomy, with the freedom to choose what to work on next based on what will have the biggest impact on customers.
Posthog's engineering team is built around small, autonomous groups of skilled engineers who can outship much larger companies. This structure allows for greater efficiency, productivity, and innovation. As a backend engineer at Posthog, you will have the opportunity to work on complex problems, develop your skills, and contribute to the company's ambitious goals.
What You Will Do
- Design, build, and maintain scalable data ingestion pipelines to process billions of data points
- Collaborate with cross-functional teams to identify and prioritize project requirements
- Work closely with other engineers to ensure seamless integration with existing systems
- Develop and implement data processing algorithms and data storage solutions
- Participate in code reviews and contribute to the improvement of the codebase
- Troubleshoot and resolve issues with data pipelines and processing systems
- Develop and maintain technical documentation for data pipelines and systems
- Collaborate with the product team to develop new features and functionality
- Work with the DevOps team to ensure smooth deployment and operation of data pipelines
- Participate in the development of the company's data strategy and architecture
What We Are Looking For
- 5+ years of experience in backend engineering, with a focus on data pipelines and processing systems
- Strong programming skills in languages such as Python, Java, or Scala
- Experience with data processing frameworks such as Apache Beam, Apache Spark, or Apache Flink
- Knowledge of data storage solutions such as relational databases, NoSQL databases, or data warehouses
- Experience with containerization using Docker and container orchestration using Kubernetes
- Strong understanding of software development principles, including testing, continuous integration, and continuous deployment
- Experience working with agile development methodologies and version control systems such as Git
- Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
- Experience with cloud-based infrastructure and services such as AWS or GCP
Nice to Have
- Experience with machine learning or artificial intelligence
- Knowledge of data science and analytics
- Experience with DevOps tools such as Terraform or Ansible
- Familiarity with Posthog's products and technology stack
- Experience working in a remote or distributed team environment
Benefits and Perks
- Competitive salary and equity package
- Opportunity to work on complex and challenging problems
- Collaborative and dynamic work environment
- Flexible working hours and remote work options
- Professional development and growth opportunities
- Access to cutting-edge technologies and tools
- Comprehensive health insurance and benefits package
- Generous parental leave policy
- Annual budget for conferences, training, and education
- Free meals and snacks in the office (for non-remote employees)
How to Stand Out
- Make sure to highlight your experience with data pipelines and processing systems in your resume and cover letter.
- Be prepared to discuss your approach to designing and implementing scalable data pipelines during the interview process.
- Showcasing your proficiency in programming languages such as Python, Java, or Scala can be beneficial.
- Demonstrating knowledge of data storage solutions and experience with containerization using Docker can be a plus.
- Prepare to discuss your experience working with agile development methodologies and version control systems such as Git.
- Be ready to talk about your experience working in a remote or distributed team environment and how you handle communication and collaboration in such settings.
This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.