Senior Data Engineer - Data Engineering

PlaidPlaid·Remote(San Francisco HQ)
Software Development

WFA Digital Insight

The demand for skilled data engineers has skyrocketed in recent years, with a 25% increase in job postings in 2025 alone. As companies like Plaid continue to revolutionize the fintech landscape, the need for experts who can harness the power of data has never been more pressing. With Plaid's commitment to empowering financial transformation, this role offers a unique opportunity to work with a talented team and make a real impact. Candidates should be prepared to showcase their expertise in SQL, Python, and data engineering tools, as well as their ability to drive business growth through data-driven decision making.

Job Description

About the Role

The Senior Data Engineer role at Plaid is a critical part of the company's mission to empower financial transformation. As a key member of the Data Engineering team, you will be responsible for building robust golden data sets to power business goals and drive data-driven decision making. You will work closely with cross-functional teams, including engineering, product, and business intelligence, to design and implement scalable data systems that meet the company's growing needs.

The ideal candidate will have a strong background in data engineering, with expertise in SQL, Python, and data engineering tools such as DBT, Airflow, and Redshift. You will be a self-motivated and curious individual who is passionate about working with data to drive business growth and improvement.

What You Will Do

  • Design and implement scalable data systems to meet the company's growing needs
  • Build and maintain large-scale data pipelines and architectures
  • Collaborate with cross-functional teams to design and implement data-driven solutions
  • Develop and maintain data quality metrics and monitoring systems
  • Work with engineers, product managers, and business intelligence teams to identify and prioritize data needs
  • Develop and maintain technical documentation for data systems and pipelines
  • Participate in the development of the company's data strategy and roadmap
  • Collaborate with the data science team to develop and implement machine learning models
  • Stay up-to-date with industry trends and emerging technologies in data engineering
  • Participate in code reviews and contribute to the improvement of the codebase
  • Collaborate with the DevOps team to ensure smooth deployment of data systems

What We Are Looking For

  • 5+ years of experience in data engineering, with a focus on building scalable data systems
  • Strong expertise in SQL, Python, and data engineering tools such as DBT, Airflow, and Redshift
  • Experience with cloud-based data platforms such as AWS or GCP
  • Strong understanding of data modeling, data warehousing, and ETL processes
  • Experience with data quality metrics and monitoring systems
  • Strong collaboration and communication skills, with the ability to work with cross-functional teams
  • Strong problem-solving skills, with the ability to analyze complex data systems and identify areas for improvement
  • Experience with agile development methodologies and version control systems such as Git
  • Strong understanding of data security and compliance principles

Nice to Have

  • Experience with machine learning or data science technologies such as TensorFlow or scikit-learn
  • Experience with containerization technologies such as Docker
  • Experience with cloud-based data platforms such as Azure or IBM Cloud
  • Certification in data engineering or a related field
  • Experience with data visualization tools such as Tableau or Power BI

Benefits and Perks

  • Competitive salary and equity package
  • Comprehensive health, dental, and vision insurance
  • 401(k) matching program
  • Flexible PTO and vacation policy
  • Remote work stipend and home office setup allowance
  • Professional development opportunities and conference sponsorships
  • Access to cutting-edge technologies and tools
  • Collaborative and dynamic work environment
  • Recognition and reward programs for outstanding performance

How to Stand Out

  • Make sure to highlight your experience with data engineering tools such as DBT, Airflow, and Redshift in your resume and cover letter.
  • Showcase your ability to build and maintain large-scale data pipelines and architectures by providing examples of your previous work.
  • Emphasize your understanding of data quality metrics and monitoring systems, and be prepared to discuss your approach to ensuring data accuracy and reliability.
  • Demonstrate your ability to collaborate with cross-functional teams by providing examples of your experience working with engineers, product managers, and business intelligence teams.
  • Be prepared to discuss your experience with agile development methodologies and version control systems such as Git.
  • Research the company's technology stack and be prepared to discuss your experience with similar tools and technologies.
  • Practice your problem-solving skills by reviewing common data engineering interview questions and practicing your responses.

This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.