Data Engineer

Data Society·Remote(United States)
Software Development

WFA Digital Insight

As demand for skilled data engineers continues to grow, with a projected 14% increase in employment opportunities by 2028, professionals with expertise in building scalable data pipelines and managing large-scale data warehouses are in high demand. Data Society, a leading provider of data and AI training and solutions, is seeking a talented Data Engineer to join their team. With the rise of remote work, companies are looking for professionals who can work collaboratively and autonomously, making this role an exciting opportunity for those who thrive in flexible work environments. Before applying, candidates should be aware that this role requires strong technical skills, excellent communication abilities, and the ability to work effectively in a cross-functional team.

Job Description

About the Role

The Data Engineer position at Data Society is a critical role that entails building and managing large-scale data pipelines, warehouses, and marts. As a key member of the data engineering team, you will work closely with data scientists to design and implement data architectures that support critical operational and analytical applications. Your expertise in data governance, quality, and security will be essential in ensuring that data systems are reliable, efficient, and compliant with federal contract requirements.

The ideal candidate will have a strong technical background, with experience in building scalable data pipelines, managing large-scale data warehouses, and optimizing data transformation pipelines. You will be working in a fast-paced environment, collaborating with cross-functional teams, and communicating effectively with stakeholders to ensure that data systems meet business requirements.

As a Data Engineer at Data Society, you will be part of a team that is passionate about empowering workforces with the skills they need to achieve their goals and expand their impact. You will have the opportunity to work on exciting projects, contribute to the development of innovative data and AI solutions, and collaborate with talented professionals who share your passion for data and technology.

What You Will Do

  • Design and build scalable data pipelines that power critical operational and analytical applications
  • Develop and manage large-scale data warehouses, lakehouses, and data marts
  • Build and optimize data transformation pipelines using tools like dbt
  • Implement data governance principles and quality standards to ensure data lineage, documentation, and metadata
  • Create efficient, performant SQL-based data queries and Python-based data processing jobs
  • Collaborate with data science teams to build supporting data scaffolding for data systems
  • Work closely with cross-functional teams to design and implement data architectures
  • Communicate effectively with stakeholders to ensure that data systems meet business requirements
  • Manage timelines, work autonomously and collaboratively, and communicate effectively with team members and stakeholders

What We Are Looking For

  • Advanced degree in Statistics, Applied Mathematics, Data Science, Computer Science, Operations Research, or other closely related quantitative or mathematical disciplines
  • 5+ years of experience in data and analytics engineering in cloud environments
  • Expertise in SQL, Python, and schema design, with experience in data cataloging and governance tools
  • Experience with data transformation and ETL best practices
  • Experience with data orchestration tools like Airflow, transformation frameworks like dbt, and cloud deployment tools like Terraform
  • Demonstrated exceptional oral and written communication skills
  • Ability to work independently and in a team environment
  • Strong problem-solving and critical thinking skills
  • Superior team-working skills, and a desire to learn, contribute, and explore

Nice to Have

  • Experience with Snowflake, Databricks, Kafka, Flume, Spark, or Flink
  • Knowledge of data security and compliance principles
  • Experience with containerization using Docker
  • Familiarity with Agile development methodologies
  • Certification in data engineering or related field

Benefits and Perks

  • Competitive salary and benefits package
  • Opportunity to work on exciting projects and contribute to the development of innovative data and AI solutions
  • Collaborative and dynamic work environment
  • Professional development opportunities to enhance your skills and knowledge
  • Flexible work arrangements, including remote work options
  • Access to cutting-edge technologies and tools
  • Recognition and reward for outstanding performance
  • Comprehensive health and wellness program
  • Generous paid time off and holiday schedule

How to Stand Out

  • To stand out as a candidate, make sure to highlight your experience with data engineering tools and technologies, such as dbt, Airflow, and Terraform.
  • Showcase your ability to work collaboratively and autonomously, and demonstrate your excellent communication skills.
  • Be prepared to discuss your approach to data governance, quality, and security, and provide examples of how you have implemented these principles in previous roles.
  • Make sure to research the company and the role, and be prepared to ask informed questions during the interview process.
  • Consider building a portfolio of your work, including examples of data pipelines, warehouses, and marts you have designed and implemented.
  • Don't be afraid to negotiate your salary and benefits package, and be prepared to discuss your expectations and requirements.

This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.