PP - Data Engineer - 175

Thaloz·Remote(Brazil)
Software Development
Excel

WFA Digital Insight

The demand for skilled data engineers has skyrocketed in recent years, with a 25% increase in job postings in the last quarter alone. As companies like Thaloz continue to evolve and expand their digital presence, the need for professionals who can design, build, and maintain scalable data pipelines has never been more pressing. With the rise of remote work, data engineers can now work with global teams and contribute to projects that have a real impact on business decision-making. For candidates looking to apply, it's essential to have a strong foundation in SQL, Python, and data modeling, as well as experience with ETL processes and data warehousing. Thaloz stands out as a company that values innovation and technical excellence, making this role an attractive opportunity for those who want to be at the forefront of data engineering.

Job Description

About the Role

Thaloz is seeking a highly skilled and motivated Data Engineer to join their Credit Platform Data team in Brazil. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable and reliable data pipelines and ETL processes that empower internal business units with efficient data processing and utilization. Your expertise will directly impact the quality and accessibility of data, enabling data-driven decision-making across the organization.

The Credit Platform Data team is a dynamic and collaborative group that works closely with product managers, analysts, and stakeholders to drive business growth and improvement. As a Data Engineer, you will be an integral part of this team, translating complex data requirements into robust engineering solutions that handle large volumes of data with high performance and scalability.

Thaloz values innovation and technical excellence, and the company is committed to staying at the forefront of data engineering technologies and best practices. As a Data Engineer, you will have the opportunity to work with cutting-edge tools and technologies, and to contribute to the development of new data pipelines and systems.

What You Will Do

  • Design, develop, and maintain scalable data pipelines and ETL workflows that support the ingestion, transformation, and storage of large datasets from diverse sources
  • Implement automated data quality checks and validation processes to ensure the accuracy, consistency, and reliability of data across systems
  • Collaborate with product managers, data analysts, and business stakeholders to gather and understand data requirements, translating them into technical specifications and actionable engineering tasks
  • Continuously monitor and optimize data systems for performance, scalability, and cost-efficiency, ensuring that data infrastructure meets evolving business needs
  • Diagnose and resolve data-related issues promptly, providing root cause analysis and implementing preventive measures
  • Maintain comprehensive documentation of data pipelines, ETL processes, and system architecture
  • Participate in design and code reviews to uphold high engineering standards
  • Stay abreast of emerging data engineering technologies, tools, and best practices to drive innovation and continuous improvement within the team
  • Provide guidance and mentorship to junior data engineers, fostering a culture of knowledge sharing and technical excellence
  • Work closely with the data warehousing team to design and implement data models that support efficient querying and reporting

What We Are Looking For

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • Expert-level proficiency in SQL for querying, manipulating, and optimizing relational databases
  • Strong programming skills in Python, including experience with data processing libraries such as Pandas
  • Proficient in using PySpark for distributed data processing on large-scale datasets
  • Experience with ETL concepts and hands-on experience designing and implementing ETL pipelines
  • Expertise in data modeling techniques to design logical and physical data models that support efficient querying and reporting
  • Familiarity with relational database management systems (RDBMS) such as Oracle, MySQL, or similar platforms
  • Knowledge of database design, indexing, and query optimization
  • Experience working with Unix/Linux operating systems for managing data workflows, scripting, and system monitoring
  • Ability to write shell scripts to automate routine tasks, manage data pipelines, and integrate with other system components

Nice to Have

  • Experience with automation testing frameworks for data pipelines and ETL processes
  • Familiarity with data warehousing concepts, architectures, and best practices
  • Knowledge of cloud-based data platforms such as AWS or Google Cloud
  • Experience with agile development methodologies and version control systems such as Git

Benefits and Perks

  • Competitive salary and benefits package
  • Opportunity to work with a global team and contribute to projects that have a real impact on business decision-making
  • Professional development and growth opportunities, including training and education assistance
  • Flexible working hours and remote work arrangements
  • Access to cutting-edge tools and technologies, including PySpark, Pandas, and SQL
  • Collaborative and dynamic work environment with a team of experienced data engineers and analysts
  • Recognition and rewards for outstanding performance and contributions to the team

How to Stand Out

  • To stand out as a candidate, make sure to highlight your experience with data pipelines, ETL processes, and data modeling in your resume and cover letter.
  • Be prepared to provide examples of your work, including code snippets and data visualizations, to demonstrate your skills and expertise.
  • Familiarize yourself with Thaloz's products and services, and be prepared to discuss how your skills and experience align with the company's goals and objectives.
  • Practice your SQL and Python skills, as these will be essential for the role.
  • Consider creating a personal project or contributing to open-source projects to demonstrate your skills and passion for data engineering.
  • Research the company culture and values, and be prepared to discuss how you would contribute to and thrive in a remote work environment.
  • Don't be afraid to ask questions during the interview process, and be prepared to provide feedback and suggestions for improving the data engineering processes and systems.

This is a remote position listed on WFA Digital, the platform for professionals who work from anywhere. Browse more remote jobs across all categories.