Job Description
The Offer
- Work alongside & learn from best in class talent
- Join a well known brand within IT Services
- Excellent career development opportunities
The Job
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and orchestration workflows.
- Collaborate with cross-functional teams to define and implement data models that support key business use cases.
- Optimise data architecture for performance and reliability using technologies like Databricks and Snowflake.
- Write clean, maintainable code using Java, Scala, or Python.
- Ensure high data quality and performance through automated testing and monitoring.
- Engage in code reviews, architecture discussions, and provide input into system design decisions.
Additional Requirements:
- Must be eligible to work in the U.S - H1B Transfers possible (No OPT).
The Profile
MUST HAVE Skills & Qualifications:
- 5+ years of experience in software or data engineering.
- Strong coding experience in Java, Scala, or Python.
- Hands-on experience with Databricks, Airflow, and AWS.
- Proficient in data warehousing and orchestration tools.
- Strong understanding of data modelling and transformation at scale.
- Clear communicator with a strong sense of ownership and attention to detail.
NICE TO HAVE Qualifications:
- Experience supporting machine learning pipelines and recommendation systems.
- Familiarity with A/B testing frameworks or product performance evaluation techniques.
- Prior experience with Snowflake in a production environment.
- Exposure to bandits or other experimentation mechanisms.
The Employer
Our client is a team driven by passion, committed to revolutionizing the way you experience recruitment and HR.
Job Tags
Full time, H1b,