- Posted 06 July 2022
- Salary €55-80k
- Job type Permanent
- Contact NameKai Wilton-Ali
Does the idea of systemising insights that historically have been manual sound exciting to you?
Do you want to make an impact in the worlds leading Human Intelligence platform?
If so, Read On!
Who you are
- Fluent in Python (Pandas, Numpy, & PySpark are essential)
- Experience with Spark, AWS Glue, Athena, Delta Lake, & Redshift
- Knowledgeable on deployment in a cloud environment (AWS preferable)
- Experienced with SQL and Git
- Understanding of an infrastructure-as-code environment with Terraform.
- Familiar with containers and scheduling tools (we use Docker on ECS)
- Familiar working with CI/CD pipeline tools such as CircleCI, Jenkins etc
What you'll do
- Develop and maintain data pipelines for production client dashboards, ML training loops and data operations workflows
- Work closely with Client Strategists to support their activities with data transformations and new data sources
- Active pull request participation, branch management, code reviews etc
- Work closely with developers and product managers to ensure that data engineering requirements are implemented correctly and efficiently
- Being an active part of the group of engineers reviewing and making key tech design
- decisions, and coordinating work and operations on our platform architecture
What you'll get
- Flexible working hours, fully remote or hybrid, dependant on your location
- A budget to foster your own personal learning and development
- Paid Volunteer days for any charity of your choice
- Competitive, regularly reviewed salary
- 24 days holiday, with the option to rollover unused days.