Your role as part of Global Analytics and Insights Technology will be to build last-mile global analytics and reporting solutions that depict player evolution year-over-year in world-class titles.
You will help build a platform to process petabytes of data into pipelines for user-facing environments and collaboratively compete for the industry-leading data program – all while serving needs of teams on the cutting edge of game development using DevOps methodologies.
- Maintain and enhance our core data infrastructure and ETL framework.
- Develop and maintain data sets to support reporting and exploratory analytics.
- Develop and own core tools, frameworks, and our Kubernetes-based platform.
- Assess and recommend available and emerging big data technologies.
- Contribute to post implementation reviews helping to demonstrate success.
- Bachelor’s or Master’s degree in computer science, software engineering or similar.
- Be fluent in Python and some flavour of SQL.
- Write clean, effective, re-usable code that can be used to scale applications.
- Have experience with cloud platforms such as AWS and GCP.
- Are comfortable on the command-line and working with servers.
- Built and used CI/CD pipelines in the past – Docker experience is essential.
- Proactive to remain technically capable and an expert in current technologies.
- Basic knowledge of other languages such as Java, Scala and C#.
- Have actively used Kubernetes + Helm and other orchestration tools such as Apache Airflow.
- Are comfortable using infra-as-code tools such as Terraform.
- Bring a “If it can be automated, it will be automated” mentality.
- Experience with adapting purpose-built applications into cloud and on-premise platforms.
- Spent time throttling multiple systems to achieve target performance for SLAs.
- Understand the physical and logical design of cloud-native database such as Snowflake.