Senior Data Engineer (Contract)
Square Enix has an internal cloud-based platform (SGV), which provides our Analytics & Insight team and other groups across the business with a single data lake pooling game telemetry, sales and marketing data, web analytics and other information.
- Optimising, refining and enhancing the data acquisition pipelines
- Working with client teams to ensure robust capture of high-quality data
- Supporting data analysts and other users of the data via training and technical assistance
- Providing staff development for others in the Data Engineering team
This position requires a driven and talented person that can help the team progress
- Ensure the Data Engineering team deliver on requests from client teams to agreed specification and timelines.
- Ensure open and regular communication with other stakeholders as to the status of their projects.
- Work to ensure Data Engineering team is capable to deliver against responsibilities. Maintain a learning culture within the team to enable individual team members to continue to grow professionally and to develop their skills
- Ensure data is robust and of high quality.
- Provide data access and querying support to users both within the team and across the business.
- Have a good understanding of the scope, potential and limitations of the datasets maintained by the Data Engineering team, remaining alert to any opportunity to further employ our data to benefit the business.
- Evangelise the use of customer data to better understand our customers across the organisation.
- Maintain strong relationships with technical partners at Google, Amazon, Microsoft, Sony etc. to ensure Square Enix capability remains at the forefront of the industry.
- To represent the team professionally at all times – both internally and externally.
Head of Digital Channels, Online Development Director, Director of Analytics & Insight
Knowledge & Experience
- High-level of professional experience with cloud-based data engineering platforms ideally Google Cloud Platform (DataFlow, BigQuery, PubSub, GCS)
- Expertise with lambda architecture and other approaches to capture and processing of data at scale to provide real-time analytics capability
- Excellent programming skills in Java (8 preferable) & Python essential, other languages an advantage
- Experience modelling ETLs using Apache Beam.
- Experience writing near real-time ETLs.
- Experience with multiple build tools, preferably gradle
- Expert SQL skills
- Comfortable familiarity working with large data sets
- Excellent problem solving & analytical skills.
- Familiarity with OSX or Linux environment (shell scripting, basic system administration etc).
- Experience with managing a code base and using source control/collaboration tools such as GitHub, Bitbucket or GitLab.
- Familiarity with collaboration and communication tools such as JIRA, Confluence, Slack etc.
- BSc or higher-level degree in Computer Science, STEM subject or a similar field of study
- Experience with a variety of systems with aggregation frame works such as Mongo and Elastic Search
- Experience with DAG based workflow management systems, ideally AirFlow
- Interested in statistical methodologies and models.
- Experience with Hadoop technologies.
- Experience writing ETLs in SPARK.
- Knowledge of functional programming languages, such as Scala, Kotlin etc
- Knowledge of Data Protection laws and best practices
- Experience working in a data protection regulated environment would be beneficial
Competencies, Skills & Attributes
- Ability to quickly learn and employ new technologies and methodologies.
- Highly numerate.
- Strong documentation skills.
- Ability to articulate and present ideas and information with ease and clarity.
- Ability to work on own initiative and as part of team.
- Strong interest in technology.
- Ambition to drive self-development
- Excellent attention to detail.
- Ability to work under pressure and to deadlines.
- Follower of industry trends and developments.
Your application has been successfully submitted.