Naukrijobs UK
Register
London Jobs
Manchester Jobs
Liverpool Jobs
Nottingham Jobs
Birmingham Jobs
Cambridge Jobs
Glasgow Jobs
Bristol Jobs
Wales Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Data Engineer PySpark, Remote

Job LocationLondon
EducationNot Mentioned
Salary£60,000 - £75,000 per annum
IndustryNot Mentioned
Functional AreaNot Mentioned
Job TypePermanent, full-time

Job Description

DATA ENGINEER - PYSPARKFULLY REMOTE - 2-3 DAYS IN THE OFFICE PER MONTHSALARY DETAILS ON REQUESTThis company are a young insurance business who have been trying to challenge the status-quo since inception. They have recently gone through a large scale data transformation and this has led to a data driven CEO/CPO being hired.A key component to this transformation is a new, greenfield data platform that is being built. As a Data Engineer, you will join this project in early stages and heavily contribute to the build.COMPANY: This company build and develop insurance products for the digital age - they are currently in the middle of an exciting tech transformation to become a fully fledged fintech/insuretech business. The decision has been made to transform the entire IT estate, which includes building a greenfield data platform from the ground up to enable both batch and real time data usage.This platform will be built within an AWS infrastructure and heavily relies on Spark and PySpark. You will also build real-time data pipelines using kafka/kinesis.ROLE:

  • Design and build scalable data pipelines using PySpark.
  • Build data infrastructure within AWS, utilising a number of AWS services.
  • Orchestrate data workflows using Databricks (on AWS).
  • Bring experience of AGILE ways of working and software engineering best practices to the team.
  • YOUR SKILLS & EXPERIENCES:
  • Excellent python software engineering skills
  • Experience with PySpark
  • Experience with AWS or other cloud environments
  • Experience with docker/databricks/airflow
  • HOW TO APPLY:Please register your interest by sending you CV to Joshua Carter via the apply link on this page. (The Company have outlined a fully remote interview process and have a remote on-boarding policy in place) Required skills
  • Python
  • Airflow
  • Data Engineer
  • AWS
  • Remote
  • PySpark
  • Keyskills :
    Python Airflow Data Engineer AWS Remote PySpark

    APPLY NOW

    Data Engineer PySpark, Remote Related Jobs

    © 2019 Naukrijobs All Rights Reserved