Naukrijobs UK
Register
London Jobs
Manchester Jobs
Liverpool Jobs
Nottingham Jobs
Birmingham Jobs
Cambridge Jobs
Glasgow Jobs
Bristol Jobs
Wales Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Apache Airflow Specialist (ETL) Remote UK

Job LocationLondon
EducationNot Mentioned
Salary£55,000 - £95,000 per annum, negotiable
IndustryNot Mentioned
Functional AreaNot Mentioned
Job TypePermanent, full-time

Job Description

Job; Apache Airflow Specialist (ETL) - Remote UKThis is an opportunity to join a fast-growing data intelligence business that works across media and creative industries. You will join their mission to consolidate and rationalise their diverse portfolio of industry-leading media analysis & planning tools and be a pivotal player in the design and delivery of their next generation SaaS Media Planning Platform.The Role:

  • In the role of Apache Airflow Specialist, you will be responsible for designing, expanding, automating and optimizing data pipeline architecture, as well as optimizing data flow and collection.You will be responsible for Airflow architecture to design, build and manage automated data pipelines built upon environments in AWS and pioneer ways of working that other team members can adopt.
  • Using Apache Airflow for respondent survey pipeline building, you will work to automate components of a manual load and then add these to a pipeline.The aim is to reduce manual intervention, and increase processing speed, so their company can repeatedly process more data, at higher frequencies in the future. Where needed you may choose to add manual human tasks to automation workflow.
  • You will work with Data Engineers, Python Engineers and Product Managers within Agile to deliver secure, performant and maintainable data automation. You will use best-practice continuous integration and continuous deployment methodologies to ensure that build and deployment pipelines are fast, robust and secure. You will champion good code quality and architectural practices.
  • Main Responsibilities and Tasks:
  • Advise on extension and optimisation of automated data ingestion platform using Apache Airflow for automation of data loading routines
  • Automation/Industrialisation of pipelines for regular execution in business-critical scenarios
  • Programmatically author, schedule and monitor workflow within Apache Airflow.
  • Determine custom ETL processes in design and implementation of data pipelines
  • From ingestion of data from source systems, identify and resolve data quality and other related issues.
  • Elicit requirements, describe scope, define data flow in flowcharts and be able transform them into stories as per internal standards.
  • Ability to read existing source code/scripting repositories to determine functional state of current implementations
  • Be able to document current state and functional state of current workflow for their wider Technical team e.g. about automation pipeline, and utilities used within it.
  • Place security as a foremost primary concern in architecture, secure coding, build and deployment of solutions
  • Collaborate with their wider team in particular Infrastructure Engineers to deploy automation
  • Convert, parse and manipulate data files using various database programs and utilities
  • Education, Experience, Knowledge, and Skills:
  • Significant experience of large-scale implementation using Apache Airflow
  • Be an expert in the concept of DAGs (Directed Acyclic Graph) and Operators to schedule jobs
  • Working knowledge of message queuing, stream processing, and highly scalable "big data" data stores.
  • Experience of manipulating, processing and extracting value from large disconnected datasets.
  • Prior experience with customer data platforms.
  • Experience in performing root cause analysis on internal/external data and processes.
  • Prior experience with data analysis and data warehousing
  • Technical expertise with data models, data mining, and segmentation techniques
  • Proficiency in scripting languages (especially Python)
  • Be able to investigate current data loading procedures, planning pipelines and required steps in order to automate data extraction, transformation, and loading (ETL) processes
  • Proficiency in understanding of GitHub for source code repositories to maintain daily operation, integrity and security of source code.
  • Experience of conducting code reviews against acceptance criteria
  • Knowledge of Amazon Web Services (AWS) infrastructure & services e.g. Redshift, EC2, RDS, S3, Lambda, EMR, Batch or Athena
  • Excellent Linux scripting skills
  • Experience with data modelling, data processing and ETL
  • A working knowledge of SQL, query authoring and a working familiarity with a variety of relational databases.
  • Experience with Agile methodologies and change management i.e. JIRA and be able to define technical acceptance criteria for stories
  • Experience working with external partners to drive product delivery.
  • Other Stuff;
  • £’s; fully DoE up to £95k + benefits
  • Location: based from your home / remotely (with very occasional meetings at their London offices as/when required); i.e. you will be given autonomy in defining your working location and times
  • Interested Please send your (Word doc) CV,
  • Please only apply if you are already eligible to work in the UK (indefinitely & without sponsorship),
  • Not for you but you know someone suitable - Profile 29 can pay a referral fee,
  • Visit our website for lists of all current job opportunities,
  • In accordance with GDPR by applying you give Profile 29 consent to use your data for recruitment purposes only (details of Profile 29’s privacy policy can be found at: profile-29 .com/privacy)Profile 29 recruitment keywords; Apache Airflow home remote WFH London UK

    APPLY NOW

    © 2019 Naukrijobs All Rights Reserved