London Jobs |
Manchester Jobs |
Liverpool Jobs |
Nottingham Jobs |
Birmingham Jobs |
Cambridge Jobs |
Glasgow Jobs |
Bristol Jobs |
Wales Jobs |
London Jobs |
Manchester Jobs |
Liverpool Jobs |
Nottingham Jobs |
Birmingham Jobs |
Cambridge Jobs |
Glasgow Jobs |
Bristol Jobs |
Wales Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Cardiff |
Education | Not Mentioned |
Salary | 60,000 - 65,000 per annum |
Industry | Not Mentioned |
Functional Area | Not Mentioned |
Job Type | Permanent , full-time Work from home |
A highly established fintech company are looking to expand their team with an experienced Data Engineer. The role will play a key part in fulfilling business needs and will define and build data pipelines that will enable better, data-informed decision-making,both within the business and for their customers.As the Data Engineer within this company, youll have opportunity to gain exposure to big data architectures and mpp processes. You will work closely with the database teams and data engineering building specific systems, facilitating the extraction andtransformation of data.On offer is a competitive salary, flexibility to work from home majority of the week, private medical, discounts and a wide range of benefits including learning many new skills. Maintaining a motivating and nourishing culture is essential for this client,they encourage a healthy work life balance and organise yoga sessions, games tournaments and mental health awareness days at the office, with great occasional social gatherings to meet with likeminded professionals.Responsibilities include: Provide mentorship to team members, for code maintainability and performance Work in an Agile team to develop, test and maintain high quality systems Extracting data from various data sources, for example relational databases, files and APIs Help evolve the data platform Execute practices such as continuous integration, and test driven development to enable the rapid delivery of working code Design and build metadata driven data pipeline using Python and SQL in accordance with guidelines set by the Data Architect Ship medium to large features independently using industry standard processing patternsExperience required: Strong Development experience, creating production grade ETL pipelines in python Be comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred) Hands on experience with data orchestrators such as Airflow, Prefect, Dagster or Luigi(Airflow Preferred) Knowledge of Agile development methodologies, and automated delivery processes Awareness of cloud technology particularly AWS Experience designing and building autonomous data pipelines Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications) Ability to write efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred)Please apply if you feel this role could be a fit for you/youre experienced with a good number of the criteria enlisted but your experience doesnt meet each requirement, as we are happy to talk about this position and whether it could still be the right matchfor you.Please Apply Now to be considered or contact Rachael Maule for a confidential chat: