Advance Search

Browse Jobs

Data Platform Data Engineer

Posted 11 days ago

Purpose of JobWe are seeking an experienced and skilled Data Platform Data Engineer to join the Data Platform Delivery Team in EMEA Data & Analytics Technology. The successful candidate will play a key role in designing, implementing, and maintaining robust data infrastructure and pipelines to support our organization's data-driven initiatives, ensuring optimal performance, scalability, flexibility, and reliability. The data platform is to be built in a hybrid cloud environment with Azure and some components residing in our on-prem data centre. The platform is to be built on a green field basis and legacy capability migrated to it. The role will include the following.Understanding of business and data requirements and translation to data pipeline and systems design.Implementation of agreed designs.Mentoring and guiding other members of the team through collaboration and peer review.Building production quality components.Aligning the implementation of the Data Platform to the agreed strategy of the platform.Deliver excellence in data ingestion and distribution for SMBC EMEA.BackgroundThe newly created EMEA Data & Analytics Technology team has been put in place to deliver data integration, reporting, and analytics services to the SMBC EMEA business. It exists within the wider ITSD department that is responsible for all IT delivery to the SMBC EMEA enterprise.The team deliver data and analytics solutions to FO P&L business lines, cross business line departments (eg Finance, Risk, Operations, etc), and the Data Office as a key client. This is within a complex systems environment which has been built over several years and is looking to be re-architected for the next phase of SMBC EMEA growth.Facts / ScaleWill be working in team of 40-50 IT staff within a department of 250+Will deliver robust, reliable, performant, and maintainable components of the Data Platform for use across the SMBC EMEA entity covering capital markets and corporate banking activities.Will attend the London office at least 2 days a week.Accountabilities & ResponsibilitiesDesign and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data.Develop data models, schema designs, and data architecture frameworks to support diverse analytical and reporting needs.Build and optimize data processing workflows using distributed computing frameworks available on Azure, our preferred cloud provider.Implement data transformation logic to cleanse, validate, and enrich raw data for analysis and consumption by downstream applications.Integrate data from various internal and external sources including databases, APIs, and streaming platforms into centralized data repositories and data warehouses.Ensure data quality and consistency by implementing validation checks, data profiling, and error handling mechanisms.Optimize data pipelines and processing algorithms for performance, reliability, and scalability to handle growing data volumes and user demands.Identify and resolve performance bottlenecks, optimize SQL queries, and fine-tune system configurations for improved efficiency.Implement data governance policies, access controls, and encryption mechanisms to ensure the confidentiality, integrity, and availability of sensitive data.Collaborate with security and compliance teams to adhere to regulatory requirements and industry best practices.Monitor data pipelines, job schedules, and system health metrics to proactively identify issues and ensure the availability of data infrastructure.Perform routine maintenance tasks, data backups, and disaster recovery procedures to mitigate risks and ensure data continuity.Create and maintain technical documentation including system architectures, data flow diagrams, and operational procedures.Share knowledge and best practices with team members through code reviews, technical presentations, and training sessions.Knowledge, Skills, Experience & QualificationsEssential:Minimum of 5 years of experience working as an analyst developer or data engineer in a data-centric environment.Proven experience in designing and implementing end-to-end data solutions from ingestion to consumption.Strong experience with Azure data PaaS services and data pipeline delivery on the Azure platform.Experience of delivering data pipelines with Databricks.Experience delivering data platforms with C#, JSON, XML, APIs, and message bus technology.Strong knowledge of database systems, data modelling, and data integration technologies.Strong Analytical and problem-solving skills with the ability to troubleshoot complex issues and propose effective solutions.Ability to analyse data requirements, identify patterns, and design scalable data designs to meet business needs.Strong experience of different data architectures including hub and spoke, data lakes, data fabrics, lakehouses.Strong SQL skills and experience with relational databases (e.g., ORACLE, MS SQL) and NoSQL databases (e.g., MongoDB, Cassandra).Good communication skills to explain and gain buy in for designs from other analyst developers in the team and other stakeholders.Collaborative and curious practices to develop strong well thought through solutions.Curiosity to learn new ways of delivering solutions and openness to new designs or other people’s ideas.DesirableKnowledge of complex systems estates with diverse data needs. Preferably in capital markets, corporate banking or asset management.Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.Proficiency in programming languages such as Python, Scala, or Java for data processing and scripting tasks.Experience of Agile practises and especially scrum of scrums and Atlassian tooling.Experience with containerization and orchestration technologies such as Docker and Kubernetes.Data warehouse virtualisation technologies eg Denodo, Snowflake, etc.Coaching / mentoring of more junior staff.Advanced certifications in data engineering, cloud computing, or related fields would be a plus.
Apply