At PwC, we are significantly growing our Data & Analytics (D&A) practice to meet new market demands and opportunities. We value the fact that most cutting edge market solutions depend on a strong Data Engineering and Data Science capability.
Within D&A, our Data Engineering team, based across the UK has specialists in all sectors pioneering in developing solutions for client problems through the use of data analytics, tools and platforms. The team’s core proposition is to develop innovative subscription based products and services which combine the latest technologies with PwC’s deep subject matter expertise and industry knowledge.
As part of D&A team, you will gain exposure to working with a broad and high profile client base, including FTSE250, financial services and public sector clients. We have ambitious growth plans and are looking for highly motivated and skilled individuals who can architect, build, support and maintain technology solutions in Microsoft Azure, Google Cloud Platform (GCP) or AWS.
A data engineer is a software engineering role specialised in large distributed systems aimed at producing end user analytics. The role is typically a hybrid of a solution or data architect, a data analyst and a data scientist. And more recently an individual who has developed practical insights and deep expertise in cloud architecture, big data architecture and the typical management of data workflows, pipelines and ETL processes.
About the role
We are looking for motivated, enthusiastic individuals to join our team. People with the right blend of hands-on data/programming experience, a passion for innovation and an understanding of the markets in which we work.
As a Senior Manager in our practice you’ll have the opportunity to develop your career whilst driving the growth of our business. As part of a market-leading D&A team, you’ll be helping new and existing clients deliver technology and data solutions to streamline their data management processes, gather insights and deliver engaging reporting tools and data visualizations.
This is a fast growing team where you’ll be responsible for execution and delivery of work across a combination of audit and advisory engagements. Your responsibilities will include the following:
Maintenance, improvement, cleaning, and manipulation of data in the business’s operations and analytics databases. The Data Engineer works with the business analyst, data scientists, and Program Managers to understand and aid in the implementation of system requirements, analyze performance, and troubleshoot any existing issues.
Understanding of Cloud entity, PaaS and IaaS and Hybrid Systems.
Understanding of MI systems and Data Warehouses.
An expert in SQL development further providing support to the team in database design, data flow and Modelling activities. The position of the Data Engineer also plays a key role in the development and deployment of innovative big data platforms for advanced analytics and data processing.
Define and builds the data pipelines and integration mechanism that will enable faster, better and well organized Data Pools.
The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem.
Can analyse large and complex data sets to solve business challenges and present insights to non-analytics minded individuals.
Should be able to advise the best of breed for the client solutions.
Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering.
Experienced in delivering Medium to Big Enterprise Data Projects working on Depp tech and VLDB systems.
Proven experience in Performance Optimization of complex systems and producing cost optimized solution deliveries.
Azure Cloud, AWS and GCP Experience including and not limited to MPP systems, Database systems, ETL and ELT systems and Data Flow compute
Good to have skills:
A high level understanding of AI and/or block chain technologies and desire to develop your skills in these areas of emerging technology.
Experience in NoSQL databases (Cosmos DB, MongoDB) and Hadoop experience.
Connected systems, IoT sensing, Hyper scale compute, Tensor Flow Module, GPU Computing and Massive Batch Loads.
Minimum of a Bachelor degree (2.1) in Economics, Statistics, Operations Research, Computer Science, Information Systems, Engineering or similar quantitative discipline