Made by Team Ness
E.T.L: Extract.Transform. Loch Ness Monster.
A project to create a data platform that extracts data from an operational database, archives it in a data lake and makes it available in a remodelled OLAP data warehouse.
The Team
Paul Sandford
A focussed individual with a love of learning who thrives…
on the challenge of continuous self-improvement. Enjoys working collaboratively in pursuit of ambitious and meaningful goals. Currently pursuing career change from Day Trader to Data Engineer, recently graduating from Northcoder’s Data Engineering bootcamp. Stimulated by solving complex problems, thinking analytically and writing beautiful code.
Liam Dearlove
Passionate data engineer currently transitioning from…
retail sector to career in data engineering. Strong interpersonal and communication skills from time as an optical assistant. Recently immersed in a transformative data engineering bootcamp, learning programming, data handling and advanced problem-solving skills. Eager to leverage this unique blend of skills to and experience to contribute to innovative data engineering projects.
Inna Teterina
Drawing from my foundational background as a school…
teacher, I am embarking on a new professional journey into the field of data engineering. My career shift into this new domain was kick-started by a Northcoders Data Engineering bootcamp, laying a solid foundation in data manipulation and analysis. I’m impressed how data can help solve big challenges, and I’m excited to bring my skills and passion to this new field.
Rahul Aneesh
A highly motivated BSc Mathematics graduate with a strong…
foundation in mathematical concepts along with analytical abilities and problem-solving skills. A passion for honing skills beneficial for various software projects. A creative writer with passion for books, excellent customer service and organisational skills. With a keen eye for detail and a passion for numbers and logical thinking I am seeking a role where I can leverage my skills and knowledge to contribute to the success of an organisation.
Muhammed Irfan
Analytically driven graduate with a solid background in…
Data Science, specializing in statistical analysis, data wrangling, and visualization. Proficient in Python, R, SQL, and data visualization tools, with a keen eye for transforming complex data into actionable insights. Transitioning from a Data Analyst to aspiring Data Engineer, I have embarked on recent bootcamp training to fortify my skills in cloud computing, DevOps, and database management, including AWS, Terraform, and Postgres. This career shift is propelled by a fervent passion for technology and a relentless drive to innovate within dynamic tech environments. Excited to contribute analytical ability and collaborative spirit to fuel data-driven strategies and propel business growth as a Data Engineer.
Muhammad Raza
Dedicated Junior Data Engineer at Northcoders: I am…
actively honing my skills to transition into a dynamic career within Data. With a robust background in finance and healthcare, I bring a unique blend of analytical expertise and industry knowledge. Eager to leverage my strong quantitative skills, programming proficiency, and a passion for problem-solving.
Tech Stack

We used: Python – including pytest, bandit, safety, coverage, pandas, pg8000, sqlAlchemy, autopep8. Terraform. GithubActions. AWS – including s3, Lambda, CloudWatch, SystemsManager, EventBridge. PostgreSQL. Python – because it is a powerful and flexible programming language well suited to the tasks and challenges involved in Data Engineering. Bandit and Safety – to make sure there were no security issues in our code / installed libraries. Coverage – to ensure that our tests provided sufficient coverage, i.e. above 90%. Terraform – to allow us to build and alter AWS infrastructure with speed and flexibility. GithubActions – to build an efficient and robust CI/CD pipeline. AWS – because Amazon Web Services is an accessible and widely used cloud computing platform. Trello – we broke down the project into granular tickets. This allowed us to focus on achievable tasks and to manage the workflow of the team as a whole. Pair programming – we worked in pairs frequently in order to foster a collaborative and supportive working environment, and to make sure we were producing high quality code. Daily stand-up meetings – we conducted regular stand up meetings to make sure that we could overcome any blockers that pairs were facing.
Challenges Faced
Differences between column names on the provided ERD and the actual data warehouse – this led to us having to refactor parts of our transform app, but was a good experience as it enabled us to practise being flexible and responding to changing conditions. Deploying the load app using sqlAlchemy, psycopg2 and pandas – this led to our deployment package being too large for an AWS Lambda, so we had to learn how to use sqlAlchemy with pg8000 rather than psycopg2. Mock testing – in order to sufficiently test our functions we had to learn about and practice ways of testing that we weren’t particularly familiar with before the start of the project, which proved to be a very useful challenge. Github – making sure we were on separate branches and regularly merging to / pulling from main.