Data Analytics Engineer
Permanent - Full Time £50,000 / yearJob Overview
At Jersey Post, we’re on a mission to transform the way we use data—and we need a Data Analytics Engineer to help us make it happen.
In this role, you’ll design and build robust, scalable data pipelines and analytics solutions that empower the business to make smarter, faster decisions. You’ll be the bridge between operational systems, cloud platforms, and cutting-edge analytics tools—ensuring that high-quality, well-structured data is always at our fingertips for reporting, predictive modelling, and optimisation.
This isn’t just about moving data. It’s about driving impact: improving customer experience, boosting operational efficiency, and unlocking strategic insights that shape the future of Jersey Post. You’ll operate with autonomy, continuously enhancing the reliability and maturity of our data platform, and playing a key role in our digital transformation journey.
Key Responsibilities
- Design and Deliver: Translate core business processes into clear, trusted data models and datasets for reporting and analytics.
- Data Engineering: Build and maintain scalable data pipelines across on-premises and Azure platforms, using tools such as Databricks, Fabric, SSIS, Azure Data Factory, APIs, and event-driven patterns.
- Analytics Development: Create and optimise SQL transformations, Power BI datasets/reports (including DAX), and Databricks or Microsoft Fabric notebooks using SQL and Python.
- Quality and Performance: Apply data quality, validation, and observability practices; troubleshoot and optimise pipelines, jobs, and reports for reliability and cost efficiency.
- Integration: Ensure interoperability between legacy systems and modern cloud solutions.
- Collaboration: Work closely with stakeholders to gather requirements and deliver solutions aligned with business outcomes.
- Operational Support: Participate in monitoring, incident response, and service desk support for data platforms and analytics products.
- Continuous Improvement: Identify opportunities for automation and enhanced data availability; contribute to CI/CD pipelines and follow best practices using Git and Azure DevOps.
- Team Contribution: Share knowledge, maintain documentation, and mentor associate engineers through peer reviews and guidance.
Skills, Knowledge and Expertise
Impact & Mindset
- Delivers high-quality data pipelines and analytics solutions with minimal supervision.
- Owns key data domains end-to-end, improving reliability, performance, and automation.
- Enables insight across customer, operational, and sustainability initiatives.
- Recognised as a dependable contributor with a curious, proactive mindset.
Essential Technical Skills
- Strong SQL (query optimisation, data modelling) and dimensional modelling expertise.
- Experience with Microsoft data platforms and tools (SSIS, Azure Data Factory, Databricks/Microsoft Fabric).
- Working knowledge of Python/PySpark for data transformation and automation.
- Integration experience with APIs and Power BI (datasets, semantic models, DAX).
- Familiarity with Azure Data Lake/Fabric storage, Git, and CI/CD pipelines.
- Understanding of scalability, observability, security, and cost control.
Desirable
- Near-real-time/event-driven ingestion; forecasting and optimisation use cases.
- Exposure to machine learning concepts and Python libraries (pandas, numpy, scikit-learn).
- Awareness of AI techniques (NLP, anomaly detection) and their business value.
- Certifications in Fabric or Databricks.
Behaviours
- Takes ownership and accountability for outcomes.
- Communicates clearly and collaborates effectively across teams.
- Proactive in learning and receptive to feedback.
Make Your Resume Now