Analytics Engineer
We usually respond within a week
Shine exists to help freelancers and small business owners reclaim the joy of working for themselves.
Running a business shouldn't mean drowning in financial admin - it should be inspiring and rewarding. Our app brings banking, invoicing, accounting and admin together in one place, so entrepreneurs can focus on what matters most: growing their business and enjoying the freedom of working for themselves.
We're a multicultural team of over 400 people across France, Germany, Denmark and the Netherlands. By bringing together leading European fintechs like Shine, Kontist and Tellow, we've built a single, intuitive platform designed for simplicity, speed and accuracy - backed by local, award-winning support.
Your hiring experience matters
Just as we respect our customers' time, we respect yours. Your experience with Shine should feel simple, transparent and genuinely supportive.
If this sounds like somewhere you want to grow, we'd love to hear from you.
The Data team at Shine
Our Data team is organised into three complementary pillars: Data Engineering, Analytics Engineering, and Product/Revenue Analytics. This structure ensures analysts spend less time fixing broken models and more time generating high-value insights.
Analytics Engineering sits at the core of this model, acting as the bridge between raw data infrastructure and business-ready analytics. We are now looking for a Senior Analytics Engineer to help build and scale a unified, dbt-based modelling framework on Snowflake, enabling consistent, cross-entity analytics across Shine's growing portfolio of European products.
Your role as an Analytics Engineer
What's in it for you?
Platform-Level Impact: Your models will power the analytical output of the entire Data organisation, from executive steering dashboards to self-serve analytics for Product, Revenue, CS, and Banking teams.
Greenfield Architecture: Join at a pivotal moment as we consolidate SB & CPA data sources into a unified Shine reporting layer, with significant influence over how the foundation is built.
Cross-functional Visibility: Work closely with Data Engineers, Data Analysts, and senior stakeholders to shape how data is modelled, governed, and consumed across the organisation.
A Stack in Progress: We work with Snowflake, dbt, Python, and Omni — and we're actively building and improving our data foundation. You'll have real ownership over how the stack evolves, not just maintain what's already been figured out
A hybrid work policy: Enjoy a balanced mix of remote work and office collaboration.
Modern, centrally located offices: Work from modern office spaces in prime city locations.
An international environment: Join a diverse team with colleagues from across the globe.
Mobility across locations: Opportunity to work from our offices in Copenhagen, Paris, Amsterdam, or Berlin.
Lunch on us: When you're in the office, we've got lunch covered — a daily perk to keep you fuelled and give the team a moment to connect.
Your responsibilities will include:
Unified Data Modelling: Design, build, and maintain scalable dbt core models on Snowflake that support cross-entity analytics across SB & CPA domains, including revenue, NPS, and customer activity.
Analyst Enablement: Ensure Data Analysts can deliver the majority of their analytical work through well-documented, reliable, and reusable dbt models, reducing ad-hoc data wrangling and accelerating insight delivery.
Data Governance & Quality: Own model testing, documentation, and data quality frameworks. Define and enforce dbt best practices across the Analytics Engineering team.
Tooling & Automation: Drive AI-assisted workflows within the AE function, including automated code review via GitHub Copilot and LLM context management for the dbt repository.
Cross-team Collaboration: Work closely with Data Engineering on ingestion pipelines and with Data Analysts on business use case requirements, ensuring the modelling layer serves both technical and business needs.
Standards & Best Practices: Evolve and enforce internal standards for SQL styling, model architecture, peer review, and documentation across the team.
📍 Job located in Berlin or Copenhagen, with possibility of two remote working days per week
About you
Experience: 5+ years in Analytics Engineering or a closely related role (e.g. Data Engineering with strong modelling focus), ideally in a fast-paced, multi-product environment.
Technical Excellence: Expert-level SQL and deep hands-on experience with dbt — including modelling patterns, testing frameworks, and documentation standards.
Platform Proficiency: Strong working knowledge of Snowflake or similar cloud data platforms (e.g. BigQuery, Redshift); familiarity with data ingestion tools and BI platforms (e.g. Omni, Looker) is a strong plus.
Engineering Mindset: You treat data models as production-grade software — with version control, CI/CD, and maintainability built in from the start.
Communication: Able to engage effectively with both technical peers and non-technical stakeholders, translating business requirements into robust data models.
Language: Fluent in English both written and spoken; excited to work in a diverse, international setting.
Equal Opportunity Employer
We follow the principle of equal treatment to consider all job applicants and do not discriminate based on their gender, sexual orientation, color, racial or ethnic origin, religion, disability, etc. as per applicable law.
Our recruitment process
1️⃣ Screening Call: Initial screening call with a TA-partner
2️⃣ Hiring manager call (30’): Discussion with August (Analytics Engineering Manager) to talk about your previous experiences and the role we’re offering.
3️⃣ Case Study Presentation (60’): Evaluation of your modelling approach, architectural thinking, and stakeholder communication.
4️⃣ Logical & Personality Assessment (30–45’): Followed by meeting a few team members + Discussion with the VP of Data & Analytics regarding collaboration and strategic vision.
- Department
- Product & Technology
- Role
- Platform
- Locations
- Berlin, Copenhagen