About the Role
We are looking for a skilled and motivated Senior Analytical Data Engineer to join our team
and drive the development of scalable, efficient data solutions. You will play a crucial role in designing and building ETL/ELT pipelines, optimizing data architectures for all available source & target systems, and enabling business insights through effective data models. This role provides an opportunity to collaborate with cross-functional teams and work on high-impact projects that align data engineering practices with business goals.
Technologies you will be working with:
• SQL, Python, Google Cloud Platform (GCP), and Google BigQuery
• Data pipeline tools such as Apache Airflow and DBT
• Infrastructure tools like Terraform
• Data modelling concepts, including dimensional and star schema design
• Data visualization tools like Looker
• NoSQL and SQL databases
• Data pipelines for Braze (Customer Engagement Platform) and Segment CDP (Customer Data Platform)
Role Responsibilities
• Design, implement, test and maintain scalable and efficient ETL/ELT pipelines to ingest and transform large datasets from diverse sources.
• Architect, build, test and optimize data warehouses and data lakes, focusing on data consistency, performance, and best practices for analytics.
• Work closely with data science, analytics, and business intelligence teams to understand data needs, translate requirements into data models, and develop reliable data sources.
• Implement data quality checks, validation rules, and monitoring to ensure data accuracy, integrity, and security. Contribute to data governance initiatives.
• Continuously monitor and enhance database and query performance, identifying
opportunities to streamline and improve data retrieval times.
• Define and design frameworks for monitoring data quality and data pipelines
• Monitor Google BigQuery consumption.
• Evaluate, select, and integrate new tools and technologies to enhance data capabilities. Automate manual processes to improve efficiency and accuracy.
• Mentor junior engineers, sharing best practices in data engineering and analysis, and helping to build a culture of technical excellence within the team.
• Adhere to development guidelines and quality standards
What you will be doing
• Within 3 Months you will:
• Gain an understanding of our data ecosystem, technologies, and business requirements.
• Develop foundational ETL/ELT pipelines and start contributing to existing projects.
• Collaborate with cross-functional teams to translate business needs into technical requirements.
• By 3-6 Months you will:
• Lead the development of data models and pipelines, ensuring scalability and efficiency.
• Implement data quality checks, validation, and monitoring tools to enhance data accuracy.
• Start optimizing database performance and query efficiency.
• Collaborate closely with data analysts from the Marketing Operations team to gain insights, share knowledge, and ensure alignment with business goals.
• By 6-12 Months you will:
• Take ownership of key projects and drive initiatives to enhance data capabilities.
• Evaluate new technologies to improve data architecture and integrate them into existing systems.
• Mentor junior team members, contributing to a collaborative and innovative team culture.
About You
You are a technically skilled and analytical individual with over 6+ years of experience in
data engineering. You excel in building scalable data solutions and thrive in fast-paced
environments. You are comfortable working with large datasets, optimizing data flows, and
translating complex data needs into actionable data models. You bring a blend of technical
expertise and problem-solving skills and are eager to contribute to the team’s success.
Technical Skills & Experience-
• Strong experience in designing and building scalable, efficient ETL/ELT pipelines to
ingest, process, and transform data from multiple sources.
• Strong Proficient in SQL, Python, and data pipeline tools like Apache Airflow, DBT, or
similar.
• Strong understanding of relational databases, and experience with both SQL and
NoSQL databases.
• Strong experience with Googe Cloud Platform and Google BigQuery.
• Strong skills in data modeling concepts, including dimensional modeling,
star/snowflake schema design, and building scalable, performant data architectures.
• Ability to work with large, complex datasets efficiently and optimize data processing workflows.
• Knowledge in Terraform or other infrastructure such as code tools.
Analytical Skills-
• Strong understanding of business needs and ability to translate data requirements into actionable solutions.
• Familiarity with metrics/KPIs relevant to the business and ability to develop data models that support analysis and decision-making.
• Experience Looker to support data visualization and deliver data-driven insights.
• Ability to communicate complex data findings to technical and non-technical stakeholders effectively.
• Understanding basic statistical methods, predictive modelling, and data science concepts is beneficial.
• Familiarity with machine learning workflows and how they interact with data pipelines is a benefit.
• Understanding of Data Mesh concepts.
Soft Skills-
• Strong analytical and problem-solving skills.
• Ability to work in a fast-paced environment, managing multiple priorities.
• Proven ability to work closely with data scientists, analysts, product teams, and other stakeholders to align data strategies with business objectives.
• Strong communication skills to explain data engineering concepts to non-technical
stakeholders.
• Experience in mentoring junior data engineers, setting coding standards, and promoting best practices in data engineering.
• Proactively shares knowledge within the team, contributing to a culture of learning and technical growth.
At Springer Nature, we value the diversity of our teams and work to build an inclusive culture, where people are treated fairly and can bring their differences to work and thrive. We empower our colleagues and value their diverse perspectives as we strive to attract, nurture and develop the very best talent.
Bachelors
Data Engineering,Data Modelling,ETL,NoSQL,Python,SQL,
Publishing