
Lead Data Engineer
- Λεμεσός Λευκωσία
- Μόνιμη
- Πλήρης Απασχόληση
- Employ state-of-the-art technologies for data architecture, storage, and processing.
- Design, build, and maintain scalable and reliable data structures, pipelines and ETL processes using SQL/NoSQL and other relevant tools, to acquire and process large volumes of data from various sources.
- Develop and maintain data warehouse, data lake and other relevant solutions to support a wide range of data requirements for the data science team.
- Establish methodology for working with large-scale datasets practicing efficient aggregation and feature engineering.
- Collaborate with cross-functional teams to define and enforce data governance policies and data security best practices.
- Ensure data quality, integrity, and security by implementing appropriate data validation and monitoring techniques.
- Identify and address performance bottlenecks in data processing and recommend optimizations to enhance system efficiency.
- Troubleshoot and resolving issues related to data quality, data pipeline failures, and data infrastructure.
- Research and stay up to date with emerging trends and technologies in the data engineering and data management field and recommend innovative solutions to enhance our data infrastructure.
- Communicate results to the rest of the team and other departments.
- Guide and mentor multiple teams, fostering innovation and technical excellence.
- Present insights and model outcomes to non-technical stakeholders, translating data into actionable recommendations.
- PhD or MSc in Data Science, Physics, Mathematics, Computer Science, or a related discipline.
- 12+ years of professional experience in data science, with a strong emphasis on data engineering and fintech solutions.
- Expertise in data warehousing and working with both relational and NoSQL databases (e.g., MongoDB).
- Strong command of AWS services (MSK, Lambda, Glue, SageMaker) and infrastructure provisioning using Terraform.
- Practical experience with cloud-native data platforms.
- Familiarity with workflow orchestration tools such as Airflow or Kubeflow.
- Experience implementing CI/CD pipelines using GitLab.
- Advanced programming skills in Python, SQL, and at least one other high-level language.
- Proficient in Unix/Linux environments.
- Strong analytical and problem-solving abilities, with keen attention to detail.
- Demonstrated success in leading data science or data engineering teams.
- Effective collaborator in cross-functional, multi-stakeholder settings.
- Excellent verbal and written communication skills in English.
- Attractive remuneration package
- Private health insurance
- Corporate pension fund
- Intellectually stimulating work environment
- Continuous personal development and international training opportunities
- Let's Connect - Intro Chat with Talent Acquisition
- Deep Dive - First Interview with Your Future Team
- Bring It to Life - Role-Specific Take-Home Task
- Final Connection - Final Interview