Data Engineer

Tallinn, Estonia
1 day average response time from company
Photo of Viktoriya Bliznyuk
Recruiter
Viktoriya Bliznyuk
Roles:
Data
Must-have skills:
PythonSQLAzureDockerKubernetesKafka
Considering candidates from:
Estonia
Work arrangement: Onsite
Industry: Software Development
Language: English
Level: Senior
Required experience: 4+ years
Relocation: Not paid
Visa support: Provided
Size: 51 - 200 employees
Logo of Cargoo

Data Engineer

Tallinn, Estonia
1 day average response time from company
Cargoo is a digital supply chain solution that provides full transparency from source to market. It facilitates communication, data sharing, and execution, across your whole network, giving you greater control of logistics from start to finish. Cargoo seamlessly executes your supply chain plan, anticipates disruptions proactively, reduces manual work, and gives you constant access to performance data, so you can work smarter, not harder. With Cargoo no news really is good news
Tasks:
  • Shape and contribute to Cargoo’s data engineering strategy, including the design, architecture, and implementation of scalable data solutions
  • Build, maintain, and evolve the Data Platform to support advanced analytics, machine learning, and real-time decision-making
  • Collaborate closely with AI engineers, BI analysts, and business stakeholders to translate data needs into robust technical solutions
  • Identify data opportunities across the organization and enable teams to extract maximum value from data assets
  • Design and implement ETL/ELT pipelines, including batch and streaming data workflows
  • Develop and maintain data models using dimensional and advanced modeling techniques
  • Support deployment, monitoring, and lifecycle management of data applications
  • Apply DataOps best practices to ensure reliability, scalability, and quality of data pipelines
  • Participate in code reviews and promote high engineering standards
  • Take ownership of projects end-to-end, ensuring timely delivery and measurable impact
Must-have:
  • Solid proficiency in Python (runtime environment, package management) and SQL (DML, DDL)
  • Hands-on experience with SQL Server / Azure SQL Server
  • Experience working with cloud platforms, preferably Microsoft Azure
  • Familiarity with the modern data stack, including tools such as dbt and orchestration frameworks (Airflow, Dagster, or similar)
  • Strong understanding of ETL/ELT concepts, data architecture, and data modeling
  • Experience with streaming technologies (Kafka or equivalent)
  • Experience with Docker and container orchestration technologies
  • Experience deploying and monitoring applications on Kubernetes (K8s)
  • Knowledge of application lifecycle management
  • Understanding and application of DataOps practices
  • Strong project management, execution skills, and a clear sense of ownership and accountability
Benefits and conditions:
  • Trial period: 4 months 
  • Work with a cutting-edge data stack to power real-time, reliable, and beautifully orchestrated data workflows
  • Enjoy Stebby perks or health insurance, snacks and free parking
  • Work in a modern city-center office with great tools and a collaborative atmosphere
  • Team events and a culture that values balance 
Interview process:
  1. Intro call with Toughbyte
  2. Culture fit interview with the company's recruiter 
  3. A call with the tech lead and local HR
  4. A meeting in the office 
Have questions about this position? Try the company page or sign up to ask one.