Logo of Massar

Data Engineer

Vienna, Austria
Photo of Caroline Gachigua
Recruiter
Caroline Gachigua
Roles:
Data
Must have skills:
PythonSQL
Nice to have skills:
Spark
Considering candidates from:
Austria
Work arrangement:
Onsite only
Industry:
Investment
Language:
English
Level:
Middle
Relocation:
Not paid
Visa support:
Not provided
Company size:
11-50 employees
Logo of Massar

Data Engineer

Vienna, Austria
Massar is a technology-led hedge fund founded in 2015, passionate about collecting, cleaning, transforming, and storing data to feed trading strategies. Its data driven trading models utilize a variety of open-source technologies and multiple data sources, from vendor APIs to fleet of web crawlers running 24/7, to make trading decisions in the global markets. The firm manages assets on behalf of a diverse range of clients including pension plans, insurance companies, financial institutions, family offices, qualified individual investors, among others.
Role Overview
Partnering with the investment team and data scientists, you will develop solutions that enable the investment team to efficiently extract insights from data. This includes owning the ingestion (web scrapes, S3/FTP sync, API collection), transforming it into actionable insights (Spark, SQL, Kafka, Python), storing it, and designing their interfaces (APIs).

Tasks:
  • Design and implement scalable ETL solutions (structured and unstructured; batch and streaming).
  • Building data pipelines, standardizing, and cleansing data for the investment team
  • Define and set best practice standards surrounding data (i.e., data modeling, database design, ETL design, job scheduling, and monitoring, etc.)
Must-have:
  • Bachelor’s in a technology-related field (e.g., Engineering, Computer Science, etc.) is required; a Master’s or Ph.D. degree would be ideal
  • Proficiency in Relational database engineering and NoSQL database engineering is required
  • Strong programming background with experience coding in Python
  • Knowledge of developing highly scalable distributed systems using open-source technologies, such as Spark, Dask, or Hadoop
  • Be able to succinctly present recommendations at an executive level
  • Strong analytic and strategic thinking skills
Nice-to-have:
  • Experience with any of the following systems/tools is a plus: Apache Airflow, Jupyter, Kafka, Docker, Kubernetes
  • Experience with CI/CD
  • Familiarity with the mechanics of securities markets and exchanges is a plus but not required
Benefits and conditions:
  • Annual bonus
  • Health benefits 
Interview process:
  1. Intro call with Toughbyte
  2. Coderbyte challenge (75 min)
  3. Call with the Hiring Manager (30 min - 1 hr) 
  4. Call with the Tech lead (30 min - 1 hr)
  5. Call with the Founder (30 min - 1 hr) 
  6. Technical Interview
Have questions about this position? Try the company page or ask us below:

Apply now

or
By applying you agree to our terms of service. This site is protected by reCAPTCHA and the Google privacy policy and terms of service also apply.