Logo of Archipelo

Remote NLP Engineer

Photo of Daria Elliott
Daria Elliott
Must have skills:
Nice to have skills:
Considering candidates from:
Belarus, Ukraine, and Schengen
Work arrangement:
Remote only
Developer tools
Company size:
2-10 employees
Logo of Archipelo

Remote NLP Engineer

Archipelo is building an intelligent code discovery platform that provides the best tools for developers to discover code in any form—and benefit through insights, recognition, and greater productivity. They are transforming code search to improve the practice of modern programming—using a graph-based approach drawing on data from the entire open source ecosystem. They're on a mission to build the world's best code discovery engine. Archipelo is well-funded by top investors in Silicon Valley, including the first investors of Google, Twitter, Zoom, LinkedIn, and Uber. Their team has backgrounds from NASA, LinkedIn, Facebook, Amazon, AWS, Cisco and MIT, Harvard, Stanford, and Berkeley.
Right now, they are seeking a Senior NLP Engineer to lead technology development on the frontier of code discovery and developer productivity. A successful applicant is an expert in data science, machine learning, and complex data analysis spanning natural language, code syntax and networks. 

  • Build complete data processing systems that drive products, systems or applications 
  • Lead experimentation processes that accelerate prototyping and maximize resource utilization
  • Process data pipelines for machine learning operations: scheduling, ETL, dataflow programming, SQL, data labeling, representation learning, hyperparameter tuning, and model management
  • Produce and deploy internal and external APIs
  • Design and implement predictive models on multiple decision platforms
  • Apply the latest techniques from research and academia to real-world problems in the production environment
Must-have skills:
  • Expertise in Natural Language Processing and Understanding (NLP & NLU)
  • Expertise in microservices and cloud computing—in at least one cloud platform
  • Familiar with distributed systems and the orchestration of large numbers of independent commodity machines into complete, functional systems to handle diverse workloads
  • Expertise performing data science research
  • Expertise writing world-class Python code
  • Experience coding in Go
  • 10+ years of professional data science or software engineering experience or a bit less but very relevant experience
Nice-to-have skills:
  • PhD in computer science, artificial intelligence, machine learning or related technical field
  • Advanced working knowledge of information retrieval and search technologies and have set up and used open-source search systems to query and understand data
  • Experience with many of the following technologies:
    • ElasticSearch, Solr and equivalent 
    • Kubernetes
    • Machine learning infrastructure
    • Deep learning
    • Relevance engineering
    • CircleCI, GitHub Actions, Jenkins or equivalent
    • Any graph database
  • Stock options
  • Paid vacation and sick leave
  • A strong remote work culture that includes group activities and local gatherings
Interview process:
  1. Vision and Opportunity Interview 
  2. Culture and Operations Interview 
  3. First Technical Interview w/ Technical Challenge (20-25 mins) 
  4. Second Technical Interview
  5. (Optional) Third Technical Interview
  6. Compensation discussion 
  7. (Optional) Meeting with Investor to Close
Check out the answers to frequent questions about this position below. Can't find the answer you're looking for? Ask us via email or try the company page.

Is part-time work possible?

Unfortunately, no. The company is looking to hire a full-time employee.

Apply now

By applying you agree to our terms of service. This site is protected by reCAPTCHA and the Google privacy policy and terms of service also apply.