DevOps Engineer

Location: Fully remote, Ukraine

On behalf of Adthena, Efisco is looking for a DevOps Engineer to join the fully remote team on a full-time basis. We are looking for an experienced person to help evolve and improve our existing architecture and data infrastructure.

About Adthena

Adthena’s mission is a world of search transparency where precise ads connect marketers to consumers. This statement is key to our ethos here at Adthena and is backed by our Whole Market View technology, a dynamic, AI-driven, data model that is unique for each advertiser, representing their entire relevant search landscape. 

Powered by its patented machine learning technology, Whole Market View provides the comprehensive data scope and quality required by the world’s leading advertisers to precisely assess competitive opportunities at scale across their entire market, without limitations. Adthena indexes information hourly, processing over 10TB of new data, 500 million adverts and 200 million keywords across 15 different languages each day. 

Things that make Adthena Unique

  • Machine-Learned Whole Market View
  • Supervised and Unsupervised Learning
  • Convolutional Neural Networks
  • Natural Language Processing
  • Word Vector Embeddings (Word2Vec)
  • Built for Client Value and Outcomes
  • World-Class Customer Success

Reasons why you should join the Adthena team @ Efisco

  • Startup Engineering culture
  • Good work/life integration 
  • Your work is seen and touchable
  • Your input is heard and implemented
  • Vacations: Annual leave + Christmas week
  • Paid sick-leaves
  • Individual coaching programs
  • Monthly Hackdays
  • Monthly Socials and company-wide retreats (when Pandemic restrictions become easier)
  • Free Trainers when you join our team
  • Social activities to join in. 
  • Huge training base

As a DevOps Engineer, you will be working across our entire stack, so a real passion to drive the product and technology forward is something that we value. Your responsibilities will include creating a vision for the future architecture of this complex data system, adding innovative ideas that use the latest cutting-edge technology. You will work closely with Data, Data Science, and Web teams to deliver the best quality architecture and automation solutions to our teams. 

Responsibilities

  • Creating / scripting, monitoring, and maintaining Spark Clusters 
  • Building and working with complex ETL workflows (Luigi, Airflow)
  • Automate the provisioning, management and monitoring of Adthena’s data platforms and architecture
  • Manage the deployment, orchestration and monitoring of data processing and dissemination platform
  • Design and implementation of CI/CD pipelines covering infrastructure, code build, and deployment 
  • Continuously strive for operational excellence by seeking to automate away manual processes 
  • Develop internal tooling and capability to support our engineers and SaaS solutions
  • Work closely with teams to create ultra-scalable, flexible and highly reliable system 
  • Troubleshooting, diagnosing and fixing production software issues 
  • Proactively performing software maintenance and configuration 
  • Working with AWS technologies and improving infrastructure security and resiliency
  • Collaborate with the Technology Leadership team to define our DevOps strategy and roadmap
  • Collaborate with teams to develop, maintain and optimize our operational systems, services and infrastructure 
  • Effectively communicate data findings and helps team/business make data-driven decisions 
  • Take ownership and pride in the products we build and always make sure they are of highest standard
  • Be empathetic towards team members and customers 

Requirements

  • Bachelor degree in Computer Science, similar technical field of study or equivalent practical experience
  • Proven successful experience as a DevOps engineer operating within a Distributed Data environment
  • Commercial experience working on Spark and Hadoop stack
  • Confident writing production quality code preferably in Python and SQL
  • Experience with Ansible, Terraform or other configuration management systems 
  • Building and automating deployment pipelines using CI/CD tools
  • Experience working on complex distributed systems and version control systems 
  • Exceptional analytical, quantitative, problem-solving, critical thinking and communication skills
  • Good knowledge of the core technologies we use 

Tech stack used in the Data Engineer team

Java, Scala, JavaScript, Python, SQL, Bash, Spark (scala), DropWizard, React, Backbone, Akka, Play Framework (Scala), Flask, PostgreSQL, AWS(S3), Redshift, Redis, MongoDB, Cassandra, RabbitMQ (messaging), Quartz scheduling, Maven, TeamCity, Jenkins, Docker, Kubernetes, Ansible, Terraform, ELK, Grafana, Prometheus, Git (GitHub), AWS, IntelliJ IDEA, Jira, Kibana.

If you are interested, please leave your detailed CV at job@efisco.net or fill in the form below.