DevOps / Infrastructure Engineer, Datapane | #python | #jobs

Job Title

DevOps / Infrastructure Engineer

Job Description

Datapane ( is looking for an exceptional and experienced devops / infrastructure engineer to join our team. This role covers devops, infrastructure, and site reliability engineering; you can think of it as involving everything around what runs, where it runs, and keeping it running!

Our scale and workloads present unique challenges which mean this role is hugely important as we grow. Datapane’s public API is the most popular way to share and embedding data science documents on the web. Additionally, we provide both a managed/automated cloud and on-premise solution for customers which includes serverless execution of their data science workloads and hosting of their internal data science reports.

The role has many facets, including:

  • As a devops engineer, you’ll be working and supporting the development and deployment of Datapane and its associated components. This covers working on our CI/CD systems (using Pulumi and GH Actions), including those for our open-source repos, developing build and deployment scripts based upon container technologies (e.g. buildah and podman), backups, testing, and more.
  • On the SRE side, configuring and maintaining the managed Datapane PaaS that includes our highly available public product and managed organisational private workspaces – all automated with Python + Pulumi and running on Kubernetes on GKE in conjunction with other GCP components, including Postgres, Redis, and blob storages backends.
  • Developing a system for shipping on-premise solutions to customers who require Datapane in their private environments, such as VPCs (on AWS / Azure) or bare metal. This currently is built on Docker Compose along with cloud-specific components and provisioning systems (e.g. Terraform, Cloud Formation).
  • Collaborating with the platform team on infrastructure features, such as Linux sandboxing technologies to safely run user’s code, or to integrations with other platforms, e.g. hosted data warehouses.

This will be a flexible role with a lot of potential. You might be orchestrating our CI/CD system to handle building and deploying a product used by tens of thousands of public users and organisations; optimising container build and distribution workflows; working on on-premise and cloud solutions; or pairing on new plugins to integrate Datapane with different data-stores and big data tools. You’ll also be helping to improve our automated deployment, management, and monitoring of customer workloads on the Datapane hosted platform.

The core Datapane platform is a Python system comprising of an OSS library and a free / managed Django server. On the infrastructure side, we try to stay as cloud-agnostic as possible, and can run on any IaaS, but are focusing on our own hosted PaaS on GCP.

Our tech stack:

Our tech stack is Python, Django, and Kubernetes on the backend. On the front-end, we use a mixture of React, HTMX, Alpine.js and TailwindCSS. Under the hood, the work we do introduces interesting challenges across distributed systems, containerisation, scalability, and large-scale data processing. We try and wrap this up as a simple and delightful product which users love.

Company Benefits

  • Fully remote
  • Flexi-working
  • Share options
  • Individual Learning and Development budget
  • Laptop of choice
  • Individual annual remote working budget
  • Co-working allowance
  • Workplace pension
  • Quarterly team hackathons at locations across the UK


  • Telecommuting is OK
  • No Agencies Please


  • Linux and Docker/container knowledge are essential, as is working with the main cloud platforms – we have several customers running automated setups on AWS, Azure and GCP
  • Prior Kubernetes experience highly desirable but not required; however, you need to be ready and willing to learn K8S as an administrator and operator
  • Working Python knowledge – we try to stick to only a few languages across our stack to maximise reuse, hence our product and most of our deployment scripts and commands use Python
  • General systems knowledge, for instance, Postgres setup, Redis, networking, load balancing, HTTP caching, SSL cert management, etc. to help maintain the Datapane PaaS
  • Knowledge of the modern data stack and interest in data engineering would be highly useful, as we grow to make use of tools such as Apache Airflow and data formats such as Apache Arrow and Parquet

You should be interested in developing best practices for running highly available and highly performant data services, and excited to pick up the new technologies and skills to do so. Lastly, you should like the idea of releasing to real customers regularly, and prioritise getting a great product into users’ hands for feedback and iteration. You will have extensive scope to build and architect the backend, and to help grow the team in the future.

Note this role is remote, but we are hiring in the UK and Europe only. Please do not apply if you are not within these geographies.

Interview Process

  • Application
  • Introductory video call with CTO (15m)
  • Technical Interview with CTO and another Engineer (60m)
  • Values Interview with Founders (60m)
  • Offer!

About the Company

Datapane is the frontend for the data science ecosystem. Our open-source library helps data scientists use the tools they love to create reports, dashboards, and apps for non-technical end-users.

We are backed by some of the top investors in the world, and have grown to be the most popular way to create and share data science reports. We are proud to put the power of the open-source ecosystem in the hands of over 50,000 end-users each month.

Take a look at our Employee Handbook to learn about the application process and how we work:

Contact Info

Source link