Remote job
Data Engineer

1 week, 6 days ago

Location

Worldwide

Company

Web Summit

Job Description

About us:

In the words of Inc. magazine, “Web Summit is the largest technology conference in the world”. Forbes says Web Summit is “the best tech conference on the planet”, Bloomberg calls it “Davos for geeks”, Politico “the Olympics of tech”, and the Guardian “Glastonbury for geeks”.

Whatever Web Summit is it wouldn’t be possible without an incredible team of nearly 200 employees, including world class engineers, data scientists, designers, producers, marketers, salespeople and more.

We’ve disrupted an old industry by building incredible software and designing mind-blowing events. We’ve revolutionised how people come together in our world. We started with one event: Web Summit. But now we’re creating category-defining events all over the world, from Asia to North America.

We’re just getting started.

About the team:

We’re looking for a Data Engineer to join our growing Data Science team at Web Summit, to build out our data platform and deliver key data services for our internal users. The team builds custom data pipelines, data models, and dashboards, and manages all the underlying infrastructure.

You will work closely with end-users to help them get insights and automate parts of their deliverables across all our conferences: Web Summit in Lisbon, RISE in Hong Kong and Collision in Toronto.

What you’ll achieve at Web Summit:

  • You’ll work closely with the Data Science team to build data products for Web Summit teams.
  • You’ll work across our data stack (Python, Presto, Superset and Metabase) to deliver quality, maintainable and reliable data products that support decisions for the best conferences in the world.
  • You’ll collaborate and learn with other team members via mentoring, code reviews, and technical talks.
  • You’ll learn about and champion best practices in code, architecture and development processes.

Who you are:

  • You have an analytical mind and love solving problems.
  • You care deeply about data engineering culture and the quality of products you and your team build.
  • You love working with data and new technologies.
  • You’re knowledgeable in a broad spectrum of products and recent developments, and can make informed choices about technologies to be adopted.

Skills and abilities we’re looking for:

  • 3+ years of experience in designing and building data pipelines (ETL or ELT), preferably using Python.
  • 3+ years of experience in writing complex SQL queries.
  • 3+ years of experience building data models from different data sources.
  • Comfortable building BI dashboards, data visualisations and/or automated reporting (including requirement gathering)
  • Ability to understand large, complex distributed systems with many moving, interrelated parts.
  • Linux experience ideal
  • Good understanding of AWS.
  • DevOps experience is a big plus, especially Terraform and Kubernetes.

Ready to Apply?

Apply Now