Vacancy is archived. Applications are no longer accepted.

Senior Software Engineer, Database Infrastructure Service

at Airbnb
šŸ“ United States
SENIOR
āœ… Remote

SCRAPED

Used Tools & Technologies

Not specified

Required Skills & Competences ?

Kafka @ 4 Kubernetes @ 4 Python @ 4 Scala @ 4 Spark @ 4 GCP @ 4 Java @ 4 Airflow @ 4 Distributed Systems @ 6 Flink @ 4 Leadership @ 4 AWS @ 4 Communication @ 7 Debugging @ 4

Details

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe. Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way.

Responsibilities

  • Build and operate data ingestion systems that enable various ways of accessing data at Airbnb, including ingesting database data into the warehouse in various formats and frequency, and streaming change data capture (CDC) at near real time.
  • Be hands-on with coding, design, testing, and collaborate with cross-team partners (internal customers, dependencies, and leadership) to deliver multi-month projects in a timely manner.
  • Raise operational standards by proactively identifying, debugging, and fixing operational issues. Participate in the on-call rotation for the DBExports platform.
  • Mentor junior engineers on the team.

Requirements

  • 5+ years of experience building and operating large-scale core backend distributed systems such as storage, data ingestion, backup and restore, streaming.
  • Ability to own and dive deeply into a complex code base.
  • Experience maintaining, analyzing, and debugging production systems.
  • Skill in writing clean, readable, testable, and maintainable code.
  • Strong collaboration and communication skills in a remote-working environment.
  • Demonstrate strong ownership and consistently deliver in a timely manner.
  • Experience with Java, Scala, or Python.

Nice to Haves

  • Experience building large-scale data exports and ingestion platforms.
  • Experience with large-scale distributed databases.
  • Experience with AWS and/or GCP.
  • Experience working with Spark, Kafka, Flink, Kubernetes, Airflow, AWS Aurora, or TiDB.

Benefits

  • Role may be eligible for bonus, equity, benefits, and Employee Travel Credits.
  • Remote eligible within the United States with restrictions on certain states.
  • Occasional office or offsite attendance as agreed with manager.
  • Commitment to inclusion and belonging with a disability-inclusive application and interview process.