Senior Software Engineer, Data Platform (Streaming)
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Kafka @ 4 DevOps @ 4 Python @ 7 Spark @ 4 GCP @ 4 Java @ 7 Flink @ 4 Machine Learning @ 4 AWS @ 4 Streaming Data Processing @ 4 Databricks @ 4Details
Ready to be pushed beyond what you think you’re capable of?
At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us every day, as we build the emerging onchain platform — and with it, the future global financial system.
To achieve our mission, we’re seeking a very specific candidate passionate about crypto and blockchain technology, eager to leave their mark, enjoy working with high caliber colleagues, and actively seek feedback to improve. The work culture is intense and requires in-person participation multiple times annually despite being remote-first.
The Data Platform team builds and operates systems that centralize internal and third-party data for analytics, machine learning, and powering end-user experiences. The engineer will contribute across the full spectrum from foundational processing and data storage to scalable pipelines and tools making data easily accessible.
Responsibilities
- Support and operate Kafka clusters integrating necessary services and features for full self-service and self-management.
- Become expert in operating and managing Databricks clusters as a stable, secure platform for high-volume streaming data processing and analysis.
- Build and maintain infrastructure for diverse streaming use cases internally and externally.
- Evangelize best practices via documentation, tutorials, examples, and implementations.
Requirements
- 7+ years relevant work experience.
- Strong backend development skills in Golang, Java, or Python.
- Experience with data systems and pipelines.
- Expertise in Cloud platforms (AWS/GCP), Kafka, Spark/Flink for scaling streaming infrastructure.
- Extensive experience building streaming applications and platforms as client and operator.
- Well-versed in architectural design and patterns.
- Experience with core AWS services (S3, IAM, autoscaling groups) or devops experience.
- Hands-on with building streaming data systems using Spark/Flink on Kafka.
- Experience with change data capture pipelines using Debezium and Databricks.
- Experience standing up, operating, and maintaining Kafka and Spark clusters.
Nice to Haves
- Stateful data processing concepts and implementation (state store, retry/idempotency logic, distributed counters/aggregators).
- Tiered-storage architecture for Kafka.
- Familiarity with Open Data Lake standards like Iceberg or Delta.
Benefits
- Medical, dental, and vision plans with generous employee contributions.
- Health Savings Account with company contributions.
- Disability and life insurance.
- 401(k) plan with company match.
- Wellness, mobile/internet, and connections stipends.
- Volunteer time off.
- Fertility counseling and benefits.
- Generous time off/leave policy.
- Option to get paid in digital currency.
Answers to crypto-related questions may be used to evaluate onchain experience.