Used Tools & Technologies
Not specified
Required Skills & Competences ?
Docker @ 1 Kubernetes @ 1 Google Sheets @ 4 Python @ 4 SQL @ 4 Looker @ 6 Statistics @ 3 Tableau @ 6 GCP @ 1 dbt @ 4 ETL @ 4 Airflow @ 4 GitHub @ 3 CI/CD @ 3 Data Science @ 4 AWS @ 1 Communication @ 7 ELT @ 4 Databricks @ 4 Snowflake @ 7Details
Ready to be pushed beyond what you think you’re capable of?
At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system.
To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems.
Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be.
Responsibilities
- Develop and maintain foundational data models serving as the single source of truth for analytics across the organization.
- Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
- Partner with engineering, data science, product, and business teams for alignment on priorities and data solutions.
- Build frameworks, tools, and workflows for efficient data usage while maintaining high data quality and performance.
- Use modern development and analytics tools to deliver value quickly with long-term maintainability.
- Build subject matter expertise in specific business and data domains, understanding full data flows.
- Build data pipelines and collaborate with partners to deliver insights and fix data gaps.
- Interface with stakeholders to generate business value from data directly or indirectly.
- Develop new abstractions to support scalable data workflows and infrastructure.
- Use established tools like Google Sheets and SQL to quickly deliver impact.
Requirements
- Strong understanding of modular and reusable data modeling best practices (star/snowflake schemas).
- Expertise in prompt engineering and design for large language models (LLMs) such as GPT.
- Proficiency in advanced SQL techniques for querying and optimization.
- Intermediate to advanced Python skills including OOP and building scalable frameworks.
- Strong communication skills to translate technical concepts into business value.
- Experience building and optimizing ETL/ELT pipelines using tools like dbt, Airflow.
- Proficiency in data visualization tools such as Looker, Tableau, Superset, or Python libraries (Matplotlib, Plotly).
- Familiarity with GitHub, CI/CD, and modern development workflows.
- Knowledge of modern data architectures e.g., Snowflake, Databricks.
- Business acumen and data/statistics familiarity.
- Bonus: Experience with cloud platforms (AWS, GCP) and containerization (Docker, Kubernetes).
Benefits
- Medical, dental, and vision plans with generous contributions
- Health Savings Account with company contributions
- Disability and life insurance
- 401(k) plan with company match
- Wellness, mobile/internet, and connections stipends
- Volunteer time off
- Fertility counseling and benefits
- Generous time off/leave policy
- Option to be paid in digital currency
*Note: Applying for a specific role does not guarantee consideration for that exact position. Leveling and team matching are assessed throughout the interview process.