Used Tools & Technologies
Not specified
Required Skills & Competences ?
Google Sheets @ 3 Python @ 3 SQL @ 3 Looker @ 3 Statistics @ 2 Tableau @ 3 GCP @ 3 dbt @ 3 ETL @ 3 Airflow @ 3 GitHub @ 2 CI/CD @ 2 Data Science @ 3 AWS @ 3 Communication @ 6 Data Engineering @ 3 ELT @ 3 Databricks @ 3 Snowflake @ 6Details
Ready to be pushed beyond what you think you’re capable of?
At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system.
To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems.
Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be.
The CX Analytics Engineering team bridges the gap between data engineering, data science, and business analytics by building scalable, impactful data solutions. We transform raw data into actionable insights through robust pipelines, well-designed data models, and tools that empower stakeholders across the organization to make data-driven decisions. As an Analytics Engineer on our team, you will function as a force multiplier, enabling Analytics and Operations to function seamlessly at scale. You’ll have the opportunity to translate complex technical and operational requirements into easily consumable front end data solutions, while also heavily influencing the overarching strategy for CX Analytics and its partners.
Our team combines technical expertise with a deep understanding of the business to unlock the full potential of our data. We prioritize data quality, reliability, and usability, ensuring stakeholders can rely on our data to drive meaningful outcomes.
Responsibilities
- Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
- Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
- Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
- Build frameworks, tools, and workflows that maximize efficiency for data users, maintaining high standards of data quality and performance.
- Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.
- Build subject matter expertise in specific business and data domains.
- Understand data flows from creation, ingestion, transformation, and delivery.
- Interface with stakeholders to deliver maximum commercial value from data.
- Develop abstractions such as UDFs, Python packages, and dashboards to support scalable workflows.
- Use established tools with mastery (e.g., Google Sheets, SQL) to deliver impact swiftly.
Requirements
- Strong understanding of best practices for modular and reusable data models (e.g., star schemas, snowflake schemas).
- Expertise in prompt engineering and design for LLMs (e.g., GPT) including optimizing prompts.
- Advanced SQL proficiency for data transformation, querying, and optimization.
- Strong communication skills to translate technical concepts to business value.
- Experience building, maintaining, and optimizing ETL/ELT pipelines using tools like dbt, Airflow.
- Familiarity with version control (GitHub), CI/CD, and modern development workflows.
- Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks.
- Business acumen to address challenges via analytics engineering.
Nice to Haves
- Intermediate to advanced Python scripting and automation experience, OOP, and building scalable frameworks.
- Data visualization skills using Looker, Tableau, Superset, or Python libraries (Matplotlib, Plotly).
- Experience with cloud platforms such as AWS or GCP.
- Familiarity with statistics and probability.
Benefits
- Medical, dental, and vision plans with generous employee contributions.
- Health Savings Account with company contributions each pay period.
- Disability and life insurance.
- 401(k) plan with company match.
- Wellness stipend.
- Mobile/Internet reimbursement.
- Connections stipend.
- Volunteer time off.
- Fertility counseling and benefits.
- Generous time off/leave policy.
- Option to receive payment in digital currency.