Design, build, and maintain analytics-focused data pipelines using modern data stacks
Develop and optimize ELT / ETL workflows to support analytics, BI, and experimentation
Build reliable data models and transformations (e.g. dbt-style workflows)
Partner with product, growth, and engineering teams to define metrics, KPIs, and data standards
Enable data-driven decision-making through well-designed dashboards and analytics tools
Improve data quality, governance, and observability, including tracking and event design
Automate manual analytics workflows and reduce operational overhead
Contribute to analytics best practices, documentation, and internal enablement
6+ years of experience in data engineering, analytics engineering, or advanced data analytics
Strong proficiency in Python and SQL
Hands-on experience with modern data stacks, such as:
Cloud data warehouses (BigQuery, Snowflake, Redshift, etc.)
Workflow orchestration (Airflow or similar)
Data modeling / transformation frameworks (e.g. dbt)
Solid understanding of analytics systems, event tracking, and data modeling
Experience building business-facing analytics or BI solutions (Looker, Power BI, Tableau, etc.)
Ability to translate business questions into scalable data solutions
Strong ownership mindset and ability to work independently in a remote environment
Experience with growth, marketing, or product analytics
Background in data governance, tracking design, or server-side analytics
Experience building analytics tools or internal data products
Exposure to event-driven architectures or streaming systems