Role: Senior Data Engineer Location: Remote / Hybrid (San Francisco, CA) Tech Stack: Python, SQL, Spark, Airflow, Snowflake, AWS What You’ll Do (Responsibilities) What You’ll Need (Requirements) What You’ll Get (Benefits) Pipeline Architecture: Design, build, and scale robust ETL/ELT pipelines. Technical Depth: 4+ years of experience with Python and Big Data tools (Spark/Flink). Comp: $150k – $190k + meaningful equity package. Data Modeling: Structure our "Single Source of Truth" in Snowflake for BI teams. SQL Mastery: Expert-level analytical SQL and experience in dimensional modeling. Setup: $2,000 home-office stipend + latest MacBook Pro. Data Quality: Implement automated testing and observability (Great Expectations/Monte Carlo). Cloud Native: Hands-on experience with AWS (S3, Redshift, Lambda) or GCP. Wellness: 4-day work weeks once per month + premium health coverage.