Senior Python Data Engineer (Pandas, Snowpark, FastAPI, Kubernetes).
Chicago, IL - USA
Job Summary
Role: Senior Python Data Engineer (Pandas Snowpark FastAPI Kubernetes).
Location: Chicago IL
Model: Onsite 3 days per week
Anchor days: Mondays 2 flex days
Hours: 8:00am 5:00pm CST
Contract role
Project Overview/Role:
We are seeking a Python Developer with strong expertise in data transformation pandas and modern data engineering practices. The ideal candidate will design and implement scalable data pipelines and APIs leveraging Snowflake Snowpark and containerized environments. Experience with FastAPI and Kubernetes is essential. Familiarity with the financial services industry is a plus.
Experience Level: 3 Senior
Required Skills & Experience
Python expertise with deep experience in pandas for ETL/ELT and data wrangling (e.g. vectorization memory management IO time series).
Hands on with Snowflake (SQL performance tuning warehouse configuration) and Snowpark (Python) for scalable transformations.
Strong FastAPI experience building production services (dependency injection Pydantic models async IO).
Practical knowledge of Kafka (consumer groups offsets partitions schema management) and designing event driven microservices.
Proficiency with Docker and Kubernetes (deployment strategies networking volumes service meshes a plus).
Familiarity with GitHub Copilot or similar AI assisted coding tools to accelerate development and improve code quality.
Solid understanding of software engineering fundamentals: testing code quality design patterns API design and clean architecture.
Experience with CI/CD (GitHub Actions GitLab CI Azure DevOps or similar) and IaC (Terraform or Helm preferred).
Familiarity with data modeling and SQL
Strong communication skills and the ability to work in a cross functional agile environment.
Nice to Have
Financial Services industry exposure
What Youll Do - Tasks & Responsibilities:
Design & Build Data Pipelines: Create reliable testable data transformation workflows using Python (pandas PySpark/Snowpark) optimizing for performance and maintainability.
Snowflake Engineering: Implement Snowflake objects (tables stages tasks) write efficient SQL and develop Snowpark based transformations; manage performance (clustering warehouses caching) and cost.
Service Development (FastAPI): Build RESTful/JSON APIs and backend services in FastAPI to expose data and business logic; implement authentication/authorization rate limiting and request validation.
Containerization & Orchestration: Package services with Docker and deploy/operate them on Kubernetes; manage manifests Helm charts ConfigMaps/Secrets health probes autoscaling and observability.
Event Driven Architecture: Produce/consume Kafka topics; design schemas (Avro/JSON/Protobuf) ensure idempotency and implement exactly once/at least once semantics where appropriate; apply stream processing patterns.
Quality & Reliability: Write unit/integration tests data validation checks and contract tests; implement CI/CD (linting type checks security scans test automation) and support blue/green or canary releases.
Observability & Operations: Instrument services with logging metrics and tracing (e.g. OpenTelemetry); build dashboards and alerts.
Collaboration: Partner with product analytics and platform teams; document designs APIs SLAs and runbooks; participate in reviews and sprint ceremonies.