Calling the bold.

Calling the bold.

Uncork-backed companies are hiring explorers, builders, and operators ready to help chart new territory.

Uncork-backed companies are hiring explorers, builders, and operators ready to help chart new territory.

Calling the bold.

Uncork-backed companies are hiring explorers, builders, and operators ready to help chart new territory.

Infrastructure and Automation Engineer

Dual Bird

Dual Bird

Software Engineering, Other Engineering
Multiple locations
Posted on Mar 13, 2026

DualBird is building a next-generation data acceleration platform that brings hardware-level performance to cloud infrastructure with software-level simplicity.

We’re looking for a Infrastructure & Automation Engineer to build automated systems that validate our Spark acceleration layer in large scale.

The role

You will design and operate our CI/CD for our core framework. This is a SW role focused on building infrastructure and internal tools on AWS (EMR/EKS) to ensure data correctness, system stability, and performance parity.

What you’ll do

  • Framework Development: Build an automation frameworks to build, validate, orchestrate, and release out SW suites.
  • CI/CD Integration: Integrate automated tests into GitHub Actions CI pipelines, ensuring every release correctness and performance.
  • System and CI Monitoring: Implement observability (Prometheus/Grafana/CloudWatch) to monitor our SW health.
  • Cross-Team Collaboration: Work with low level system architechts to provide automated feedback loops for hardware-level optimizations.

What you bring

  • 4–7+ years of experience: Proven track record in SW development, QA automation or DevOps within cloud or distributed data environments.
  • Advanced Python: Ability to build complex, modular automation tools (Pytest, custom SDKs, Build tools, debug tools).
  • Cloud & K8s: Hands-on experiennce with AWS (EMR, S3, EC2) and EKS (Kubernetes) networking/orchestration.
  • CI/CD & Observability: Experience building automated release pipelines and monitoring distributed system as well as SW helath.
  • Advantage: Spark & Data Lakes: Technical knowledge of Apache Spark (shuffles, memory tuning, executors) and Apache Iceberg.
  • Linux/Systems: Strong CLI skills and comfort debugging failures across the OS and container layers.