Data Engineer at Ceres Imaging
Oakland, CA, US
Ceres Imaging is a venture-backed company developing a technology that helps conserve water and fertilizer in farms. Specifically, we use aerial imagery and spectral image processing to monitor crop variables. The primary delivery mechanism for our imagery is a web application that allows customers to view the imagery of their fields over time (see the demo on our website). 
 
We help the farmer use this data to improve crop management practices like fertilizer application, irrigation schedules, stress/problem detection, and other applications. Currently we have a substantial funding, and paying customers throughout California and Australia
 
THE ROLE:
We are seeking a data engineer to help plan and implement a scalable processing pipeline and storage system. Ceres has an amazing set of images and image processing tools that produce products that our customers love. We need an ambitious engineer to help us make systems that run these tools at a massive scale and with high reliability. In this role, you’ll be working with scientists and computer vision engineers to plan and implement clean interfaces and systems to churn through data today and enable the development of next-generation analysis tomorrow. The ideal candidate has experience in developing or working with distributed systems, ETL pipeline frameworks, RESTful API services, and working with relational and/or non-relational database systems.

Essential duties and Responsibilities:

    • Assess and improve image processing pipeline for throughput, reliability, and transparency.
    • Develop architecture for rapid deployment of new image processing pipelines as new analytical techniques are developed.
    • Work with the web app and science teams to develop data storage system and data access interfaces.

Desirable skills:

    • Ability to write clean, modular, testable and well-documented code (we work in Python).
    • Expertise working with ETL framework (ie Airflow, Luigi).
    • Experience working with business process management frameworks.
    • Experience working with AWS or GCP + Linux.
    • Ability to clearly communicate with and document the needs of engineering and scientific teams.
    • Demonstrated ability in designing SQL and/or NoSQL stores.
    • Degree in CS, engineering, or quantitative science & 3+ years experiences working in a data engineering context.
On offer is the chance to join a literally world changing startup, working with an outstanding team of scientists, engineers and go-to-market.
 
If you are interested in learning more, please don't hesitate to apply now.