Skip to content

south washington watershed district: retrieval kernels hit timeout errors #188

@westonslaughter

Description

@westonslaughter

SWWD retrieval depends on static download URLs, the largest of which (in my experience on Duke/Home wifi) have tended to fail regularly before succeeding (MS-1 and Trout Brook discharge are the worst).

This means that, as currently written, they are likely to fail on a full acquisition run without supervision. The kernels should be amended with this in mind. Brute forcing with a "while" loop I do not think is a good solution, because the website will freeze you out after a certain volume of requests. In fact, I found some success in adding a system sleep call after each download. I think, unfortunately, making a set number of tries after which it downloads from a static version on our google drive might be the simplest way to allow this domain to retrieve without supervision.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions