Skip to content

Ship with a local DB as a possible source/destination #120

@fleupold

Description

@fleupold

This is probably out of scope for milestone 3, but given we have a use case for it, I'd rate it more important than S3 buckets (which was already labelled for a yet to negotiate milestone 4).

We would like to be able to do the following

  1. Run a query on some DB1 and mirror it into a temporary table/storage
  2. Run another query on some DB2 and mirror it into a temporary table/storage
  3. Combine the two results and upload the final table to Dune

Currently this is not possible. Potentially using csv files (which I assume would be part of #59) a strict "append" combination would be possible.

However, to be maximally flexible I wonder if the container couldn't ship with a built in local sqlite or postgres DB that could be used as a data source and sink?

Then, we could order jobs accordingly to make sure the local table is populated before it gets synced to Dune.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions