Danbooru

Database dump

Posted under General

I've found that just dropping tables/columns is much faster than updating them so I can probably do this semi-regularly. But if you want to just profile different queries then this dump should be adequate for awhile because the distributions don't change much over the months.

Awesome, thanks. Could you sum up what has been dropped?

Also, it'd seem beneficial to have it content-encoded with gzip; right now it's going uncompressed, which is a silly waste of bandwidth and time.

@albert

Yeah I agree that the distributions don't change quickly enough to matter for most things. If the update cycle was once a month or two, that'd probably be sufficent for everything I have in mind right now.

If I really needed newer post data for the most recent month or so, the API should be able to handle that just fine.

@hazuki

If you try compressing the dump, it actually doesn't compress well at all. It looks like compression is already baked into Postgre's pg_dump and it looks like Albert has used it here.

1