I've found that just dropping tables/columns is much faster than updating them so I can probably do this semi-regularly. But if you want to just profile different queries then this dump should be adequate for awhile because the distributions don't change much over the months.
Yeah I agree that the distributions don't change quickly enough to matter for most things. If the update cycle was once a month or two, that'd probably be sufficent for everything I have in mind right now.
If I really needed newer post data for the most recent month or so, the API should be able to handle that just fine.
If you try compressing the dump, it actually doesn't compress well at all. It looks like compression is already baked into Postgre's pg_dump and it looks like Albert has used it here.