Danbooru

Auto-resized images

Posted under General

Thanks muchly whoever implemented the auto-resizing of large images! These "resized" images themselves are still significantly large, and actually end up being just the right size to be completely viewed in the browser at 1600x1200 screen resolution.

I often found images higher than this resolution overkill anyway. I'd end up waiting for the long download, then resizing it with the browser... making a smaller but pixelated image. This is just right!

Also, I'm curious... Does your code have a hard limit, where it resizes anything even one pixel larger than the limit down to the limit? Or does it have a bit of leeway? (Example: Limit is 960px. Image is 962px. Image gets a resized one at 960px, and there is virtually no difference between the two.
Or with leeway: limit is 960px. Two images, 970px and 1070px. 970px is not significantly [let's say 10%] larger, so it passes and is not limited. However, 1070px is significantly [ > 10% ] larger than the limit, so it gets resampled down to 960px.)

Updated by creaothceann

If you're talking about leeway vs. a hard limit, the advantages are this:

For the user:
Leeway - The user will always know that there's a significant size difference when a "This image has been resized" message is present. The minimum significance is merely determined by the % of leeway you give.
Hard Limit - The user might see the resized message, yet the original image could be many times larger than the resized one or merely a single pixel larger.

For the host:
Leeway - Not as many images will have to be stored, nor will as much bandwith need to be consumed.
Hard Limit - Bandwidth will be consumed by users loading original size pictures that could be as little as one pixel over the hard limit (useless since the user would likely not notice any difference between images with such small difference in size), and the same for storage space.

Basically, increasing a hard limit will never get rid of the "one pixel larger" problem (really, the "only slightly larger image problem"), which leeway does deal with. It does, however, mean that some non-resampled pics will be displayed at a higher res than resampled pics. But that's the way it's supposed to work.

Also, I was wondering if it's at all possible to display the actual resolution of the "original image" next to the link in every oversize post?

I just want the fact that the image has been resized to be more prominent. Particularly when the resize amount is relatively small.

I know there's that message up top, but that's a bit much to see repeatedly and I've turned it off.

Albert, can you bold the "Original image (xxx.x KB)" link? I think even that small change should make me a lot less likely to miss it.

Is there a way to turn it off?

I'd rather just have the full size image load every time instead of loading an intermediate just to get to it.

I don't like extra steps really. I think it only really makes sense on sites like moe.imouto where there's crazy large images. Most of the pictures on Danbooru aren't 6-25 MB.

-Eruru

jxh2154 said:
Albert, can you bold the "Original image (xxx.x KB)" link? I think even that small change should make me a lot less likely to miss it.

A more evident indication of it would help, yes. But I don't think bolding "Original image ..." will be enough.
Perhaps a not so intrusive change of the color of the box on top? If it's the general grey it is now, I'd be prone to take it for a parent or child -- thus, not care much. The box for new PMs catches my attention.

I have a rather high resolution, and I don't mind scrolling a bit. The resizer is really useful for absurd-res pictures, though, as I tend to zoom out to see what the hell it is a picture of. And they load dirt slow because Danbooru only yields about ~10-120 KB/s for me.
My resolution is 1920x1200, and I could see 2500x2000, or 1.5 MB, a potential lower limit. Five minutes ago, it took 15 seconds to load 1 MB of an image. Just now, it took 20 seconds to load 300 KB. I move between a 100mbps and 10mbps connection; the speed is still within the same range from Danbooru. Weak link to Europe?

chiisana said:
I'd like to second an option to turn it off; I prefer my big highres vector traces the way they are: big, high res, and still vector trace, not small, down sized, and rasterized.

url?

It seems that note sizes, and positions are not adjusted when clicking the "Original image" link.

Eruru said:
I'd rather just have the full size image load every time instead of loading an intermediate just to get to it.

I think the same, and would really like to see an option to turn it off.

I thought about having it threshold resizing when I implemented it, of course, but opted to keep it simple. Anyway, it ends up doing something silly either way. For example, if images under 1000x1000 are resized to 800x800 (so 801x801 images are left alone), then a source image that's 900x900 ends up getting to be bigger than the 1000x1000 image, which is even weirder.

There's also a size threshold (512k); that could do the same, but it'd complicate the code and isn't really worth it--most images that are under the resolution threshold but over the filesize threshold are PNGs, so they usually get compressed pretty well.

chiisana said:
I'd like to second an option to turn it off; I prefer my big highres vector traces the way they are: big, high res, and still vector trace, not small, down sized, and rasterized.

Danbooru doesn't support SVG, so they're rasterized whether they're resized or not...

I'd also like a way to turn it off. I think it's great to have on large, multi-megabyte images. But more often than not I've found having to click on "View Original Image" to be a pain.

What I'd most like to see, if it were possible, is to allow users to set their own thresholds. Obviously, the server would set the lower limit (I guess 1000x1000 and 512KB right now?) Users could then raise the thresholds. For example, a user might set her thresholds to 2000x3000 and 1MB. In that case, the resampled images would only display if the original image was larger than 2000 horizontal or 3000 vertical or 1MB.

If bandwidth is a concern, the amount the user could raise the thresholds could be capped. I'm not sure what a suitable cap would be. In the most extreme scenario, a user would always choose to view the original image, thus using up the bandwidth of the original image and the resized image.

--------------
When imposing any limit, I think you have to consider how many images will be "stupidly affected", resized for little to negative benefit. For example, the current hard limit is 1000x1000. If a large number/proportion of images fall between 1001 and, say, 1200 in some dimension, there are going to be a large number of images "stupidly affected" by the limit. It would be interesting to do an analysis of the current database, considering image dimensions and limits. Perhaps you could even factor in the popularity of an image, giving it more weight in the analysis if it has a higher view rate.

----------
I think leeway is superior to a hard limit.

petopeto said:
For example, if images over 1000x1000 are resized to 800x800 (so 801x801 images are left alone), then a source image that's 900x900 ends up getting to be bigger than the 1000x1000 image, which is even weirder.

That isn't an issue at all. As StarlitVoyager already said, it's supposed to do that. Resampling an image for little gain just wastes harddrive space and could hurt bandwidth (depending on how aggressively you compress the resamples). For example, if an original image is 1050x1050 and 200KB, perhaps the resampled image is 1000x1000 and 180KB. In this case, if more than 10% of people choose to view the original image, you've actually hurt bandwidth.

Another hilarious example is post #209733. The resampled image is actually LARGER than the original, 244KB to 145KB. A sanity check should be introduced. I've opened a ticket #209.

Updated

Eruru said:
Is there a way to turn it off?

I'd rather just have the full size image load every time instead of loading an intermediate just to get to it.

I don't like extra steps really. I think it only really makes sense on sites like moe.imouto where there's crazy large images. Most of the pictures on Danbooru aren't 6-25 MB.

-Eruru

This. I don't enjoy having to go to another link in almost every pic to save it, instead of just going directly to the page and saving it there.

Marshall_Banana said:
I don't enjoy having to go to another link in almost every pic to save it

You can just right-click the "View Original" link in the top frame and save that. That's the same amount of clicks.

But yeah, the "resize image" thing in the settings menu most likely needs to be updated to reflect the new resize system. I'm guessing it'll be added next time.

I like the idea of custom size limitations as well, although I have no idea if that'll be hard to do or whatever.

petopeto said:
I thought about having it threshold resizing when I implemented it, of course, but opted to keep it simple.

Yes. We're telling you that you opted wrong. :-)

Anyway, it ends up doing something silly either way. For example, if images under 1000x1000 are resized to 800x800 (so 801x801 images are left alone), then a source image that's 900x900 ends up getting to be bigger than the 1000x1000 image, which is even weirder.

It's not weird, it's how it should be. The scaling method is not very advanced, and it degrades quality considerably when scaling by a factor close to 1. The quality loss starts making sense when it's associated with a considerable size reduction.

surasshu said:

You can just right-click the "View Original" link in the top frame and save that. That's the same amount of clicks.

Not for me. I save images by dragging them to a folder, which I find more convenient than using the contextual menu. Dragging a link, however, creates a link file, so I need to click, wait for the image to load, and then drag (yes, it's still more convenient than using the menu).

By the way, the double-load for saving is another reason why you only want to resize when there's a significant gain to be had.

But yeah, the "resize image" thing in the settings menu most likely needs to be updated to reflect the new resize system. I'm guessing it'll be added next time.

As long as we don't lose the ability the resize the full version on the client side.

Updated

I would also second the notion that a cookie-based option to turn on and off automatic resize option would be a good idea. In some cases it makes a lot of sense (especially absurdres images), but in most cases where it's scaling something down by only 20% it is a bit annoying.

I imagine part of the rational behind it's development is that the resampled pics contribute to very large bandwidth savings for the site, so definitely weigh the complaints against the benefits. But I just don't how much benefit there is to minor resizes.

Another thing I was wondering. How does the database account for the MD5 digests of the resized images? Technically we should block them out, since the resampled image will have a different MD5, so someone could download the resampled version, post on 4chan, and someone else end up re-posting it here. It's a similar problem to what we have with people posting the duplicates consisting of the jpeg-artifact riddled product of the resampled images from Aerisdies.

There's already an option to turn it off in the code; it's just disabled here.

It's not weird, it's how it should be.

I think it's fairly weird. Anyway, I consider this a minor issue. I may get to it at some point, but I'm not in a rush.

I imagine part of the rational behind it's development is that the resampled pics contribute to very large bandwidth savings for the site, so definitely weigh the complaints against the benefits. But I just don't how much benefit there is to minor resizes.

That was just a nice bonus; the major impetus was making high-res images less painful to view in-browser.

Technically we should block them out, since the resampled image will have a different MD5, so someone could download the resampled version, post on 4chan, and someone else end up re-posting it here.

Rather, there should be a dupe detector. We have one running on moe, using piepsy's image search API. I haven't sent that patch yet, since there are a few things left that need fixing.

petopeto said:
That was just a nice bonus; the major impetus was making high-res images less painful to view in-browser.

I had thought the major impetus was bandwidth savings. If that's not the case, then I think a good short term solution would be to offer users an option to turn off this resampling nonsense.* I've already mentioned it in the thread, but the current 1000x1000 hard limit sucks. Too many images are stupidly affected by it.

  • Although, it seems like changing from a hard limit to a XX% limit would be even easier. Wouldn't that just be changing a few lines of code? Like

image.width > Limits.width * (1+Limits.percent_leeway)
instead of
image.width > Limits.width

petopeto said:
Rather, there should be a dupe detector. We have one running on moe, using piepsy's image search API. I haven't sent that patch yet, since there are a few things left that need fixing.

Be careful with that. If an artist posts an updated version of a pictures (with a typo corrected, for instance, or some shadows retouched, and so on), we want to have it, even though it would probably trip a similarity-based duplicated detector.

LaC said:
Be careful with that. If an artist posts an updated version of a pictures (with a typo corrected, for instance, or some shadows retouched, and so on), we want to have it, even though it would probably trip a similarity-based duplicated detector.

Is not really an issue, it depends on the usage of the danbooru. It still can be a nice tool for searching Similar posts and small variations

As long as we don't lose the ability the resize the full version on the client side.

I second that I (imho) hate resized versions of images, so I vote for a user switch and the Resize script to remain where it has been always.

1 2