Description
When a huge array (here 8k x 1k) with "sharp" data is displayed with imshow in a window with lower pixel resolution than that array, it looks blurred.
I'm questioning that this behaviour is right. Here is an example of the matplotlib imshow, with a standard line of code
Code for reproduction
ax.imshow(field,aspect='auto',interpolation='bicubic',cmap='jet',origin='lower')
Actual outcome
However, the downsampled version (plotted without matplotlib via direct draw and actually using a bicubic algorithm for downsizing) shows the sharp lines I expected.
Expected outcome
So is matplotlib downsampling more than the display resolution and then interpolating back via bicubic interpolation? Or how can one get a blurred output, if the input array is much larger and is "sharp"?
Do not misunderstand, I'm not interested in "nearest" etc.
Unfortunately, the examples with imshow/bicubic only show interpolation while upsampling.
Thank you,
Tom
Matplotlib version
Operating system: Windows 10
Matplotlib version: 3.1.3
Matplotlib backend (print(matplotlib.get_backend())): TkAgg
Python version: 3.7.7
installed Anaconda and packages via conda in a separated env.