This is one of those things that is very, very, very simple, but which a lot of people have trouble understanding because they keep trying to overthink it.

There are three properties at issue.

Logical Size is the amount of information. This is the number of pixels.
Physical Size is the size of a physical image, be it a printed image, or one displayed on your screen.
Resolution is a ratio between these two. In the world of graphics, it's usually measured in dots/pixels per inch, or per centimetre. (DPI, PPI, px/cm, etc) "pixel" and "dot" are pretty much interchangeable here. Resolution is a property of the process of converting between image data, and a physical image.

Resolution = Logical Size / Physical Size

If you get that and understand all the implications, that's all you need to know. Most of the rest of this post is going to be trying different ways of explaining that one very simple little equation.

You can adjust the number of pixels. This is a process called "resampling" and produces artefacts (errors). Increasing the pixel count (Upsampling) also produces problems as a result of having to guess at what was "in between". In most cases the artefacts of take the form of "blurriness". There are many different algorithms for resampling (shifting from one grid of pixels to another) and interpolation (guessing what should go between pixels) and they all have different strength and weaknesses. Downsampling is usually OK if you do it by a large factor: reducing by 10 (10% as many pixels on a side) should look fine, reducing by 0.1 (90% as many pixels on a side) will not look good.

So, for a file, focus on the logical size. Changing that means resampling, which is generally a bad thing, (except large downsampling).

Physical size is hopefully fairly simple. Take out a ruler or measuring tape and measure. That's the physical size.

Resolution is, as I said, a property of the process of converting between physical and logical.

Many image files have a little tag attached to them that indicates a resolution. This is used to either indicate what resolution the image was scanned at, or to suggest what resolution/size it should be displayed/printed at. This may be wrong, ignored, or just irrelevant. It's really not that important. If you don't have a use for it, it's best to leave this blank.

Generally, when you display an image on a computer monitor, it is displayed at the resolution of the monitor (A typical modern desktop monitor is about 100 DPI). A lot of people will go on and on about the magical significance of 72 DPI with regard to monitor resolution and particularly Macs. This is nonsense. For images where printing/scanning isn't important, you should generally set the resolution blank to indicate that it's not important. Failing that, use the resolution of the monitor you created it on (probably approximately 100 DPI)

Some software knows to look at the DPI tag, and will resample the image to appear at the correct size on the screen (assuming the OS reports the screen resolution to the software correctly) As there is resampling going on, this can look a bit blurry, unless the resolution specified in the image is much higher than that of the screen (in which case you get downsampling by a large factor) In GIMP for instance you can turn off "Dot For Dot" mode in the "View" menu to get this behaviour. This is only really all that important if you plan on printing the image and want to see how big it will be.

Resolution can also be thought of as the physical density of information. Which is one factor impacting quality. Denser information allows more detail, and more precision. But remember that the resolution tag on the image is just a hint as to how big to print the image. Increasing the resolution just means the image will be printed smaller. That certainly means it will be more dense, but it's probably not what you are after. You can compensate by upsampling, but this is couterproductive as you are still just stretching the same information out over the same area. The interpolation will "cancel out" the higher resolution, and you will be at best, no better off, and may well have a worse image.

If you scanned an A4 sheet, and printed at A3, you would have the same number of pixels, but twice the physical size, and so half the resolution. If you scanned at 600 DPI, you would be able to print at 300 DPI which is probably good enough. If you scanned at 150 DPI, you would print at 75 DPI, which would look fairly obvious unless viewed from a distance.