The voltage at the CU should be known, let's say 230V. A diagnostic tool plugged into a socket should be capable of drawing a known, fixed current and measuring the voltage at the socket, giving a voltage drop and resistance of the cables.
Indeed.
If you also give it an estimate of the cable length, this should tell you whether the resistance is within acceptable limits, no?
As I've said, the voltage drop together with length and current will enable you to determine the size (cross-sectional area) of the cable) and thus whether it is adequately protected (from overload/overheating) by the MCB protecting it. It is not 'resistance' that has 'acceptable limits'
Furthermore, the tool should be able to detect if the resistance is constant over time , or increases in a manner consistent with and overheating cable somewhere.
At a given temperature, resistance of the conductors of the cable will remain constant, regardless of what current is flowing through them. However, any current will result in at least some rising of temperature, and that will, in turn, result in a (fairly small) increase in resistance.
So let's say you plug this tool in an every socket, and have it draw as close as possible to the rated current of the circuit at every socket.....
As I've said, with most sockets circuits, if you plugged a load into every socket outlet you would seriously overload the circuit (and cable) and cause the MCB to trip. A sockets circuit would commonly have at least 8 double sockets (hence 16 13A socket outlets) - that's a potential load of 512 A, far, far too much to be 'allowed' to flow for very long by a 32A MCB (which would trip within an hour with, much faster with higher currents and within milliseconds with 160A)
If you're not getting a rising resistance or as long as the resistance plateaus at some acceptable value, all is well in theory?
As above, the resistance will rise as little as the cable warms up but, for any particular constant current, will eventually reach a 'plateau' when thermal equilibrium is achieved. The resistance of a copper conductor will increase by about 0.39% for each degree C rise in temp. Hence if the temperature rose from an ambient temp of, say 20°C to the maximum permissible work temp of the cable (typically 70°C), the resistance would increase by about 19.5%. However, this is not a way of investigating anything, not the least because that temperature rise is merely an indication/measure of the amount of current flowing (which will be known and/or can be measured).
I still don't really understand what makes you suspect/fear that there might be a risk of 'overheating' in your installation.