I apologize, because I know I'm terrible at putting my thoughts into words. I do understand the difference between the bump map (generated from texture) and noise/detail (not generated from texture).
In my mind, I was drawing similarities between your "detail" noise map to
Detail Textures from Half-Life. Only instead of having an actual texture for the "detail", you use a generated and scaleable noise map.
Basically I thought it would be nice if this noise map could be different for between each texture, similar to MadVR's dithering noise is different between ordered vs random (and frames of random vs other frames of random). This would help trick the mind into believing skin and clothing have an actual different texture (as in visual appearance of a physical texture) by reducing the tiling and repetition of the "detail" portion.
It's a super minor change, but IMO can make a big difference, again like ordered vs random dithering.
Ex: Using Detail sliders to get "facial pores". This would get applied to not only the skin, but clothing as well. Our eyes detect differences in light far better than differences in color, so using tricks like this help image fidelity and immersion. Again it's not a HUGE difference. Everyone is different. Some people are annoying by tiled textures or repeated texture patterns and are fine with clothing or weapons clipping though models, and there are people the exact opposite.
And because why not let's say, hypothetically speaking of course, us users were able to fine tune the detail on multiple "detail" textures. We could have small detail to look like facial pores like I mentioned above, and a large scale detail for slight smooth shadows with more coverage to enhance the existing clothing texture (not fold-less clothing like spandex

).
After writing that, I'm still not satisfied with my attempt at describing of what's in my head.
And again, I don't know anything about the technical stuff going on here so I'm just spewing nonsense.

After think about it a little more, this might introduce more stuttering (see texture, read texture, look for matching textures, generate random seed noise for each unique texture, process and apply to each texture, continue on with rendering chain. The generation and caching of shaders cause enough stuttering already.
*Warning - Another of Kami's spontaneous "ideas"*
This is one of those "Epiphany Ideas" we all get while sitting on the toilet.
For the rest of this post I'll be referencing "detail textures" as they were implemented in Half-Life.
First, for illustration purposes only I'm going by what the Material Map section on the Ishiiruka github wiki. Let us assume all material maps have the same <TextureID> as the base texture.
Bump map = <TextureID>.bump.extension
Normal map = <TextureID>.nrm.extension
Specular/Phong map = <TextureID>.spec.extension
In my head I saw two things:
I) Two "not for gaming" options. By "not for gaming", think of options like texture overlay or texture/shader/vertex dumping. This option is for those who want to make custom textures.
First option would identify each texture ala "texture format overlay".
This identification indicator is a "quality of life" improvement for the artists and mspaint warlocks.
The indicator texture can be dumped along with the base texture.
When dumped, and following the naming scheme from the wiki, the indicator texture is
Detail map = <TextureID>.detail.extension.
This is just a starting template that inherits the filename and texture dimensions of the base texture.
The only thing missing is love from an artist.
II) For the rest of us lowly, non-artist peons (or the lazy artist lol), there is a new section in the backend options.
In this section we have a list of all textures currently loaded in memory (or dumped).
When selecting one, we get some sliders and noise/pattern options to play with.
Since we can do this while the game is running, we can make changes in real-time and when satisfied we could save it as a custom texture. I'm assuming that by saving it we will get better performance vs real-time generation. And it could be shared with others, too.