So you are looking for a 4K monitor that can do 120Hz or 144Hz, they do exist but at a cost, but is it worth it? The whole concept of console gaming is to have acceptable gaming on relatively cheap. So, the amount of money you will spend to have a display (TV or Monitor) that is 4K120, would only disappoint you when you see the results.
Now if 4K120 would have been cheap (relative to price of PS5), I would have recommended you to dive in, no problem, but it's not mainstream yet. If you are an enthusiast to whom money doesn't matter, then you should go PC route first, then get an appropriate monitor like the Gigabyte AORUS FV43U or LG 27GN950 (also in 38") or ASUS ROG Swift PG32UQX or ACER XV282K or similar.
Otherwise get a 4K HDR TV with low input lag, the ping on any multiplayer game that you play here in Pakistan is already disadvantageous enough that you won't feel any difference between a 17ms TV or a 1ms monitor. TVs are a bit versatile, cheaper, better HDR but higher input lag than monitors. You can get a UHD (4K) TV that has 1080p 120Hz mode, or a UHD TV that has a 1080p 144hz mode or a UHD that has 2160p 120Hz mode.
https://youtu.be/XwfOPQVgRu8
Now that we've got answers to your question, lets address the whole 4K gaming debate, that too on a Console. Higher resolution makes graphics sharper & crisp. The need to increase resolution came as TVs/Monitors screens started to get bigger. We needed to increase resolution on a bigger screen in order to have same PPI (Pixels Per Inch). We need higher resolution when we increase the screen size or decrease the viewing distance. How much screen size and viewing distance affects our ability to actually discern the higher resolution:
https://youtu.be/ehvz3iN8pp4
4K gaming on AAA demanding titles is already tough for most powerful PCs, IF YOU DO IT RIGHT. The whole reason for going 4k (3840x2160) resolution is to get sharper textures, crisp graphics, higher detail. But if you need to lower the graphic settings in order to get the game to run decent frames at 4K, then it doesn't make sense. There is already a visible difference in games between PC & Consoles (and between PS5 & Xbox Series X itself) where there is less crowd, less grass, less scene population etc on consoles. PS5 dips below 60FPS running Cyberpunk2077 on "Performance Mode" at 4K, PS5 does not even offer Quality Mode on Cyberpunk2077, while Xbox Series X has a Quality for Cyberpunk2077 at 4K but locks FPS at 30.
https://youtu.be/l0aZocGxy3I
On a side note, Ray Tracing is also such a marketing gimmick like 4K gaming which is not ready for prime time yet. The hardware has not caught up to these technologies yet. Enabling Ray Tracing on consoles, NONE of the AAA titles run natively at 4K, most are being rendered at much lower resolution, lower than 1080p even, then upscaled to 4K and besides having lower graphic settings like less objects populating the scene etc.
https://youtu.be/RnZAN91pUiw
On PC, if you turn RTX ON for Eye Candy, it almost cuts the FPS in half, then you have to DLSS ON to get back those FPS but now the game is rendered at a much lower resolution and then upscaled by AI, so it's not that sharp, and it brings the whole reason of turning on Eye Candy to a moot. It's the same on consoles.
Back to the topic of Resolution Vs Frame rates Vs Eye Candy. Higher FPS makes for a buttery smooth experience especially in a shooting game. There is a pretty discernable difference going from 60 to 120Hz, and 144Hz just sweetens it further. As gamers we always try get the max resolution & frame rate out of the game by first lowering the settings for Anti Aliasing / Lighting / Shadows / Reflections / Environmental Details / Ambient Occlusion / Motion Blur settings etc, before we lower any Texture Resolution / Model Details etc.
I would suggest get a 4K monitor/TV that can do 120/144Hz on 1080p/1440p as well and go for a decent HDR display fro PS5.
Anyways, bottom line is,
chasing the technology will get you bankrupt, be smart about what actually matters and at what cost.