BTW regarding GeForce 4000 series, the newest rumors are now pointing towards August rather than July like was earlier predicted. Also the current rumors are that Nvidia will launch with the 4090 first and work down so, if you're not someone that wants a 450w to 600w monster then your purchase might be even later... at which point one starts wondering if Radeon 7000's release won't become a factor (has consistently been rumored to be a quarter 4 release and to, if not straight-up beat Nvidia in performance, to at least match them while definitively beating them in power efficiency).
(06-03-2022, 03:43 PM)LeBoulet Wrote: [ -> ]And all this for only 4 euros more. ?
I mean technically there's also the 12600KF for cheaper which is the same but with no iGPU. I personally like having a integrated graphics since I prefer their low heat output compared to a dGPU but it's up to you if you want to eliminate the iGPU altogether.
For reference, the cheapest 12th gen i7 seems to be roughly 90 Euros more.
(06-04-2022, 03:14 AM)Nintendo Maniac 64 Wrote: [ -> ]BTW regarding GeForce 4000 series, the newest rumors are now pointing towards August rather than July like was earlier predicted. Also the current rumors are that Nvidia will launch with the 4090 first and work down so, if you're not someone that wants a 450w to 600w monster then your purchase might be even later... at which point one starts wondering if Radeon 7000's release won't become a factor (has consistently been rumored to be a quarter 4 release and to, if not straight-up beat Nvidia in performance, to at least match them while definitively beating them in power efficiency).
I mean technically there's also the 12600KF for cheaper which is the same but with no iGPU. I personally like having a integrated graphics since I prefer their low heat output compared to a dGPU but it's up to you if you want to eliminate the iGPU altogether.
For reference, the cheapest 12th gen i7 seems to be roughly 90 Euros more.
I've been a lifelong intel and Nvidia user. I wanted to test AMD for the first time with the R9 5900HX, fate will have decided otherwise! Here I am back at Intel, and suddenly there is a chance that I will stay on the 4000 series. Now if the new cards from AMD are much better than the Nvidia (EVGA), I might reconsider my position.
I have a 750 watt power supply currently on my main setup, hopefully that will be enough.
I also prefer to have the gpu integrated, in case of problem with my graphics card, it leaves me the possibility to use the machine while waiting to find another graphics solution. In any case, the difference in price and of the order of 25 euros, it is not nothing, but it is not expensive for insurance.
(06-04-2022, 04:03 AM)KHg8m3r Wrote: [ -> ]I was asking more so that you might want to buy a z690 motherboard to support overclocking, like the ASRock Z690M-ITX/ax https://www.amazon.fr/ASRock-Z690M-ITX-AX-Mini-ITX-LGA1700-Sockel/dp/B09N6Y6DLH/ref=sr_1_1?__mk_fr_FR=%C3%85M%C3%85%C5%BD%C3%95%C3%91&crid=3JHHMBN9SHDPE&keywords=ASRock+Z690M-ITX%2Fax&qid=1654278558&sprefix=gigabyte+z690i%2Caps%2C754&sr=8-1
Slightly more expensive, but it'll be more useful for you down the line if you want to overclock, or add another M.2 drive. It has a better WiFi card (WiFi 6E), and PCIe-5.0 slot for future GPUs that use it
Thank you for your interest
I saw this model of motherboard during my research, and I had a first brake when I noticed that for the price it does not have an "I / O shield", it seems that you have to buy it independently. (I did not find)
It's scandalous.
After reflection, the "Gigabyte B660I Aorus Pro DDR4" will be enough for me insofar as this PC will be intended primarily for emulation, I will not overclock it, and I never intend to change graphics card again. The 2080 xc ultra will complete its mission in this case.
This configuration seems largely sufficient to emulate up to the PS3-Xbox360. I would take over with real consoles for future generations.
On the other hand, you have just made me think that my Z390-H gaming (PCIe-3.0 16x) on the main pc will not be sufficiently advanced for future generations of graphics cards, so if I change cards, I will have to change processors, ram, motherboard and power supply...

Compared to even just 5 years ago, overclocking on both K-model (Intel) and X-model (AMD) CPUs like the 12600K brings much fewer gains than it used to unless you're ready to throw power consumption and thermal efficiency out the window by using 360mm radiators and/or dual-tower air coolers - something that is impractical for SFF PCs. Also in general the main gains are for all-core workloads (hence the existence of things like "multicore enhancement"), something that is less applicable to emulation, and those gains are typically because the stock clocks for highly-threaded workloads is lower due to, once again, power and heat output.
Nowadays you tend to really only gain a decent benefit in normal "gaming" PCs when overclocking on non-K or non-X CPUs but, at least for Intel, non-K CPUs specifically means a locked multiplier which makes overclocking difficult except in certain situations (e.g. BCLK overclocking, the rare non-K unlocked multiplier whether via special BIOS or special CPU like my G3258, etc). The tl;dr is that AMD's non-X CPUs are the main CPUs that gain substantial benefit with overclocking without going big on cooling and/or power consumption (and, even then, AMD's own "Precision Boost Overdrive" does a good job at getting you most of the way there in a manner similar to how "multicore enhancement" on Intel does) - there is one exception though for the many-core Ryzen CPUs under all-core workload where the all-core clocks are considerably lower due to, again, power and heat constraints, but overclocking tends to hurt gaming performance since the stock turbo clocks are almost always higher than what you can achieve with an all-core overclock.
Anyway, about PCIe, thus far PCIe 3.0 with 16x bandwidth isn't really a bottleneck, but who knows when that will change; keep an eye out for reviews (this is something that Hardware Unboxed / TechSpot has tested for with regards to PCIe 4.0 vs PCIe 3.0 on current-gen GPUs).
(06-04-2022, 04:49 AM)LeBoulet Wrote: [ -> ]...
I saw this model of motherboard during my research, and I had a first brake when I noticed that for the price it does not have an "I / O shield", it seems that you have to buy it independently. (I did not find)
It's scandalous.
Maybe Google translate isn't working very well on the Amazon page, but the only place I see no I/O shield mentioned is in a review where they say there isn't an
integrated I/O shield. According to the manufacturer's website, there is an I/O shield included in the box:
https://www.asrock.com/mb/Intel/Z690M-ITXax/#Specification under the Accessories section of the Specifications tab. That just means the shield is a removable piece that you have to make sure to install when building it. I've built several computers with ASRock boards, and that's how they've all been from my experiences.
Despite the lower-end B-series chipset, the Gigabyte board in question looks to be substantially higher quality. One thing is that, due to the use of an AIO, there's not going to be much (if any?) airflow over the VRMs and the Gigabyte looks to have substantially higher-rated VRMs as well as greater cooling on said VRMs - and we're dealing with an SFF PC with a factory-overclocked 2080, so you better believe there's going to be some heat in that case.
Also the rear I/O and port selection on the Gigabyte is just plain better unless you really want an analog audio input jack built into the motherboard itself (the Gigabyte has an optical audio output instead).
That being said, the ASRock does have two m.2 slots while the Gigabyte only has 1 but, even on the ASRock, they're still just PCIe 4.0. The PCIe 5.0 on the ASRock is only for the 16x slot which is, honestly, of questionable benefit unless you start seeing some PCIe 5.0 GPUs cutting down to 8x lanes in the future (m.2 SSDs and general I/O seems to be the main benefit for PCIe 5.0).
So it looks to me like Gigabyte put more money into overall build quality and rear I/O while ASRock put more money into the chipset (which would include lanes hence the 2nd m.2, PCIe bandwidth, and overclocking).
EDIT: Question - with GeForce 4000 series looking to be at least August and you planning to "steal" the 2080 from your existing PC, are you planning to just have your existing PC just forgo a dGPU for the time being?
Basically if you don't plan to build this ITX PC until GeForce 4000 series is available (and/or Radeon 7000 depending on how long Nvidia actually takes as I've mentioned previously), then it seems a bit hasty to be looking at parts now, especially with Zen4/Ryzen 7000 launching in autumn and Intel 13th gen seemingly launching before that (September at the earliest? If so that could very well coincide with a new GPU purchase anyway).
(06-04-2022, 07:14 AM)Nintendo Maniac 64 Wrote: [ -> ]Anyway, about PCIe, thus far PCIe 3.0 with 16x bandwidth isn't really a bottleneck, but who knows when that will change; keep an eye out for reviews (this is something that Hardware Unboxed / TechSpot has tested for with regards to PCIe 4.0 vs PCIe 3.0 on current-gen GPUs).
About the PCIe 3.0 16x, despite being able to ogle the 4000 series due to a possible limitation due to my PCI port (I'll check anyway), I'm currently looking for the 3000 series which can be a good alternative, because due to the forthcoming release of the 4000 series, the 3000 series is naturally likely to fall in price.
and it seems to be compatible with PCIE 3.0 16x, only I have a relatively large bottleneck with my I9-9900kf processor.
around 25% at 1920x1080p with an RTX 3090.
around 21% at 1920x1080p with an RTX 3080.
around 13% at 1920x1080p with an RTX 3070.
around 6,7% at 1920x1080p with an RTX 3060 Ti.
around 0.6% at 1920x1080p with an RTX 3060.
In general task
If I understand the principle of the bottleneck correctly, I am limited to the rtx 3070 maximum in the 3000 series.
Can a processor overclock help the situation?
(06-04-2022, 08:45 AM)Nintendo Maniac 64 Wrote: [ -> ]EDIT: Question - with GeForce 4000 series looking to be at least August and you planning to "steal" the 2080 from your existing PC, are you planning to just have your existing PC just forgo a dGPU for the time being?
Basically if you don't plan to build this ITX PC until GeForce 4000 series is available (and/or Radeon 7000 depending on how long Nvidia actually takes as I've mentioned previously), then it seems a bit hasty to be looking at parts now, especially with Zen4/Ryzen 7000 launching in autumn and Intel 13th gen seemingly launching before that (September at the earliest? If so that could very well coincide with a new GPU purchase anyway).
No, I can't do without a graphics card in my main computer while I buy a new card.
In other words, you are right, the material may evolve the time that I am ready to take action.
But the configuration seems sufficiently efficient at the moment to achieve what I aim to do and which has some form of limit. So, I'm not so worried about any evolution of the hardware.
Unless because of this evolution, the components that I am currently aiming to acquire are no longer available for purchase. (Motherboard mainly)
(06-04-2022, 08:51 AM)LeBoulet Wrote: [ -> ]I'm currently looking for the 3000 series which can be a good alternative, because due to the forthcoming release of the 4000 series, the 3000 series is naturally likely to fall in price.
Just keep in mind that, unless you specifically want DLSS (at least in situations where FSR 2.0 is not yet present) or their currently-better raytracing performance, the Radeon 6000's pricing is staying considerably less inflated than the GeForce 3000 series is, resulting in silly price-matchups where a budget Radeon 6600XT can be had for the same price as a higher-end GeForce 3050 (not a typo).
(06-04-2022, 08:51 AM)LeBoulet Wrote: [ -> ]I have a relatively large bottleneck with my I9-9900kf processor.
around 25% at 1920x1080p with an RTX 3090.
around 21% at 1920x1080p with an RTX 3080.
around 13% at 1920x1080p with an RTX 3070.
around 6,7% at 1920x1080p with an RTX 3060 Ti.
around 0.6% at 1920x1080p with an RTX 3060.
In general task
What is this "general task" you speak of though? Dolphin, or some other program?
Different workloads will have different CPU requirements and that's going to make all the difference for whether you'd be CPU-bottlenecked or not. I mean,
Gamers Nexus even used a 10700K @ 5.1GHz for their 3090Ti review, and the 9900K(F) really is virtually the exact same CPU with just slightly-tweaked stock clocks.
(06-04-2022, 09:13 PM)LeBoulet Wrote: [ -> ]In other words, you are right, the material may evolve the time that I am ready to take action.
But the configuration seems sufficiently efficient at the moment to achieve what I aim to do and which has some form of limit. So, I'm not so worried about any evolution of the hardware.
One thing I was thinking is that Intel 13th gen should be drop-in compatible with any LGA1700 motherboard with the only requirement being a BIOS update, and both the Gigabyte and ASRock motherboards support Q-Flash/Flashback (lets you update the BIOS without having the CPU installed), thereby letting you update the BIOS and being able to boot with a 13th gen CPU without even needing a 12th gen CPU. And since Intel historically doesn't really drop the prices on their CPUs, I would not be surprised if a "13600K" ended up being similarly priced to the 12600K (and that's assuming that something like the "13600 non-K" doesn't end up having E-cores) at which point it'd be silly to get a 12600K barring any weird Intel platform-segmentation shenanigans that makes this idea impractical.
Of course, I would imagine that 700-series motherboards would also launch along-side Intel 13th gen...