Dolphin, the GameCube and Wii emulator - Forums

Full Version: Hardware Discussion Thread
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I didn't know Lenovo was such a bad brand but I got a Lenovo Edge 15 this past October and so far this laptop has been good. i7-4510U, 8Gigs of RAM, 840M. It isn't for gaming, but it can handle it. The build quality is great. I like the keyboard. The trackpad was meh. The only problems were that the speakers are under it rather than above the keyboard and it came with some features I didn't care for(screen rotates backwards and it's touchscreen). I got it for $500 bucks from a friend.
@Garteal . If you still haven't bought anything yet , you should give this laptop a look
https://www.youtube.com/watch?v=jJPQfWCuxAo
That 4k version looks very tempting. Will still have to add an SSD to it and replace the paste though.
I have been experimenting with 4K 10 bit HEVC video rips lately in preparation for UHD bluray releasing later this year as well as netflix transitioning to it. It seems that at the moment the nvidia GTX 960/950 (second gen maxwell) and nvidia tegra X1 are the only GPUs that support full hardware decoding for this right now. Intel will not support it until Kaby Lake and AMD will not support it until 3rd gen GCN, both of which release later this year. VP9 full hardware decode is supported by skylake, nvidia tegra X1, and nvidia GTX 960/950.

Without hardware acceleration my HTPC cpu chokes on hevc 4K content (dual core sandy bridge 2.9GHz no HT) pulling a measly 5-10 fps at 100% cpu load. My desktop with my core i5 3570K (ivy bridge, quad core, no HT, 3.9GHz TB) pulls 24 fps at 40% cpu load thanks to kepler's hybrid decode support. I would upgrade my HTPC GPU but I can't justify $180 for a GTX 950.

Future proofing for the 4K era won't really be viable until next year. Still it's exciting to see things finally starting to change. We've been using 16:9 1080p 24/30 fps H.264 BT.709 8 bit YCbCr 420 chroma SDR 7.1 Surround as the gold standard for way too long. I just hope content distributors will start raising bitrates in response instead of overcompressing. The bitrates used for 1080p web streaming are generally way too low as it is.
HDR? Even more brightness (10x higher brightness levels than current LDR TVs)? @o@
But most TVs are already excessvely bright, especially if you watch your media in a dark[ened] room.
Do you *really* want to experience being blinded by really bright light as in real life?
7000 nits is now the absolute minimum a TV should have to pass as "HDR-ready".

They did a similar thing with "HDR" audio. New high-end DACs with over 130dB of dynamic range. But to really tell if there's any diffrence between these new DACs and a standard DAC, you'll have to turn up the volume to "insane" levels (130 dB SPL / the threshold of pain) and permanently damage your hearing in the process.
110dB is more than enough even for the most sensitive (or damaged) ears, because no sane person would turn up the volume above that Smile

The good thing about these new HDR TVs is the potentially lower power consumption at minimum brightness / contrast settings (and maybe higher backlight lifetimes because of this) Smile

What most people need now is very low black levels (nearly infinite contrast like a CRT) *and* efficient screen polarizers (this is just as important, if not more than the black levels), so the black remains pitch-black at any room brightness level and doesn't look grey or washed-out like a CRT.

8-bit color [with dithering] is also more than enough for most people (no banding at all).
What we need is not more bits (12-bit). How about improving the quality of streams / updating the standard to finally make full use of the already available bits (4:4:4 instead of 4:2:0).

+1 for higher bitrates for streamed content. When 1080i broacasts look like 480i it's not even funny. It's also sad to see that the 480p SD version of the same stream when post-processed by the TV's enhancements looks better than 720p (HD).

What I also want to see is *affordable* TVs with higher REAL refresh rates than 60 Hz (e.g. 100Hz or 120 Hz) and an option to enable enhancements even in PC mode.
Most (all?) TVs neither have support nor have an option to switch to the sRGB colorspace in 'Game' or 'TV' mode.

About fixed-function hardware video acceleration (DXVA):
HW accelerated playback is unreliable (too many issues, especially with web browsers, high framerate videos and AMD drivers). Many things break after a GPU driver update, it gets out of date quickly as new additions to the spec are released and more demanding levels / higher bitrates / framerates are used.
There's also the issue with fixed quality (read: lower quality) scaling and color reproduction when using DXVA.
And finally, HW acceleration often gets broken when the GPU / drivers reach a legacy status.
The only advantage hardware acceleration has over SW playback is low power consumption.
Software playback wins hands down.

The optimal choice for UHD BR playback would be a multicore CPU (8 cores minimum) with high IPC and an integrated GPU with HBM2 and full next-gen HSA support... so we're still waiting for that 2017 AMD Zen APU and the next-gen Intel Core iX CPU with 8 cores.

Is VP9 acceleration *that* important? Aside from YouTube and a few Linux users, nobody else cares about VP9. Almost every person and his dog who cares about small filesizes and image quality uses H.265/HEVC for anything above 1080p (4K/5K/8K).
Oh for fuck's sake. Can you please just leave us alone already? Haven't you gotten the message yet with everyone trying to ignore your posts? Nobody here cares about your increasingly ludicrous and unsubstantiated complaints about everything. I used to try ignoring you wherever you went but that's become increasingly difficult as you now seem to pop in just about everywhere instead of just development discussion. I don't have time to write paragraph long explanations to correct each of your points. If this had been posted a few years ago when I didn't have a fulltime job and school I would have given you that benefit as anyone here can attest to but I no longer have that luxury.

I know this post didn't break any rules but please mods just ban him already. He's done enough in the past to warrant a permaban at this point and all of the devs can't stand putting up with him.
I swear HDR has to do with colors and contrast...
(02-09-2016, 09:03 AM)DatKid20 Wrote: [ -> ]I swear HDR has to do with colors and contrast...

and capturing shadows and highlights
@DatKid20

Think dark blacks and bright whites at the same time (high static contrast) without washing out details in dark or bright areas of a scene (in other words it maintains high greyscale accuracy).  Having calibrated "high end" HDTVs before I cannot believe it took this long to be standardized. 

For anyone interested in learning about this here is the whitepaper for dolby vision, the most popular HDR solution right now in home theater:  http://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf

I will warn anyone that the first half is mostly marketing fluff.  And here's a decent article for the lazier among us:  http://gizmodo.com/how-dolby-vision-works-and-how-it-could-revolutionize-1594894563

Fun fact since Kirby's post made me curious I looked up the brightness of sunlight when staring directly at the sun.  It's 1.6 billion nits assuming optimal conditions.  Just slightly higher than the 1000 nits he claims is going to cause us to "suffer eye damage".

Also it appears I was wrong earlier about my prediction that Intel would maintain support for bclk OC on non-K skylake.  They have officially patched it in the latest UEFI updates:  http://www.guru3d.com/news-story/intel-halts-cheap-non-k-overclocking-by-closing-skylake-bios-loophole.html

If you're planning on doing bclk OC on a non-K skylake cpu with a new mobo check your UEFI version and roll back if necessary.
(02-10-2016, 04:02 PM)NaturalViolence Wrote: [ -> ]Also it appears I was wrong earlier about my prediction that Intel would maintain support for bclk OC on non-K skylake.  They have officially patched it in the latest UEFI updates:  http://www.guru3d.com/news-story/intel-halts-cheap-non-k-overclocking-by-closing-skylake-bios-loophole.html

If you're planning on doing bclk OC on a non-K skylake cpu with a new mobo check your UEFI version and roll back if necessary.

BOOOOOOHHHHH!!!!