TSM Episode 368: The Ill-Made Knight

Lancelot's opinion of himself is one not shared by others and, in fact, it is unclear if there is anything noticeably wrong with his appearance at all. This version of him, from 'Merlin' (BBC TV 2008-2012), may be accurate to that imagined by White.

Le Chevalier Mal Fet

The Starlight Megaphone
Download: Released 2016.03.21

Because Australians have not yet invented the concept of time, the switch to Daylight Savings Time completely befuddles SiliconNooB, leaving Lusipurr in the lurch. Blitzmage quickly steps into the breach with a dizzying array of non-Cricket news.

16 Comments

  1. Sebastian
    Posted 2016.03.21 at 02:55 | Permalink

    Re: PS4.5

    The issue Sony (and Microsoft, for that matter) will have with these iterative updates they’d like to have is their reliance on AMD. They both went with custom implementations of an AMD APU (accelerated processing unit – combines CPU and graphics onto one chip), and the 2013-era architecture (Jaguar) hasn’t been updated with any significance yet. AMD’s next microarchitecture (Zen) has yet to be released, and until it’s ready (and it very well could be later this year), we won’t see a hardware bump.

    APUs are great for small form-factor systems that need moderate graphics performance, but they are not ‘high performance’ compared to standalone CPU and (even more so) GPU. The compute power per clock-cycle of AMD’s current architecture trails Intel by almost eight years, and consumes much more power in the process. The amount of voltage required to process HD graphics at even the low-to-medium quality settings (compared to desktop) is enough to push an APU to 100%, and pegging it inside a slim enclosure like the PS4 creates a lot of heat.

    The only way AMD can provide Sony and Microsoft with more power within the same thermal limits is a process change, and improved IPC (instructions per clock). Lower process (moving down to 16nm from 28nm, for example) would be essential. AMD has been significantly behind Intel here, as well.

    Bottom line, unless Zen is the world-beater that AMD would love it to be, the PS4 – be it 4 or 4.5 – will not be able to power VR. The most it could do under its current power is AR (augmented reality). 4K or full VR would require full-sized components. Not an APU, even if it does have 8GB of GDDR5.

  2. SiliconNooB
    Posted 2016.03.21 at 09:44 | Permalink

    I’m not really sure which direction Sony want to take this in. If they shrink the fabrication process then they could potentially allow themselves some headroom for overclocking. Otherwise they’d have to commission AMD to design a completely new APU, so I dunno… Maybe AMD are already designing a successor APU for use in laptop computers?

  3. Wolfe
    Posted 2016.03.21 at 17:08 | Permalink

    I’m in full seriousness when I say that if the PS4.5 turns out to be a thing, I’m just going to move to PC gaming. There will be no bloody difference at that point as I’ll constantly be expected to spend hundreds of dollars to keep up with the Joneses. Hell, I might just have to stop gaming, I can barely swing this hobby as is.

  4. Lane
    Posted 2016.03.21 at 22:04 | Permalink

    AMD does not have the R&D budget to try to compete with nVidia in the GPU space, and APUs will always remain a niche product in the PC sector, as for similar amounts of money a discrete GPU (even with AMD’s architectures) will always be superior, not to mention that in most gaming-related situations, low-end Intel CPUs mop the floor with AMD’s CPUs. The “savings” one gets by combining CPU and GPU into a single architecture rarely pay off for AMD, and indeed even Intel has an “APU” (though they don’t call it that) that matches or exceeds AMD’s capabilities in the PC sector (look at the “Iris Pro” line of integrated graphics).

    In fact, only the massive optimization for consoles brings them close to “par” with moderate PC systems.

    If Microsoft or Sony are serious about SFF PC-style gaming on an x86 platform, they would be better served by going on their knees to Intel, asking for a cheap i3 chip without integrated graphics, and then begging nVidia to let them use something akin to a 970M or 980M in a new generation of console. Not only would that improve performance over the Franken-AMD processors in current consoles, but would also probably significantly lower the thermal envelope in which that performance could be obtained, by separating the CPU and GPU heat generating sources and permitting engineers to come up with a better cooling system. Trying to move a lot of hot air off a single chip is more difficult than moving slightly less hot air from two separate chips. If they could integrate that with HBM versus GDDR5, it would help significantly.

    But there’s no way they’ll drive current-gen consoles into obsolescence. That would be a blunder on par with the “New 3DS,” and Microsoft and Sony will not ride the Nintendo train to failtown (haha, who are we kidding, if they saw a buck in it they totally would).

  5. Sebastian
    Posted 2016.03.22 at 09:47 | Permalink

    @Lane

    I couldn’t agree more. AMD has been floundering for years, and buying into their APU platform was a bit shortsighted, considering how long it takes AMD to develop new architecture. The PS4 is using laptop-caliber hardware, and an Intel Core i3 with high-end discrete graphics (and NVIDIA is preferable given thermal constraints) would provide a system that could provide 4K support (GTX 900 series natively support HDMI 2.0 as well), and run cooler and quieter in the process.

    People love to hate Intel, but they charge whatever they like for their CPUs because AMD hasn’t been competitive since the Athlon 64. AMD’s highest-end CPU product (the FX-9590 CPU) requires a shocking 220 W – and still underperforms next to even the entry-level Haswell Core i5. I wrote up a big article on CPU/GPU gaming performance last year for PC Perspective, and it illustrates (if nothing else) that a Core i3 is more than enough for most games, even with a midrange or higher GPU.

    (The article is available here, but it was never as interactive as I would have liked, and the charts are comically large: http://www.pcper.com/reviews/Systems/Quad-Core-Gaming-Roundup-How-Much-CPU-Do-You-Really-Need )

    Ultimately, the choice to go with AMD made sense a few years ago when AMD was suggesting they would have enthusiast-level graphics from future APUs, but it just hasn’t happened and their CPU performance holds them back significantly.

  6. Lane
    Posted 2016.03.22 at 11:37 | Permalink

    Intel has been anti-competitive and had some shady business practices w/r/t AMD in the past, and nVidia is hardly a champion of openness and fair business practices.

    On the other hand, Intel and nVidia have the performance to back it up. I’ve been totally mercenary with my own PCs, but currently I have an i7 and 2x GTX 980s in SLI because of the performance gains and thermal envelope I run in. While I don’t doubt 2x Fury X’s would run faster (and I have a home-built watercooling system so cooling isn’t that much of an issue for me), I also have G-Sync monitors. I know it’s proprietary technology, but damn if it doesn’t work and work well. And nVidia Gameworks, as unfair as it is to AMD cards, seems to help out the games I enjoy playing more (e.g., Witcher 3 ran like shit on AMD cards for a while after release, and Hairworks still slows those cards to a crawl. Geralt’s flowing locks are an important part of the experience for me, apparently).

    In other words, I know PC gaming has made deals with its devils. While I enjoy AMD’s can-do attitude and ethos, the fact of the matter is I can’t get better FPS from good feelings, and while the Fury X may barely dethrone the 980 ti at large resolutions currently (I don’t play in 4k because very few cards can run newer games at acceptable framerates; 2560×1440 works well enough for me at the moment), nVidia is probably a month or two away from releasing a 16nm GPU on a FinFET process, which will mean the ecosystem I’ve already invested in will once again produce superior rates.

    I can well imagine nVidia NOT wanting to cooperate with MS and Sony, however. Their Shield STB attempts to bring cloud-based PC gaming and PC streaming to the living room, and while this tech is in infancy and highly dependent on a fast connection, the time is coming when high-throughput wifi for homes and near-universal gigabit Internet will be a reality for the majority of consumers, and I expect a model like that of Gamestream will be the norm, rather than a techy niche.

  7. SiliconNooB
    Posted 2016.03.22 at 18:30 | Permalink

    they would be better served by going on their knees to Intel, asking for a cheap i3 chip without integrated graphics, and then begging nVidia to let them use something akin to a 970M or 980M in a new generation of console.

    Why not just throw in a couple of eight core i7s and quad GTX Titans? That way the PS4.5 could calculate the question to the meaning of life while we play our video games.

    – On a completely unrelated note, have you seen Sony’s finances recently?

  8. Lane
    Posted 2016.03.23 at 02:33 | Permalink

    I missed the last quarterly report.

    The i3 line and the M line from nvidia make it into SFF PCs (like the vapor-ish Steam Boxes) for not much more than a console and might actually be able to run a HMD.

  9. SiliconNooB
    Posted 2016.03.23 at 09:19 | Permalink

    The cheapest Steam box that I can see advertised [ http://store.steampowered.com/sale/steam_machines ]is $50 more than the PS4 launch price. Raising the price of the PS4 might work for the cucks in Canadia, but I don’t think the rest of the world’s gamers will be quite so accepting at a time when the price of the console should be coming down. Then again, Sony make their Pâté out of geese who lay golden eggs, so anything’s possible.

  10. Lane
    Posted 2016.03.23 at 10:44 | Permalink

    That also assumes they couldn’t negotiate some sort of large-purchase of lower-binned i3 CPUs (that would otherwise go to budget PC manufacturers). The GPU would likely be the bottleneck, I agree, because nVidia doesn’t do anything cheap, ever, but even a discrete Radeon GPU (say, a 380x or so) would actually have the graphical muscle to run HMDs with any sort of fidelity, and let’s be honest, AMD would take whatever scraps Sony deigned to throw.

    I don’t think Sony would do this, because it would render the existing PS4 market obsolete and they’re winning. Microsoft can talk about needing to upgrade the Xbone, but that’s because they made stupid design decisions in the first place. I’m just saying it wouldn’t surprise me if Sony feels like they got a raw deal with their custom-built APU on the PS4 and wishes they had negotiated more with Intel and nVidia, since those two companies seem to be putting out higher performance-per-dollar parts.

    That said, I am watching with curiosity the next generation of discrete GPUs in the PC market, because the buzz surrounding Pascal and AMD’s newest, especially w/r/t HBM, indicates we will see a larger graphical fidelity jump between generations than we have since Fermi, and it’s about time. My 980s, while an improvement over the 780s I had previously, were much more modest, and the 780s were extremely modest compared to the 760s I had before that.

  11. SiliconNooB
    Posted 2016.03.23 at 11:32 | Permalink

    I’m just talking out of my arse here – but I get the feeling that Sony got exactly the kind of graphical hardware that they paid for…

  12. Sebastian
    Posted 2016.03.23 at 13:54 | Permalink

    I’m with SiliconNooB on this point – AMD doesn’t post loss after loss for no reason. AMD offers an outdated, largely inferior product at a very attractive price to these console makers.

    As an example, the desktop A10-7850K is $115 and features “supercharged AMD Radeon™ R7 graphics” that can barely manage 15 FPS at 1080p in a lot of games. Wonder why APU-powered consoles end up having to run games at 900p or below? Even when you through ultra-fast memory at an APU’s graphics cores (a must), you still don’t have the raw horsepower of even a budget discrete option. It’s a false economy when quality is your goal.

  13. Sebastian
    Posted 2016.03.23 at 14:05 | Permalink

    The only company that has genuinely innovated in the SFF space in the last few years is Apple. They offer a completely different take on thermal dissipation that actually works. MSI just released their version – the Vortex – which, once internal photos were shared online, is a mess of heatsinks and fans inside. The iMac does thermals extremely well in their larger 5K chassis as well, with a single 1200 – 2800 RPM fan cooling a desktop CPU up to the i7-6700K, and discrete mobile GPU up to the AMD M395X (because of OpenCL – and Apple doesn’t have a great relationship with NVIDIA).

    I only mention this because consoles have been hampered by thermals for a long time. My PS3 runs ridiculously hot, and many of the early models failed. The XBox 360 was a thermal nightmare. These current consoles are doing better, but are running laptop hardware, which is significantly easier to cool. If VR is the goal, mobile hardware isn’t good enough anymore.

    I have no interest in VR, but Sony obviously does or they wouldn’t be jumping into the market with their me-too headset. It’s a fad just like 3D TV was. I was at CES in January, and 3D wasn’t. It’s all 4K, and will then move to 8K and/or HDR in the coming years. Fads are needed to sell TVs; VR is apparently needed to avoid the problem of originality and quality in gaming.

  14. Sebastian
    Posted 2016.03.23 at 14:06 | Permalink

    I was referencing the 2013 Mac Pro with my rambling about the SFF innovation from Apple, by the way. Failed to mention that.

  15. Lane
    Posted 2016.03.23 at 14:58 | Permalink

    VR can work if they develop a killer app. I think a good HMD (read: a much smaller, lighter, and with better integrated audio) has a market, for travelers who want to work in relative privacy on public transport, parents who want kids to watch a movie and shut up in the car, or workspaces where occasional PC use may be necessary but no space for a monitor (like a shop floor).

    Notice “games” aren’t in there. While I don’t doubt some types of games (racing or space sims, for example) would benefit immensely from doing that, we’ve seen what kinds of trouble those games can get in (Driveclub, Star Citizen, Elite: Dangerous, etc.). Eve Valkyrie may be an interesting arcade-style experience, but I’m willing to bet it won’t have any sticking power beyond an Oculus tech demo.

    I could see myself using a HMD… to read or watch movies while reclining in bed on a lazy Sunday morning. I don’t see myself firing up FFXIV to run a few dungeons through a HMD, or raiding tombs in one, or playing Titanfall 2 (although, if done right, that does show some promise). I just don’t believe it can or will be done right largely due to hardware fragmentation and the sheer resources needed to run it. While my PC could, my PC isn’t actually representative of what most consumers are rocking.

    Also, speaking as someone whose SO used to work for Apple and had one of those i5s, while they “ran” in the sense of “does what it says on the tin,” their performance got throttled to keep heat manageable. SFF and silicon aren’t the best of friends. Maybe 10ish years down the road when we’re using graphene-based quantum chips, sure.

  16. SiliconNooB
    Posted 2016.03.23 at 15:37 | Permalink

    I just don’t think that consumers are going to want to use a contraption that makes the 32X look tidy and well-ordered by comparison.