Commentary on Nintendo NX Disc Free and Khronos Membership:

A few weeks ago various reports surfaced surrounding patents filed by Nintendo that reveal a potential console that lacks a disc drive. In the time since that news broke there have been other events; few of which have actually been surprising. Some of which might actually highlight the NX’s killer feature.

To review the recent past earlier this year news broke that Nintendo was looking at Google’s Android as the OS base for the NX. The reaction from investors and analysts was overwhelmingly positive; with Nintendo’s stock jumping multiple points on the news. Then one of the weirdest events in gaming history occurred: Nintendo broke a literally decades old policy and directly commented on the rumor. Outside shot? NX actually is running some kind of Linux, likely Android, and Nintendo’s execs decided to turtle up rather than admit to a successful leak and simply embrace the wave of positive influence from investors and analysts. The actual reaction from Investors and Analysts in the wake of a catastrophic tactical error? Overwhelmingly negative.

Later on news seemed to break that Nintendo was not going to take steps to ensure that the NX console was at least as powerful as the Sony PS4. Multiple sources, including myself, got more than a little mileage out of noting that advances in chip fabrication, memory technologies like High Bandwidth Memory, newer and in-development ARM processors, ARM’s Mali GPU reference and various spin-offs, as well as AMD’s own Zen designs, had drastically lowered the barrier to reaching PS4 levels of rendering performance within the thermal envelope Nintendo had aimed for across the Wii and WiiU. Again, Investor and Analyst reactions were overwhelmingly negative. In my own terms; I would not say that the decision to not match the PS4 in terms of rendering and processing power was the worst mistake Nintendo had ever made; but it certainly was in the Top 2.

Fast forward and the “breaking” news of a console lacking a disc drive just wasn’t that surprising. What the preceding rumors and reactions had established is that Nintendo was not only certainly changing the Instruction Set Architecture (ISA) of the processor; but also changing base Operating Systems. The change in base OS removed much of the desire or need for disc-based backwards compatibility. Fair shot, some people will pipe up and scream Xbox One has 360 Disc Based Compatibility! No. It doesn’t. Putting a 360 disc in an Xbox One console triggers a download which includes, in most cases, at least a source recompile (think ported code) or at minimum a binary translation (think WINE) layer of, or for, the disc-based game’s executable files.

For Nintendo switching both the base Operating System and the hardware ISA is a bitter pill to swallow. Nintendo had firmly established the idea(s) and expectations of keeping physical game compatibility while moving to the subsequent generational platform. Shortly after the WiiU’s launch Nintendo faced a harsh position in the tech market. IBM was not interested in continuing to develop the POWER system as leveraged across the GCN, Wii, and subsequently the WiiU. None of the other processor vendors involved in OpenPOWER were interested in pursing POWER processors as a contender in the device form factor Nintendo’s home console were invested in. Nintendo was faced with the very real fact that their next console was going to have to change the hardware ISA. As a company Nintendo has expended a vast amount of resources on tailoring their software tool chains to both the ARM and POWER architectures; including the emulation and/or subsequent “new code” direct porting of Nintendo Platform titles. The necessary change in hardware ISA effectively forcibly retired Nintendo’s extensive software investment in the POWER architecture.  The only question at that time was whether or not the subsequent console to the WiiU would be based on ARM or x86-64; which in turn would further refine the remaining questions surrounding the software and hardware options. Whichever option Nintendo chose would invariably result in a massive amount of internal re-organization, re-development, and re-training.

In respect to supporting previous software releases on a new hardware ISA, Nintendo already leverages emulation techniques and/or binary translation layer techniques such as those taken by Microsoft in supporting game releases through the Virtual Console. Like Microsoft, Nintendo has stressed to investors that such re-releases are resource consuming. Again, like Microsoft, such techniques are no guarantee of a successfully operational game. The case in point as to the monumental nature of such a task is outlined as much by the limited number of titles released through the Virtual Console as it is by the very existence of the Xbox One’s compatible games list. Sony, on the other hand, played around with including the previous generation’s hardware as an on-system chip with the PS3. While an effective method of supporting backwards compatibility; that Sony nor Microsoft bothered bringing POWER+x86-64 system designs to market with the Xbox One or PS4 is probably enough indication that such an approach is not a viable option.  Ensuring that previous disc-based releases would work out of the box on a completely new hardware ISA is, at best, a daunting proposal. At worst it’s a nightmare worthy of Cthulhu.

Other market factors and advances have likewise contributed to the lowered need for a physical disc based release. Nintendo’s Digital Distribution and Single-Sign-On have improved to the point where a Steam-like purchasing system is a completely viable option. Likewise, solid-state memory prices have cratered just as fast as capacities have increased. While 1TB High Capacity Blu-Ray discs are hardly pointless for distributing massive amounts of data; most retail-release games barely touch those kinds of space requirements. Case in point; Dying Light features textures and models for QHD/UHD resolutions; but doesn’t even top a 20GB install size. According to Steam the largest install size of a game I currently have is Bethesda’s Wolfenstein: The New Order which weighs in somewhere around 46GB.  On the PS4 the largest digital distribution game is Destiny; which consumes somewhere north of 60gb. These aren’t exactly out-of-realm capacities for solid-state storage; where capacities as high as 512GB are currently available in the SD card format.

Nintendo was likely also influenced by the form-factor of the Wii and WiiU consoles; and the strict requirements on a thermal and electrical consumption profile. Removing the disc-drive would leave more room in the thermal envelope for an improved processor and gpu. Additionally the physical space occupied by the disc drive on the WiiU chassis could be re-purposed for other features such as front-facing USB ports.

Looking to the successor to the WiiU then; there were few arguments for retaining a disc drive. Combined with other on-going known factors and a potential picture of what Nintendo could be aiming to achieve takes shape. The keys here could be found in multiple smaller stories; such as the partnership with the Android Software Developers DeNA. Other keys exist in the rapid competition from within the ARM processor space. As referenced in the earlier link to commentary on the decision to not ensure the NX is as capable as the PS4, a variety of vendors now offer 64bit ARMv8 processors with up to 8 internal cores that can offer similar compute per watt performance, if not vastly improved performance, to the AMD Jaguar as used in the PS4. While Nintendo could very realistically field a console with greater compute capability; the PS4’s dominance  has largely defined gaming within the 1080p resolution space. Unless Nintendo moon-shot for QHD rendering there’d be little point to out-performing the PS4; and QHD+ gaming is really out of the thermal envelope and power requirements that Nintendo has aimed for across the Wii and WiiU.

Yet further keys to Nintendo’s potential goals exist in the maturation of the ARM big.LITTLE processor technology. To dwell on big.LITTLE for a moment; there are fair arguments to be made that the processor core(s) in and of themselves are not the sole culprits of battery drain in mobile devices. Electrical drain factors such as the display, memory, and system bus can offset the potential power gains from shifting processor loads around. Such concerns are not as prevalent in home devices like consoles where thermal restrictions and electrical drain requirements are not as limiting. Considering when the PS4, and in turn the PS4 derived Xbox One (not sorry), went into development; the SMP and scheduling for x86-64 processors like AMD’s Jaguar were fairly mature and well implemented within the BSD and NT kernels.  However; big.LITTLE has forced the Linux Kernel scheduler as used in Android to improve the scheduler load balancing and processor distribution in ways that neither the BSD or NT kernels support. Now, imagine for a second that a big.LITTLE processor configuration was physically split. Imagine the low electrical consumption cores in one chassis; and the higher-performing cores in different chassis. Would that be interesting? Hold onto that thought.

Another key likely exists in the news that Nintendo has joined Khronos as a Contributor. For those unaware of the significance of the Vulkan API, I’d suggest watching Khronos’s 2015 API State of the Union; with emphasis on Valve’s contribution around the 1:41:00 mark: https://youtu.be/quNsdYfWXfM?t=6050. Starting with the Gamecube Nintendo consoles have leveraged a graphics API roughly derived from OpenGL called GX. The GX API was intended to accomplish the same goals as Voodoo’s Glide or Khronos OpenGL ES; limit and constrain elements of the OpenGL API in order to better suit the limited requirements of a limited computational device. Which was fine when the GCN launched with fixed function graphics in 2001. That API became a sticking point on the original Wii console as pretty much the entire third party developer industry had moved into programmable shaders as exposed through Microsoft’s DirectX 9 and SGI’s Architecture Review Board’s OpenGL 2.0. The Wii console, despite launching well after the move to programmable shaders, retained the fixed graphics functions that would have been at home with the OpenGL 1.x API’s of the late 90’s and early 2000’s. One of the major points of interest on the WiiU is that not only was the console finally joining the modern era or programmable shader hardware; but that it would also support a more recent OpenGL API release. However, to date, no Nintendo signed developer has stated openly what API the WiiU system supports outside of GX and what has loosely been referred to as a GX2 API that is extended to include OpenGL 3.x class features.

Given that Nintendo has only indirectly worked with OpenGL development in the past through Khronos members such as AMD; there are likely three points to take away from the public membership. The first is likely a clear sign that the NX console will be leveraging Khronos Vulkan as the default graphics API. This fits into selection of an Android derivative as the new base operating system given that Google has also selected Vulkan for Android.  The second is that Nintendo engineers, who through GX have ample experience in manipulating low-overhead graphics API, intend to have input on the maturation of the Vulkan API. The third point is that Nintendo likely intends to leverage Vulkan’s development as a graphics API for mobile devices and either support the API on the 3DS/2DS… or launch a new mobile hardware device.

So; time to add these factors all up into an interesting idea. Nintendo’s not interested in including a disc-drive in the device that patents are filed to. Networked Digital Distribution is very real possibility; and retail releases can be met with the same type of memory card distribution as the existing mobile (X)DS platforms. Android offers big.LITTLE processing as well as Vulkan graphics.

What if the memory card that consumers bought in a store could work on both their home console… and a mobile device?

What if the application or game that consumers bought in Nintendo’s Online store could work on both their home console… and a mobile device?

What if the application or game could be paused; and shunted in real-time; from a mobile device to a console or vice versa?

What if the application or game had assets for both High-Resolution / High-Polygon displays and rendering; and Low-Resolution / Low-Polygon displays?

What I envision is a console and tablet combination much like the WiiU. Only the Tablet is just a tad bit smarter than the WiiU’s remote display device. I envision NX owners being able to take the NX tablet with them; playing their games in lowered asset modes. When in range of the NX home-base though; I envision the game cranking up the details and leveraging the greater processing capabilities.  Such a concept could also work alongside an updated (X)DS device. Like the NX tablet; gamers could simply link to the NX home-base to play their games in high-asset mode; and still get the entire dual-screen functionality on the go.  The demand for a mobile device with a proper gamepad and buttons is huge; something the 3DS keeps proving software release after software release.

The software to support this kind of transparent device gaming is now largely in place. The improvements big.LITTLE has brought to Linux Kernel Processor Scheduling dramatically simplifies the difficulty in shifting the execution of an application from low-powered processors to higher powered processors and back. The benefits Vulkan brings not only open up the home console for awe-inspiring visuals; those benefits promise a playable experience on lower-class hardware devices. The only real difficulty would be maintaining a pause state while shifting devices; a not entirely insurmountable obstacle. If this is indeed what Nintendo is attempting to achieve; then they could very well revolutionize mobile and console gaming… all over again.

The only reason I’m not sure I buy my own theory? This kind of revolution would require Nintendo to actually use up-to-date ARM hardware… and well… Nintendo isn’t exactly known for choosing the bleeding edge.

Leave a Reply