The Nintendo Wii U has gotten a lot of heat regarding its computing power. Its CPU, in particular, is less powerful than the ones used in the PlayStation 4 and Xbox One, but the entire system uses a pretty unique hardware setup, and the little console might be much smarter than many initially judged.
This sudden clarity comes from the semiconductors from the Renesas factory that Sony just bought out. As it turns out, that same factory made the Xbox 360’s semiconductors when it first came out several years ago. Computing technology has improved significantly since the Xbox 360, as technology is wont to do, and it is safe to assume the parts Renesas is pumping out have improved as well.
In short, the Wii U was discovered to have as much eDRAM as the Xbox One has eSRAM, Nintendo’s console has a rather unique GPU structure, and it is speculated that the Wii U could have three times the internal bandwidth of the Xbox One in GB/second. Surprising to some, no doubt, but not so much considering that so far the Wii U has had little trouble running full 1080p graphics at a steady 60 fps.
As a brief introduction to some of the techno mumbo-jumbo in this piece, eDRAM and eSRAM stand for “embedded dynamic random access memory” and “embedded static random access memory.” DRAM and SRAM are functionally the same, but DRAM needs fewer electronic parts to store each bit of data, making it usually cheaper and more space efficient, but SRAM has its charms, too. More likely than not, the RAM in your computer is non-embedded DRAM.
The use of eDRAM in consoles serves to increase the internal bandwidth—and by extension the whole console’s speed—using eDRAM. According to some literature from Silicon Valley the Xbox 360 made good use of this technology back in its day:
“Using external memory, the GPU would be limited to a 32- or 64-bit
interface. The NEC Electronics eDRAM expands the on-chip memory interface past 1000
bits in width to support the GPU’s 256-Gbytes/second bandwidth between the graphics
pipelines and memory.”
— Bob Peterson
This basically means that eDRAM allowed the 360 to exceed its normal 1024 bit bandwidth limit by a significant margin, potentially boosting it all the way to a through-put (how much data is pushed through) of 256GB.
The Wii U has as much eDRAM as the Xbox One has eSRAM. This means Nintendo and Microsoft’s current consoles have 32MB of embedded memory, allowing 4MB per macro. (A “macro” being a single path where memory can flow. The GameCube had 32 macros with a “width” of 16 bits, meaning it could handle 512 bits/cycle.)
We do not know the exact specs of the Wii U, as Nintendo keeps a lot of what makes it tick close to the chest, but knowing what we do about the Xbox One and the GameCube, it is theorized that the Wii U clocks in a 563.2GB/second, more than three times the Xbox One’s 170GB/second. That is a pretty big jump. It won’t mean much to many readers, but here’s a diagram of the Wii U showing off it’s eDRAM:

Regarding the Wii U’s processing capabilities, Nintendo’s console does have a relatively weak CPU, but makes up for it by having a hefty GPGPU, which stands for “general purpose graphical processing unit.” What this means is that console loads much of its non-graphics work onto the GPU as well as the graphical ones—which are well suited to the sort of calculations needed in graphics and general physics processing. While the Wii U does not have perfect DirectX 11 capabilities, it is strong enough to tap into equivalent features, which Teku studios acknowledged to be working on using the Unity Engine.
The Wii U’s unique processing system makes it especially suited to mult-ithreading , which is a fancy way of saying it can do a lot of calculations at once, and is even capable of
tessellation, which is a pretty big deal for computers. Even its power supply, which looks relatively light at a glance, earns it some power points. The Wii U has a Class A supply, meaning low consumption and high-efficiency. Eurogamer offered a breakdown:
“…if the [Wii U] has a power supply with efficiency of more or equal
to 85% then we are talking about 62 watts or more for the [Wii U] out
of those 73 watts labeled at [the] power supply and the rest is
dissipated as heat.”“…since [Wii U ] can load as much as 62 watts or more…accounting [for] the [minimum], meaning 85% but the standard says it
can go beyond that[,] that means there are still about 18.33 watts or
more power left for games.”
— Eurogamer
So what does this mean, exactly? Well, Eurogamer has the answer to that, too:
“I am sure you have heard the rumors about the e6760 being the base of
the [Wii U] just to tell you, the e6760 has a performance of 16.5
gigaflops per watt, so if the [Wii U’s GPGPU] has a similar efficiency,
then if at least we consider 10 watts out of those 18.33 watts (or maybe
more) we would be speaking about 165 gigaflops that many game[s] waste
due to being quick ports.”
— Eurogamer
It seems there have been some miscalculations about the strength of Nintendo’s little console that could. It certainly explains why all of Nintendo’s first-party titles play so smoothly at max resolution, and most of all it begs the question of what could be in store should more developers take advantage of all this potential.
Source: Gaming Blend
No
ChannelImages
Our Verdict











Comments