GameCube Architecture

A practical analysis by Rodrigo Copetti

If you use accessibility tools or old browsers, switch to the ‘classic’ edition.




Supporting imagery

Model

Image
The one and only GameCube.
Released on 14/09/2001 in Japan, 18/11/2001 in America and 03/05/2002 in Europe.

Motherboard

Image
Motherboard
Taken from my 'DOL-CPU-10' model, later ones removed Serial Port 2 and Digital Output. Encoder chip, expansion, controller and PSU slots are found on the other side.
Image
Motherboard with important parts labelled

Diagram

Image
Main architecture diagram
Each data bus is labelled with its width.

A quick introduction

Gone are the days of ‘3D-attempts’, Nintendo’s new offering consists of a clean and powerful break from its predecessor that will open the door to new, original and unseen content.

It’s worth pointing out that the design of this architecture led to one of the most compact hardware of this generation. This was emphasised by the lack of slims or lite revisions.


CPU

After the loss of SGI’s dominance in the graphics market, Nintendo needed new players to partner up with.

Image
Construction of Gekko.

A promising candidate seems to be IBM: Apart from their famous work on mainframes, they recently allied with Motorola and Apple to create a CPU powerful enough to compete with Intel’s ruling in the PC market. The resulting product is a series of processors carrying the name PowerPC, which were selected to power 99% of Apple’s Macintoshes and some embedded systems.

Fast forward, Nintendo required something powerful but cheap, so to comply with those red lines, IBM grabbed one of its past designs, the PowerPC 750CXe (found on the late iMac G3, known as the Early-Summer 2001), and beefed it up with capabilities that would please game developers. The result was the PowerPC Gekko and runs at 486 MHz.

Features

Let’s find out what makes Gekko so special, and to do that we need to first look at the offerings of the 750CXe [1]:

Additionally, this CPU also includes dedicated units to speed up specific computations [2]:

And, of course, some cache is also included to speed up memory bandwidth:

IBM’s enhancements

While the previous lists of features are very appreciated (compared to previous generations), this CPU still falls behind others on gaming performance (let’s not forget that this is still a general-purpose CPU, good at spreadsheets but average at physics). To compensate, IBM added the following tweaks that will constitute Gekko [3]:

Apart from handling the game logic (physics, collisions, etc), these enhancements will allow the CPU to implement parts of the graphics pipeline (geometry transformations, lighting, etc) with acceptable performance. This is very important since the GPU can only accelerate a limited set of operations, so the end result is not conditioned by the GPU’s limitations.

A step forward or a step backwards?

On your Nintendo 64 article, you explained that the system has a 64-bit CPU, but the GameCube one is 32-bit. Did Nintendo downgrade their console?

Indeed Gekko implements a 32-bit PowerPC specification, while the MIPS R4300i can switch between 32-bit and 64-bit modes. To answer whether this is an improvement or not, you have to ask yourself: Why would you need ‘64-bitness’?

As you can see, the GameCube already enjoys the advantages of 64-bit systems without being called a ‘64-bit console’. This is why you and I can’t summarise two complex machines by their ‘number of bits’.

Clever memory system

During the design of the next-gen architecture, Nintendo’s architects performed a post-mortem analysis of their previous design and discovered that using a Unified Memory architecture together with some high-latency components (RDRAM) resulted in one of the biggest causes of bottlenecks (almost 50% of CPU cycles were wasted while idling) [4]. Moreover, the inclusion of multiple independent units contributed to a concerning competition for the memory bus.

For that reason the GameCube’s architects came up with a new memory system strictly based on providing dedicated memory space and using low-latency chips. With the new design, GPU and CPU will no longer compete for the same RAM (causing fill-rate issues) since the GPU will now have its own internal and amazingly fast memory. On the other side, the GPU will still be in charge of arbitrating access to I/O too.

Image
Memory layout of this system.

The result was a system organised with two main buses:

Additionally, this design contains an additional (yet unusual) bus where more memory can be found:

Overall, this means that while ARAM provides a considerable amount of RAM, it will be limited to less critical tasks, like acting as an audio buffer or being used by certain accessories (explained in the I/O section).

Organising memory and sorting out ARAM

So far, we’ve seen that, on paper, the memory capabilities are undoubtedly superior to its predecessor, but there’s still room for improvement. For instance, Nintendo could’ve fitted more hardware to incorporate ARAM into the CPU’s memory map.

On a related note, let’s revisit the MMU used in Gekko. The CPU, with its 32-bit address bus, can access up to 4 GB of memory, but the system houses nowhere near that quantity. So, to prevent exposing unpopulated (and unpredictable) memory addresses, ‘virtual memory’ addressing is activated by default to mask physical addresses with a safer, easily-cacheable and continuous ‘virtual’ address map [6].

To make this work, Gekko (and other PowerPC architectures) translate virtual addresses to physical ones with the following process:

  1. Perform Block Address Translation (BAT): There are eight pairs of programmable registers (four for data and four for instructions) where each pair map a range of virtual address to a continuous range of physical addresses. The MMU attempts to find the physical address if it’s found within those ranges.
  2. If BAT didn’t work, read the Page Table: The MMU also stores a table that catalogues the physical location of pages (block of virtual addresses).
    • The MMU can take time to read a page table, so a Translation look-aside buffer (TLB) is included to cache recent reads.
    • Other architectures such as x86 or MIPS provide paging as well, though not all of them will offer a TLB.
  3. Finally, if the requested virtual address can’t still be translated, then the MMU triggers a ‘page fault’ exception in the CPU and lets the operating system decide what to do next.

So what use does this have for developers? Well, it turns out Nintendo released some libraries that extend main RAM using ARAM with the help of paging. To recap, ARAM is not addressable, but the CPU may call DMA to fetch and store data from there. Thus, the CPU can move pages out of main RAM to make room for other resources and temporarily store them in ARAM. Afterwards, whenever a page fault occurs, the OS contains routines to look for the missing pages in ARAM and restore them to their original location in main RAM.

In conclusion, with some clever tricks, these general-purpose capabilities enabled GameCube games to enjoy more memory than technically allowed, thereby reaching higher levels of quality. However, it’s important to bear in mind that such tricks may come with some performance penalties (especially if they’re taken for granted).


Graphics

This is one of the most critical sections of this console, it basically makes the GameCube, a GameCube.

The history of this console’s GPU has some interesting connections: Wei Yen, the director of N64’s SoC (the RCP), later founded ArtX and landed a contract with Nintendo to develop their next-gen chip: Flipper.

Image
Super Mario Sunshine (2002).

There were lots of advancements done from the previous iteration, for instance, the subsystem was severely simplified down to a single (but powerful) core.

During the development process, ArtX got acquired by ATI, which in turn was sold to AMD six years later. Hence, this is why you see an ATI sticker stamped on the front of the case.

Architecture and design

Flipper is a complex block that handles multiple services [7], so let’s focus on the graphics component for now (since it’s the one responsible for bringing our geometry to life). We’ll call this area the GPU or Graphics Engine and, if you’ve been reading the N64 article, just letting you know that the core is now functional out of the box, so programmers won’t need to worry about injecting code to make it work. Nevertheless, there will be some interesting parts that are customisable.

Image
Pipeline design of Flipper’s GPU.

As always, in order to draw a frame on the screen, our data will be pumped through the GPU’s pipeline. Data goes through lots of different components which we can group into four stages:

Database

Image
Database stage diagrams.

The CPU and GPU communicate to each other using a fixed-length FIFO buffer in main RAM, this is a reserved portion where the CPU will write drawing commands that the GPU will read (and eventually display), this functionality is natively supported by the CPU and GPU.

Furthermore, the CPU and GPU don’t have to be pointing at the same FIFO at the same time, so the CPU can fill a separate one while the GPU is reading the first one [8]. This prevents idling.

Issuing individual commands to construct our geometry can get very tedious with complex scenes, so official libraries included tools that generated the required Display Lists (pre-compiled set FIFO commands) from the game’s assets, this chunk only needs to be copied to RAM to let the GPU effectively display them.

The GPU contains a command processor which is in charge of fetching commands from FIFO.

Geometry

Image
Vertex stage diagram using indirect mode.

Here primitives are transformed to shape accordingly for the desired scenery and prepared for rasterising. The engine uses a dedicated Vertex unit or ‘VU’ to accomplish this.

There are two vertex modes available to handle primitives issued through FIFO:

  • Direct mode: The CPU issues each FIFO entry with all required attributes (position, normal, colour, texture coordinate, or matrix index). Useful when the data is already cached.
  • Indirect mode: The FIFO entry contains an index value that specifies where the attribute information is located in RAM, so the vertex unit needs to look it up by itself. This data is structured as an array, so for the VU to traverse it, each vertex entry must specify where the array starts (base pointer), how long is each entry (stride) and at which position the vertex is (index).

Once loaded, the primitives can be transformed, clipped, lighted (each vertex will have an RGB value that can also be interpolated for Gouraud Shading purposes) and finally, projected.

Texture

Image
Texture stage diagram using a default setup.

Now it’s time to apply textures and effects to our models, and for that the GPU includes multiple units which will process our pixels. Now, this is a very sophisticated (yet quite complex) procedure, so if you find it difficult to follow, just think of it as a big assembly line that processes pixels. Having said that, there are three groups of units available:

  • Four parallel Pixel units (also called ‘pixel pipelines’): Rasterises our primitives (converts them to pixels). Having four units available enables to deliver up to 2x2 pixels on each cycle.
  • One Texture mapping unit at the end of each Pixel unit (giving four in total): Together they process up to eight textures for our primitives (now mere pixels) at each cycle.
    • It can also loop itself to merge multiple texture layers over the same primitive, this feature is called Multi-Texturing and can be used to achieve detailed textures, environment mapping (reflections) and bump mapping [9], for instance.
    • Finally, the unit also provides an early z-buffer, mipmapping (processing a downsized texture instead, based on the level of detail) and anisotropic filtering (a welcoming improvement over the previous filters that provides greater detail with sloped textures).
  • Texture Environment unit or ‘TEV’: A very powerful and programmable 16-stage colour blender. It basically combines multiple texels (lighting, textures and constants) to achieve an immense amount of texture effects that will be applied over our polygons.
    • The unit works by receiving four texels which are then processed based on the operation requested. Afterwards, it can feed itself the resulting texels as new input, so at the next stage/cycle, the unit can perform a different type of operation over the previous result. This ‘loop’ can last up to 15 iterations.
    • Each stage has 2^4 operations to choose from [10] and, considering the result can be re-processed at the next stage, there are ~5.64 × 10^511 possible permutations!
    • Programmers set up the TEV at runtime (meaning it can change any time) and this is crucial since it opens the door to lots of original materials and effects.

All of this is assisted by 1 MB of Texture memory (1T-SRAM type) which can be split into cache and Scratchpad memory (fast RAM). Real-time hardware decompression for SRTC (S3 Texture Compression) textures is also available to fit more textures in that single meg. of memory available.

Render

Image
Render stage diagram.

The final stage of the rendering process includes applying some optional but useful touches to our scene:

  • Fog: Blends the last colour from the TEV with a fog constant colour to effectively simulate a foggy environment.
  • Z-compare: A late-stage Z-buffer. The engine will use 2 MB of embedded 1T-SRAM for Z-buffering calculations.
  • Blending: Combines the colours of the current frame with the previous frame buffer.
  • Dithering: As the name indicates, applies dithering over our frame.

The resulting frame is finally written to the frame buffer in the embedded 1T-SRAM, but this is still locked inside Flipper (the area is called ‘Embedded Frame Buffer’ or ‘EFB’, though it also includes the z-buffer). So, to display it on our TV, we have to copy it to the External Frame-Buffer or ‘XFB’ [11], which can be picked up the Video Interface or ‘VI’. Besides, the copy process can apply effects like Antialiasing (reduces blocky edges), Deflicker (smooths sudden changes in brightness), RGB to YUV conversion (a similar format that occupies less space in memory) and Y-scaling (vertically scales the frame).

It’s worth mentioning that the XFB area can also be manipulated by the CPU, this enables it to combine previously-rendered bitmaps with our recently-rendered frame; or when certain games need to render very colour-rich frames which can’t fit in the EFB, so they are rendered in parts and merged by the CPU afterwards (always keeping in-sync with the VI).

Interactive comparison

Time to put all of this into perspective, check out how programmers evolved the designs of their previous games to take advantage of the new graphics capabilities of this console. Don’t forget the examples are interactive!

The upgrade

The famous Mario model which had to be stripped down due to polygon counting on the previous generation got completely redesigned for this one, take a closer look at how the model evolved from plain faces to wrinkled sleeves.

WireframeSurfaceTextured
3D model
Super Smash Bros (1999) for the N64.
320 triangles.
WireframeSurfaceTextured
3D model
Super Smash Bros. Melee (2001) for the GC.
4,718 triangles.

It’s really impressive how much detail has been gained in just two years, eh?

The update

In this case, Sonic Team already designed a Sonic model for their unique console, but after porting their game to the GameCube they found themselves able to add more polygons to their model, achieving better detail.

WireframeSurfaceTextured
3D model
Sonic Adventure (1999) for the DC.
1,001 triangles.
WireframeSurfaceTextured
3D model
Sonic DX (2003) for the GC.
1,993 triangles.

Creativity

As you can see from the inner working of this pipeline, graphics technology has been evolving to point that programmers can now take control of certain functions of the rendering process.

Image
The Legend of Zelda: The Wind Waker (2003).

During the same time, PC graphics cards were starting to discard fixed-function pipelines in favour of shader cores (units that run small programs which define how pixels are operated). Flipper still contains a fixed-function GPU, however, by including components such as the TEV unit, one could argue that Nintendo provided their own shader-like solution.

I guess one of the best examples of games that exploited this new capability is The Legend of Zelda: Wind Waker which implements a unique colour/lighting technique known as Cel shading to make its textures look cartoonish.

Video output system

The video signal outputs a resolution of up to 640x480 pixels (or 768×576 px in PAL) with up to 16.7 million colours (24-bit depth). Additionally, the system could broadcast its signal in progressive mode (which has a clearer image, but not every TV may have supported it during that time).

The XFB can have multiple dimensions, so for compatibility reasons, the Video interface will try its best to display the frame by re-sampling the XFB to fit the TV screen based on the region.

Connections

The console included not one, but two video output connectors:

Image
A/V Connections on the back.
  • One named Analog A/V which is actually the good-old Multi Out. This is the most popular one.
    • The PAL version of this console doesn’t carry S-Video and the NTSC one doesn’t provide RGB (bummer!).
  • Another one called Digital A/V which sends audio and video in digital form (similarly to HDMI nowadays but using a completely different protocol!).
    • Nintendo released a component cable set that connected to this socket. The same plug incorporated a video DAC and encoder to convert the digital signal into YPbPr (optimal quality).
    • The cable was sold as an extra accessory and now is considered some sort of a relic!

Audio

Nintendo finally delivered some dedicated audio circuitry to offload the huge task from the CPU-GPU and provide richer sounds. Their new solution is an independent Digital Signal Processor or ‘DSP’ manufactured by Macronix running inside Flipper.

The DSP’s job consists of performing different operations over our raw audio data (e.g. volume changes, sample rate conversion, 3D sound effects, filtering, echo, reverb, etc) and then output a 2-channel PCM signal. It doesn’t work alone however, the DSP delivers audio with the help of other components.

Its first companion is the Audio Interface (AI), a 16-bit stereo digital-to-analogue converter responsible for sending the final sample through the audio signal that ends on the TV. The AI can only process 32 bytes of audio data every 0.25ms, so if we take into account that each sound sample weights 2 bytes and we need two to create stereo sound, the AI will be able to mix up to eight stereo samples with up to 32 kHz of sampling rate, sound!

Finally, we have the Audio RAM (ARAM) block, which is a large (16 MB) but very slow spare memory that can be used to store raw sound data. There’s quite a lot of space, so the GPU can also use it to store additional material (like textures). The CPU doesn’t have direct access to this memory so it will resort to DMA to move content around.

For better or worse, the DSP is programmable with the use of microcode (yikes), but fear not, as the official SDK already bundles a general-purpose microcode that almost every game used, except on the console’s boot sequence and some Nintendo games (how convenient, as Nintendo didn’t document the DSP, so only they know how to program it).

That being said, the process of generating sound works as follows [12]:

  1. CPU commands DMA to move raw samples to ARAM.
  2. CPU sends a list of commands that instruct how the DSP should operate these samples. In other words, it uploads the microcode program (only one is officially available for developers).
  3. DSP fetches samples from ARAM, applies the required operations and mixes them into two channels. Finally, it stores the resulting data on RAM.
  4. AI fetches processed samples from RAM and outputs them through the audio signal.

Compression and freedom

While we’ve already reached the sampling age and we are not locked to specific waveforms anymore, the new sound system is still a huge improvement. For starters, the saga of forced music sequencing is gone for good. The system can now stream pre-produced music to the audio endpoint without problems, much like what the Saturn and PS1 accomplished years ago.

Let me show you an example using two games, one released for the Nintendo 64 and its sequel released for the GameCube. Both have different music scores but the context (enemy battle) is the same. Take a look at how both tracks differ in sound quality, taking into account the design of each system (shared vs dedicated).

Paper Mario (2000) for the N64.
Sequenced on the fly by the RSP.
Paper Mario: The Thousand-Year Door (2004) for the GC.
Streamed to the DSP.

As you can hear, the DSP finally gave music composers the flexibility and richness they always asked for.

Additional material

For a more direct side-by-side comparison, I’ve prepared this interactive widget that shows how composers ended up adapting their arrangements for the GameCube and its predecessor. Here, the same upbeat score is utilised for a Nintendo 64 title and a GameCube one, and the resulting comparison allows me to demonstrate (once again) the technical advantages of the GameCube’s DSP.

Nintendo 64GameCube
Nintendo 64: Kirby 64: The Crystal Shards (2000).
GameCube: Kirby Air Ride (2003).

Now, to visualise what’s happening behind each track, here are the two respective spectrograms. Before I start, if you are not familiar with these kinds of charts, I recommend reading my previous NES article, in particular the audio section (where I introduced them).

Image
Spectrogram of the PCM channel in Kirby 64: The Crystal Shards (2000).
Image
Spectrogram of the PCM channel in Kirby Air Ride (2003)

To be fair, mixed tracks are difficult to decompose in a spectrogram, but I believe I can attempt to deduce some patterns from it.

To begin with, almost all the frequency spectrum is being evenly utilised in the GameCube track, which may be attributed to the additional instruments used for accompaniment (which add harmonics and therefore fill more areas on the chart).

Finally, the amplitudes on the GameCube spectrogram look more uniformly distributed. In other words, the volume of each instrument is differently balanced and includes effects like reverb. I’m guessing the latter is what the composer originally intended while producing this score, and these types of controls are possible by the fact that the GameCube supports audio streaming. Thus, composers can use any tool of choice to sequence and mix their music, as opposed to strictly depending on the console (and its limitations) to sequence and mix at runtime.

I wouldn’t say that the Nintendo 64 is completely incapable of producing the same result. However, one thing for sure is that, in the world of the Nintendo 64, every single audio function costs extra cycles and/or memory, and this can have an impact on other areas of the game. Hence the need to ration resources. On the other hand, with the GameCube’s support for large samples, one can just stream the full produced score altogether.


I/O

It seems that this generation is putting a lot of work into expandability and accessories, the GameCube included a couple of new interesting ports, although some of them remained unused.

Internal I/O

Image
Main diagram of the GameCube’s architecture. In there, we find the ‘Northbridge’ which controls most of the I/O.

Flipper is in charge of interfacing the CPU with the rest of the components so, apart from including sound and graphics circuitry, it also provides a collection of hardware named Northbridge that is composed of [13]:

Each interface exposes a set of registers to alter some of its behaviour.

Optional I/O

On the bottom of the GameCube’s case, you’ll find two external sockets to connect some widgets.

Image
Covered accessory slots on the bottom of the case.
Image
Uncovered accessory slots on the bottom of the case.

Both are technically identical (serial bus running at 32 MHz), yet they are presented with a distinct external shape to accommodate different accessories:

  • Serial Port 1: Nintendo shipped Modem and Broadband adapters for this slot.
  • Serial Port 2: No public accessory shipped using this port, but you may find some third-parties accessories that provided debugging tools.

These ports are operated on the EXI stack.

Unusual I/O

You’ll notice I still haven’t mentioned another available socket found next to the serial ports: The Parallel Port. This port happens to be much faster (8-bit at 80 MHz vs 1-bit at 32 MHz) which may be the reason Nintendo called it Hi-Speed Port. But the most unusual part is that this port is not interfaced through EXI, but through ARAM!

The only official accessory known to date is the famous Game Boy Player which plugged in as an extra floor under the GameCube, it contained the necessary hardware to natively play Game Boy and Game Boy Advance games. The Player works by doing all the heavy work itself and then sending the results (frames and audio data) to ARAM which the GameCube forwards to the respective components for display/sound.

Consistent design

I found it worth pointing out that no matter how many accessories you connect, the console will always keep its cubic shape (or at least attempt to).


Operating System

Upon turning on the console, the CPU will start loading an operating system called Dolphin OS found on the BIOS/IPL chip, this is a very minimal OS that will take care of initialising the hardware and providing some convenient system calls and global variables for games to use. Games developed using the official SDK will implicitly execute these calls during low-level operations.

Image
The official logo, shown after the boot animation finishes.

Splash and shell

After finishing the boot process, the OS will load a small program unofficially called Main Menu.

Image
Main Menu with multiple settings available.
Image
Clock settings.
Image
Sound settings.

This program is responsible for displaying the famous splash animation (the wee cube drawing a GameCube logo) and loading the game if there is one inserted. In the absence of a valid game, it will then provide a simple cube-shaped menu with various options to choose from:


Games

Nintendo provided developers with lots of tools to assist (and encourage) the development of games for their console [14]:

Specialised hardware

Apart from the software, the company supplied different hardware kits (which range in price) before and after the console was publicly released.

Probably the most popular one worth mentioning is the Dolphin Development Hardware or ‘DDH’ which consisted of a PC-like tower containing some of the GameCube’s I/O and lots of dev-assisting hardware [15], it was mainly used as a debugging station while the game was being developed on a Windows PC.

Medium

Games are loaded from a proprietary disc called miniDVD, it’s almost half the size of a traditional DVD disc and can hold up to 1.4 GB of data.

As an interesting fact, the disc reader operates in a Constant Angular Velocity or ‘CAV’ meaning that its data will be read at a faster rate if its found in the outer area of the disc (3.125MB/s) and slower if it’s found in the inner area (2MB/s). This differs from Constant Linear Velocity systems (used by traditional CD/DVD readers) where the effects are the opposite.

Game saves are stored in a proprietary external accessory called Memory Card and there are enough slots for two of them.

Unusual controllers

Nintendo shipped an accessory known as the GameBoy Link Cable which plugged a Game Boy Advance into the GC controller port, so games could upload small programs to the GBA and treat it as a special controller. This interesting feature enabled unique interactions and content in some games.

Online Platform

Well, unlike the competition, not only did Nintendo require users to buy extra accessories to access online content, but they also didn’t deploy any internet service that publishers could rely on [16], making developers solely responsible for providing the necessary internet infrastructure.

As a result, while online gaming was a possible feature, it didn’t get widely adopted and only a tiny amount of games made use of this.


Anti-Piracy & Homebrew

Nintendo has been in this game for quite some time, so it’s no news that they included security mechanisms to prevent running unauthorised software or games from a different region. Furthermore, due to the new range of I/O that the GameCube provides, the attack surface got significantly larger.

Security mechanisms

We can organise them into these areas:

DVD controller

Even though this is the first Nintendo console to use the disc medium, attempting to play pirated copies of games just wasn’t going to be easy. The miniDVD is protected by using proprietary bar codes on the inner side of the disc, in addition to having its data encrypted. The validation and decryption process works seamlessly: The miniDVD controller takes care of it while the system is limited to only requesting the data.

The hardware composing the DVD reader can be imagined as a fortress wall which is only accessed using a series of commands, the miniDVD controller features a proprietary CPU that will take care of deciding if the inserted disc is genuine or not, and if it’s not, no command issued by the main CPU will convince to read it otherwise.

Defeat: As with any other cat-and-mouse game, it was just a matter of time before third-party companies successfully reverse-engineered the controller to build mod-chips that could trick the reader. But bear in mind that no mod-chip will make this console magically fit a conventional CD/DVD without physically altering the case!

IPL and EXI

Another possible path of exploitation consists of using the external I/O available to load Homebrew programs. Although, without cracking the DVD reader first, the only other option available is to try to take control of the first program that the GameCube loads, and that is… The IPL.

That means that by reversing engineering the BIOS and replacing the chip with a modified one, one would be able to run, let’s say, a file reader, and from there execute programs received from the accessory ports (assuming the additional hardware is plugged in).

Be as it may, the IPL chip is encrypted using XOR conditionals and a Cipher-text [17], making it ‘impossible’ to reverse engineer.

Defeat: Hackers eventually discovered that the hardware that handles the decryption of the IPL contained a bug that enabled them to capture the Cipher-text used [18]. With this, another ROM could be constructed and encrypted with the same cypher so the GameCube boots it as its own!

As if that wasn’t enough, hackers also found new methods to trick the miniDVD reader into loading conventional DVDs.

Honourable Mention

Before those two mechanisms were discovered, there was actually a much simpler way of loading arbitrary code without any modification whatsoever. This method consisted of hijacking the online protocol.

Some games like Phantasy Star Online implemented their own online functionality, which could be updated by downloading an updated executable (DOL file) from the company’s servers, and the latter didn’t implement any security in its protocol. So, as you can see, this was a man-in-the-middle attack waiting to happen…

Long story short, by spoofing a fake server the GameCube would just download (and execute) whatever DOL you could provide. That means hackers only needed the original game and the broadband adapter. This technique is known as PSOload [19].


That’s all folks

Image
My old GameCube recently rescued from the attic. I only needed the controller for the Wii (back then it was cheaper to buy the whole second-hand lot!).

Well, this is it, the 10th article!

I really tried to set a rough limit on the length of this article but you have to understand, technology has gotten so complex that if I accidentally skip anything important, the whole topic gets impossible to follow.

Anyway, I’d like to thank the #dolphin-dev IRC community for helping me understand the complicated pipeline of Flipper’s GPU, these guys have been developing the GameCube emulator for quite some years now and it’s really impressive how much they had to put up with.

And finally, please consider contributing if you found it an interesting read. I strive to make it as complete as I can, and in the process, I forget how much time it’s suddenly costing me, I find it a good investment nonetheless.

Until next time!
Rodrigo.


Contributing

This article is part of the Architecture of Consoles series. If you found it interesting then please consider donating. Your contribution will be used to fund the purchase of tools and resources that will help me to improve the quality of existing articles and upcoming ones.

Donate with PayPal
Become a Patreon

You can also buy the eBook edition in English. I treat profits as donations.

Image

A list of desirable tools and latest acquisitions for this article are tracked in here:

### Interesting hardware to get (ordered by priority)

- Any IPL mod (?)

### Acquired tools used

- (While ago) Gamecube with controller (£20) and some games (if I remember correctly, no more than £20...)

Alternatively, you can help out by suggesting changes and/or adding translations.


Copyright and permissions

This work is licensed under a Creative Commons Attribution 4.0 International License. You may use it for your work at no cost, even for commercial purposes. But you have to respect the license and reference the article properly. Please take a look at the following guidelines and permissions:

Article information and referencing

For any referencing style, you can use the following information:

For instance, to use with BibTeX:

@misc{copetti-gamecube,
    url = {https://www.copetti.org/writings/consoles/gamecube/},
    title = {GameCube Architecture - A Practical Analysis},
    author = {Rodrigo Copetti},
    year = {2019}
}

or a IEEE style citation:

[1]R. Copetti, "GameCube Architecture - A Practical Analysis", Copetti.org, 2019. [Online]. Available: https://www.copetti.org/writings/consoles/gamecube/. [Accessed: day- month- year].
Special use in multimedia (Youtube, Twitch, etc)

I only ask that you at least state the author’s name, the title of the article and the URL of the article, using any style of choice.

You don’t have to include all the information in the same place if it’s not feasible. For instance, if you use the article’s imagery in a Youtube video, you may state either the author’s name or URL of the article at the bottom of the image, and then include the complete reference in the video description. In other words, for any resource used from this website, let your viewers know where it originates from.

This is a very nice example because the channel shows this website directly and their viewers know where to find it. In fact, I was so impressed with their content and commentary that I gave them an interview 🙂.

Appreciated additions

If this article has significantly contributed to your work, I would appreciate it if you could dedicate an acknowledgement section, just like I do with the people and communities that helped me.

This is of course optional and beyond the requirements of the CC license, but I think it’s a nice detail that makes us, the random authors on the net, feel part of something bigger.

Third-party publishing

If you are interested in publishing this article on a third-party website, please get in touch.

If you have translated an article and wish to publish it on a third-party website, I tend to be open about it, but please contact me first.


Sources / Keep Reading

Anti-Piracy

Audio

CPU

Games

Graphics

Photography


Changelog

It’s always nice to keep a record of changes. For a complete report, you can check the commit log. Alternatively, here’s a simplified list:

### 2023-02-11

- Added additional material to the audio section

### 2022-12-04

- Corrected ambiguity between Flipper (the SoC) and its internal GPU. See https://github.com/flipacholas/Architecture-of-consoles/issues/150 and https://github.com/flipacholas/Architecture-of-consoles/issues/151 (thanks @phire, @Pokechu22, @Masamune3210 and @aboood40091)

### 2021-06-20

- Added 32 vs 64 bits paragraph
- Expanded audio section

### 2021-05-15

- Added MMU+ARAM section
- Added memory layout diagram
- Improved 'Sources' structure

### 2020-02-26

- Added mention on anisotropic filtering

### 2020-01-09

- Improved the write gather pipe explanation

### 2019-11-22

- Corrections and more info added, thanks @phire from #dolphin-dev and /r/gamecube !

### 2019-11-20

- Corrections here and there

### 2019-11-19

- Public release!

Rodrigo Copetti

Rodrigo Copetti

I hope you have enjoyed this article! If you want to know more about the author tap here and if you would like to support him tap here instead

rsslinkedintwittergithub facebookreddit