Talk:Hollywood (graphics chip)
Known Specifications
editThe new edits made by Acrevolution. We do need exact sources on the information. Can anyone post these sources? ~ Eevee04
The details of what exactly the hollywood is haven't been made public yet. There are conflicting rumors stating that its just an R300 like the flipper (which is what was previously mentioned in the article) and rumors stating that it is based on a more current ATI laptop graphics chip, either a customized R530, or an early R630 based custom chip. About the only evidence we have at all for the identity of the chip are the id numbers on it, and 632 does appear on the chip, which would seem to be some evidence for the last hypothesis. 64.6.0.7 20:21, 6 December 2006 (UTC)
- I think we've come to the conclusion that no one quite knows what the specs are, so, I'm removing the information until it's properly sourced. The Captain Returns 05:08, 7 December 2006 (UTC)
- I've been trying to find information as well, both about this and the central processing unit "Broadway". I'm thinking that the chips must be modified versions of Gamecube chips as the console is backwards compatible, but does not seem to be so much more powerful than the Gamecube as to be able to emulate it in software. Thoughts? If anyone digs up any technical documentation I'd love to take a look at it. 65.96.163.132 00:56, 11 December 2006 (UTC)
- The processor does seem to be a PPC750CL derived chip. It seems like its simply a faster gekko processor, although it may contain a few extra extensions, which would allow for complete backwards compatibility in that regard. The video chip however can be very different and still allow for backwards compatibility as long as it supports openGL 1.4, which is pretty much a given.
- I would agree that the Broadway is derived from the same PPC750 design that was used for the Gekko; it's clear that the Wii doesn't have a separate set of hardware for GameCube games, and anything else would entail emulation software, like the Wii uses for the Virtual Console games. Once adjusted for the difference in fabrication process, the chip would have perhaps no more than a 75% increase in transistor count over the Gecko, which is small for a change between console generations. Evidence does suggest that it's just an evolution, akin to how a Core 2 Duo is fully backwards-compatible with a Pentium IV.
- However, the GPU is a different story entirely. I'm wishing I could get my hands on a thrown-away Hollywood chip (the Vegas die namely) so I could tear it open and put it under a microscope; then I'd be able to figure out what it is. As it stands, all that I can say is that it very clearly is *not* the same as a Flipper; once accounting for the shrinkage due to changes in fabrication process, it would appear that the Vegas die would have 2.5-3x the number of transistors of the Flipper. Given its shape, size, and fabrication process, it does seem similar to an RV530 (not 630) though... Nottheking (talk) 00:02, 3 January 2008 (UTC)
- False. If it was in fact similar to the RV530, the hardware would require some sort of emulation to play GameCube games. 70.137.151.202 (talk) 00:57, 25 April 2008 (UTC)
- Well, I merely specified that it was similar; the die is made by the same company, on the same fabrication process, and has pretyt close to an identical surface area. I meant that the similarities implied would be chielfy limited to the transistor count, (157 million for the RV530) and perhaps shed a clue on what sort of hardware was included. I did not mean to suggest that it has an actual RV530 in there... For that GPU was designed to operate with better cooling than the Wii has, and at any rate, was designed to control 4 chips of VRAM, while the Wii only has one.
- The obvious thing here is that it'd be vastly different from the Flipper in the Game Cube, which had only had 51 million transistors. Given that most of the other hardware that could be included (like the DSP, encryption, etc) are on the Vegas "daughter die," and just the memory interface, I/O, and embedded cache are on the main "Napa" die, that would imply that the extra transistor count goes largely toward increase graphics processing hardware... With the bulk of a GPU's space normally taken by a combination of the memory interface, I/O, texturing units, arithmetic units (shaders) raster units, some hardwired features (like AF and tesselation, for example) and a small portion for control of the chip. In other words, this would imply that Napa has a substantial amount added onto it that Flipper didn't have. Nottheking (talk) 00:38, 8 September 2008 (UTC)
- If you are going to mention the graphics API, it got a custom one named GX.--Henke37 (talk) 21:14, 22 October 2008 (UTC)
- The obvious thing here is that it'd be vastly different from the Flipper in the Game Cube, which had only had 51 million transistors. Given that most of the other hardware that could be included (like the DSP, encryption, etc) are on the Vegas "daughter die," and just the memory interface, I/O, and embedded cache are on the main "Napa" die, that would imply that the extra transistor count goes largely toward increase graphics processing hardware... With the bulk of a GPU's space normally taken by a combination of the memory interface, I/O, texturing units, arithmetic units (shaders) raster units, some hardwired features (like AF and tesselation, for example) and a small portion for control of the chip. In other words, this would imply that Napa has a substantial amount added onto it that Flipper didn't have. Nottheking (talk) 00:38, 8 September 2008 (UTC)
20% Faster? Or 20% more efficient?
editI just removed an unsourced paragraph from the article claiming the chip was 20% faster than the Gamecubes. Which I find to be a dubious claim, since it has a clockspeed which is at least 50% faster than the Gamecube's, and it has faster access to more ram as well, which would lead me to expect a larger boost in power. However, if the claim was misinterpreting a piece of information which claimed 20% more efficient, that is more plausible as a similar reduction in power draw has been reported by IBM from the Broadway. Zeebo-010 22:38, 20 December 2006 (UTC)
- If memory serves, there was a statement from someone at Nintendo or one of the technology partners claiming that one of the chips (might've actually been Broadway) consumed 20% less power than its equivalent predcessor in the Game Cube did, so I think that might've been the source of the paragraph. I'd re-add it to the article, but right now it appears the source escapes me. Nottheking (talk) 00:41, 8 September 2008 (UTC)
No Such Thing as 1T-SRAM
edit1T-SRAM is a brand of 1T-DRAM. This is a blatant advertisement and should be removed.
- It would appear to be policy to use registered trademarks and names when referring to a specific company's product. Because the Wii uses a brand of memory chips made by MoSys that the company names "1T-SRAM," technically that would be the correct name for it, even if it's DRAM, not SRAM. Nottheking (talk) 00:04, 3 January 2008 (UTC)
Good source for the article
edithttp://blog.newsweek.com/blogs/levelup/archive/2007/09/24/is-wii-really-gamecube-one-point-five-yes-says-beyond3d.aspx JACOPLANE • 2007-09-26 20:22
- Can't use that, since, frustratingly enough, it goes back to the same IGN article source as everything else. IGN is the entire crutch for their analysis, and that's been repeatedly noted as a highly dubious source, given that absolutely zero has been provided to back it up, such as any comment from any actual developer or someone at one of the hardware companies. Especially since it largely ignores the considerably higher transistor count in Napa compared to Flipper, which is nearly a comparable increase in complexity to the transition between the NV2A in the original Xbox and the R500 Xenos in the Xbox 360. In other words, going on clock rate alone is rather fallacious, and you'd recognize it quite well if someone made the claim, on core clock alone, that the Xbox 360 had the graphics power of 2.14 Xboxes taped together. (since you'd just be comparing 500 MHz to 233 MHz)
- Likewise, they make a mistake when comparing chip sizes... Apparently, they judge that the sizes are too similar, making the mistake of believing that in switching from the 180nm node used for the Game Cube's hardware (as well as that of other sixth-generation consoles) to the 90nm of the current seventh-generation consoles, that you'd have transistors taking up half as much space, which is an error when you remember that chips are two-dimensonal, not one; the suface becomes twice as dense with EACH shift of a full node, and going from 180nm to 90nm is two node jumps, skipping the 130nm node. Hence, the density would be quadrupled, meaning that if Flipper had a die area of 110 mm², then compared to the 72 mm area of Napa, that would imply a transistor count increase of 161.8%, rather significant. Nottheking (talk) 00:56, 8 September 2008 (UTC)
I'd say they were going off trends with other recent chips. That, and judgment of graphics fidelity + random developer comments. Cell at 45nm, is half the area of Cell at 90nm for example, while remaining the same chip. http://pc.watch.impress.co.jp/docs/2008/0929/kaigai_2l.gif It'd be hard to judge, and I wouldn't go as far as to attempt to derive a figure, considering area of transistors devoted to logic and edram might scale differently at the same processing size, and half the Gamecube's gpu was edram. As well as a other random things to possibly consider. But there's honestly nothing to suggest a significant boost in anything outside of clock-rate, and ram improvements. Certainly not a 160% increase in processing logic. (processing logic, because Wii games demonstrate the same limitations in aa, color formatting, dithering, etc.. as Gamecube had. Which would suggest little to no additional framebuffer edram or its related transistors) So most additional transistors would likely be processing of some sort. Which I'd doubt there's much of an increase in, having played most of the better looking games. (note: I notice that this discussion stuff is kinda dated) Swapnil 404 (talk) 22:02, 26 July 2009 (UTC)
"AMD's ATi Technologies division"
editShould that really say AMD? This chip was developed before AMD bought ATI, so AMD had absolutely nothing to do with the development of this chip. 66.183.175.7 (talk) 07:31, 31 August 2010 (UTC)
External links modified
editHello fellow Wikipedians,
I have just modified one external link on Hollywood (graphics chip). Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
- Added archive https://web.archive.org/web/20060615131048/http://www.nintendo.co.jp/n10/e3_2006/wii/index.html to https://www.nintendo.co.jp/n10/e3_2006/wii/index.html
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}}
(last update: 5 June 2024).
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—InternetArchiveBot (Report bug) 04:32, 4 April 2017 (UTC)
Split to “Hollywood (chip)” and “GX (graphics chip)”
editWhile the Hollywood was marketed as a graphics chip, it contains other hardware such as OTP memory and the Starlet. Generally, Hollywood refers to the entire board responsible for both these security functions and graphics, while GX refers solely to the portion responsible for graphics. Hallowizer (talk) 01:06, 12 August 2021 (UTC)
- Just started a draft for GX, in the process I copied some page contents to Draft:GX (graphics chip). Hallowizer (talk) 03:18, 9 September 2021 (UTC)