Wikipedia:Reference desk/Archives/Computing/2017 July 6

Computing desk
< July 5 << Jun | July | Aug >> July 7 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 6 edit

RGB vs. 4:4:4 YUV in television edit

Basically, I'm seeking some debate and input on this issue, not only in relation to the article. --79.242.203.134 (talk) 02:26, 6 July 2017 (UTC)[reply]

Scart, which has been around forever in consumer electronics, has RGB inputs... Now they need not be wired up to anything internally but in the last TV I had they were and consequently it could process RGB. I also had a DVD player which could be made to output RGB (which is how I know the TV could process it.) I think you're overstating the uncommon-ness of RGB in consumer devices. 78.50.124.48 (talk) 09:24, 6 July 2017 (UTC)[reply]
The SCART connector was designed by CENELEC and first appeared on TVs in 1977. Blooteuth (talk) 20:15, 7 July 2017 (UTC)[reply]
RGB and YUV are both mathematical Color models in which the shade and intensity of a Color can be represented by 3 numbers. The linked articles give equations for converting a given color from one model to the other.
RGB numbers correspond to the individual Primary color gun currents in a TV Cathode ray tube, RGB is the basis of the trichromatic theory of human color vision and is used in common image file formats such as .BMP. When 8 bits are allocated to each of R, G and B, the number of colors that can be represented is 23x8 = 224 = 16 777 216 (including black), called the True color format. This is convenient for computer storage, processing and graphic media but requires excessive bit rate for TV broadcasting.
YUV is an abstract model motivated by the need for compatible Color television broadcasting within the frequency spectrum previously occupied by a monochrome TV broadcast. Compatibility with monochrome receivers was achieved in the analog TV broadcast formats NTSC, PAL and SECAM from which a standard monochrome receiver extracts only the Y (luma) component. U and V are color-difference signals that a color TV receiver can decode to recover RGB values for display. However the broadcasting format saves bandwidth by subsampling U and V i.e. these two chroma signals are sent with lower resolution than luma, taking advantage of the human visual system that perceives details more from brightness contrast than color difference. It was this bandwidth economy that made compatible color analog TV possible. Various degrees of Chroma subsampling are used: YUV 4:2:0 is common in TV broadcasting and video encoding, YUV 4:2:2 is used in high-end video formats and interfaces found in studios such as CCIR 601. YUV 4:4:4 if it is found would be the YUV format without chroma subsampling. ("4:4:4" may instead refer to RGB color space that has no chroma subsampling.) Blooteuth (talk) 20:10, 7 July 2017 (UTC)[reply]

Criticism of RERO edit

Hi,

Release early, release often says that "Advocates argue" that RERO allows many things, but are there critics of RERO, and if so, what do they say?

Thanks.

Apokrif (talk) 14:56, 6 July 2017 (UTC)[reply]

Have you tried to use many Linux distros? Redhat-based distros are not RERO. It is very rare that I've had to troubleshoot problems on the user end. Ubuntu is in the middle. Every release includes a laundry list of end-user fixes for problems caused by unexpected problems (well, they say they are unexpected, but they should be expecting them by now). Arch is very RERO and, if you don't know what you're doing, you will quickly hit a brick wall as one package conflicts another, which conflicts with another, which conflicts with another... The alternative is to have alpha, which is known to not work, beta, which you hope works, and a polished release. Let the end-use decide if he or she wants to deal with problems. 209.149.113.5 (talk) 15:07, 6 July 2017 (UTC)[reply]
  • There are many criticisms of this. I have yet to read one that is competent and informed.
RERO is a core part of most Agile practices, and contemporary software development practices are an essential part of RERO. RERO can't be done (and product quality maintained) without the support of continuous integration and continuous testing, even continuous delivery. Of course it can (and in ESR's day it was), but only by a descent into "cowboy coding" and random quality, with a nightmare of recurrent bugs that wouldn't go away, even when fixed once.
To do it today, i.e. rapid turn-around of product releases with growing features and stable, improving quality, requires an Agile approach and also something like (but not necessarily any single one) a Scrum methodology. One of Scrum's first principles (when taught right) is the "squirrelburger" concept, that you can do many things, but you don't cut corners on product quality.
OTOH, if you're a multinational consultancy, wedded to pushing an obsolete approach like PRINCE or ITIL, then you're going to fight against this. You will produce verbose reports for your paper-centric customers on how only a careful and plodding Waterfall (et al) approach can match customer needs. Despite the lessons of history that such approaches achieve no such thing. So you slate Agile, you claim to "do Agile" ("Agile Waterfall" is claimed to be a real thing) and you slate approaches like RERO. As the audience you're pitching this too are middle-aged managers who have been disconnected from real development for a couple of decades, then they're convinced by this. Andy Dingley (talk) 15:15, 6 July 2017 (UTC)[reply]
Note that RERO is valuable in two scenarios, and one much more than the other. It can be useful when developing a new product, and the development timescale is simply large. It's also useful, and much more useful, when the customer requirements are unstable or ill-defined at the outset.
If you're working on a project that is neither, then RERO may have less to offer. Which is still not a criticism of it, nor a reason to avoid it. Andy Dingley (talk) 15:17, 6 July 2017 (UTC)[reply]
The Cathedral and the Bazaar is outdated. It's still a good read, but it's not a guide for how to work today.
It pre-dates the dot-com boom. Software dev became a lot better after the dot-com boom, using the illumination of all those piles of burning money. After the chaos and the collapses, came the unemployment and the quiet. In those quiet times, retrospective analysis of how horrible it had been (and improvements to cheap OSes and virtualisation, or at least machine imaging) meant that the practices of CI / CT developed enormously, particularly for vast amounts of fairly dumb (thus achievable) automated regression testing on large, cheap (thus available) server farms. Around 2005, software dev became respectable. ESR's reliance on human beta testers as a way of checking actual bugs went away. Since then, you just don't release code with the faults in it that ESR was working so hard to eliminate and keep eliminated. Andy Dingley (talk) 15:38, 6 July 2017 (UTC)[reply]