Core 2 Duo and Core 2 Extreme: What is Intel Thinking?

by Chris Seibold May 09, 2006

In the age of Google Maps and online internet directions it is easy to forget that less than one hundred years ago navigation was a tricky thing. The first trans-American highway wasn’t started until 1912 and the first Rand McNally Road Atlas didn’t appear until 1924. Before those relatively recent achievements, piloting your car (or horse) from one destination to another involved looking for physical landmarks and asking for a lot directions.

Once you start asking for directions on an unmarked road the distances involved become wildly personal. You might hear measurements like “a ways” as in “go past the burnt down barn for a good ways and turn left on the road where Farmer Ogden puts up his scarecrow every spring.” Another metric used in the pre-road mapped US, popular in Missouri, was the “see.” If you were interested in going from, say, Jefferson City to Lohman and asked someone for the route there was a good chance they’d stick you on a rutted lane and tell you to travel straight for about 40 “sees.” A “see” is, of course, how far you could see. For example, if you were at point A, you would look in the direction you were headed. The farthest thing you can make out is an old oak tree. The distance between your current position and the tree comprised one “see.” Once you got to the oak you’d scan ahead, the furthest thing you saw would mark the second “see.” Thus the process would continue.

There are obvious drawbacks to the “see” system of distance measurement. A “see” in the hilly terrain of East Tennessee might only be 40 yards, while in the topographically challenged states like Nebraska a “see” would run from Omaha to Lincoln. However ineffectual the “see” is as measurement tool it did have the advantage of giving, at least, a rough idea of distance. Measuring computer performance with megahertz is a lot like measuring distances with “sees,” it isn’t very accurate but at least you get an idea of something.

If the computer chip followed the model of roads-to paper-maps-to-Google-directions with every generation of new chips, consumers would know a little more about the chip powering the machine. The trouble is that the industry is headed in the opposite direction, instead of clarifying they are obfuscating. Put differently, megahertz (as flawed as the statistic may be) is looking like a pretty solid way of judging computer chips right now. It is as if people abandoned the “see” not in favor of maps but in favor of falling on the ground and emitting unintelligible guttural noises while still expecting the person asking directions to get the general idea.

The problem is Intel’s chip naming scheme and Apple’s apparent adherence to it. At this moment in time the best laptop chip Intel offers is the Core Duo. Monday, Intel announced the names of the next processor revision.

What are the clever names? Core 2 Duo and Core 2 Extreme. The chip names are less telling than the “See” mentioned earlier. Why is the Core Duo 2 any better than a Core Duo? Is it really a core^4? What about the core solo 2 (presumably one of these will roll out). Is that chip one core or two? The Core 2 Extreme? Just how does that chip earn the extreme moniker, is it extremely low power, extremely high performance? And, finally, how in the world can the Core 2 Extreme hope to out duel the Core Duo 2? Sure, Core 2 may be extreme but it is obviously inferior to the Core Duo 2 because the name omits the “Duo,” a superior chip would obviously be the Core Duo 2 Extreme (with a Hyper-Threading Velocity Engine).

It wasn’t always this way. At one time Apple took care of which chip revisions were important and which ones weren’t. For example, The G series of chips weren’t called G series by the manufacturers. The chips had boring names like the 750 GX. There were less than trivial differences between some models, but instead of Core Duo 2 Extreming users to death by telling folks they were buying a computer with the slightly enhanced 750 GX instead of the earlier 750 FX, Apple cleverly waited to spring a new round of chips on people until the difference was substantial. With the G4 it was the addition of the Velocity Engine. The G5, as we all remember, but Steve might like us to forget, was heavily hyped because it was a 64 bit chip.

The naming convention generated a lot of excitement. Once Apple fully adopted the G family moniker, users could get excited about the next number up. If you had a G3 notebook you could hardly wait until Apple managed to shove a G4 chip into a fresh new lappy. If you used a G4 desktop machine, the prospect of G5 was awe inspiring. It wasn’t so much that any G(n+1) would make an incredible amount of difference in your day to life, rarely did that occur, it was more about the excitement of being on the cutting edge. The fact that Steve Jobs could talk about a Pringles chip and make it sound as though your current machine was archaic if it didn’t ship with at least a dual Pringle didn’t hurt.

The G naming convention might not have actually meant anything intrinsically, but it was a sign to consumers that their machines were seriously outdated. Sure the difference in performance between a 400 MHz G4 and a 1.62 GHz G4 was probably greater than the difference between the late G4 and early G5 chips, but let us not quibble here—the G5 was a full digit better than the G4 and that was enough to get folks salivating.

With the names Intel is throwing around, the days of breathlessly anticipating the next model may be gone. If Apple had stuck with IBM, the Mac sphere would be buzzing with rumors about the G6 or the Cell processor, and the day when Apple might bust through the 3 GHz barrier. Now we are stuck in a weird place, with no major increments spoonfed to us by the Gx-naming-convention. We suddenly find ourselves face to face with reality: steadily faster machines, but no quantum leaps. Sure, the current state is more reflective of computing reality, but it is also much, much more boring.


  • Be the name good or bad, after doing some testing it looks like it will be a ho-hum upgrade for those running single-threaded applications, but two thumbs up for any application that runs with multiple threads!

    -Lorin Thwaits

    Lorin Thwaits had this to say on May 09, 2006 Posts: 1
  • Some people believe that Intel dropped the famous “Pentium” name because of Apple. I don’t think it’s the only reason, but still “Pentium” sounds better than “Core” (at least for a Russian). But what makes it sound even worse is all those stupid “Extreme” and “Duo” words. It’s actually the same problem that “MacBook Pro” has. “MacBook” alone sounds quite nice. But that “Pro” makes “MacBook Pro” sound like “Timbuktu”. ;-]

    Anyway, when Apple will release Mac Pro (?) it would probably be “Dual Core 2 Duo” !!! ;-]

    Frosty Grin had this to say on May 09, 2006 Posts: 33
  • So instead of G4 to G5, we go from Core to Core 2 (Core n+1).  While the solos and duos and extremes add some confusion, it’s actually MORE telling about what’s inside IMO than the comparatively meaningless G moniker.

    Beeblebrox had this to say on May 09, 2006 Posts: 2220
  • Did you just spend the first half of the article talking about how meaningless the naming scheme was, and then the second half complaining that Apple wasn’t using its old meaningless numbering scheme and hence the excitement was gone?

    I’m a bit confused at that.

    Greg Alexander had this to say on May 09, 2006 Posts: 228
  • I know names at can be confusing and I had to do some research too. Core 2 Duo stands for second generation of dual core. The number before “Duo” will mean the generation of the chip itself. So why did Intel do this? They are trying to separate themselves from the marketing confusion coming from the AMD camp. A friend of mine from Sun told me that the 1st batch of AMD Athlon dual cores where 2x32 and AMD marketed that as AMD64 without clear explanation of the number 64. People bought that thinking it was a new platform, and little did they know that it was just a marketing trick.
    My friend from Sun told me about AMD marketing bashing and the ugly dispute with Intel. I won’t go into details (because it’s not relevant to Apple). I was shocked at how shallow some companies can be, and said to him “If AMD would have picked on Apple the way they are doing it to Intel, Steve Jobs would liquidate AMD in 5 working days so he can play golf on Saturday and Sunday.”
    Anyway, the article also mentioned IBM. Do any of you know how ugly things got between IBM and Apple? IBM quality control in chip manufacturing is very poor. Bringing Sun back into the picture, 20% of the IBM chips supplied to Sun (in that period) were faulty and had to be sent back for replacement. Now it’s 10%. In the semiconductor industry that’s really bad figures. IBM is having a hard time making the NY fab work properly.
    On top of that, IBM and Motorola were having a feud about patents and quantities of PowerPC production. In the middle, there was Apple wondering when (?) the new chips will be released and are they going to get it on time (?). I felt very sorry for Steve Jobs because most likely he took Exedrin every day.
    From this sour experience that Steve Jobs had to go through, I wouldn’t be surprised if he demanded a legal written agreement/contract from Intel that supply and innovation will not be compromised for Apple.
    It looks like the author of this article is a PowerPC fan and is having a hard time relinquishing the good old days for the “future.” A way for Steve Jobs to erase the sour experience with IBM is to release a new version of Power Mac G5. If he decides to call it G6, it’s fine with me. But, I don’t think it would be wise given the current innovation at Apple. You know the Apple motto “Think different.” As a 3D animator I do that every day. So how about we use a new name? How about another bold idea – HAL?

    domino360 had this to say on May 09, 2006 Posts: 2
  • Why is the Core Duo 2 any better than a Core Duo?

    As mentioned above in the comments it would be “Core 2 Solo” and “Core 2 Duo”, not “Core Solo 2” or “Core Duo 2”. The name can be found in the press release, if you can’t get that right, why even bother posting an article?

    Core 2 is the 2d generation, G2 if you will, of the Intel Core architecture. It’s a 64-bit family of chips, with more cache, a faster front side bus, etc. If you understand the Gx moniker formerly used by Apple, it should be crystal clear.

    With the advent of multicore processors Intel will distinguish between single-core and dual-core with Solo/Duo, this is not rocket science, is it?

    The Core 2 Extreme is only intended for a niche market, hardcore gamers who can afford a 1,000 USD processor to go with their high-end chipset, graphic card and neon case mod. You can forget about it.

    Intel’s naming scheme can be confusing nonetheless, the notebook part will use either the 5000 series or 7000 series, the desktop part will use numbers in the 4000 and 6000 range. Power consumption will be designated by a letter prefix, from U for ultra low voltage to X for extreme.

    As an example: Core 2 Duo T5400 would be a dual-core part, consume between 15 and 24 watts and would be intended for notebooks. Core 2 Duo E6800: dual-core, 55 to 75 watts, desktop.

    Aha, peace of cake!

    MadMatters had this to say on May 10, 2006 Posts: 2
  • Intel phased-out the “Pentium” moniker and its ilks supposedly to un-confuse the already befuddled consumer. In its replacement we have the Core Solo, Core Duos, and Extreme editions.

    The Xeon and Celeron brands will be kept for their market place - server and entry-level, respectively. So, the “Core” brands will focus on the desktop and notebooks, exclusively.

    If you are Intel, with dizzying array of product layers to cover with specialized versions of an architecture, you have to be a wizard just coining up a name - let alone variants. The “Pentium” brand was A-OK as far as marketing goes. In fact some of you might admit it was genius naming the fifth generation in the x86 line a-la Greek mythology. The brand name was so successful it was followed up by AMD with their Conanesque brand - Athlon.

    Clarity aside, the question here is did Intel dumped the Pentium brand (mucho confuso to many) to replace it with an ever-more confusing Core brand?

    I really feel it wasn’t so. Every so often a company and its brands must be freshened to reflect the times. Remember Coke thought they were doing the right thing by diversifying their bread-and-butter Coca-Cola brand? We all know what happened.

    Now that a tectonic shift is happening in CPU architecture technology, whereas the ongoing chorus of companies are promoting n-core designs vice stratospheric GHz clock speeds of the past decade or more, it is apparent that a new naming -call it a convention, a theme, branding - is needed to convey such massively different architecture. Where “Pentium” and “Athlon” conveyed super-GHz CPUs to all, “Core” is a much demure delineation for a very capable architecture design. Duo, Solo, (Quatro?), (Hexo)?, is just to identify the number of cores. Just as MadMatters said, “peace of cake!” - no need to correct that!

    So, yes. It can be simpler, but how? Go to Intel’s product line and its road map. Its road map alone looks like the U.S. interstate highway system wink If you were me, take Apple’s engineering prowess…These guys know what is good for you and me. Believe me, engineering is a delicate balance (not just compromises) of price, capability (power), supply, and design feasibility. You have to repeat that ten times or more to know you are using the right technology at a right time.

    I will buy what fits my budget and that’s all that matters. ‘Nuff said.

    Robomac had this to say on May 10, 2006 Posts: 846
  • ^ Uh…

    yeah, anyway… Nice point you’ve made Chris. I never considered the fact the switch to Intel could spell the end of ‘major’ improvements to the line. Are we just doomed to be stuck with being introduced to the future product iterations simply as, “the new iMac”, “the updated MacBook”, “the slightly-changed-design’ed MacBook Pro” ? Meh, I guess they’ve been managing okay doing that with iPods. Will be a difference, though.

    Luke Mildenhall-Ward had this to say on May 10, 2006 Posts: 299
  • Having scrubbed out several replies to that, I think the point you raise is so interesting i’m considering starting a blog just because of it.

    Benji had this to say on May 14, 2006 Posts: 927
  • Ben, are you bored?

    I have just the thing!:


    Luke Mildenhall-Ward had this to say on May 14, 2006 Posts: 299
  • Ben, are you bored?
    I’m an unemployed ex-student. You have no idea.

    Benji had this to say on May 16, 2006 Posts: 927
  • That’s a question best answered by in house Intel experts. Ask them during the Intel Live Chat event on the 26th of Feb.

    Ajay had this to say on Feb 18, 2010 Posts: 1
  • Page 1 of 1 pages
You need log in, or register, in order to comment