Comments Locked

65 Comments

Back to Article

  • Patflute - Friday, September 7, 2012 - link

    I don't get why the T version, that performances the weakest, is priced higher. You think having it clocked lower would decrease price...
  • ananduser - Friday, September 7, 2012 - link

    The T consumes less, it's greener at 35W.
  • Ammaross - Friday, September 7, 2012 - link

    Anand did a bench comparing "T" models to their regular counterparts and found that even though they're running at a lower TDP, they take longer to, say, transcode (on CPU) a video, or zip a bunch of files, and thus end up expending MORE power over the long run, as both chips will go to the same idle C-states when work is done. The advantage to running a "T" CPU is the TDP your chassis/heatsink is rated for. Nothing else.
  • bill4 - Friday, September 7, 2012 - link

    " and thus end up expending MORE power over the long run, "

    Well, that type of nonsense is typical of "green" products in all walks of life. They're forced on us by laws passed by liberals, but more often than not they're actually destructive to the environment. They're for stupid people who vote Democrat (especially lets say, the demographic of young females, who are particularly gullible and unintelligent) to "feel good", not any actual help to the environment.
  • andyo - Saturday, September 8, 2012 - link

    How did you end up writing all that?
  • pablo906 - Saturday, September 8, 2012 - link

    anyone ever tell you you're an ignorant mysoginist
  • KorruptioN - Saturday, September 8, 2012 - link

    We're talking about CPU TDP yet you had to chime in with your hurr-durr ignorant gubmint tripe....
  • Azethoth - Sunday, September 9, 2012 - link

    Its not even true on the face of it. Data centers use low tdp chips not because of young girls but because of sheer naked economic factors. Chilling chips costs money.

    Google and Apple use them because they have actual smart people in charge, not dumb hicks angry at the world frothing cheetos laden curses from their mom's basements.
  • GotThumbs - Monday, September 10, 2012 - link

    Unfortunately your use of a slur also displays your level of ignorance.

    Have you really put Bill4 in his place by responding in a like manner?

    The answer is NO. There will always be ignorant people on BOTH sides. It's important to remember that these individuals are NOT an reflection of all people.

    You may not agree with someones opinion...but as an American...You should respect their right to have one.

    Best wishes,
  • madmilk - Saturday, September 8, 2012 - link

    The same is true for server CPUs, and is why a E5-2690 costs more than the 2687W despite the latter being faster. In servers' case, you're limited by how much electricity is available for you to use at any instant in time. Break that, and the datacenter will get pissed at you.
  • MadMan007 - Friday, September 7, 2012 - link

    Validating for lower TDP costs money.
  • SlyNine - Friday, September 7, 2012 - link

    It all comes down to yields, You get less chips that can stay stable at a lower voltage.

    But I've never heard that the validating process was more expensive.
  • A5 - Friday, September 7, 2012 - link

    Every extra round of testing costs money. Also, the users who have to have it are willing to pay a premium for something that is validated for low-TDP applications (think OEM SFF systems).
  • Hector2 - Saturday, September 8, 2012 - link

    Validation is done for new designs and takes months. You're referring to "testing" which takes seconds each in the factory after wafer sort & after packaging. While testing costs money, that's spread over all cpus tested. What causes the price to vary (as others point out) is the bin split yield --- the % of parts that pass testing at the lower voltage while consuming less power. For example, if only 10% of the cpus pass at the ultra low voltage, you can expect to pay a price premium.
  • fic2 - Monday, September 10, 2012 - link

    I would think getting a T version to put into my Mac Mini. I'll have to wait for benchmarks, but it seems like it would be a pretty good jump. AFAIK, the current Mac Mini can only use 35W max cpu.
  • Pirks - Friday, September 7, 2012 - link

    Who would buy $100 two core Intel POS when you can get AMD 3.4 GHz quad core Phenom II X4 for less? Only Intel cock suckers and brain dead idiots will buy this Pentium shit.
  • fr500 - Friday, September 7, 2012 - link

    why are you allowed to post?
  • Pirks - Friday, September 7, 2012 - link

    your brain is too small to understand why
  • JarredWalton - Friday, September 7, 2012 - link

    Here's a G620 vs. A8-3850:
    http://www.anandtech.com/bench/Product/406?vs=399

    Here's the G620 vs. A6-3560:
    http://www.anandtech.com/bench/Product/406?vs=403

    Add roughly 20% more performance to the Intel to get the G2120 and you can see why it might be interesting for some users. Plus it would still less power (probably even less than the G620), which means less heat and noise. Would I recommend the G2120? Nope, but the G530 is still an awesome deal.

    As for quad-core Phenom X4, sure it would be faster; it would also consume 65% to 130% more power than a G620 (depending on how you're using it), which means probably two to three times the power draw of the G2120.
    http://www.anandtech.com/bench/Product/406?vs=102

    If you're not running highly multi-threaded workloads (and many users aren't), these CPUs are quite good and you can find reasonably priced motherboards to go with them.
  • Pirks - Friday, September 7, 2012 - link

    Yeah I meant it's faster because it has more cores and can do more things simultaneously. And if I need dual core CPU I can get some cheapo 3.2GHz Athlon II for $60. Wake me up when Intel lowers its Pentium price to $70 or so, then I'd consider it as a competition for AMD. For now all the people looking for best bang for the buck stay with AMD. Intel is for the lame fanboys and cash rich bench boasting "enthusiasts" IMHO.
  • SlyNine - Friday, September 7, 2012 - link

    How about watts per performance. Why are you so utterly in capable of considering that.

    That was his argument, are you so fing dense that you can't see that? You just hear what you want to hear and see what you want to see and stick your fingers in your ears and yell la la la la la apple apple apple.
  • silverblue - Friday, September 7, 2012 - link

    Pirks is hardly an Apple fanboy, if that's what you were implying.
  • SlyNine - Friday, September 7, 2012 - link

    LOL, have you been to daily tech??

    If he changed his tune, its a VERY RECENT change.
  • silverblue - Monday, September 10, 2012 - link

    I thought that was Tony Swash?
  • Belard - Friday, September 7, 2012 - link

    Are you some sort of nutty troll?

    Intel does actually sell their CPUs at competitive fair prices.

    The AMD FX series is blowing chunks because the price doesn't match the performance.

    The new lower-cost i3's are going to give the AMD-A series a run for their money... other than graphics, of course.

    That G620 ($60 newegg) competes pretty well against the AMD quad ($100). AMD's main market nowadays is the $150 and less market.

    AMD is all over the place with its FM1, FM2 and AM3+ motherboards, none of them support PCIe 3.0.
  • madmilk - Friday, September 7, 2012 - link

    The Celeron G530 is $45 on Amazon, and beats a $60 Athlon II 250 in most benchmarks and power consumption.
  • JarredWalton - Friday, September 7, 2012 - link

    I linked it in the article, but right now you can get the G530 at MicroCenter for $30 + $8 shipping.
  • MonkeyPaw - Friday, September 7, 2012 - link

    Microcenter = awesome. I am fortunate enough to have one in my city, so I can do in store pickup. They have successfully taken most of my business away from newegg.
  • Taft12 - Sunday, September 9, 2012 - link

    Jarred posting a serious reply to Pirks. I think the apocolypse is upon us.
  • JKflipflop98 - Sunday, September 9, 2012 - link

    LOL bro, AMD sucks on every front. Anything AMD does, Intel does twice as fast.
  • Patflute - Friday, September 7, 2012 - link

    It performs better than the Phenom II X4 and consumes less power...

    Go away AMD fan boy, just realize that AMD processors suck...

    The stock cooler is fine... I'm getting 35 degrees Celsius idle with a i5-3450... The i3 is cooler so...
  • Pirks - Friday, September 7, 2012 - link

    yeah sure, two cores perform better than four, LOL
    Intel fanboys are such a funny bunch
  • yuchai - Friday, September 7, 2012 - link

    Two cores do perform better than four if your workload doesn't use more than two cores at once, which is probably the majority of desktop workload. IMO single core performance is still king in a desktop environment. You do still need dualcore, but not really any more than that.
  • pablo906 - Saturday, September 8, 2012 - link

    Ya his AMD cpu goes to 11, and that's louder than 10 because you know it's 11.....
  • SlyNine - Friday, September 7, 2012 - link

    Don't reply to trolls.
  • Hector2 - Saturday, September 8, 2012 - link

    Not all cores are created equal. I forget which review it was, but Anand has published reviews with benchmarks showing various Intel dual cores out-performing AMD quad cores. AMD just doesn't have the process technology that Intel does anymore and they're struggling to catch up. Similarly, do you expect an ARM quad core to out-perform either an AMD or Intel dual core ? I hardly think so.
  • tammlam - Monday, September 10, 2012 - link

    Pirks...I've seen your posts on another website. I think it's Fudzilla. You're better off staying over there...this is not a forum for you.
  • silverblue - Friday, September 7, 2012 - link

    Of course IB is going to use less power - you get four times the number of transistors per mm2. Performance varies on what you're trying to do; in some cases, the 965 BE's extra cores will completely obliterate the new Pentium... but who expected any different? In gaming, the G2120 will win far more than it loses, at least until games really become threaded.

    The comparison to the top Llano would be interesting but for most uses, Llano's extra cores go wasted. Desktop Trinity is the thing we need to compare it to. Still, looking at the comparison between the G620 and the 965 BE, using approx. 2.5x the power whilst performing at 1.6x the speed isn't too shabby, given that it's an entire process behind (Llano is on a less mature 32nm process and as such, the gap closes a bit). That's just the one usage case, though, and that's only against the G620.

    That i3-3225 is looking like it'll be in a fair few HTPCs.
  • SlyNine - Friday, September 7, 2012 - link

    Whether its a node behind isn't the issue. Why should consumer worry or care about that?

    Its all about, performance or power usage or price. Or any combination of the three that is important to you. Even if performance is your most important factor, The Intel can still win many tests and not just games.

    I use AMDs for many of my builds, but it seems to be less and less.
  • StevoLincolnite - Friday, September 7, 2012 - link

    I've had a Phenom 2 x4, Athlon 2 x4 that unlocked into a Phenom 2, Phenom 2 x6 1090T, AMD FX 8120 and now have a Core i7 3930K.

    One thing all these processors have in common is that they can ALL be undervolted, AMD generally has very high voltages, more than what they need so they don't have to be so aggressive with validation and hence save money.

    The end result is you can get some pretty significant energy savings by undervolting to the point where it can close the gap in terms of TDP between Intel and AMD.

    When I had a Phenom 2 x6 1090T for instance the stock voltage was 1.35v, I dropped that down to 1.15 volts and kept the same clockspeed which made a massive difference, the FX you can get even more aggressive with the voltages.

    But in the end, if you compare Intel and AMD's TDP's directly, then you are doing it wrong, AMD and Intels TDP values are not comparable because they measure them differently, a 100w TDP Intel chip will not use the same amount of power as a 100w TDP AMD chip.
  • pablo906 - Saturday, September 8, 2012 - link

    someone should take away your keyboard
  • Hector2 - Saturday, September 8, 2012 - link

    Usually, tech posters have some class --- something you definitely lack
  • Articuno - Friday, September 7, 2012 - link

    $30 for 0.1 GHz?
  • madmilk - Friday, September 7, 2012 - link

    The Xeon E5-2690 is $300 for 0.1GHz more than the 2680. Processor pricing is exponentially increasing.
  • Ananke - Friday, September 7, 2012 - link

    I have to agree with the AMD comparison here. I have always had Intel CPUs and only one AMD Athlon, yet even I think the price is too high for these. I don't know what AMD Trinity will bring as performance, but considering the good integrated graphics, AMD low and mid range is still way better choice than Intel as price/performance.
  • Patflute - Friday, September 7, 2012 - link

    The i3s for the desktop side is meant to accompany a dedicated GPU, so it performs so much better than a A8 in processing.
  • Taft12 - Sunday, September 9, 2012 - link

    You'll find many millions of i3 CPUs in Dell and HP desktops and almost none of them will be accompanied by a dedicated GPU. Why would they put a GPU on the die if you aren't supposed to use it???
  • randinspace - Saturday, September 8, 2012 - link

    Anybody heard whether the 3550P will have the same overclocking potential the model it's replacing had? The last thing I heard was that rumors were pointing to "no," but one can dream, right?
  • randinspace - Saturday, September 8, 2012 - link

    argh. 3550P=3350P.

    While I'm double posting anyway, for those that are curious the SB variants were the Core i5 2380P (same clocks as 3350P) and 2450P.
  • Taft12 - Sunday, September 9, 2012 - link

    Don't you need a K-series CPU to overclock at all? Boy do I miss the good old days!
  • extide - Monday, September 10, 2012 - link

    It won't be unlocked (not K series as other comments have mentioned) so you get at best the 'free' +4 multipliers that Intel allows. The benefits are TDP and price, only.
  • overseer - Saturday, September 8, 2012 - link

    I found it much more hilarious reading the comments than the article. While It happens all the time on AT, this is the funniest occasion in months.
  • stuhad - Saturday, September 8, 2012 - link

    So is the NDA lifted? Is there going to be a review or didn't Intel send out any review samples? Anand mentioned he was going to review some of the new lower power of T and S series ivy bridge cpus, is that still happening?
  • stimudent - Saturday, September 8, 2012 - link

    I'm the only one I know who still has a desktop computer.
  • JarredWalton - Saturday, September 8, 2012 - link

    I have five! LOL. And about twice that many laptops probably.
  • HisDivineOrder - Saturday, September 8, 2012 - link

    AMD, wherefore art thou, AMD? Where are your low priced, low power competition? Hm? If AMD ever wonders why they're getting soundly thrashed all across the x86 marketplace, it's because they aren't showing up for most of the battles that actually matter.
  • Taft12 - Sunday, September 9, 2012 - link

    Pay closer attention. Trinity is already on the market in Acer and HP desktops and will be on Newegg any day now.
  • Bekenn - Saturday, September 8, 2012 - link

    I'm looking to put together an HTPC very soon, and was hoping the i5-3475S (with HD4000 graphics) would finally make an appearance. Any word on availability?

    I'm considering the 3225 as an alternative, but I'd be more comfortable with the extra CPU headroom that the 3475S should provide.
  • jwcalla - Saturday, September 8, 2012 - link

    Honestly you're probably better off getting a low-powered CPU and a $10 GeForce card. At this point NVIDIA is about the only one I trust for reliable video playback.
  • extide - Monday, September 10, 2012 - link

    I just picked up a 3225 for my HTPC. It will be an upgrade from a Core 2 Duo E6600 + HD 4870, and should drop my power consumption a decent amount. I also wanted the HD 4000 as well, as I wanted to use the IGP and wanted as much performance as possible from it.
  • VTArbyP - Saturday, September 8, 2012 - link

    I haven't seen in the article or the comments, any mention of the new low end ivyBridge processors supporting pci-x 3.0 with the X77 chipset (and others?)
    I would think that they do, but I wouldn't put it past Intel to turn that off on the low end procs. If they can run PCI-X 3.0 I expect that would enable one to go to 2 or 3 way SLI / Crossfire, because you only need / use PCI-X 3.0 X8 bandwidth per card. That would change the value of these babies relative to their Sandy Bridge predecessors.
  • C'DaleRider - Sunday, September 9, 2012 - link

    You're correct, PCIe 3.0 is not available on the i3 IVB cpus. It takes going up to the i5 versions to get PCIe 3.0 support.
  • extide - Monday, September 10, 2012 - link

    Careful about your nomenclature, PCI-X is an entirely different thing than PCI-Express. You are talking about PCIe not PCI-X. And it's the Z77 chipset, not X77.
  • Balzy - Monday, September 10, 2012 - link

    not gonna lie, i have a 3570k and MAN O MAN i am lovin this processor. but honestly these cpu's are a total letdown, let me tell you why i believe this. intel said that when you put a 22nm codenamed IVY BRIDGE cpu into a z77, you could essentially unlock pcie 3.0 amongst other things that i didn't really buy my board for. the cpu's aren't even part of the new revision meaning still on 2.0. i've been waiting and waiting for these very processors so i can upgrade my wife's pc' i even bought a 7770 2gb so she can play ALL the games she want, cause after researching we figured out that is literally all she needed. look it's not THAT bad, but maybe that's why they've been holding out on them for so long. sittin there in the giant towers... like sauron man. waitin to see whats happening with the 2ng gen i3's. prolly saying somethin along the lines of "OH WAIT they are selling fine, well i guess we gotta make sure that this is the most worthless upgrade ever" not gonna lie i have thought about the igp but the more and more i think about it, if the 3570k's igp wasn't as good as the 3770k's because of a 2mb dif in cache, it just doesn't sound like i have any worthwhile reason to even look at this. Intel still makes the banginest processors in teh whole world, so i'll stick with them.

    doesn't mean i gotta like all of their methods
  • DanNeely - Monday, September 10, 2012 - link

    Unless you're running 3+ GPUs, PCIe3.0 won't let you run higher image quality settings when gaming (and on an i3 your CPU would probably bottleneck first in a 3/4gpu setup). It's only high end raid controllers, and some scientific computing workloads (the ones that need frequent CPU/GPU coordination) that benefit from PCIe3.0 in single/dual card configurations.

Log in

Don't have an account? Sign up now