Back at the day I used to recommend dual core processors to my friends precisely becase of one reason:
Since there are not that many applications at the moment utilizing the power of two cores, the best use case for them is this: when your application hangs up it sometimes eats up to 100% of your CPU and you you cannot even move your mouse or click anywhere simply because there is not enough processing power left. Now with two cores, when your application hangs up like that, you still have that second core sitting idle in the background and you can quickly bring up task manager and kill the misbehaving app.
Nowadays, when by far the most used and abused application is the web browser, and when each tab is its own process, from time to time I see all of my cores – two, to be exact – being fully utilized up to a point that typing or switching to another application becomes almost impossible. Even mouse movement becomes shattered (and I’m on a Core 2 Duo 2.53GHz). I feel like being back at the single core times. Four cores might behave a little better, but maybe eight is the target to reach? I’m worried that the browser will just eat them up, no matter how many you’ll have.
Applications utilizing multiple cores are actually very appreciated, it’s just too bad, that this model has some drawbacks.
It’s just a plain old Arial at 14px. Apparently cleartype does not like this font very much. There are similar artifacts at other sizes too. And, of course, it’s nothing new. Only today I’ve stumbled upon a so clearly visible example.
And one more thing – I *love* cleartype. It just hurts to see when it goes bad.
Moore’s original statement that transistor counts had doubled every year can be found in his publication “Cramming more components onto integrated circuits”, Electronics Magazine 19 April 1965. Moore slightly altered the formulation of the law over time, bolstering the perceived accuracy of Moore’s Law in retrospect. Most notably, in 1975, Moore altered his projection to a doubling every two years. Despite popular misconception, he is adamant that he did not predict a doubling “every 18 months”. However, an Intel colleague had factored in the increasing performance of transistors to conclude that integrated circuits would double in performance every 18 months.
December 15th, 2007: Seagate Barracuda 7200.11 500GB – 427 PLN
Pitt plays the character at almost every age, but it’s almost impossible to tell when the CGI is being used on him. You know it’s there, obviously, but you can’t tell it’s being used. When the transition is just smooth enough for the Visual Effects to be retired, but just rough enough to use makeup, it’s absolutely perfect. If you’ve ever wanted to see Brad Pitt look 20 again, look no further, as the effects that make our actors young again (the same goes for Blanchett) are just as stunning as those that make them older.
Thing is with screen big enough and a 1080p source – you can tell the difference between Brad Pitt and CGI. And you can do it quite easily. Both old and young Benjamin look very much CGI alike. And it’s not that the effects are bad, because they’re not, it’s just that at this level of details (and face close-ups) you need to have your effect to be close to perfect. Seems we’re not quite there yet. The skin, the wrinkles, small hair covering the face, freckles, moles, every face feature – those are things you see and appriciate the most when viewing good HD content. And they make the biggest impresion. I know they did on me. They say animating a face is the hardest thing to do. At HD it’s ten times harder.
In particular I’ve concluded the free is deeply entwined into the very foundation of technology. I was sharing some of those emerging half-baked thoughts with Chris in the lobby of TED. Since that conversation I’ve discovered that the tie between technology and the free goes even further than I thought. My current conclusion can be summarized simply: Technology wants to be free.
Let me state it more precisely: Over time the cost per fixed technological function will decrease. If that function persists long enough its costs begin to approach (but never reach) zero. In the goodness of time any particular technological function will exist as if it were free.
This seems to be true for almost anything we make: basic things like food stuffs and materials (often called commodities), and complicated stuff like appliances, as well as services and intangibles. The costs of all these (per fixed unit) has been dropping over time, particularly since the industrial revolution.
During the recent move of my development machine to the basement I’ve conducted a test telling me what is the actual power consumption of my new Core 2 Duo powered server. Basically, it is a normal PC: Core 2 Duo E6300 1.86GHz, 2 x 512MB DDR2, 2 x 250GB SATA 7200rpm (RAID 1), old PCI graphic card and a 350W power supply. All running latest Ubuntu (currently 7.04 Feisty Fawn, server edition). Since it is a development machine, it’s idle most of the time (98% or even more). And this is the state I was making my measurements in. So what are the results? Well, I was quite surprised how low my power consumption actually is. I took three tests, which indicated basically the same: about 77 Watts. Even taking into the account temporary power usage spikes (when I’m actually using the machine…) it shouldn’t cost me more than 4$ per month to keep it running 24/7. Isn’t that sweet? ;)
Apparently there seems be some hidden truth beneath the Apple’s decision to switch to the Intel chips. Robert X. Cringely suggests that it might be that Intel is planning to buy out Apple (just another word for merger). I must say that he uses some really convincing arguments. On the other hand the rumor that Intel might be producing PowerPC chips for Apple was very sensible and convincing too.
Here are some excerpts:
If Apple is willing to embrace the Intel architecture because of its performance and low power consumption, then why not go with AMD, which equals Intel’s power specs, EXCEEDS Intel’s performance specs AND does so at a lower price point across the board? Apple and AMD makes far more sense than Apple and Intel any day.
The vaunted Intel roadmap is nice, but no nicer than the AMD roadmap, and nothing that IBM couldn’t have matched. If Apple was willing to consider a processor switch, moving to the Cell Processor would have made much more sense than going to Intel or AMD, so I simply have to conclude that technology has nothing at all to do with this decision. This is simply about business – BIG business.