These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

Out of Pod Experience

 
  • Topic is locked indefinitely.
 

Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessor

Author
Commissar Kate
Kesukka
#1 - 2012-11-26 19:10:46 UTC
Looks like future mainstream Intel CPUs will not be interchangeable and will be soldered onto the motherboard. Although the article says that it is likely that high-end desktop platforms will still remain in the LGA (interchangeable) package.

Source
Suzu Fujibayashi
Happy Dudes
#2 - 2012-11-26 21:13:23 UTC
I don't know when I last time upgraded just the CPU and therefor had to remove the old one.
MotherMoon
Tribal Liberation Force
Minmatar Republic
#3 - 2012-11-26 21:19:02 UTC  |  Edited by: MotherMoon
yay

Good riddance. The new phones coming out with the new intell mobile and Nvidia chipsets are quad core 2.5 CPU phones.

We're moving towards a future were even a cheap phone can run 3ds max and after effects, how exciting.

plus they have HDMI out, and you hook up and mouse and keyboard with bluetooth.

I guess what I'm saying is, once a motherboard comes with all of the pieces built in, because it's faster/cheaper/ it just makes more sense, they parts will become more and more affordable anyways.

http://dl.eve-files.com/media/1206/scimi.jpg

Commissar Kate
Kesukka
#4 - 2012-11-26 22:19:11 UTC
Suzu Fujibayashi wrote:
I don't know when I last time upgraded just the CPU and therefor had to remove the old one.


Yeah most of the time when you upgrade you get a new motherboard and CPU at the same time. Only problem I have is that I like choosing a specific motherboard and a specific CPUs that fit my needs.

MotherMoon wrote:
yay

Good riddance. The new phones coming out with the new intell mobile and Nvidia chipsets are quad core 2.5 CPU phones.

We're moving towards a future were even a cheap phone can run 3ds max and after effects, how exciting.

plus they have HDMI out, and you hook up and mouse and keyboard with bluetooth.

I gues what I'm saying is, once a monthboard comes with all of the pieces built in, just becuase it's faster/cheaper/makes more sense, they parts will become more and more affordable anyways.


As long as I get to keep my keyboard/mouse and enough horsepower drive a large display I don't really care what they do
Shalia Ripper
#5 - 2012-11-26 22:31:29 UTC
The sky is falling for the home build hobbyist.

Sig blah blah blah blah

AlleyKat
The Unwanted.
#6 - 2012-11-27 11:49:18 UTC
It won't stop home builders, although I would question the financial viability of home building a rig in ten years from now.

Computing power will one day be in the cloud where it should be - the only thing I should need on my desk is a monitor and k-board/mouse.

AK

This space for rent.

Sturmwolke
#7 - 2012-11-27 13:29:26 UTC
Ever wondered why you keep throwing away your video card? Be prepared to throw away whole motherboards + integrated CPU when a component goes faulty. If applied to our current desktop ATX form factor, servicibility goes down, more waste. It's a regression. Remember the abandoned "Slot" series CPU (limited cooling option and iffy slot connections) and ATX PSU reverse exhaust fan standards (which actually made internal case temp hotter)?

In a clever way, it might increase demand for their CPUs when you have to practically marry it directly to the motherboard with BGAs (which is almost impossible to de-solder/re-solder without proper equipments). The ability to retain a working CPU and just swap out a faulty mobo will cease to be an option. Now all they need to do is sell the idea to mobo manufacturers under the guise of a win-win scenario.

Personally, I couldn't care less if they apply this to mobile devices .... but when it comes to desktop/servers, it's a HUGE turn off.
Kurfin
Kippers and Jam Developments
#8 - 2012-11-27 21:21:45 UTC
Not a radical departure from the status quo. Most home PCs are laptops, which don't have interchangeable CPU's. Desktops are either high end, for gaming, CAD or whatever, or they are the super cheap boxes that live in offices all across the world. It could make the super cheap ones cheaper, or at least secure higher margins for intel, and the high end ones still get interchangeable ones. And AMD is, you know, still around...

Also as pointed out, how many of us swap the CPU without changing the motherboard too? By the time you feel the need for an upgrade, odds on your motherboard has an obsolete socket.
Iosue
League of Gentlemen
The Initiative.
#9 - 2012-11-27 21:29:35 UTC
Suzu Fujibayashi wrote:
I don't know when I last time upgraded just the CPU and therefor had to remove the old one.


pretty much this, not to mention ram.
digitalwanderer
DW inc
#10 - 2012-11-27 23:17:50 UTC
I however am pretty much an enthusiast and have been for 2 decades, so i'll make sure i'll go for the nuttiest upgrade possible next year by using an overclockable dual socket motherboard and having the fastest Xeon 12 core processors fitted, all water cooled of course.


So it'll be a 24 core/48 thread monster that can handle anything for years and cloud computing with just a keyboard, mouse and displays at home, can go screw itself as far as i'm concerned.


P
Tarvos Telesto
Blood Fanatics
#11 - 2012-11-28 01:55:53 UTC  |  Edited by: Tarvos Telesto
No customisation in future means (buy new computer for every new game after relase) Ugh But in other way procesors and software become solid and stable things, with current strong new CPU you are safe for years.

Its like my few years old AMD x2 4800+ procesor it can hande most modern games designed for dx9, at last can cary two EvE clients in multi box, or games like old crysis, and some new modern games, still big reserve...

EvE isn't game, its style of living.

Caleidascope
Republic Military School
Minmatar Republic
#12 - 2012-11-28 03:19:21 UTC
An interesting reversal of the 1990s practices.

Back then the Apple computers came with soldered on CPU.

PC computers had socket cpu so you could pick and choose. If you could not afford a flagship Pentium cpu, you bought a cheaper version, then upgrade to the flagship once the price came down some.

Life is short and dinner time is chancy

Eat dessert first!

Reiisha
#13 - 2012-11-28 08:47:20 UTC  |  Edited by: Reiisha
AlleyKat wrote:
It won't stop home builders, although I would question the financial viability of home building a rig in ten years from now.

Computing power will one day be in the cloud where it should be - the only thing I should need on my desk is a monitor and k-board/mouse.

AK


Internet goes out, no access to your data. Router goes out, no access to your data. Network cable goes out, no access to your data. Particularly bad thunderstorm and your wifi may be interupted. Cable breaches anywhere along the 10 to 10,000 mile distance between you and whereever the cloud data is stored, no access to your data. Sandy, even though you don't live anywhere near where it happened, no access to your data.

It baffles me how cloud proponents can't see on how many different places this entire thing can go wrong and block access to your data.

Local storage, local computing will never, ever go away since it gives a level of control that at least a small part of the market will never want to give up (including, i bet, a large part of the EVE populace). Cloud storage is, in my opinion, good as a supplemental but never as a replacement. It's like storing all your stuff 1000 miles away from you and being dependant on the weather, truck drivers, general traffic and even political situations if you want to retrieve a critical bank document or even a cup of coffee. IMHO it''s ridiculous. Not to mention the obvious security issues here, where strangers will always have access to your data beyond your control whereas storing it locally gives you some peace of mind that at least you know where it is and who has access to it.

Of course, if people want to blindly trust cloud storage, go ahead, it'll only make the moment where i say "told you so" a hell of a lot sweeter.





As for the actual topic - There are valid business reasons for Intel to do this, but ultimately they're taking away control from a not insignificant part of the market (home builders and the server market, aswell as some major OEM's). This move will drive a lot of people to AMD or, if they don't manage to survive that long, a lawsuit against a monopoly. It will also help ARM grow a hell of a lot most likely.

This move can only hurt Intel financially in the long run. It seems like they decided to push their current advantage just one step too far, thinking they can get away with it. I'm fairly certain that Microsoft and NVidia will also take issue with this as this means they lose a lot of influence over the hardware market, meaning that Intel will lose even more support.

If AMD can survive the next 2 years, i'm fairly sure that they will slowly be able to gain the upper head in the CPU market, or possibly sharing it with ARM, where Intel will slowly switch into the position AMD is currently in.

If you do things right, people won't be sure you've done anything at all...

Sidus Isaacs
Center for Advanced Studies
Gallente Federation
#14 - 2012-11-28 09:23:32 UTC
About time. Miniaturization is the future, and this is a step in that direction.
Perihelion Olenard
#15 - 2012-11-29 01:41:04 UTC
This will be quite unfortunate for people like me who like building computers. Also if a CPU ever fails, not only would I have to get a new CPU, but I'd also have to get a new motherboard. The same would be true if something fails on the motherboard. I'd have to pay for a new CPU as well. What a hassle this is going to be.
Akita T
Caldari Navy Volunteer Task Force
#16 - 2012-11-29 07:50:20 UTC  |  Edited by: Akita T
So, the cheap CPUs (which usually also go on cheap mobos) will be solder-only, while the expensive ones stay the same ?
Meh.

The only people really negatively affected are motherboard manufacturers, and even that's not quite such a big deal.
For the average consumer, it won't even register. It might even be a blessing for novice system builders, since they don't have to worry which CPU fits in what mobo.
And nothing changes for the enthusiasts anyway.

Heck, honestly, I actually would prefer my higher-end CPU soldered to the mobo too.
Out of all possible components, I pay attention to the mobo THE LEAST of all possible components (except maybe the optical drive, that's really a "whatever's there is good" kind of component).
Honestly, when was the last time any of you actually REplaced a CPU on a mobo anyway ? Can you even remember ? I sure as heck can't. More than a decade ago anyway. Probably longer.
With solder, it's no more thermal paste, wobbly or sweaty hand, and static shock worries when putting the CPU in a mobo for the first and last time... might as well be soldered on from the beginning.

Reiisha wrote:
It baffles me how cloud proponents can't see on how many different places this entire thing can go wrong and block access to your data.


Important and/or critical data that's not "a trade secret" (or similar) gets a local copy, on a low-power, low-capability device. Heck, there are cellphones today with more processing power than desktop PCs of a decade ago, and that's quite enough for most work. Getting "cut off" from the cloud just means reduced capabilities, not total blackout.
Sensitive and/or private data stays off the cloud either way, and you get a machine that can work with it just fine locally. It might get a heavily encrypted cloud backup just in case.
It's just common sense, IMO.
Kitty Bear
Deep Core Mining Inc.
Caldari State
#17 - 2012-11-29 16:38:34 UTC
Caleidascope wrote:
An interesting reversal of the 1990s practices.

Back then the Apple computers came with soldered on CPU.

PC computers had socket cpu so you could pick and choose. If you could not afford a flagship Pentium cpu, you bought a cheaper version, then upgrade to the flagship once the price came down some.


hmmm the happy days of choosing between
intel, AMD, PowerPC, CYRIX :D
Sturmwolke
#18 - 2012-12-03 20:33:11 UTC  |  Edited by: Sturmwolke
Just some info.

Reballing With Low Cost/HomeMade Equipment (just one of the many youtube examples of re-balling BGAs)
http://www.youtube.com/watch?v=VTAU647jzzk

Things like this are becoming more commonplace nowdays due to BGA connection joint issues. If you've got a faulty GPU board, chances are the problem is caused by a cracked/oxidized BGA joint due to mechanical/thermal stress or manufacturing defect. The Nvidia defective chip drama (way back in 2008 or so) is a BGA failure in the chip packaging.

Imagine the reliability that could be achieved if they'd just socketed the GPU with LGAs or socketed adapter like this (which makes repair a cinch).
Sadly, the PC industry just don't work that way, pity *sigh*

PS: Nice random pics from HP ... texts maybe too technical for the layman.