These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

Out of Pod Experience

 
  • Topic is locked indefinitely.
12Next page
 

Are next generation game consoles already irrelevant?

Author
Jada Maroo
Native Freshfood
Minmatar Republic
#1 - 2013-01-26 05:33:23 UTC
I want to preface this by saying that I am not saying that TFLOPS is a good measure of overall graphical performance. Memory amount, bandwidth, low overhead operating system and many other things contribute.

That said, let's just use it as a basic measure and consider what we're going to be seeing in the next few years.

In under 6 months Intel will be releasing the Haswell processor. Haswell is going to be very popular for mobile devices because of its low power consumption. The lesser talked about "feature" of Haswell is that Intel is setting the stage for a pretty major bump in their integrated graphics. Haswell's GT3 variant GPU will perform at about 600 GFLOPS. This compares to about 240 GFLOPS of an XBox 360, and an estimated 1.5 TFLOPS for the next generation XBox.

More important is the successor to Haswell, Broadwell. This chip is to have 4b transistor GPU, capable of an estimated 2.5-3 TFLOPS.

The tablet / ultrabook variant will no doubt have less horsepower, but even if it's only a half, in 2-3 years you will have next gen quality graphics on your tablet, period. And the next generation consoles aren't even out yet.

Which means the next gen consoles will have only 18 months to two years time before they are surpassed... by an iPad.

Once you get to that point, you have more than enough power to embed a very highly capable "console" into something like an Oculus Rift, which I guarantee will happen.

At that point... who is going to bother with these next gen consoles after a couple years? There's been a lot of talk about the next gen being the last, but there doesn't seem to be a realization that their life spans probably won't be nearly that of previous genera --- OH MY GOD JUDGE DREDD IS GAY
Rain6639
GoonWaffe
Goonswarm Federation
#2 - 2013-01-26 05:56:07 UTC
I think... until visors merge computers and televisions, console gaming wil continue to exist. console platforms are not as powerful as personal computers, but their games are created around them.

even after visors become common, another factor is console gamers' age and supervision. there will still be adults who want to provide video game entertainment to their children, but without the burden of computer maintenance. this niche of standardized hardware, locked behind a warranty-voiding seal will keep consoles relevant.

lastly: progress in computer processing will help "across the board" in the same way semiconductor device fabrication is currently a limiter "across the board."
Surfin's PlunderBunny
Sebiestor Tribe
Minmatar Republic
#3 - 2013-01-26 05:56:25 UTC
Nope, completely out of date Straight

"Little ginger moron" ~David Hasselhoff 

Want to see what Surf is training or how little isk Surf has?  http://eveboard.com/pilot/Surfin%27s_PlunderBunny

Rain6639
GoonWaffe
Goonswarm Federation
#4 - 2013-01-26 09:35:16 UTC
this was all just a judge dredd pic joke, wasn't it.
Akita T
Caldari Navy Volunteer Task Force
#5 - 2013-01-26 10:15:56 UTC  |  Edited by: Akita T
PS2 (2000) -> CPU: 6 GFLOPS | GPU: 0 GFLOPS | Combined: 6 GFLOPS
Xbox (2001) -> CPU: 1.5 GFLOPS | GPU: 5.8 GFLOPS | Combined: 7.3 GFLOPS
Xbox360 (2005) -> CPU: 115 GFLOPS | GPU: 240 GFLOPS | Combined: 355 GFLOPS
PS3 (2006) -> CPU: 218 GFLOPS | GPU: 1800 GFLOPS | Combined: 2018 GFLOPS

EDIT : almost nobody actually managed to utilize the full PS3 GPU potential so far, which is a damn shame, and it's the XB360's indirect fault for that happening, with a good helping from Sony by making development not very user-friendly.

The older rumours were that the next-gen consoles (let's call them Xbox720 and PS4 for now) would feature mostly AMD//ATI hardware, built around some modified variant of a A8-3850 (2.9GHz quad core) CPU+iGPU combo with a side-order of possibly modified Radeon HD 7670 GPU(s).
One would assume they will use at least two, possibly up to four GPUs inside them (with three of them being in the sweet spot as far as anti-micro-stuttering goes), so you can probably expect at least a 2.1-2.3 TFLOPS overall GPU performance out of the PS4 (probably only about 1.4-1.5 from the xbox, cheap bastards might only go with two instead of three, which means good luck 720 gamers with your sucky microstutters), so pretty close to a single desktop GTX 670 (which will probably still cost more than the entire console would be likely to cost at the time of the console launch).

They might actually ramp up those specs a bit higher than that before the consoles get launched - but frankly, that stops being a major concern by this point, and even lower specs would be ok in most cases.
There is a huge problem with handhelds, and you know it's NOT pure processing power.

It's user interaction.

The damn screens are too small (or else they would no longer be handhelds, would they) for halfway serious gaming, for starters.
The input methods are at least problematic (if you add an external controller that's larger than the device itself, what's the point of having a handheld), not to say impossible as far as serious gaming goes.
All in all, you just can't compare the gaming experience you get from a console and a handheld EVEN IF YOU USE A HYPOTHETICAL GAME THAT RUNS MORE THAN FINE ON EITHER HARDWARE. And the experience on a PC is even better, because, come on, mouse plus keyboard vs controller, there's not even a contest there.
And the consoles are not really at risk from the PCs either, since they're too darn cheap for most of their useful life, so the userbase is that much broader.

The consoles will only start to risk being an endangered species when somebody will come up with a way to include both large field-of-vision (the Oculus Rift might be the only serious contender so far) and reliable control methods (none so far have done that) in a handheld (or in a "wearable" device, whatever) THAT IS ALSO QUITE CHEAP (pretty unlikely) so that it can compete with similarly-specced consoles.
AlleyKat
The Unwanted.
#6 - 2013-01-26 18:19:58 UTC
Micro$oft

Sony


If I'm reading this right, and I think I am, these designs will not do 4K gaming - which leads me to believe this new generation of consoles are going to be a stop gap before 4K gaming can arrive.

1080p/active 3D gaming and more integration for online services/media streaming is really the goal here I believe - as well as making the console as cheaply as possible to maximize ROI over a shorter lifespan, and have a cool and efficient machine.

No more RRoD, or YLoD is my guess.

What's their respective target audience(s)?

Microsoft have got the frat-boy demo sewn up, so probably capitalise on that more, but at the same time, they would not wish to lose sales from those who have left college and are starting a career, looking to spend $ on stuff to put into their new babe-lair...so the aesthetics are going to be interesting.

I'll still maintain I do not think a 2014 release is likely for N.America for the PS4, perhaps Japan fall 2014, then N.America Spring 2015 and EU/EMEA Summer and AU when they can be bothered.

I think Microsoft are gonna be first out of the gate, no pun intended.

AK

This space for rent.

Mars Theran
Foreign Interloper
#7 - 2013-01-26 18:55:54 UTC
I think having a viewscreen less than 2 inches from your eyes is going to cause damage to peoples vision. Infrequent use of tech like the Occulus Rift might be okay, but anyone spending any time with it is certainly going to have problems.
zubzubzubzubzubzubzubzub
Eli Green
The Arrow Project
#8 - 2013-01-26 19:22:34 UTC
Mars Theran wrote:
I think having a viewscreen less than 2 inches from your eyes is going to cause damage to peoples vision. Infrequent use of tech like the Occulus Rift might be okay, but anyone spending any time with it is certainly going to have problems.


also you feel reaaaaly weird after wearing them for more than 2 minutes. REAAAAALY WIERD Shocked

wumbo

Marie Hartinez
Aries Munitions and Defense
#9 - 2013-01-26 20:41:41 UTC
I have two PS2's sitting around collecting dust. A first gen PS2 and the updated slimmer model. Haven't used them in quite a while.

We got the kid a Wii about two years ago. She sometimes plays a game or two on it. But now, she is very much into art and drawing.

The only reason the Wii is still hooked up to the TV is to stream movies and shows from Netflix.

So, we're pretty much done with consoles. Now that tablets are getting better, I am seriously considering getting one for myself and one for the wife. But, they will never replace our computers.

Surrender is still your slightly less painful option.

Jada Maroo
Native Freshfood
Minmatar Republic
#10 - 2013-01-27 00:36:19 UTC  |  Edited by: Jada Maroo
I would only amend my prediction in one way: it's possible you might not see a headset with an embedded GPU because by that time you'll see a wireless HD video standard or much faster wifi connections that can handle HD video.

In fact, 5 years from now, I don't even see desktop computers being around in their current form. More likely, your phone will be your computer, which you will sync to a wireless monitor and Bluetooth mouse and keyboard for desktop tasks. And for gaming, a wireless Oculus type headset paired with a console style controller or motion sensing Kinect.

But I find it hard to believe we're going to continue to be tethered to these clunky systems in a few years. Watch some TV shows on Netflix that aired from around 2006 up to the current day. Look at the phones people were using 5 or 6 years ago. Project that out another few years.

What staying power can these consoles possibly have at this pace?

Turns out Judge Dress isn't gay, BTW.
Reiisha
#11 - 2013-01-27 03:41:26 UTC
Jada Maroo wrote:
I would only amend my prediction in one way: it's possible you might not see a headset with an embedded GPU because by that time you'll see a wireless HD video standard or much faster wifi connections that can handle HD video.

In fact, 5 years from now, I don't even see desktop computers being around in their current form. More likely, your phone will be your computer, which you will sync to a wireless monitor and Bluetooth mouse and keyboard for desktop tasks. And for gaming, a wireless Oculus type headset paired with a console style controller or motion sensing Kinect.

But I find it hard to believe we're going to continue to be tethered to these clunky systems in a few years. Watch some TV shows on Netflix that aired from around 2006 up to the current day. Look at the phones people were using 5 or 6 years ago. Project that out another few years.

What staying power can these consoles possibly have at this pace?

Turns out Judge Dress isn't gay, BTW.


Everyone always underestimates the power of a workstation...

Some people might do away with the traditional desktops yes, but there's still a few massive problems with your assumption.

Imagine your "work PC" getting pickpocketed. Hilarity ensues :)


What people always forget is that there's quite a number of people for whom working at a desk, on a proper workstation, is the best way to get certain things done. It's not "the form factor", it's the funcionality and usability. You're never going to write code on a telephone. You're never going to edit video on a telephone. You're never going to do "high end gaming" on a telephone. Etc.

You have to remember, "high end" also moves with the tech. The original Quake was, in it's day, a "high end" game, requiring a lot of power - But even though you can play it on telephones now i don't see people throwing away their desktops. Tweaking stuff to your own ends will also not be done on mobile devices - Can you imagine trying to use the Skyrim editor on a tablet?

A mobile device can break far more easily than a "unwieldly" desktop. Drop it somewhere, water seeps in from the weather, you sit on it, or (as i said before) it gets stolen. Dumdeedum, you lose it for days or even weeks, since you most likely will have to send it off to the manufacturer to get it fixed, and you have to hope that they don't mess up your data.

Something breaks in a desktop? You replace it. Even a novice can troubleshoot a PC, with some help, in a matter of hours. No need to send anything away to get it fixed and basically losing productivity for an undetermined time.


You may claim that your data is on a NAS at home, that all your devices use. That doesn't take away that if your device breaks or is stolen, you still can't access it. And if you have other devices at home just for that, why not have a desktop anyway?


You may claim that your stuff is on the cloud. In that case there are so many things between you and your data - Your network card, your network cable, your router/modem, the cable connection in the ground, the ISP network station(s), the different connections that haveto be made in between (signals can go through dozens of servers, each of whcih is a potential point of failure)..... Having your data local removes all of those potential problems. The cloud is, at best, an addition to your tools, not a replacement. Oh, and i almost forgot - Storing stuff on the cloud means that there are other people who are able to look into your files without having to hack or break anything. This is a necessity and inevitability of needing system admins, aswell as a sad consequence of certain laws in certain countries... You may think you have nothing to hide, but do you really want other people to know what kind of kinky stuff you're doing in your private bedroom, or what you spend your money on and where? What your salary is? Etc.



Which actually goes for all of the things mentioned - They're additions. Gadgets of convenience. They can make certain things easier but they will never replace desktops as they are now, as the market for devices like that will never go away.

If you do things right, people won't be sure you've done anything at all...

Rain6639
GoonWaffe
Goonswarm Federation
#12 - 2013-01-27 09:06:21 UTC  |  Edited by: Rain6639
Mars Theran wrote:
I think having a viewscreen less than 2 inches from your eyes is going to cause damage to peoples vision. Infrequent use of tech like the Occulus Rift might be okay, but anyone spending any time with it is certainly going to have problems.


it's a myth that vision problems result from being "too close to the screen" or "you're on the computer too much"--it's not about the computer screen. it's a repetitive stress injury, and the strain some people experience is related to the range/distance of the gaze and the lens of the eye, but this can happen from only looking at books, or a desk, or a newspaper.

Jada Maroo wrote:


In fact, 5 years from now, I don't even see desktop computers being around in their current form. More likely, your phone will be your computer, which you will sync to a wireless monitor and Bluetooth mouse and keyboard for desktop tasks. And for gaming, a wireless Oculus type headset paired with a console style controller or motion sensing Kinect.


phones won't kill the desktop, because of CPU scalability and parallel computation. there will always be monster code to crunch, that a phone cannot handle but a "desktop"/immobile computer is built for.

AlleyKat wrote:
Micro$oft

Sony

If I'm reading this right, and I think I am, these designs will not do 4K gaming - which leads me to believe this new generation of consoles are going to be a stop gap before 4K gaming can arrive.
AK


there is a -LOT- that needs to happen before consoles break into 4k/Ultra HD, aka 3840 × 2160

(disambiguation: the terms HD, 1080p, Ultra HD, and 4k refer to resolution, and are independent of refresh rate. a rate of 30 frames per second, or fps, is the minimum rate to convey realistic motion to the brain. the brain can perceive higher frame rates, and will "fill in the gaps" as frame rates drop to 30--but anything lower is perceivable "lag." while 60 fps is more visually appealing than 30 fps, the jump from 30 to 60 fps incurs a directly proportionate doubling of computational load for video games, and storage space or broadcast bandwidth for videos. broadcast signals are identified by their resolution and framerate together as 720p30, 1080p50, etc)

native 1080p graphics would be an improvement in console gaming. most games on the PS3 and XBOX 360 are rendered in resolutions significantly lower than 1080p, and then upscaled to 1080p--through onboard upscaling hardware on the XBOX 360, and software on the PS3--and get this: with the PS3, graphics upscaling support is the responsibility of the game developer

the XBOX 360 didn't even have a 1080p-capable out until 2007

I don't expect this next generation of console platforms will try to make "4k" happen, or even develop for it. even the one after that. it would take at least one generation of 1080p -not- being sold... and some displays are still hanging on to 720i

"ultra HD" on PC is only available through an array of computer displays, and Ultra HD TVs are still being showcased as "new" at CES 2013 (hollywood reporter article, sorry.)

there's also movies that aren't "mastered" for Ultra-HD, like... everyone's Blu-Ray collection. sony's $25,000.00 Ultra HD tv only comes with 10 Ultra HD remastered movies.

television is still not broadcast in 1080p--at most, 1080i

internet bandwidth is getting better, and so is data storage, but 720p and 1080p are still in their blossoming stages, and nowhere near standard... HD as a standard would look like: "true 1920 pixels in width and 1080 pixels in height, and the display is not overscanning, underscanning, or reinterpreting the signal to a lower resolution."

... at 20, 30, 50, or 60 frames per second, and everyone is able to render, broadcast, store, and display it. and thennnn consoles might render in 1080p and upscale to Ultra HD for a generation.

hey, but... i thought you said you were leaving. seeing your mug reminds me that I want to buy you if you leave.
Alpheias
Tactical Farmers.
Pandemic Horde
#13 - 2013-01-27 10:21:22 UTC  |  Edited by: Alpheias
Consoles as devices in your home are becoming obsolete and irrelevant as technology evolves, same thing can really be said about your personal computer whether it is a Apple branded PC or something you built.

I think it is more realistic to think that in the future more and more of your devices will be Cloud based and that change is going to happen quicker in countries like Japan and most of Europe and slower in countries that really are infrastructural nightmares like the Americas, Africa, Asian countries and Russia.

Agent of Chaos, Sower of Discord.

Don't talk to me unless you are IQ verified and certified with three references from non-family members. Please have your certificate of authenticity on hand.

Rain6639
GoonWaffe
Goonswarm Federation
#14 - 2013-01-27 10:23:23 UTC
Alpheias wrote:
Consoles as devices in your home are becoming obsolete and irrelevant as technology evolves, same thing can really be said about your personal computer whether it is a Apple branded PC or something you built.

I think it is more realistic to think that in the future more and more of your devices with be Cloud based and that change is going to happen quicker in countries like Japan and most of Europe and slower in countries that really are infrastructural nightmares like the Americas, Africa, Asian countries and Russia.


cloud based gaming would be awesome, and would support subscription-based gaming similar to a cable channel. and yeah, bandwidth.
Brujo Loco
Brujeria Teologica
#15 - 2013-01-27 15:22:11 UTC
I still prefer the term Bicurious Big smile

Inner Sayings of BrujoLoco: http://eve-files.com/sig/brujoloco

AlleyKat
The Unwanted.
#16 - 2013-01-27 19:33:29 UTC
Rain6639 wrote:
hey, but... i thought you said you were leaving. seeing your mug reminds me that I want to buy you if you leave.


I am and you can't, because you cannot buy something that does not have a currency, but I appreciate the sentiment :)

4K will happen, as the profit margins on current LCDs have dropped to near flat lines and will coincide with a new panel technology, like O-LED. Trying to make a 4K teevee is hard, especially if you do not manufacture the panel technology, you are at their mercy.

I do agree with most of what you say, but feel that all you need is good up-sampling or down-sampling to counter any resolution woes for non-games. If you are applying this to movies it's even money because digital film is 4K (moving to 6K this year) so you get a great transfer of a film print to teevee - minus the aspect ratio changes, not all films have the same aspect ratios, but 2.35:1 is common, which is way wider than the sub-standard ratio of 1.78:1 that you get on a 1080P panel.

4K gaming is harder, because you gotta work with v.large texture sizes for it not to look bad, and the model complexity will need to equal Hollywood production CG.

This is the direction we're going in for the home market, and I do think console gaming will always have a demand for 'spectacle' gaming destined for big teevees with big controllers. Whether it's teenagers, college boyz, or straight up income disposers living the dream - it'll be around for a long old while.

This generation approaching though...really a transition period of consoles, I have no illusions about it and I really do feel the next next generation will be 4K, or higher, whichever "K" has been chosen as the international standard for home entertainment on the new teevee technology panels.

Kinda weird isn't it; that producers of panels like AU Optronics are the ones who will dictate what and when future games consoles will appear.

AK

This space for rent.

Rain6639
GoonWaffe
Goonswarm Federation
#17 - 2013-01-27 22:35:20 UTC
the first time I saw 1080p was 1993 at the world fair in Korea. 20 years later it is commonplace to see 1080p at your local electronics store. that's the timeline I expect ultra HD to follow. it might be a little long, since the work of developing flat panels and "true black" RYB is done, but think of all the hardware throughout the television and gaming industries that has only recently been put into use to support 1080p.
Ishtanchuk Fazmarai
#18 - 2013-01-28 14:58:12 UTC
As time goes by, I am becoming a kind of hi-tech luddite, and can't but wonder what's wrong with PC so everybody wants to hit a hundred billion in sales with something else. Portable devices are a solution for a non-existant problem.

Why would you want to do the same you do with a desktop PC at yor home/office, just elsewhere, anywhere, anytime, for 80% the price tag, with 50% the performance and 1% the chances to fix it yourself when **** happens?

I can get laptops, they're a professional tool when you need almost a PC away from office, but, pads and smartphones and so? They can do some things, and maybe do them better than a PC, but, the greatest virtue of PC is that they can do anything. Budget and software are the limit, not functionality (pads with a 21 inch screen?) or hardware (my cellular is 6 years and i reload its battery each 8 days...).

Roses are red / Violets are blue / I am an Alpha / And so it's you

Eternal Error
Doomheim
#19 - 2013-01-28 15:23:33 UTC
Irrelevant? No, because plenty of people are still going to buy them.

Obsolete? Yes.
Riedle
Brutor Tribe
Minmatar Republic
#20 - 2013-01-28 15:45:19 UTC  |  Edited by: Riedle
The biggest impediment to Ultra HD is the delivery of the signal. Even a pure fibre to the home will have difficulty transmitting it efficiently.

All said, if HD took @ 20 years to reach product maturity then I am predicting 10 years for Ultra HD.

You can call it the Riedle Paradox.

:)
12Next page