New Xbox Revealed

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Kantham

Fool.
Sep 17, 2004
18,034
2
38
+1 to the M$ cash cows tardness.

[m]http://www.youtube.com/watch?v=Ot9SYHVdXTk[/m]
 

Nemephosis

Earning my Infrequent Flier miles
Aug 10, 2000
7,711
3
38
It's a lot easier to avoid a shitstorm than to clean one up. I guess Microsoft just felt they could walk all over people and we'd just bend over and take it, and they were just told no.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
PS4 is going to be $399. It's the same scenario as last generation just reversed.
 

Benfica

European Redneck
Feb 6, 2006
2,004
0
0
I just hope that the AMD chips are released for the PC market and we can get CPU & GPU on the same package (hence using the same memory pointers) where neither do suck.
 

Carbon

Altiloquent bloviator.
Mar 23, 2013
557
10
18
I just hope that the AMD chips are released for the PC market and we can get CPU & GPU on the same package (hence using the same memory pointers) where neither do suck.

This isn't going to happen for a while yet and we should be careful what we wish for. I would rather be able to upgrade my graphics card when I run 2 4K monitors than have to buy a whole new CPU package as well.

As you said, unified memory architecture needs to come first - get rid of the PCI-E bus completely which means getting rid of the graphics 'card' as we know it. Until UMA comes, there will be no competitive discreet graphics; at least not anything that can compete with an add-in card. Discreet graphics are still in the terrible infancy stage where all they say is 'no'.

The SKU AMD are using are not for public consumption and that is fine, seeing as when the consoles do finally arrive, they will still be less powerful than a good PC and will be completely buried within one cycle. Then again, for the floundering AMD, the chips will probably be a huge leap forward. Ho, ho.

We won't be seeing the dominance of discreet graphics on the PC for a while, that is for sure. It will be fine for the consoles, as they will be hemmed in by all manner of protocols by which the PC isn't restrained. The PC is and always will be more powerful than a console.

Leave discreet graphics for mobile space and just figure out UMA using a swappable graphics chip. That will be sweet. I bet Nvidia thinks so as well. ;)
 

Benfica

European Redneck
Feb 6, 2006
2,004
0
0
This isn't going to happen for a while yet and we should be careful what we wish for. I would rather be able to upgrade my graphics card when I run 2 4K monitors than have to buy a whole new CPU package as well.

As you said, unified memory architecture needs to come first - get rid of the PCI-E bus completely which means getting rid of the graphics 'card' as we know it. Until UMA comes, there will be no competitive discreet graphics; at least not anything that can compete with an add-in card. Discreet graphics are still in the terrible infancy stage where all they say is 'no'.

The SKU AMD are using are not for public consumption and that is fine, seeing as when the consoles do finally arrive, they will still be less powerful than a good PC and will be completely buried within one cycle. Then again, for the floundering AMD, the chips will probably be a huge leap forward. Ho, ho.

We won't be seeing the dominance of discreet graphics on the PC for a while, that is for sure. It will be fine for the consoles, as they will be hemmed in by all manner of protocols by which the PC isn't restrained. The PC is and always will be more powerful than a console.

Leave discreet graphics for mobile space and just figure out UMA using a swappable graphics chip. That will be sweet. I bet Nvidia thinks so as well. ;)
I don't see the industry getting rid of the PCI-e 16x slots for dedicated graphics cards and gamers.

I'm just interested in running OpenCL accelerated software where I don't have to copy memory buffers or even access remote memory across the PCI-e bus. The gains are substancial on this kind of applications. I swear I saw a benchmark instance where an integrated combo was beating another much more powerful due to much lower latency. Of course, if the new AMD jaguar devices come with shitty CPU or GPU when compared with an Core i7 + ATI 7970, that's a moot point.
 
Last edited:

Carbon

Altiloquent bloviator.
Mar 23, 2013
557
10
18
I don't see the industry getting rid of the PCI-e 16x slots for dedicated graphics cards and gamers.

Indeed, which is essentially what I said.

I am interested in getting rid of the PCI-E bus but keeping the ability to swap out graphics chips, all utilizing UMA.

This would simply mean getting everyone in the same room as it were to make decisions. It would be cheaper for graphics card makers to not have to make full cards; the GPU would be treated just as another type of CPU (which is where it is already moving, something that started quite a while ago when they became programmable) and the process would involve just producing a chip, not an entire card. This could them be swapped out exactly as a CPU.
 

Benfica

European Redneck
Feb 6, 2006
2,004
0
0
This would simply mean getting everyone in the same room as it were to make decisions. It would be cheaper for graphics card makers to not have to make full cards; the GPU would be treated just as another type of CPU (which is where it is already moving, something that started quite a while ago when they became programmable) and the process would involve just producing a chip, not an entire card. This could them be swapped out exactly as a CPU.
Perhaps, but only within the same brand. For high-end, the industry will never cooperate to accomplish what you're suggesting, like Intel redesigning a memory controller that coordenates with an Nvidia GPU.

In fact, this article suggests a platform close, and is worrying in the long term:
http://semiaccurate.com/2012/12/17/intel-slams-the-door-on-discrete-gpus/
http://semiaccurate.com/2012/12/18/how-intel-can-slam-the-door-on-gpus/
 

Benfica

European Redneck
Feb 6, 2006
2,004
0
0
Oh and DirectX has nothing direct about it. It's horrible. No wonder consoles can often have better graphics than a more capable PC hardware-wise, and often all you get are shitty console ports.
 

Arnox

UT99/2004 Mod Crazy
Mar 26, 2009
1,601
5
38
Beyond
Oh and DirectX has nothing direct about it. It's horrible. No wonder consoles can often have better graphics than a more capable PC hardware-wise, and often all you get are shitty console ports.

I think that's only because they don't have to build a game for a thousand different PC configurations. With a console, it's just one universal system.
 

Benfica

European Redneck
Feb 6, 2006
2,004
0
0
I think that's only because they don't have to build a game for a thousand different PC configurations. With a console, it's just one universal system.
It's more the way it's (poorly) designed. Libraries or APIs (of any kind of programming, not just graphics) can shield the programmer from those multiple PC configurations and allow writing generic boring code when what matters is programmer productivity, yet give him freedom to innovate or program for speed. For example, 95% of the code using the convenience of libraries, but hand-write those 5% of the code that is critical for speed, image quality or new ideas.

It is not productive to write DirectX code, afaik there isn't real direct access to the GPU instruction set, it is demanding/slow CPU wise, it's serves the purpose of locking developers to Windows.

Interesting articles:
http://www.tomshardware.com/news/API-DirectX-11-Shader-Richard-Huddy-PC-gaming,12418.html
http://www.tested.com/tech/gaming/2036-is-directx-holding-back-pc-game-development/
 

HugoMarques

☆☆☆☆☆
Dec 14, 2010
612
0
16
Portugal
Remeber Microsoft's FUD campaign against OpenGL back in 2003 and how everyone stupidly bit it? And since then M$ has been able to program their DirectX to be slow as fuck so their piss-poor console can compete with PCs several times more powerful?

I know that an API has to be compatible with X number of different configurations, but it's not the primary reason why currently PC GPUs struggle so much with games that run equaly on 10-times-less-powerful consoles.
 

Renegade Retard

Defender of the newbie
Dec 18, 2002
6,911
0
36
TX
Visit site
No More Kinect Requirement

http://www.tomsguide.com/us/kinect-microsoft-console-xbox-one-gaming,news-17348.html

Eventually Microsoft caved in to consumer backlash about having a sensor watching their every move even when the console is supposedly sleeping, so the company said okay, the motion detecting sensor can be turned off in the system settings.
Now Microsoft's Marc Whitten has come forward to explain that if Xbox One owners unplug Kinect, the world won't stop spinning. "Like online, the console will still function if Kinect isn’t plugged in, although you won’t be able to use any feature or experience that explicitly uses the sensor," he said.

He also said that Kinect will be totally off when the user switches it off in the settings – there's (supposedly) no secret sleeping mode. "You have the ability to completely turn the sensor off in your settings," he added. "When in this mode, the sensor is not collecting any information. Any functionality that relies on voice, video, gesture or more won’t work."