Now that I am awake with caffeine in my blood let me share what I think is happening. Long post in coming for TL;DR folks...
Back in the day before all these fancy consoles in the market and the general notion of console gaming we had PC games and simple consoles Nintendo and Sega systems. In that time almost everything was exclusive, PC had their own games, Sega had their own, and Nintendo had their own. Zero to no cross platform gaming.
With conditions like that, developers of PC, Sega, and Nintendo games focused on a single product for a single system. And if you all recall games back then were solid for what ever system you got it for, rarely did you encounter bugs.
Also in that time period (or a bit after it)PC's were pretty high tech but in a way almost standard, meaning most people had the same amount of ram, processor and video card hardware. (I could be totally generalizing this right now but just speaking from example, as my friends and I growing up all had similar rigs, now we are all over place in terms of parts and specs)
With similar specs between PCs (with some being a little lower and higher end than the standard) PC devs could easily make and test a game with a given standard and be sure that it works for lets say 80-90% of consumers.
On the flip side at that time with the consoles that were around (I think the gensis had died off near the late 90's) the developers had it easy, they can make a game and be sure that it works to a certain degree perhaps closer to 95-98% of the time because they knew that the consumers all had the same machine with the exact same specs in their living rooms.
Now let's move forward in time. PC modding is pretty much rampant, so many video card options, mobo options, various amounts of RAM 2GB 4GB 6... and in console land we get the xbox 360 and PS3.
At this point developers can place a sure bet that any game they make will work to a higher degree on console than someone's modded rig, because like before all the consoles have the exact same hardware. Given that they know the consoles have the exact same hardware they know what the hardware can and can not do, and build their games around it.
However, there is something new that has emerged over the years that did not exist as much back in the day. Consumer demand (or publisher demand) for cross platform. Let's say a game comes out on console A. It's good, really good. Console B owners say, we want it too. So it get ported. Then down the road the PC guys that don't own console A or B say, wait we want this awesome game too, so it gets ported a third time.
Now this is where I think the break down occurs in modern game development. When a game is made on 360 first (for example) the devs know what they have to work with in terms of memory, graphics optimization, controls, etc. It's standard. Now if that game needs to to go PS3, it can be done but some things may need to be tweaked, to get to run on the PS3 architecture but again for the most part that port will follow some standard rules to make it work on the PS3. But wait...PC gamers want the game too!
In this day and age there are hundreds(okay that may be a stretch) of possible PC configs. Now a dev house has to go through the process of porting their game to various different settings, low end machines, high end machines and everything in between. Doing this does leave the door open for chance of bugs and performance issues to be more prominent than consoles.
So what does this all mean? You may ask why not make a PC game from the get go and forget console gamers. Well in today's market, the PC gamer priority is on very few developers minds (perhaps only Blizzard and Valve). Yes there are people like me that would love to make the game as awesome for the PC gamer as possible, but sometimes in any dev house the powers that be, shareholders (if you are an activiblizision or EA) want to get the most bank off their games. And how do you do that? Make the game for a console, a standardized gaming market, where all the consumers have the same setup.
Now some studios still pull the exclusive card, but again only on consoles and a little bit on PC (like Starcraft2) And guess what, that gives the devs even more power to focus their time and effort for making the game as awesome as possible for one system.
However, when games go multi platform, issues can arise due to different architectures. But more often than not from the games I have played on PC I can clearly tell it is a console port. Why? Well devs sometimes cut corners to make the game on PC just for the sake of being on PC. Perfect example, recently I played Transformers: War for Cybertron on PC. In that game the clear sign of console port was the lack of functionality to change controls and keybindings! Seriously that's ****ed up! But I see why it happened. Transformers on console don't need key bindings or control modifiers, it's all standard, everyone has the same controller. So be it laziness, or just cutting corners to put the game on PC just to have a PC copy stupid things happen.
Now look at Black Ops, on 360 that thing has great frame rate, and with matchmaking MP minimal lag. Again, the devs know everyone has the same 360. But on PC, there are so many configs people have, it's hard to account for the possibilities. Sure, a minimum spec chart is released for most PC games, but sometimes **** happens under the hood of the game which makes it not work with the PC you have. And to take the time to setup multiple different rigs costs money, compared to getting the game to run on standardized console.
Which leads me to another point. Perhaps for another discussion but when games go console exclusive, like Gears, Uncharted, God of War, people complain that it should also be on the console they own or even on PC. But from a business view, going exclusive like games back in the day, only exclusive to Sega or Nintendo, gives the devs a better chance to make a better game for a standardized system without having to sacrifice time and money to make it work for other systems.
So does the game industry treat us like beta testers? I say no, what I do think is that the advent of multiplatform gaming has caused more issues for consumers and developers. Now it's up to developer to take the time to consider all the possible edge cases for multi platform development but to what cost? That's a question dev houses need to answer. Do they want to take the time to make sure the game runs perfectly on all systems or do they cut corners to make it work? Or just take the route of old school gaming and keep things exclusive.
However, I do agree to one point. If you are going to make a multiplatform game or even an exclusive, please make it work to a degree where at least 85-90% of ones consumer base is happy with the end product the dev studio put time and money into making!
If you got this far, I salute you. Hopefully this makes some sense. The lunch break in the middle may have derailed or extended this rant beyond it's normal means of getting a point across.