Would be quite curious to see the rationale, honestly. What I've stated is not really opinion, but is largely accepted by the gaming community, as a whole. Consoles have the mystical "eight-year lifespan", and it is not until a third or halfway through that console, do you start seeing the technological limitations pushed. That extends to first-party games, too.
From a development perspective, multi-platform games are generally written to maximize "core commodity", which is to say that as much code that can be reused, is reused. That is a direct attempt to minimize defects, and increase delivery rates. It comes at the cost of having a minimum and maximum configuration that you can support. In the PC world, the PC generally exceeds the maximum, which is why it is a moot point - but not in the console world. You can have platform specific code that enables some levels of optimization and is tuned towards the console, but effectively, you are trying to port the smallest amount of code between consoles, as physically possible.
We already know that games in this generation can run at an equal or higher framerate, with equal or higher resolution, and better visuals. It is the exact same reason that you are seeing developers announcing cancellations for the previous generation, even recently. Dying Light is just one example of that in the news recently.
English
-
Basically you have an understanding of how this works. So you have already determined how your general PC game is designed for current high end component specs IE graphics cards/processors/etc, but clearly able to run on mid low end components specs of the release with reduced graphic qualities. This is employed in almost every new and old game you have ever played for PC it called specification customization. It currently being employed in Destiny, Bf and any other quad console game. It currently runs unannounced to the player. Honestly if they say the old console held them back the truth would lie more along the lines as they were lazy and cut corners, but what you have is a game designed for next gen and reduced graphic quality to suit the last gen consoles.
-
Correct - but I still do not see how Bungie lied. As a game developer, you would have to make concessions closest to the core, and the core is the game engine. There is no advantageous pursuit in writing an engine for each console, when you instead code it to your lowest common denominator, with the intent of improving it in subsequent versions. Ultimately, you take code that is target-specific, and remove that from the core aspects - compartmentalizing code so that it has no impact on other systems, negative or otherwise. This is not just a concept in game development, but a general concept in multi-platform development, as a whole. Write as much code for reuse as possible, even if that limits what you can achieve, and focus on the overall experience. As platforms become irrelevant (or in the case of the real world, where the platforms are generally the OS and they evolve), you adjust what the minimum acceptable target is, and optimize towards that. Core concepts, though, are too low-level to general abstract on a platform-by-platform basis. The cost of manpower to develop and maintain that would break most shops - and you would end up with differing implementation with differing bugs, differing release paths, and an overall difficulty in coordinating longer term milestones. I write software for a living, and have for nearly two decades. Part of that time was spent tinkering with game development (although I specifically chose to not involve that in a career path), and I do have friends that still work in the gaming industry. Developing for the highest potential target and dumbing down code is not what most shops would do. It is a bad practice, and it only leads to more issues than it would ever solve. In fact, that alone would be enough of a warning sign that the shop has absolutely no clue how to develop a multi-platform product, and I would run (not walk) to another shop, personally.