Wednesday, 25 May 2011

Tech Analysis: LA Noire (360 vs PS3)

Like with Quantic Dream's Heavy Rain, Rockstar Games and Team Bondi's LA Noire is more of an experience than a raw videogame. But unlike the former, it actually manages to become a far more involving affair, with a decent script and excellent performances breaking through the many mundane – or simply slower, more realistic if you will – parts of being a rising detective.

The key component in making this happen, is the game's use of a costly, but incredibly advanced way of capturing facial animation. Team Bondi uses a digital scanning solution, called MotionScan, which in effect records the actor's performance and the creates a 3D animated model for use in-game. Of course, there's more to it than that; the model has to be compressed and scaled down in terms of complexity to fit into the memory and processing requirements of both consoles. But the end result is still the same: the most believable facial animation work seen in any videogame to date.

LA Noire was originally a PS3 exclusive release, only switching platforms after Rockstar decided that a multi-platform only future was the way forward. As such, the core game engine has been optimised for the Sony platform, with some effects looking better on the system, though in some cases we actually find the 360 code to be mildly superior in this regard. On the whole Team Boni have delivered an excellent example of solid multi-platform parity, delivering an open-world experience which is accomplished on both formats, but that does favour the PS3.



In past Rockstar titles – namely GTA4 and Red Dead Redemption – we found that memory bandwidth, along with vertex shading performance was a key concern in getting the engine up and running on both systems without compromising on the graphical look of the game. While 360 owners received native 720p with 2x multi-sampling anti-aliasing with it's framebuffer, PS3 gamers were treated with a sub-HD resolution and quincunx AA instead. The result was a blurrier overall look with more in the way of aliasing issues and lesser amounts of visible fine detail.

For LA Noire things are very different. Sure enough, on the 360 we find the very same 720p plus 2xMSAA framebuffer we've come to expect, but the difference here is that both versions now operate at 720p with the same amount of edge-smoothing. The level of sharpness and clean edges is now equivalent across the board, with crisp and clear image quality overly apparent, sans for some deliberate blurring due to some purposely placed depth of field effects.



That's not to say there aren't still issues with LA Noire's presentation. Firstly, while some smaller details – such as the larger telegraph wires for example – are thankfully devoid of any shimmering or jaggies, the same cannot be said for thinner wires or small elements of scenery dotted throughout the environment.

Secondly, the game is filled with various shimmering shadows which flicker and clip out of view on many an occasion, meaning that although image quality is fairly decent, it's far from perfect overall. Given the open world nature of the game and the high levels of detail on offer, this of course was only to be expected. And there are times when this level of detail does benefit the presentation – it helps create a believable world in which to immerse the player.

LA Noire's world it littered with little touches just waiting to be noticed; detailed geometry on the fronts of buildings, the intricate nature of the power lines roving across the city of LA, various smoke and particle effects, the heat haze present on a muggy morning sun rise, and not forgetting the interactive plants and foliage. Not everything always looks polished or impressive, but these small touches are very welcome nonetheless.






So with that said, it's a pretty accomplished game engine overall, with plenty of detail apparent in both versions. By the looks of things Team Bondi have optimised their engine around the PS3's architecture but without neglecting the 360, thus enabling both versions to look identical from an art standpoint, with only a few technical differences between them.

Occasionally we see what appears to be lower resolution, or less detailed textures on either format at different points. Although, it seems that this is a simple LOD/streaming issues with regards to mip-maps and nothing more. Sometimes we see higher quality assets loading in on the PS3, but not the 360 and vice-versa.






In terms of overall LOD set-up however, the PS3 version commands a small lead over the 360 game. Here we see environmental objects – such as buildings, trees etc – popping in later on the 360, along with texture detail and shadows. By contrast, on the PS3 they usually load in a little quicker, with far away details often being visible from longer distances.

Additionally, the use of a depth of field effect also helps in masking LOD issues on both. In some scenes we see DOF is stronger on the 360, hiding object pop. While in others the two versions appear identical. We also see that some foliage and trees being slightly more detailed on the PS3 too, regardless of LOD.



Contrary to some early reports, SSAO (screen-space ambient occlusion) is indeed present in both versions of LA Noire, however the overall impact of the effect throughout is quite different. SSAO adds noticeable depth to scenes in both versions of the game, although the effect is stronger on the PS3. On the 360 the intensity seems to have been reduced, along with the radius of the shadowing produced, thus making it stand out far less than it does on the PS3.



In scenes which appear quite flat and lacking depth, the SSAO in LA Noire generally favours the PS3's implementation – especially on the characters. Elsewhere, and on the environments, it can be far too obvious, appearing less realistic as a result. It's also quite buggy in both versions, prone to a few artefacts in motion. For example the shadowing that SSAO produces often appears as a floating halo around objects, creating an unnatural glow where ambient light occlusion is simulated, in addition to occasionally flickering or popping in and out of view. Both versions suffer from this, but this is more noticeable on the PS3 due to the stronger use of the effect..



Shadowing on the other hand, looks nicer on the PS3 (although wholly subjective – some may prefer the 360's dithered look). While actual shadow resolution is the same across both formats, constantly changing depending on the distance from the player, they way in which they are filtered is not.

Essentially, we're looking at PCF filtering on both versions respectively. However, edges are plainly filtered on the PS3, thus preserving sharpness at the expense of some jittering. While on the 360 the penumbras are dithered in order to create a smoother appearance. Although, this means edges suffer from an apparent graininess, albeit without less jittering side effects. Ambient shadowing on the characters can also appear a touch grainy at times too.

From a distance the shadows in general do look a tad nicer on the 360, but up close and the opposite is true. Here, you'll find that the graininess of the 360's shadows look more than a touch more unsightly at times.






There are a few other differences between both versions of LA Noire, but nothing that you'd be able to notice whilst playing. Lighting for example isn't a complete match in all like-for-like situations.

In some night time scenes it looks like there are some light sources have been positioned a little differently on both versions, though not all the time. Whereas when indoors, the PS3 seems to occasionally benefit from stronger use of lighting. The stronger use of SSAO also creates a darkening affect on lighting in any given scene, which accounts for some - but not all - of the oddities we've been finding.



We also find that reflections on some shiny surfaces are sometimes stronger on the PS3 – the intensity of the effect appears to be lower on the 360. As a result it looked like reflections were absent in a few places, when really this wasn't the case at all. Although saying that, none of the above lighting differences are present in every scene, and as such are more technical curiosities than anything else.

On the whole, there's a real sense that LA Noire is very, very close on both platforms and that the differences aren't exactly going to be deal breaker for owners of only one system. Sure enough, some parts of the game's graphical make up can be pretty inconsistent at times. But Team Bondi have created a nice looking title, that while far from perfectly polished, has a lot of soul and character whilst also holding up rather well technically.



Moving on, and LA Noire targets a 30fps update, with the frame-rate being capped at that level. Compared to previous Rockstar titles, performance is much improved on both consoles, but especially so on the PS3.

There are still some similarities with GTAIV and Red Dead Redemption. Like with those titles the PS3 version is v-synced at all times, while the 360 game isn't. As such we see some subtle screen tearing appearing at the very top of the display, although this goes by virtually unnoticed. Tearing only really occurs during gunfire and hectic scenes, or in scenes whereby gunfire results in some environmental destruction, so isn't an issue at all.

Moving on to frame-rate concerns, and we find that performance is better overall on the PS3 game. Both start out running at the targeted 30fps, and both drop frame-rate heavily in stressful scenes with lots going on. However, for the most part it is the PS3 version which copes better in these situations, dropping less frames and for shorter periods. In some shoot outs – and in the on-foot chase sequence near the beginning of the game – we see smoothness take a dip, with frame-rates in the 20's on the both formats. But the PS3 version in the same scenes operates between four to six frames higher, looking and feeling more fluid as a result.

Of course in similar scenes elsewhere the opposite can also be true on occasion – the 360 game can see higher frame-rates than the PS3, though the differences aren't always as pronounced. Here it seems that environmental factors (crowds, cars, scenery etc) are the main cause behind this. Like with most open-world titles, load strongly dictates the level of performance at any given time. Cut-scenes for example, set in more detailed areas of the game can struggle to reach the desired 30fps, whilst indoors things are far more stable. The same could be said with regards to driving and on-foot sections, where the more pedestrians, cars, and scenery on screen causing frame-rates to fluctuates.

In any case, performance in LA Noire is far more solid on the whole compared to both GTAIV and Red Dead Redemption on both formats – especially on the PS3, whereby Team Bondi's optimisations for the platform result in a smoother overall experience, but without pairing back on any details. The PS3 version features slightly longer draw distances and LODs, but also manages to perform more consistently across the board. Although, as good as that sounds, both versions are equally as playable, and we do often find a pretty consistent 30fps update during play, which is the main thing really.



Beyond the close multi-platform conversion between PS3 and 360, the developer's biggest achievement comes with the inclusion of a stunning facial animation system, which provides perhaps the most uncannily realistic, and downright believable looking characters we've ever seen in a videogame.

Actors performances are digitally scanned in 3D, and detailed, animated models are created automatically from this. These then have to be paired back significantly to work within the technical budgetary constraints of the home consoles; geometry complexity is hugely reduced, and in it's place large amounts of multi-layered normal maps are used, in combination with various textures, which are then blended to deliver the final in-game model.



As you can see above, the end results are stunning to behold - intricate muscular movements are carefully handled with ease, subtle changes in wrinkle and skin distortion are present as facial expressions change, eyes and lips move and are perfectly in sync with dialogue. Essentially, the actors original performance is recreated extremely accurately in 3D, and it works very well in bringing about a real sense of immersion to the proceedings.

However, the downside of this technique is that facial movements cannot be tweaked by hand (there's no animation rig to modify), and additional skin texture details and shaders are troublesome to add without breaking the seamless look created by the original scan. As a result there are no advanced shaders present at all – facial models simply consist of various normal maps, textures, a colour map, and a phong specular map – which leaves them looking a touch flat compared to hand crafted creations.

When compared to the limitations of traditional motion capture, the use of digitally, 3D scanned performances outweighs any negatives that go along with the technique. The lack of any additional shaders, or even environmental reflections on the faces hardly takes away from the overall look of the game. Instead, the slightly plain, washed out appearance of the characters fit in perfectly with the hazy 1940's art style the developers are going for. In effect by far the most important element here is the believable performance being captured and displayed correctly, it is central to the core experience.

On the other hand, the rest of the character animation has been motion captured the traditional way, before being touched up by artists for use in the game. As such, we see some odd mismatches and errors with the way in which the facial performance and bodily animations actually blend together. For the most part the two are pretty seamless, but sometimes it can be a little noticeable that there are two different systems at work, especially when the end result isn't quite as perfect across the entire range of animations on the characters.

This is of course very minor in the grand scheme of things, and has no impact on your enjoyment of the game itself, nor does it spoil any of the superb acting and capturing work done throughout. Like with the rest of LA, Noire Team Bondi have done a great job.



Looking back at past Rockstar releases, and it's pretty obvious that the 360 has been the main focus throughout development, with both Red Dead and GTAIV heavily build around specifics which favour the 360 as a platform - long draw distances, plenty of intricately detailed scenery, and loads of alpha-based effects, all of which are dependant on having huge amounts of memory bandwidth available, along with a hefty amount of vertex shading capability.

Using a custom engine set-up, Team Bondi have delivered a title which does the opposite. It has been carefully optimised around the PS3's strengths and weaknesses, thus not only resulting in far better overall performance, but also a superb multi-platform conversion that is incredibly close on both formats while favouring Sony's system.

There really isn't much in it all. There's a real sense that SSAO is perhaps more balanced and realistic on the 360, but other areas of the game – LOD, shadowing, performance under load - slide tangibly towards the PS3, albeit subtly in most cases. As such, both versions of LA Noire come highly recommended, but for owners of both consoles the PS3 version provides a touch more polished experience overall, and thus is the one to get given the choice.

In the end, LA Noire's long term success will largely depend on how players react to its unique blend of streamlined sandbox-style gameplay, and more tightly controlled/scripted story progression that is paramount to the experience as a whole, just as much as the lavish trimmings of 1940's LA and the key graphical talking points – the actual performances themselves, and their superb recreation in computer rendered form.

Either way you can't deny that Rockstar and Team Bondi have offered up a compelling ride that blends the core elements of film with the interactive parts of a traditional videogame, and does so without any overly self-aware, arty undertones.

Saturday, 7 May 2011

Tech Analysis: Gears Of War 3: Multiplayer Beta

Gears Of War 2's multiplayer component, while highly entertaining and incredibly well put together featured some obvious latency issues compared to other high profile online titles. Some matches would go by without issue, while in others you would struggle to maintain aiming accuracy as controller lag impacted on the experience. Since then Epic has listened to these concerns, and for Gears 3 have provided the community with dedicated servers helping to reduce latency to negligible levels.

The gameplay has also been completely overhauled too, now catering for different play-styles via the use of what appears to be a balanced weapon selection (at this stage at least). Close or long ranged weapons part a huge part - often in providing a counterbalance - with players being able to switch between either option at any time. Here, you'll find that someone with a Retro Lancer to be just as deadly as someone with a Sawed Off. It just depends entirely on their own ability and approach in any given situation. The game modes too, have also been reworked and made to be a little more accessible to those unfamiliar with GOW's own brand of multiplayer action.

Of course gameplay and graphics often go hand in hand. The Unreal Engine 3 and Epic's Gears Of War series have been at the very heat of pushing high-end visuals on Microsoft's Xbox 360 console. Not only has the engine been behind some of the most graphically superb releases on the platform, it is also constantly undergoing a range of subtle tweaks and major enhancements.

Gears Of War 2 saw the engine being upgraded to handled bigger environments with longer draw distances - texture and object steaming was refined to prevent noticeable object pop in the campaign mode. Epic also expanded on the lighting component present in the first game and SSAO brought an increased depth to the title's already decent use of shadowing.

Even in it's current stages as a mutiplayer beta, Gears 3 demonstrates similar leaps in visual quality, with upgrades in a variety of different areas. Perhaps the framebuffer is the only thing exempt from this, but in any case image quality was never really an issue to begin with.



For the record, you're looking at a native 720p rendering resolution forgoing the use of anti-aliasing. In past titles, AA was performed pretty early on in the rendering cycle, meaning that it had pratically no impact on the final scene being rendered. Sampling was only done on static objects and lighting, thus additional post processing elements - such as depth of field, bloom etc - and dynamic lighting weren''t covered by the limited use of MSAA at all. So as a result Epic has disabled the use of edge-smoothing completely, perhaps in order to maintain more consistant performance while the game benefits from a range of graphical improvements.

The lack of AA means that we find a variable level of jaggies and shimmering edges. Stages featuring copious amounts of sub-pixel elements suffer the most, whilst areas with mainly large geometric structures fare much better. This is of course in line with most UE3 titles, and in the case of Gears 3 can be considered a worthy trade-off considering number of HQ components in the game's rendering tech. It's also worth pointing out that aliasing is less noticeable overall than it was in Gears 2.


Looking back at past titles, it was the single player campaign that trumped any of the multiplayer modes graphically. And it's very likely that we'll see a similar kind of visual leap again with the campaign in Gears 3. The current beta exceeds past instalments in most areas, which leads us to believe that Epic have even more visual mastery packed away under the hood.

Compared to Gears 2, the various upgrades come in thick and fast. At the base level we see more detailed character and environment modelling, both in texture work and in geometric complexity. Little cracks and other small details that feature on the surrounding environmental stonework are more distinct than before. In particular, the depth of the parallax mapped floors have even more of an impact, even if they do look just a little too OTT as a result.

There a few obvious low resolution, and repeated textures dotted about, along with some rather flat looking surfaces. But these do very little to spoil the over look of the game considering how detailed the environments actually are. The leap in quality is huge compared to Gears 2's multiplayer mode, and even exceeds the single player campaign portion of it with ease.

Additionally, there is little in the way of harsh LOD transitions at the beginning of each match. In Gears 2 there was a tendancy for the engine to still be streaming in final quality assests for at least a minute or two after matches had begun. But thankfully, we see that this issue is pretty much non-existent in Gears 3.


The biggest change however, comes in the form of the game's lighting system. Like with the likes of Crysis 2 and Battlefield 3, Gears 3 features its own implementation of global illumination via Epic's Lightmass solution, which provides a cheap alternative to the few real-time GI systems doing the rounds right now (Geomeric's Enlighten for example). Ambient lighting, along with the main light bounces are rendered off-line before being pre-baked onto the environments, thus providing a similarly realistic look but without the raw processing costs usually associated with doing this.

Dynamic lightmaps ensure that the lighting data actually affects moving objects in real-time, rather than having no impact whatsoever. Given the fact that the sun's position never changes, by far the most important factor is believably lighting up the various dynamic objects present in any given scene - lighting that never moves can be convincingly baked, so long as it still affects environmental objects in real-time. And the implementation found in Gears 3 manages to do a excellent job in this regard.


We also see that sunlight now casts dynamic shadows off select objects, further enhanced when combined with the game's pre-baked shadowing components. Other elements that give off light - such as gunfire and explosions, burning parts of the scenery etc - react with both the environment and the objects contained within. Although they did this in Gears 2 as well, the effect is more refined in this sequel.

Even if the baseline environmental lighting is static, the combination of dynamic lightmaps and additional light sources help breathe life into the way in which scenes are lit and shaded. The difference is dramatic to say the least; there's a whole lot more depth added to the scene over the previous games.



Outside of Lightmass, and there are a few other nice touches that Epic have made with the lighting engine in Gears 3. A probable FP10 buffer, and a controlled use of bloom substitutes true HDR lighting without over-emphasising the top end of the spectrum, while the game now features stronger use of simulated godrays (they are simply a post process-based effect). As you can see above, in some stages sunshafts dynamically react with the environment, while in other areas they are more static in nature.

Additionally, the way that SSAO is handled has also been tweaked slightly. SSAO is done as a post process in both GOW 2&3, with it being reprojected and accumulated over multiple frames. The end result is that it could appear more noticeable in still scenes in GOW 2 than when moving. In Gears 3 however, the temporal side effects have been reduced, and there is now greater consistency in deploying the effect throughout the scene.


In terms of alpha-based effects, smoke in particular features convincing depth being heavily multi-layered, while also appearing volumetric. Particles, fire and smoke look much fuller than they did in Gears 2. The way in which they interact with the surrounding geometry is almost bug free - you can see clearly in the shots below how the smoke collides with the nearby walls, rather than simply clipping through them. Alpha buffers are also rendered in full resolution, taking full advantage of the high bandwidth eDRAM present in the 360's Xenos GPU.



Moving on, and when looking at the online element in any game the importance of having a consistent frame-rate is paramount. It's not so much about ensuring you have a visually smooth experience (though that certainly helps), but instead one with as little controller latency as possible. Essentilaly the two do go hand in hand, but with previous Gears games this was limited by the lack of dedicated servers, thus introducing additional latency outside of the game slowing down.

Although it's likely that we'll see greater variation in performance during the single player campaign, due to the engine pushing much larger environments and more stenous action, in the beta Epic has done well to keep things under control in order to deliver as little input lag as possible. And this is something that has further been addressed by the use of dedicated servers. In the beta not everyone will be on dedicated servers at all times, although performance on these alternate hosts is improved compared to the ones powering GOW2's online component.



Performance-wise, Gears Of War 3 operates at 30fps and employs v-sync in order to maintain image consistency. However, we also see that v-sync is disengaged when the frame-rate drops below 30fps in order to maintain a steady level of performance in scenes which stress the engine. To that effect there is some barely visible tearing that manifests by crawling up and down the screen, which can frequently occur during gunfire, but its impact goes reletively un-noticed for the most part.

While the game manages a stable frame-rate for extended periods of time, there are a few drops in smoothness to speak of when the engine is put under stress; mainly in scenes whereby multiple players are present and lots of gunfire and explosions create additional load. But outside of these scenarios there is little in the way of slowdown to impact on the experience, which also ensures a crisp controller responce is maintained as often as possible.

Also, like with many games this generation (Alan Wake and Vanquish both spring to mind) GOW3's smooth refresh rate is further enhanced by the inclusion of motion blur. Going back to the first game and the Gears series has always used a vector-based motion blur implementation, but this is now handled on a per-object basis. As you can see below, the effect varies in strength - ranging from rather subtle to in your face as it were, producing a nice screen distortion effect in the process - but also serves to seamlessly blend individual frames together to create a more fluid screen refresh.



Overall, even in its unfinished state as a multiplayer beta, Gears Of War 3 is clearly shaping up to be another noteworthy graphical showcase for the 360. Sure enough, the real test of course, will come in the form of the single-player campaign, in which larger environments and distinct set-pieces are likely to push the engine a lot harder than anything we've witnessed thus far. But for now things are looking rather promising, with Epic successfully using its underlying tech to craft a suitably lavish visual experience. Going back to Gears 2 and then to the beta, and there's a real tangible difference to be felt in the overall quality of what we are seeing.

Coming away from the look of the game and there is also a sense that Epic are trying to further hone in Gears Of War 3 as a distinct online experience, separating it from past titles and from other online shooters in order to keep things fresh. Not only is the gameplay faster, but the close/ranged set-up for example, seems balanced enough for players to compete on an even keel regardless of their play-style, while the skillfully deployed weapon placements ensure that mastery of the maps themselves, and the calculated strategy that goes along with them isn't lost to those more inclined to simply run and gun their way to the top.

Saturday, 16 April 2011

Wii 2 Details Revealed - Round Up

The idea behind remote gaming isn't a new one. While OnLive attempt to bring the concept to the forefront on a grand scale (cloud computing), companies like Sony have dabbled in similar ventures on a much smaller level. With the PS3 for example, you could directly stream videogame content from the unit to the PSP, thus using the handheld device as a portable PS3 controller while also being able to play select games on it as a result. This was a headlining feature touted by Factor 5's Lair, although it goes unused by most developers.

Now a similar concept is said to be behind the very heart of the Wii 2. Nintendo's latest machine will apparently feature a 6" LCD screen directly on the controller - much like the Sega Dreamcast and its VMU display - in which users of Nintendo's new console will be able to wirelessly stream their games directly from the Wii 2 to the system's controller for portable play.

So far, from what we gather this appears to be just for local wireless play at present, although there's no stopping Nintendo from expanding this concept online via a broadband connection. At this point, it should be noted that the PS3 (with Lair) managed to achieve game streaming over wireless broadband regardless of your location. The quality and speed of both your connection and the remote wireless network dictated how laggy an experience you got, with fast connections obviously allowing for far less input latency.

Additionally, Nintendo's controller for their next machine will reportedly feature the standard dual analogue stick, four buttons and a d-pad approach missing on the Wii's Remote and Nunchuck set-up. Other interesting tidbits include backwards compatibility with the Wii (emulation in HD perhaps) and that the new controller's LCD display actually has touch screen functionality. This LCD screen is also considered to be 'HD'. However that doesn't simply imply a native 720p resolution, as the NGP and iPhone 4 has shown.

The company is also said to be targeting the hardcore gaming audience with the Wii 2 after languishing behind in this area with the current Wii and NDS consoles, thus the return to traditional controllers.

As a result sources have been told to expect HD graphics from the console (a given really), and a system that exceeds the capabilities of the PS3. It has been indicated that Nintendo could be using a custom, three core IBM PowerPC CPU, and a ATI R700 family GPU with shader model 4.1 extensions. Curiously, the inclusion of a mere 512MB of RAM seems a little low. We expect that 1GB is far more reasonable for a 2012 release machine. But this is all hearsay and conjecture at the moment.

One thing we do know, is that Nintendo is planning to fully unveil the machine, along with both first and third-party software for the system at this year's E3, with an announcement to be forthcoming ahead of the show.

Sources sighted: GameInformer, Develop, and CVG.

Friday, 1 April 2011

Crysis 2 Gets Custom Graphics Settings

Crytek raised a few eyebrows when they revealed that Crysis 2 would feature 'fixed' graphics modes rather than the option to individually customise certain visual elements of the title. Instead of users having the ability of set levels of anti-aliasing, shadow quality, along with enabling and disabling of various effects (SSAO, colour grading, etc), there are simply three distinct options in place of those - modes outlined as 'high', 'very high', and 'extreme', each featuring a few additional upgrades in certain areas.

However, it is possible to tailor the various graphical components of the game via the internal config file, and for some users that also meant a patch could be developed allowing easy access to all those settings. And that is exactly what has happened.


Wasdie, a member of the MyCrysis forums has developed a GUI that enables the end user to set a wide range of parameters for the game's graphical outlay. These include everything from shader quality to the giving the choice of what kind of anti-aliasing you'd like the to use.

This GUI is installed and then run before the user loads up Crysis 2, and easily labels each of the various settings with the same naming conventions given to the title's three standard graphical modes. In addition it is possible to downgrade the game further still, with access to 'Low' and 'Mid-spec' settings only present in the leaked build of the game.

The most compelling part, is that the ability to now use MSAA with the option checked in the GUI (and edge-AA set to off) further increases the gap between consoles and the PC, with just 4xMSAA delivering a noticeable boost in image quality over Crytek's own temporal anti-aliasing solution. While we're not sure how much this affects overall performance, we do know that Crysis 2's temporal AA solution was relatively light in using up system resources in comparison. And this is definitely something we'll be aiming to test out as soon as we get the time.

Seeing as the range of configurations have now dramatically increased, and that visibly higher IQ is now available for those wanting to 'max out' the game as usually intended on high-end gaming rigs, we shall hopefully be taking a closer look at this recent mod in a future update here at IQGamer. Until then updates on this site might be a little quiet as I'm currently working on the next Face-Off for Digital Foundry. But please, do keep an eye out.

Sunday, 27 March 2011

Tech Analysis: Crysis 2 (360 vs PS3 vs PC)

So here we are, with one of the biggest releases of this year. Crysis 2 finally comes storming out through the gates after a myriad of techinical demonstrations and effects showcases designed to big up the CryEngine 3 to the gaming fathfull. Crytek are masters of producing high-end visuals that require high-end hardware to run. But what about designing the same cutting-edge content to run on what can only be considered five year old, low-end tech?

Well, that is exactly what we're here to find out as we lay out a triple platform tech analysis of the developers latest visual spectacle, Crysis 2. First we begin with the consoles, before moving onto our more direct PC comparison.

Crytek have made it no secret that their CryEngine 3 technology has been made in such a way as to scale between different platforms, each with varying specs while keeping the core components (GI lighting, advanced shader effects, real-time shadows etc) intact. Instead, compromises have been made in other areas, from shadow quality, resolution, LOD, right down to how perameters for each of these components operate.

While PC owners will eventually get the full, untapped potential of the engine (The game only supports Direct X9 at launch), console users on the other hand get a scaled back revision that impressively implements some of the high-end features found only in the upper settings of the computer version. In that respect they get a nicely balanced blend of compromised image quality at the expense of some loverly GI-based lighting, god rays, and other cool touches.

But how does each one fair? Lets get on with it...



As always we start off by taking a look at the framebuffer of both console versions and see how they hold up. While PC owners obviously get a choice of native rendering resolution - regardless of actual hardware specs - on consoles it isn't quite so simple. The framebuffer is restricted by both available processing power and memory bandwidth; both of which are a limited commodity on consoles compared to the constantly shifting nature of PC hardware.

Crysis 2 renders in 1152x720 on the Xbox 360 and 1024x720 on the PS3, with both versions getting the same use of temporal 2xMSAA (multi-sampling anti-aliasing). And as you can see in the opening screenshots above, the two games aren't far off of each other with regards to image quality. The 360 game is a tad sharper owning to less horizintal upscaling taking place, but at the same time the difference can often be barely noticeably in motion, and neither build features the clearer IQ of a true 720p game.

Interestingly, the HUD elements in the PS3 game appear to be upscaled rather than rendered over the final framebuffer. Quite why this is the case isn't exactly known for sure. But, what we do know is that the RSX GPU provides extremely low cost horizontal scaling, and in order to render the HUD after the FB they'd need more memory to do so.



Regardless both versions still look good however, and It seems that edge post-process effects along with the game's use of anti-aliasing attributes somewhat to its soft look. More so on the PS3 by the looks of things due to the additional upscale taking place, but in practice the difference appears less pronounced than in still shots, and thus less impactful in general.

Crytek's anti-aliasing solution - a temporal form of 2xMSAA which appears to be selective in its implementation on surfaces throughout the scene - has little impact in terms of providing high-level amounts of edge smoothing in highly detailed outdoor scenes. But its effects seem to be variable, with some areas - particuarly inside - faring better than others. Crytek's AA solution also works on various parts of the scene a little differently, using depth buffer info and edge detect on close objects.

Also, as a result of a frame blending technique used to create the AA samples from two seperate frames, we see that a ghosting effect is present during movement, like in the demo. Additionally, it appears that the use of frame-blending, along with edge-post process effects tend to blur the image somewhat on all versions of the game.



The killer point about Crysis 2 on both consoles and on the PC, is the inclusion of Crytek's much talked about single-bounce global illumination (GI) lighting system, whereby sun lighting features a singular, real-time bounce that for the most part accurately resembles real-world light occlusion.

This also means that all shadows and light sources are rendered in real-time through the game. The effects of which are outstanding as a whole. Ambient light and shadows are cast, while the main light-bounce creates a depthy, atmospheric look to the proceedings. Along with this we get the usual lens flaire and bloom effects, plus the addition of real-time, 'proper' sun-shafts too. All of which are equally represented on both PS3 and 360. Bar, except for some mild additional light occlusion in places on both the PC and PS3 codes, which usually darken the scene but seem to add a mildly stronger light bounce in places.



Crysis 2 uses deffered rendering in the form of differed lighting passes in order to deliver many dynamic lights onscreen at a lower overall cost than incurred by traditional forward rendering techniques. It is also an easier and more convenient way for artists to light every scene - they don't have to wait for hours of pre-computation in order to see the end results.

However, use of GI also comes with additional costs - namely memory bandwidth and computational power. Real-time shadows and occlusion means more alpha on screen, while having to calculate the lighting bounce on the fly means more processing power needed per-pixel. As a result we have already seen a reduction in the resolution of the framebuffer on each version, but there are many other parts of the game - visually - that have seen soem compromises in order to accomodate what is arguably the most impressive use of lighting in any console game to date.

By far the most obvious of these is Crysis 2's use of LOD and texture/object steaming. While there is a slight sense that LOD has seen an improvement over the 360 multiplayer demo, we still see many objects that pop-in noticeably as you approcah them. For example, foliage transitions between low and high quality assets fairly close to the player, buildings and other geometric objects too are also affected. Often this can look more than a tad unsightly, but sometimes can also go by without much notice - it all depends on just what the engine is rendering at any given time.

In our demo analysis we also mentioned in closing that a pre-release config file mentioned LOD perameters that were similar, or perhaps even identical across both PS3 and 360. Usually, it is the PS3 code that comes off worse in this area, with less memory and bandwidth to accomodate the same levels of draw distance and LOD update as the 360 in most cases. But in the finished retail game we find that not to be the case - bar one or two oddities that only ocasionally stand out in certain circumstances.




'High' settings

The most noticeable difference is how shadows are handled on both formats. Shadow LOD appears to be a tad stronger on the PS3, with some elements either being rendered in much later, or not at all. Self-shadows too also suffer from the same problem.

Additionally, SSAO initially seems to be cut back on in the PS3 game. Notice in the above shot, by the concrete blocks on the ground, that the effect is present on the 360 but fails to load in at all on the PS3. However, this isn't actually the case at all - SSAO in the PS3 game is in fact a closer match to the PC game.

The PS3 code also benefits from some advantages in other areas. First up, and as you can see below, shadows are filtered using a higher quality implementation compared to on the 360.




'High' settings

And secondly, texture filtering has also been given a significant boost, with bias towards certain surfaces giving ground and enviromental textures a cleaner look compared to on the 360.

According to Crytek, both games use dynamic AF (anisotropic filtering), but in the PS3 version we can see what looks like between 2x-8x filtering compared with much lower amounts in many places on the 360 - what looks like about 4x max, from what we can see. Officially, Crytek say that the PS3 game can switch between using 2x and up to 16x levels of AF.



While these differences are perhaps minor in nature, there's no question that the PS3 game's use of AF and higher quality shadow filtering makes a small tangible differnce. Of course, the lower framebuffer resolution partially cancels out the AF - owing to a blurrier image in general - but there's also a real sense that the two versions are remarkably close to each other given the immense task of rendering a hugely detailed environment, and then lighting it all up in real time.

Far more important however, is how they both perform whilst delivering such intricate visual complexity. And in this respect neither are particulaly excellent, with plenty of impactful frame-rate drops reducing controller response down to unacceptable levels. But this is only half of the story. Performance in Crysis 2 is heavily bound by load, with the 360 version leading in a general sense, but with the PS3 one occasionally doing the same in some chaotic scenarios.



Crytek are targeting a 30fps update for the console releases of Crysis 2, and going by the 360 multiplayer demo code at least, they were doing a rather good job of maintaining it, with very little in the way of frame-rate drops and no screen tearing. However, the single-layer mode is vastly different; there's much, much more going on. And all of this has a might impact on how well the engine can cope as a result.

While the results aren't pretty; both versions are at times very close. It's not always the case that one has a distinct advantage over the other, and both have various ups and downs with regards to keeping that 30fps but in different scenarios. Although, it's often the PS3 version that falls short more than the 360, but not always so.

On both platforms Crysis 2 run with an uncapped frame-rate and with v-sync enabled. However neither version hits the targeted 30fps mark for very long and barely goes above this. In fact, they both constantly fail to reach it in many situations, with or without heavy load. In addition, there is a small difference in how both versions deal with holding v-sync: the PS3 maintains this it throughout, while the 360 game seems to drop it very briefly, perhaps in order to maintain a slightly higher refresh. Tearing however, only occupies the very top of the screen and isn't noticeable during play.

As the video suggests, performance in Crysis 2 can be hugely variable; one minute the game is plowing along smoothly at the desired screen refresh, and then, in just the blink of an eye will drop right down to the mid to low 20's for extended periods of time. The main cause seems to be combat situations. Although, not neccesarily heavy combat - slowdown occurs when small, one or two man firefights break out, and increases dramatically when more starts going on.

Thankfully, somehow Crytek have still made the game playable regardless - in practice due more down to having to work your way around situations rather than going through the whole game and treating it as one giant shooting gallery. But seeing as playing it any other way than with ample thought and especially, a good bit of sneaking/enemy avoidance, makes the challenge just a touch too much to bare, so this negates the issue somewhat.

In any case, the fact that the game still throws everything but the kitchen sink into the console code, means that having it run at all in a playable state - even with the above listed shortcomings - is a pretty mean feat regardless. That said, one could easily argue that performance should have been better, with more careful optimisations, and maybe a few more scant cut backs here and there in order to better achieve that 30fps target.



Despite a few shortcomings, including a loss in IQ, some obvious LOD and streaming issues, plus a decidedly unwheldy frame-rate, Crysis 2 is a technical marvel on consoles. Say what you will about those comprosmises, but the fact that the game provides both scale and detail whilst delivering a fully real-time, shadowing and GI lighting system is it self an incredible feat. The game looks abosutely stunning!

Of course, there are smoother games out there, ones which perhaps excel in other particular areas. But on the whole, Crytek's latest shows that in many ways these five year-old consoles cans till handle - albeit at a significant cost - similarly high-end visuals that grace today's top of the range PC's.

More impressive - although not really much of a surprise considering that a earlier config file hinted as much - is the how well the PS3 code actually matches up to the 360 and PC versions. Its basically as good in many ways, with some very small improvements over the 360 game, and one or two slight (but hardly impacting) cut backs, non of which really impact on the game as a whole in any meaningful way.

As to which one you should pick up. Well, it's fair to say that both versions come recommended, but perhaps the 360 build edges it ever so slightly with mildly better image quality and a smoother frame-rate. But really, it makes very little difference, and I personally found aiming a little easier when using the PS3's analogue sticks despite the sometimes heavier drops in framerate.

So, Crysis 2 has successfully made the translation to console, but how well does it hold up compared to the cutting-edge PC game, and is there any evidence to suggest that it is this version that has in fact been downgraded in order to support the consoles? Moving on to our triple format comparison, let's take a closer look.




'High' settings

The first thing to note, is that unlike the console versions of the game Crysis 2 can be made to run in a range of different resolutions natively, and as a result, even at 720p the game looks noticeably sharper than its Xbox 360 and PlayStation 3 counterparts. In addition, there are three distinct graphical settings designed to scale various features in order to gain better performance on lower spec systems, or, to deliver extra visual impact on more powerful, higher-end machines.

Crytek state that the PC game on it's lowest setting 'high' is a good approximate of what the actual console versions look like, but with further tayloring of effects specifically for the computer platform. Although, both console versions actually shape up really, really well in comparison - minus the obvious drop in resolution.

As you can see above, the benefits of the PC game's true 720p output delivers a noticeable sharpness advantage, but we can also see that lighting has been given an upgarde. More light sources are visible with the lighting bounce having more of an overall imapct, casting more shadows as a result.

The other big difference comes in the form of how the PC versions handles LOD - it's far less impactful with details being loaded in much further away from the player. This is down to the fact that on the PC, the game doesn't stream in any assets at all. Instead, it takes advantage of the much larger amouts of memory available on the format in order to avoid doing this, preserving object quality across the entire game.


'High' settings


'Very High' settings


'Extreme' settings

The difference though, isn't alarmingly huge, and there are many aspects of the PC code which look virtually like for like across all three formats, with texture detail being the prime example. In order to really see how far Crytek have really taylored their engine when it comes down to running on mid to high-end hardware, we have to look at the game's highest graphical setting, 'extreme'.

Here we start to see the PC version push ahead with a small wealth of graphical upgrades, although some are pretty subtle in nature and could easily be overlooked given the quality of both console conversions. The most obvious change, is that both lighting and shadowing have both been upgraded. Not only does every light practically cast a shadow, but lighting in general seems to be more refined and accurate in nature. SSAO too has also been given a higher precision implementation, looking a touch cleaner as a result.


'High' Settings


'Very High' settings


'Extreme' settings

Additionally, Crytek's custom temporal MSAA solution offers better coverage across the entire scene, albiet with an incease in image blur. There's less in the way of visible aliasing giving the game a cleaner look on the whole. However, this upgrade is only present when running the PC game in the 'extrme' setting. When playing in either 'high' or 'very high' the game appears to use less samples in order to generate the AA, with similar levels of edge-smoothing as found on the PS3 and 360 versions. But the game looks sharper as a result.


'High' settings


'Extreme' settings

In other areas we see a few more subtle upgrades. The game's representation of water for example appears more complex in nature. As you can see above, there are greater amounts of ripples and waves on show, while the effect is animated with more accuracy than on consoles; a direct result of both an incrase in both geometry and better normal map blending being used to recreate the effect.

Furthermore, other elements such as motion blur and depth of field benefit from higher quality implementations. In particular DOF features an additional layer in the distance not present in the console versions, whilst its resolution is also higher too. Motion blur is more impacting, but the results appear cleaner than either console game while increasing the amount of screen distortion present.



'Extreme' settings

Moving on, and one of the most important factors of PC gaming is being able to bump up the resolution, plaster on all the top-end effects and still run the game smoothly. And while performance across both 720p and 1080p modes (hardware dependant of course) is noticeably better than on the 360 and PS3, there's also a real sense that bumping up the resolution does very little to improve the visuals compared to selecting one of the higher graphics modes.

Below we've listed some shots to show our findings. The two are grabs from the game running in 'extreme' mode, showcasing how 1080p fares in comparision. And as you can see, the differnce isn't particular spellbinding. Granted, on a native 1080p HDTV the jump in sharpness is easily apparent, but at the same time the additional clarity is somewhat reduced by the game's seemingly restrictive art assests. It seems like the baseline visuals were optimised in order to run across a really broad range of specifications, and thus we see no real leap in detail from moving up the resolution chain.


'Extreme' settings - 720p


'Extreme' settings - 1080p

However, without a doubt, Crysis 2 running in 1080p on mid-range or mid high-end hardware is a class act indeed. Graphically, although the improvements are sometimes subtle at best, there's still a sense that you are seeing the game in its most polished form; a form that is unobtainable on any other hardware outside that of a decent gaming PC.

Case in point, running on a i5 CPU and a NVIDIA GTX460 GPU at 720p (1280x720) with v-sync enbabled, and we get a near constant 60fps update when using the 'extreme' setting. While running in 1080p, v-synced and on 'high' you get a solid 60fps update. All of which amounts to an experience way beyond the home gaming systems.

But that said, we can't help but feel that the PC game is in fact being held back slightly by both console versions. Clearly, Crytek have focused on designing an engine around running on age-old hardware with most of its effects correct and present, than in producing high-end tech that requires the most obsecne PC's in order to run. What substantiates this, is that on day one there is no Direct x11 support at all. By contrast, the original Crysis featured a beefed up Direct X10 mode on launch, with further upgrades separating it even further from the older DX9-based configurations contained within.

My best guess is that due to spending so much time on delivering two suitable impressive - and very comparable - consoles expereiences, that a bleeding-edge, highest-end graphical mode simply wasn't ready in time for luanch. Crytek have already confirmed that an 'ethusiast mode' is on its way, along with Direct X11 support, but whether or not it shall yield the same level of technical superiority that Crysis 1 had over its peers is unknown at this point.

Either way, there's still no question that Crysis 2 is a remarkably stunning release on the PC, and what it lacks in pure graphical advantage, it more than makes up for with vastly superior performance. Running the game at 720p, v-synced, and at the console target of 30fps should be no problem for most decent, older PC's, while people with mid-range and cutting edge hardware will easily be able to blow away both console versions entirely.

Overall, it's pretty obvious that the PC version of Crysis 2 is the one to go for given the choice - and you don't even need the very best hardware to do a good job of running it - but you shouldn't discount the PS3 and 360 builds out entirely. What Crytek have manged to achieve on those consoles is nothing short of legendary, and although there are still a fair share of problems which impact on the experience as a whole, they're both beautiful looking games regardless.

Thanks go out to AlStrong for the pixel counting and Richard Leadbetter for use of Digital Foundry's analysis tools.