Saturday, 22 May 2010

Tech Analysis: Lost Planet 2 (PS3 vs 360)

The original Lost Planet represented exactly how not to not to do a PS3 conversion. Sticking to the basic approach of trying to port the overall engine in a like for like manner, despite clear architectural differences, resulted in one of the worst multi-platform PS3 conversions to come out of any studio at the time.

Missing a large amount of geometry and texture detail from its 360 counterpart, in addition to featuring low resolution effects, and only temporal 2xMSAA, the port suffered greatly losing a large chunk of image quality in the process. It also struggled to maintain a smooth framerate, thus accentuating the game’s poor use of anti-aliasing and lack of fine detail.

Lost Planet 2 on the other hand is nothing like that dreadful port of the first game. Instead Capcom have built upon the finely tuned refinements they made with the first MT Framework engine on Resident Evil 5, carrying over the optimisations to the new 2.0 version used here in Lost Planet 2. Many of the improvements that were to found in the PS3 version of that game are also found here too. However whereas Resi 5 demonstrated some significant differences in anti-aliasing, texture filtering and framerate, LP2 is a far closer affair, for the most part achieving platform parity throughout the game, minus a few issues here and there.

Like with Resi 5, Lost Planet 2 is rendered in 720p (1280x720) on both formats, with the 360 getting the standard application of 2xMSAA and the PS3 game getting no AA of any kind. The result is that both versions appear clean and very sharp, with jagged edges surprisingly manifesting themselves in almost equal amounts in certain scenes.

The differences are easily spotted in the shots below, where we can see that both versions look almost like for like, with only very subtle differences that are mainly caused be the two machines internally different gamma levels, and the PS3 version missing a few effects in places.

With regards to the 360 version displaying almost equal amounts of aliasing to the PS3 one, this can be explained away by how the game is rendering its lighting. LP2’s use of heavy HDR and high levels of strongly defined light sources all create high contrast edges, so when edge samples are taken by the MSAA they are so similar to the un-anti-aliased edges, that in the end some parts of the scene just don’t get any AA at all. This means that the screen can crawl with jaggies on both versions, though it is more apparent on the PS3 version as it has no AA to help control the problem.



In terms of texture detail and filtering both versions seem to be pretty much equal in most scenarios, which is particularly impressive given the scale of the environments and the amount of bandwidth stealing particle effects on screen at any given time.

Some subtle differences in texture quality are apparent between both platforms, but they aren’t really all that visible during actual gameplay. In some scenes textures appear more detailed on the 360 than on the PS3. You can also just about see that the 360 version edges it ever so slightly when it comes to fine detail, though you can only see this when scrutinising still screens, and not when the game is in action.


At some points however, there are noticeable cut backs in overall texture quality on PS3. Although this isn't apparent in all areas of the game, when it does happen it definitely takes away from the experience.

Some stages seem to be more affected than others, and below is a clear example.


What is surprising is that both PS3 and 360 versions of the game feature the use of anisotropic filtering (AF). Previously it was pretty much a given that games on the PS3 would benefit from the use of AF when the on 360 the same game would be using only a trilinear or bilinear solution.

Because the PS3 has more texture units in its RSX GPU than 360’s Xenos, AF basically comes for free on Sony’s machine. Whereas on Microsoft’s system there is normally some sort of memory or performance hit for using it, much like in the way that 2xMSAA is usually commonplace for the 360 but not for PS3.

Either way, both versions benefit from having clean and clear texturing that is visible for several feet into the distance. This was also apparent in Super Street Fighter IV, which first showed Capcom’s improved multiplatform use of AF.



Shadowing looks to be identical between both versions, with any differences being down to the gamma levels of each system. What is noticeable is that in really dark areas of the screen some shadow detail is mildly crushed in the 360 game, with the darkest parts appearing almost completely black instead of clearly showing the faintest of details. The PS3 game with the console’s higher gamma manages to achieve greater amounts of shadow detail, which show up a lot more clearly in dark sections and in character and object shadows.

There are of course downsides caused by the lighter shadows on the PS3 version despite the welcomed increase in noticeable detail. The sense of depth is slightly lessened leaving an overall image with less three-dimensionality compared to the 360 game, although the like for like quality of the actual shadows means this is more of an observation than a complaint.



Visual effects in general have also seen major improvements in Lost Planet 2, with the vast majority of effects looking the same on both platforms. Again, like with the texturing, certain scenes do take a noticeable hit, while others are practically identical. Smoke and particles are once again slightly lower res on the PS3 game - although not the extent of the first Lost Planet - and are less noticeable here than they were in Resi 5, particulary with the larger effects which I believe are the same in both versions.

This shot below shows off the worse case scenario of the PS3 game missing various effects found in the 360 build. Water and some shiny surfaces seem to be the main area in which certain effects have been cut back on.


Despite these differences in some scenes, it’s pretty impressive seeing how close Capcom have managed to create near-identical copies of the game visually on both systems, for the most part at least. In motion it’s only the PS3’s lack of AA which consistently shows up crawling jagged edges and a very slight drop in IQ in these areas.

Sadly, there are times when the game looks noticeably worse, though thankfully this doesn't happen all that often, especially nowhere near to the level of the first game on PS3. When it does happen however, it manages to undermine some of the hard work Capcom have done on the conversion. Which is a shame, because at times the two versions really do look identical.



So, you could say that it’s mostly par the course for parity then? Well, not quite, as whilst both versions maintain similar levels of graphical fidelity, with some exceptions in certain areas, the same cannot be said when in motion.

Like with Resident Evil 5, both PS3 and 360 versions of LP2 deal with framerate and screen tear differently. The PS3 game tends to hold v-sync in order to prevent any untoward screen tearing, along with what looks like the return of double-buffering – a process of generating a spare frame just in case the one about to be used gets torn – but in the process at the expense of obtaining a stable framerate.

This means that screen tear is pretty much non-existant in the Sony game, but the framerate instead constantly takes a dive from the targeted 30fps update in busy scenes. In large boss battle and parts of the game filled with large enemies the framerate hits between 10 to 20fps, creating what can only be described as a brief slideshow of movement.

I also noticed that the controls seemed to be a little more laggy on the PS3, which aroused my suspicions to the inclusion of the double-buffering. Although this isn’t a 100% conformation, but a solid assumption based on both this controller lag and Capcom’s previous use of the technique.

The 360 game on the other hand, instead allows the screen to tear more frequently but consequently maintains that 30fps update far more often. Interestingly, LP2 actually seems to be v-synced on 360, at least partially – something that was absent completely from Resi 5, and this on occasion can lead to terrible drops in fluidity which are pretty unsightly to say the least.

However, this only really happens during certain boss battles, and usually manifests itself in the cut-scenes rather than in actual gameplay, so although it isn’t too impacting, you can’t help but notice it.

Perhaps this is the most substantial issue between either version of the game, which is a shame as Capcom have really excelled at making Lost Planet 2 at times, a near identical experience regardless of which version you own. Everything but the use of AA, and the lower-res, paired down effects are basically the same between both versions - occasional texture issues aside - and it’s only the frequent drops in framerate that really set them apart the majority of the time.


In the end Capcom have pulled off a pretty successful multiplatform title in Lost Planet 2. It may be too much to expect a complete identikit release graphically, but most of the glaring flaws and visual differences have been addressed to some degree. Sadly the same cannot be said for the game itself, in which we awarded a rather disappointing 6/10 in our review.

For all the technical achievements the developers have managed to weave together, the underlying gameplay issues and fundamentals almost break the game at times. So much so, that for the most part Lost Planet 2 is a partially polished but unsatisfactory experience.

Tech analysis updated: extra screens and further details representing the more severe differences.

Thursday, 20 May 2010

Tech Analysis: Alan Wake

I will warn you now, that this tech analysis for Alan Wake is an incredibly long and in-depth affair. With Remedy’s latest there is so much going on, and individual parts which make up the overall look and feel of the game, that after you’ve covered one thing, you’ve also just discovered another.

So, rather than skimp over the little details which might actually have a pretty large impact in he overall scheme of things, what we’ve done, is to try and deliver the most comprehensive look at the tech present in making up the world of Bright Falls, one of this years most defining visual achievements.

Usually it’s Sony’s PS3 exclusives that garner such attention and technical praise, often delving deep into the hardware to deliver an visual experience that on so many technical levels is almost unmatched by most other titles on competing platforms – PC of course excluded, but that’s a given really. Microsoft on the other hand, have a machine with a comprehensive set of development tools that are so much more traditional and easier to get to grips with what Sony have provided for the PS3, that in most cases developers are quite happy to think inside the established box when it comes to crafting that show-stopping visual showcase expected for current-generation games.

However, some developers such are beginning to break out of that cycle, creating a graphical experience which rivals that of Uncharted 2, Killzone 2 and God Of War 3. Both Bungie with Halo Reach, and Remedy with Alan Wake, are pushing the envelope of what is possible on MS’s machine, showing gamers and developers that what is possible in a PS3 custom-made approach title is also possible on the 360.

Last week we took an in-depth look at Halo Reach, casting our critical eye over the overall workings of the tech. Today IQGamer looks closely at Remedy’s Alan Wake, a game no stranger to controversy or our very own tech analysis, in which for this feature, we will be finally evaluating the tile whilst also attempting to fully uncover the truth behind that 540p rendering resolution.

Alan Wake, to cut away any lingering doubts or speculation does indeed render in 960x540 resolution, which is then upscaled by the 360 to form a final 720p image. Not all aspects of the game are rendered in this sub-HD resolution – with some effects actually being closer towards 720p – and the game does use an impressive 4x multi-sampling anti-aliasing at all times, a side effect of which is not only a clear reduction of jagged lines but also a much cleaner upscale.

The result of rendering in 540p isn’t quite as bad as what you might expect, with fine detail still being present, and the soft look of the game actually appearing quite clean and clear compared to some other sub-HD games. No doubt that the use of 4xMSAA actually helps in reducing any upscaling artefacts, whilst keeping the image relatively clean in the process. Softness, as obvious as it is, is the by-product of this, but its inclusion actually helps in creating the creepy atmosphere found in the game, along with the shadowy and foggy night time visuals all feeding a real sense of immersion.


It’s pretty clear from the screenshot above just how good Alan Wake looks for a game that is decidedly more sub-HD than many others on the market. But why did Remedy opt for using such a low resolution so late in the development cycle? And what happened to claims of rendering in 720p with 4xMSAA?

Well, you only have to look at last years gameplay footage in order to find out, which does indeed render in full 720p whilst sporting 4x multi-sampling anti-aliasing. Before this, Alan Wake was rendering at the standard 720p with 2xMSAA, in addition to having full resolution particle and transparency (framebuffer) effects. However the game suffered from constant screen-tearing and drops in framerate, no doubt as the engine struggle to cope with the game’s intense bandwidth requirements

It is pretty obvious that Alan Wake as a game is extremely bandwidth heavy, rendering all those fog, mist, smoke, and transparent particle effects. Transparencies are littered all over the game’s fictional town of Bright Falls, and the engine unsurprisingly was struggling to cope.

Originally Remedy simply decided to change from rendering full resolution effects to instead using half-res alpha-to-coverage effects (A2C), thus saving some bandwidth needed to keep performance up. However, A2C has the unfortunate side effect of giving all transparencies that use it an unsightly screen door effect. The only solution is to effectively up the level of anti-aliasing to 4xMSAA in order to blend away the A2C into the rest of the scene.

Despite these changes it’s clear that the game still suffered from severe performance issues, with tearing once again being at the forefront of those. In an interview the developers stated that the screen tear would be part and parcel of the experience, a sign that maybe they where having trouble in keeping things running smoothly. In light of all this, it seems that in order to claw back performance they finally opted to render in a sub-HD resolution to give them a smooth 30fps update and very little screen tear, whilst still having all the benefits of 4xMSAA improving the overall image quality (IQ).

In the end the use of A2C does very little to damage the overall look of the game. Alan’s hair for example (see below screenshot), is using it with very little in the way of noticeable side effects, though if you look closely you can indeed see some of the dithered nature of the A2C at work. The 4xMSAA manages to prevent any noticeable upscaling effects from being visible, and the soft look provided by the lower resolution isn’t particularly intrusive.


There’s not doubt that the game would look much sharper as a result of rendering at 720p over the current 540p framebuffer, however it is likely that many of the outstanding graphical effects and small visual touches would have to have been sacrificed in order to keep performance levels up. With this in mind it’s much better to have a smoother, graphically more impressive game as a whole, than to have a clearer albeit simpler one instead.

Resolution and framebuffer issues out of the way, the rest of Alan Wake’s engine is just as interesting, and serves as a clear technical benchmark for many 360 developers to follow.

The stable framerate for one is a pretty exceptional achievement. Alan Wake maintains a rock solid 30fps ninety-nine percent of the time, with the game instead opting for screen tearing in situations where the engine is struggling to keep a steady hold on things. As a result the game basically never drops frames at all, and when it does, the drop is so insignificant that it is barely even noticeable at all. The downside is that the tearing can get pretty messy at times, covering the image right in the centre of the screen where it is most noticeable.

Thankfully, these situations aren’t too common place, with most of the tearing smoothly appearing for a split second or so, before vanishing as quickly as it came. It should be pointed out though, that the game does tear regularly, although it isn’t at all intrusive in these smaller amounts.

Adding to the stable framerate is the games use of camera-based motion blur, which helps to create that smooth look and feel that the game has throughout. The motion blur is very subtle, never at all intrusive, instead being an organic part of the camera movement. Its implementation is just another part of the artistic flair that is running through every aspect of the game’s visual make up. The screenshot below shows the effect in action, though it seldom has that much of a dramatic impact on the game in motion.


What is surprising, is that the engine in Alan Wake is running at a almost constantly solid 30fps with a high level of dynamic visual effects - from fog, smoke, and particles - whilst also capable of delivering incredible draw distances without dramatically paring back the visual through the use of an aggressive LOD system. The game is full of sprawling vistas, from dense forests to towering mountains, and all of it is largely reachable, with the player travelling between these iconic places many times.

Perhaps, it is for this reason that the game engine has had sacrifices in other areas. Both the framebuffer, alpha effects, and textures have been downgraded by using a lower resolution, as to has parts of the game’s lighting and shadow system.

Originally Alan Wake was going to be a fully-fledged open world action horror, in which the player would be investigating the various paranormal and supernatural occurrences whist trying to find wife Alice and fend off scores of ‘The Taken’. In much of the final game you can clearly see the original open world nature of its design, with large organic multiple paths to take, long draw distances, and the ability to backtrack, go off the beaten path, before heading to you next destination.

The engine used in the final game is still highly optimised for such an experience, so despite the change to a more linear and controlled affair, the engine still has to draw vast distances in high quality, whilst also having to render all those alpha effects, shaders and textures, and keep up framerate at the same time. It’s like having a version of GTA or Just Cause with Uncharted 2 levels of graphical polish, something which is beyond any of the current gen consoles.

A good example can be seen below, in which the game renders detailed scenery for miles for many miles away from the player, with no additional visual effects hiding the incredible draw distances that the engine is capable of.


Texture detail is reasonably high, although the actual textures themselves appear to be of very low resolution. Up close any detail begins to break up, and from a distance they look clean, but at the same time a little blurry. The quality overall is very good, given the game’s 540p framebuffer and various effects that the engine is pushing around on screen, it’s just a shame that much to the intricate details get so broken up in close range, or blended away when at a distance. Nevertheless there are times in which the combination of artistic flair and attention to detail really show of the textute work.

In term s of filtering it’s not entirely clear what is going on in Alan Wake. Assessing levels of texture filtering by eye is always a difficult proposition, however it is possible to make so well placed judgements as to what is happening.

At times texture detail is visible for a good 16 feet or so into the distance before becoming blurry, whilst in other scenarios texture fidelity is lost just a few feet away from Alan himself. In these later scenes, it would appear that the game is perhaps using bilinear filtering (BF) at best, although that would fail to explain the clarity in other scenes. Instead, my best guess is that Remedy are actually using a combination of anisotropic (AF) and bilinear filtering for the textures, alternating between large amounts of AF and BF combined, to very little AF with no BF at all.

You could call this a ‘filtering bias’ with some scenes getting more filtering that others. But at the same time witn all the fog, mist and other effects going on at night, it is hard to make a solid judgement call on this. At the very least TF is definitely present, with small levels of AF in parts.


Unmistakably though, the use of high levels of visual effects such as volumetric fog, smoke, particles, and the impressively accurate dynamic lighting and shadowing system, is what puts Alan Wake above so many other titles available on either the 360 or the PS3. It’s these effects that work so well with the A2C and 4xMSAA that any hit in pure sharpness taken away by the 540p resolution isn’t on many occasions all that apparent. Especially in night time scenes in which all the visual effects come together, with loads of moving elements on screen pretty much most of the time.

The fog, shadows, light, foliage and physics based objects are cast their own shadows, some simply being pre-baked shadow maps, other being fully dynamic reacting to a multitude of light sources and environmental objects. The fog for example, interacts with the game’s lighting, with light shining through it, moving over the trees and the foliage, casting shadows for most objects in its path.

A hazy mist is also present during the environments at dusk and at night, which both reacts with light along with blending into the black fog which appears as ‘The Taken’ arrives, creating an ambience that adds to the tension felt throughout the experience.

Some of these effects do appear to be rendering at a lower resolution than the rest of the game. The fog in particular is pretty low res, which tends to blur everything in its path, obscuring detail and almost warping the game world. The blur doesn’t impact too much on the overall graphical feel that Remedy is going for, and at times actually benefits it. Sadly when the fog makes its way to cleaner areas, such as town buildings or remote gas stations, the blur effect is more noticeable, and less impressive.


For shadowing the game uses a combination of high and low resolution shadows, both static (pre-baked) and dynamic. The ones used in doors are most noticeably low res, as are some of the dynamic shadows cast by the players touch as they explore Bright Falls. However in outside areas, in which there is so many other effects going on, it’s incredibly hard to notice the odd poor quality shadow.

The games lighting also helps to back the mixture of shadow quality and various other visual effects. Light given off by the players touch casts shadows from objects all around the environment, as do flares and flashbangs, which dynamically change the surrounding shadows. This is perhaps the most impressive thing in Alan Wake’s graphics engine, the uniformity between light and shadow, the dynamic interaction between both, and how this helps create a beautifully organic look to the visuals on offer.

It is safe to say that the quality of these effects on offer in Alan Wake is perhaps some of the best we’ve seen in any console or PC title to date. Resolution issues aside, the consistency and quality seen here is a pretty impressive feat, given the constraints the developers have had to work with and the many issues faced along the way.


Overall the tech behind Alan Wake is extremely impressive. The game is at times combining several different transparency heavy effects together, along with a fully dynamic lighting and shadowing system, whilst maintaining incredibly high draw distances at a near perfect 30fps.

Certainly, things have been sacrificed in order to achieve this level of visual performance, but those sacrifices haven’t damaged the game in any significant way. In fact, most of the issues caused by the low resolution effects and 540p rendering resolution are barely noticeable during most of your time spent in Bright Falls, which is spent in a surreal world of darkness. This darkness helps hide much of the game’s graphical shortcomings, blending them in, and actually increasing the level of immersion to be found all through Mr Wake’s adventure.

Remedy, like with all the most highly talented developers, have shown just how to work in and around any limitations of the current platform they are developing for, making concessions in certain areas, whilst scaling back in others to ensure that the whole visual make up is as polished as it is balanced.

Alan Wake in this regard, represents exactly the right design choices made for the game at hand. Generating atmosphere to completely embody the player is paramount to the experience, and is something that the developers have achieved with the underlying engine behind the game. It isn’t always about getting all the elements together in the most technically advanced way possible. But instead, about making sure that each of the individual pieces fit together succinctly, and not just as separate visual standpoints in which to admire.

In conclusion, there’s no doubt that Remedy have achieved exactly what they have set out to do with Alan Wake, creating a game which is as gripping as it is visually alluring. For all the use of high-end tech that is powering the game, it is the carefully and often cleverly crafted nature of the art design which makes the package such a success. And whilst it may not have all the high definition goodness of Sony’s Uncharted 2, it more than matches it in sheer technical brilliance and pure artistic direction.

Tuesday, 18 May 2010

3DS Development / Test Hardware Revealed?

It seems that Nintendo have been testing out parts of their new 3DS handheld with the US Federal Communications Commission, reportedly aiming to get the system’s new Wi-Fi card approved for use in the United States.

Some images from the testing appeared on the FCC Filing website late last month before being quickly removed. However a single image was posted at WirelessGoodness before the entire batch disappeared, showing something being referred to as a "Nintendo CTR Target Board". Most likely this image is that of Nintendo’s NDS Successor, the tentatively titled 3DS.

The image below shows what looks like a 3DS test kit or early development system, with the Wi-Fi card and related parts being highlighted due to being involved in the FCC’s testing process. WirelessGoodness initially assumed that it could be a custom board to test out a new built-in Wi-Fi card for either the existing DS or an upgraded model. However this is very unlikely as the board on display has some very different defining characteristics compared to any of the existing NDS designs.


The test board clearly has two screens situated one above the other like the current DS, although the top screen is clearly in a widescreen format and the bottom is in the standard 4:3, whereas the existing DS’s feature a dual 4:3 set up. The new 3DS, if that indeed is what this board is based on, shows that only one of the two screens will be 3D compatible. It’s clear from the image that the top screen features Sharp's new auto-stereoscopic 3D tech, whilst the bottom screen is just a standard LCD display.

From the image it is also possible to point out the inclusion of both a DS cartridge slot, and a SD card port too. Both of these are currently standard in the DSi and DsiXL systems, but are also likely to be included in the new 3DS when that launches, as Nintendo have previously confirmed full compatibility with all DS and DSi software. There is no sign of any new type of cartridge slot or card port, meaning that games for the new 3DS are likely to appear on the same carts as current DS software, leaving the SD card support for something along the lines of multimedia application.

More interestingly, the board on show is listed with a three-letter codename, something that Nintendo has used for all versions of its NDS hardware. The DSi was referred to as TWL, and the DSi XL as UTL, much in the same way as the GameCube, which was internally referred to as the GCN (GameCube Nintendo) rather than the publicly abbreviated NGC. The codename CTR hasn’t been used by Nintendo before, so most likely refers to a brand new generation of handheld, which is also backed up by the images showing a completely different design to the one found in the current range of DS systems.

All evidence clearly points to this being a form of 3DS development hardware, or at least a testing kit for the new machine. With the system likely to be on show at E3 in some capacity, or at some point later in the year, it isn’t a stretch to assume that development kits are in first party hands at least. This test / dev kit shows that the design is being finalised, with some features being tested out for a final inclusion, and others yet to be put in or decided upon.

With only a few months to go until this new handheld makes its debut, it’s only likely that more information and leaks begin to surface. Maybe some prototype images of the new hardware will appear a little later down the line, or that Nintendo will have its hand forced once again into an early reveal? Either way today’s images clearly show that the DS’s successor is almost in sight, and that we probably shouldn’t have long to wait for more concrete information to arrive.

Monday, 17 May 2010

Review: Lost Planet 2 (360)

A giant insectoid-like beast busts up from the snowy ground with an almighty roar. Immediately it catches me in its sights and begins to charge. Armed with only a simple machine gun and a few paltry grenades, I engage the enemy, dodging its first attack before turning around and plugging it full of lead. Some of my shots bounce off its hardened shell but others directly hit its yellowy fleshy tail instead, resulting in another hollowed roar from the creature. At this point I make a hesitant dash for a nearby semi-destroyed building, hoping to gain at least momentary cover.

Inside awaits more of the vile Akrid parasites. It turns out that I’ve just entered this creature’s makeshift nest. Immediately, without fail, I begin to blast my way through swarms of smaller spider-like Akrid, and into the pulsating eggsacks, safe for a short while from the chaos outside – my team are busy getting slaughtered by the huge beast outside. After clearing out the half-standing structure of all its living inhabitants, I take my beef back outside with me. Guns in full blaze I throw everything at my disposal against the giant beast whilst its intently distracted – shooting mercilessly at its now red little tail - and after another roar, plus the obligatory pool of blood and puss, it finally comes crashing to the ground.

Looking around, there are wide-open spaces for miles, the view of snowy particles being blown throughout the air, and the sheen of the glistering white environment reflecting back the light given off by a obscured sun. The Akrid beast is dead, oozing puss and drenched in its own blood, before shattering into a thousand frozen pieces. It was remarkably beautiful and ugly at the same time. The hard exterior shell revealed its intricate markings, while its fleshy body is both soft, and solid at the same time, covered in sheen and detail. Everywhere you look there are wondrous sights full of character, all contained in and around some lovely white vistas.


Welcome to Lost Planet 2! An experience that starts off unsurprisingly like the first game. The stunning visuals, quite possibly some of the best seen on any console to date, along with the tried and testing third-person gameplay mechanics, are every bit as polished as they were the last time around, although now feeling a tad dated. In fact, for the first few minutes or so, Lost Planet 2 is nothing but a solidly made and pretty entertaining action game. Insanely large creatures, huge guns, and lovely environments combine to form a familiar but fun element of shooting action. Much of what was so good about LP1 is also still reasonably good here, and while many of the little niggles are still present too, there are larger issues that you’ll be complaining about.

However, shortly after things take a turn for the worst, as parts of this sequel’s poor design begin to break through the solid foundations built up by the original game. It’s pretty clear that Capcom were keen to have a different gimmick driving how LP2 works, and to this end two distinct elements have been shoehorned into the experience.

One being the multiplayer focused single player campaign, in which you are merely part of a four-man team. And the other, a revised continue and checkpoint system which bares more than a passing resemblance to the hardcore games of old, ill-suited for the gameplay on offer here. These two elements are inexplicably linked together in a way, in which on there own wouldn’t pose so much of a problem, but together they conspire to break the game on so many occasions, leading to numerous bouts of frustration and fist clenching anger.


Keeping things together is the return of the thermal energy meter from the first game. Unlike in LP1, your thermal energy (TE) gauge isn’t constantly depleting. Instead it continuously accumulates more TE as you kill and collect it from fallen foes and various data points scattered around your environment. When you take damage, and as your health bar begins to runs dangerously low, you have the option of using this TE to restore lost health, thus preventing you from loosing a precious life. You’ll certainly be needing this boost, as in LP2 most large enemies have almost ‘instant kill’ attacks which leave you very little time to escape for cover, or simply in many cases, regenerate your health.

To make matters worse, the game is always pushing you towards an offensive solution. Done mostly in order to recoup lost TE as you battle it out amongst the native wildlife and nomadic Snow Pirates, putting yourself in harms way during times in which a more carefully thought out approach would be preferable. TE however, is the least of your concerns later on in the game, with the lack of save points and temporary checkpoints making this part of the experience a frustrating and sometimes an unplayable one.

The checkpoint, life, progression, whatever you want to call it system in the game, centres on something called the Battle Gauge. You start off with 500 battle points, and receive 500 more for every checkpoint you reach (data points that you activate), or 1000 if you happen to be piloting a VS suit. Every time you die, you loose a certain amount of battle points, and are respawned from the nearest data point. Loosing all of your battle points however, means that you loose all of your checkpoints and instead have to replay the entire chapter all over gain.

Chapters can range from anything from 20 to 50 minutes to complete, depending on both their length and player skill level. Either way, it means that if you get stuck on a particular boss, or mission, and end up constantly dying, then you are gonna be replaying a vast amount of content again and again before you get it right.


It doesn’t help that the game isn’t exactly signposted when it comes to telling what to do. Especially I have to say, with regards to some of the boss battles, which not only require you to work out the correct solution of dealing with them, but also working as a team to bring them down. Unfortunately, the complete lack of CPU controlled AI makes this task an infuriating one. It’s all too common in LP2 to have certain parts of the game in which working as a team is essential to score a solid victory, without the frustration and hopelessness which occurs during solo play.

The boss battle at the end of chapter 3 is a good example of this. Set upon a speeding train, you are tasked with battling a giant sandworm whilst attempting to prevent the train from being obliterated. Right at the front of the carriage you are presented with a handful of giant ammunition shells lying around, and a huge gun-turret to load them into. Aiding you in this task, is a small, rather illegible diagram showing you where on the train to load these shells, and the also the position of the engineering room, required for fixing up the train as it sustains damage.

It is clear that the game wants one person to load in the shells, another to distract the boss, another to look after the engine room, and someone to take control of the gun turret. This is great if you’re playing with four other people, but by yourself, it’s a hopeless mess. The key here is teamwork, something that your AI buddies don’t have a clue about. They’ll simply stand around getting killed and leaving you to do all the work, making the challenge so much harder. And as you are running back and forth trying to load the shells you’ll be frequently attacked, being thrown off the train and forced to restart the whole encounter all over again.


With two or more people playing this doesn’t become so much of an issue, making finding a solution for dealing with a boss easier to find, and coordination almost second nature. Of course playing with friends is likely to yield better results. But either way, the multiplayer sessions allow the game to be far more playable than going it alone. It’s just a shame that the single player campaign seems to be completely tacked-on the end of the game, like LP2 was designed to be an online only experience.

During online play the battle points system still gets in the way, which is unfortunate. With all four players sharing the same battle gauge, each player can only afford to die two or three times at the most before the gauge runs out and everyone starts the entire chapter again. On your own you could afford to take a more few chances and die a handful – or two - of times before exhausting your battle gauge, even though the overall fight is made much harder without a coordinated team behind you.

That said Lost Planet 2 isn’t a bad game by any means, it’s actually pretty good at times. A potentially great experience, let down massively by Capcom’s insistence on shoehorning in new and unwanted gameplay mechanics to a system which didn’t require radical change. The return of the TE meter works in the game’s favour, and the core gameplay on offer is almost fun as it was in the first game. However, it’s just that the new elements that have been added really threaten to break apart the game, and on many occasions they do so almost effortlessly. When this happens, all of the hard work and solid gameplay mechanics built up by the original LP is completely overshadowed, leaving an experience which is an excise in frustration.

Visually the game can’t be faulted. It looks stunning! LP2 delivers some of the most detailed texture work seen in a videogame so far, along with splashes of intensely delivered particle and smoke effects. Again, being some of the best we’ve seen. The Arkrid creatures are all incredibly detailed; lots of impressive shader effects, bump mapping, sheen and reflections. The environments, like with the first game, are filled with wide-open sprawling vistas, packed full of personality and intricately crafted characteristics. Most of all, the entire game looks and feels distinctly organic, never looking like a fake plasticky resemblance of reality.


It’s rather unfortunate then, that the gameplay fails to live up to the technical heritage on offer, with the solid core experience broken down and compromised by the developer’s need on including new and gimmicky features. All Capcom had to do was to take what worked in the first Lost Planet, and then up the ante with this sequel. Being bigger and more bad-ass, doesn’t mean better, and although it is clear that Capcom wanted to have a scale that was so much wider than in the first game, they have failed to provide suitable gameplay and progression system to really back this up.

In the end, Lost Planet 2 is one of 2010’s biggest disappointments, failing to live up to the standards set by the original game, and placing too much emphasis on the multiplayer aspect. Not enough thought has being put into solo play, and it shows. That, along with the ridiculously outdated (for this style of game) save and continue system, makes this sequel a rather substandard experience for all those concerned.

The original Lost Planet, as it stands, is a much better choice if you want to experience some of the delights of E.D.N III, and although this sequel does still deliver some (very brief) flashes of brilliance, it also completely misses the mark most of the time.

VERDICT: 6/10

Saturday, 15 May 2010

OnLive Gaming Service Comes To The UK

Last September OnLive began beta testing its brand new streaming video gaming service in the US, called unsurprisingly OnLive. The technology represented a whole new way of bringing videogames into the home. Rather than requiring a hardware box with specific processing components for rendering graphics, gameplay, etc, it acted more like a router or Internet connection device, instead delivering its content through the use of real-time streaming video.

Games aren’t downloaded onto a hard drive for play later on, instead they are instantly streamed on the fly, ready for the user as soon as the OnLive MicroConsole has started its download. During the beta controls were found to be fairly responsive, with some lag - much like the delayed responses to be found in many motion control titles on the Wii. The video stream was, at best, relatively serviceable, with evidence of macro blocking and a fare share of image break up. Detail had also been compromised as a result of the video compression scheme, lacking the same kind of intricacy found in a direct uncompressed HD source.

However, it did seem like a perfect compromise for the quality oblivious masses, in which around 50% still play their HD consoles in SD – some on a HDTV no less. I can see OnLive featuring as almost a games-on-demand rental service for most consumers, with the more serious of gamer types opting to buy a traditional console for maximum quality. Either way, the technology and idea represents a very different way of thinking when it comes to giving us a definitive gaming experience. There are upsides to the use of streaming video, mainly in the form of users not needing to have powerful hardware at their end, in a separate box powering the games. Instead, all the processing is done on high-end PC’s at OnLive and them streamed out through their array of data servers.

In ten years time, with broadband rapidly increasing in speed and reliability, we could be seeing a service such as this become the new face of traditional gaming. OnLive is in effect, in its current state, a big trial, a test to see not just how well the technology could work, but also if people are ready for such a business model so soon after the initial breakthrough.


For the United Kingdom OnLive are partnering with BT in rolling out a beta trial for sometime this summer, just after the final service goes live in the United States. Whilst no date has been set for the trial as of yet, we do know that it will be available through both a PC and MAC alongside the OnLive MicroConsole. Helping OnLive to pick up much needed market share in the territory is BT, who has purchased a 2.6 percent share in the company. This basically gives BT rights to bundle in the OnLive with its broadband packages increasing overall uptake compared to individual subscriptions.

Anyone not using BT will still be able to order the service directly through OnLive for use with their existing ISP. Users from all around Europe and the UK will be able to play against each other in online matches, along with community-based features such as Chat options, brag chips and profiles. Online play will be restricted between European and UK users only, although the community features will work regardless of region. So gamers can at least talk and compare profiles with their US counterparts.

So far various publishing companies, including the likes of Electronic Arts, Ubisoft, Take-Two, Warner Bros. Interactive, THQ, Epic Games, Eidos, Atari Interactive, and Codemasters have already signed up to have their games available on the service. That’s a pretty large show of support and should allow most of the high-profile AAA release PC titles to make an appearance on the service, with other titles and publishers joining later down the line if it proves to be successful.


OnLive certainly looks promising, on paper at least. There is still some issues that need to be sorted out with control lag – which is pretty bad at this point – and with regards to the amount of compression used in the video stream, which at present is said to be somewhat blocky and unsuitable for fast motion. However, the service looks to provide the next step in videogame rentals at the very least, with a solid replacement for traditional retail games likely to come further in the future when high-end broadband speeds are available to most of the general public.

Either way, it will be interesting to see how well OnLive performs in the UK both technically, and from a consumer’s point of view, with the financials and quality of the tech being paramount to its success. The public must also be happy with how the service performs, along with having enough new content released each week to justify the entry fee each moth.

Nevertheless, what OnLive hopes to achieve could well revolutionise how the games playing public actually takes delivery of their gaming experiences. If done right, it could eventually replace the traditional consoles as the main source of gaming in the next ten to twenty years.

Thursday, 13 May 2010

Review: Sin & Punishment 2 - SotS (Wii)

Ten years ago Treasure unleashed their little known but highly praised N64 classic, Sin & Punishment. It was a game that brought high-octane on-rails shooting to Nintendo’s failing 64bit system in a way not seen since the likes of Alien Solider on the Megadrive. Giant battles, continuous action, and challenging gameplay were all part and parcel of the experience, an experience though very well received, never made it anywhere outside of it’s native Japan.

Since the release of S&P Treasure has had only a few hits to their name. Outside the GameCube smash Ikaruga, and the commendable Astro Boy for GBA, there hasn’t been anything as iconic or sublimely brilliant at Radiant Silvergun, or even the original Sin & Punishment. Perhaps that’s because Treasure work best when investing in fresh new ideas, and not pandering around to its own rabid fanbase. It’s the main reason why, as a studio, they tend not to create sequels and only focus on new IP.

With this Wii sequel to the original S&P however, Treasure have delivered an experience that is in every way superior to the N64 original, featuring some of the most intensely fast-paced hardcore shooting action to be found on any console to date. If you like games that send wave upon wave of beautifully choreographed enemies your way, with some absolutely huge boss battles continuously emerging from the chaos, then you’ll love Sin & Punishment: Successor of the Skies (S&P2).


Like its predecessor S&P2 is an on-rails shooter. Guiding you along a fixed path, the game has you aiming and blasting your way through anything that stands in your way. Frequently your progress will be hampered by some show stopping gigantic creatures, in which the game mechanics and the on-rails nature briefly expands into something more free roaming, though still as tightly restrictive. In fact, right from the beginning it is clear that you have more control over your characters than in the original game, if only restricted to the view on screen. At the most, you can on occasion move inwards and outwards in addition to the standard left and right, giving you a brief moment of extra manoeuvrability.

The Nunchuck controls character movement, whilst the Wii Remote points and shoots at enemies across the screen. Camera movement is fixed, and your path is largely pre-determined, though you can move around the limited space given at any time. Two different characters are playable throughout the game, each with subtle differences adding some extra strategy to the game, and another excuse to play through the whole thing again once it’s finished.


Isa, the main one of the two and usual lead male protagonist, commands control of a jetpack and has the ability to unleash a charging shot of sorts, which explodes in a grenade-like fashion when it connects with enemies or the environment. Kachi, on the other hand, is a little different. The female of the bunch, she uses a hoverboard instead of a jetpack - yes, proper Back To The Future stylie - and features a lock-on charge shot that can target multiple enemies at once. Both characters can dodge, and also fight back with a standard melee attack, which can repel projectile attacks while still being the first choice for close-range combat.

Throughout most of S&P2 you will definitely need to use your entire arsenal, dodging and shooting your way past a multitude of foes, whilst making sure to keep that chain gauge going up. Later on in the game you’ll be faced with having to dodge through laser beams and constantly melee back projectile attacks, whilst at the same time trying to counter a gigantic boss creature’s main method of attack by firing off a well timed charged shot, disabling it for few seconds before repeating the whole process again. It’s pretty intense, and utterly exhilarating at the same time, being the most fun I’ve had with an arcade shooter in a long time.


It’s also a pretty tough ride all round. Though never unfair, the game requires you to simply learn enemy attack patterns and counter them effectively with the right set of moves. Most of the time, a well-planned dodge or some accurately positioned charge shots are all that is required. Whereas later on, you will need to mix it up using melee strikes and rapid gunfire in order to survive. That said there are a plentiful amount of checkpoints on offer, not least of all before every gigantic boss creature and end of level encounter, so you’re never far away from where you last died. Unlike Lost Planet 2, Sin & Punishment 2 absolutely nails down how old school progression should work, keeping things fair but challenging at all times.

The best part is that the entire game is filled with imaginative ideas, from the huge bosses and the smaller cannon fodder, to the level design and overall aesthetics. Sin & Punishment 2 is overflowing with an art style that is as original as it is bizarre, packed with a level of stylised beauty that could only have come from the minds at Treasure. Much of the game bares more than just a passing resemblance to Ikaruga, and at times it feels like this could almost be a spiritual successor to both Treasures much loved GCN shump and the Saturn classic Radiant Silvergun, though obviously unrelated to either.


Along with the unique art style and imaginative designs, the game also looks very pretty impressive from a graphical point of view. Visually, S&P2 is one of the best-looking Wii games, and doesn’t take its time to showcase its abilities. For one, the game runs at a buttery smooth sixty frames per-second almost constantly, with only minor drops in framerate. Bosses and the larger enemies are packed with detailed textures, bump mapping, and feature a liberal use of that next-generation sheen lacking in so many Wii games.

However, if there is one downside is that the game tends to look a little blocky, lacking consistantly high polygon counts, a result of having so much going on at any given time and keeping a smooth framerate. Also, despite featuring some of the sharpest, cleanest edges for a Wii game, S&P2 suffers from plenty of jagged edges, which means that playing this upscaled on a good flat panel HDTV is a painfully ugly process. Thankfully, any CRT owning folks out there can experience this in all its clear 480i/p glory, which really, is the best way to be playing S&P2.


With Sin & Punishment: Successor of the Skies there’s very little to complain about. Treasure have easily delivered one of their finest games of recent years, and one of the best arcade shooters to come out of any Japanese development studio in a long time. With it’s unique blend of imaginative ideas and art design, filled with unmistakably addictive on-rails action, S&P2 is not only an essential purchase, but also the best thing to come from the minds at Treasure since 2002’s Ikaruga. In many ways this deserves to be remembered as fondly as Radiant Silvergun, and maybe even some of their older 16bit hits as well.

What we have here is quite simply a modern classic, and perfect for anyone out there wanting some old-school action - something that you can dive in for a couple of minutes before getting lost into for several hours. Despite being a little short, it perfectly demonstrates what is missing in so many of today’s high profile titles, and shows that a tried and test formula can be equally refreshing as anything that attempts to push forward the boundaries of gaming.

VERDICT: 9/10

Tuesday, 11 May 2010

EA Locks Out Features From Used Games...

The battle against preowned games may have well and truly begun, as today EA announced the first title that would require an activation code to enable online play.

Maybe retailers should have thought about handing over some of those profits from used game sales back when they had the chance to make amends? Instead, they are now faced with potential reduction in preowned sales and a fall in the trade in price on certain titles due to the removal of multiplayer features.


Beginning with the release of Tiger Woods PGA Tour 11 on both PS3 and 360, EA will introduce a new ‘feature’ known as the Online Pass. This is a code which grants the user access to all the game’s online functionality along with any of the bonus features included. It is a one-time only registration option which allows the unlocked modes and extras to be available to just a single user, mostly likely being tied to their PSN or XBL accounts.

For people who purchased a used copy of the game, they will have to plunk down $10 for the Online Pass, or sign up for a 7-day free trial. Currently, there are plans to include the Online Pass in the company’s future sports line-up, which so far consists of NHL 11, Madden NFL 11, NCAA Football 11, NBA 11, FIFA 11, and EA Sports MMA. Each title will have different features unlocked when registering the Online Pass, although all titles will require the Pass to unlock any online functionality.

"This is an important inflection point in our business because it allows us to accelerate our commitment to enhance premium online services to the entire robust EA SPORTS online community," stated Peter Moore, President of EA SPORTS.

Though he failed to mention any link to retailers profiteering on used game sales and the fact that the publisher makes nothing on each used game sold, it is clear that this introduction of a registration code to unlock ‘standard game features’ is a direct reaction to that particular problem. The Online Pass it seems appears to be another main component in the company’s Project Ten Tollar plan, aiming to give gamers another reason to buy new.

US retailer GameStop looks to have welcomed the change, highlighting that it is inline with their newly directed focus towards expanding their operations in digital game sales and downloadable content.

"GameStop is excited to partner with such a forward-thinking publisher as Electronic Arts," said Dan DeMatteo, Chief Executive Officer of GameStop Corp. "This relationship allows us to capitalize on our investments to market and sell downloadable content online, as well as through our network of stores worldwide."

It is likely that EA’s Online Pass will be sold on the retailer’s website, and that the user will receive the code via an email much like how Amazon’s PSN downloads work. Either way, not all retailers are positioning this as a doom and gloom situation, instead opening up new opportunities for future profitability.

You can read our report about EA’s Project Ten Dollar here, and about the new face of videogame trade-ins here.

Monday, 10 May 2010

Feature: The Future Of Videogame Trade-Ins?

The notion of trading in your old games for new ones, or just simply buying the latest new releases in second hand form seems to be a thorn in the side of videogames publishers. Or so it may seem, especially when reading reports on how companies like EA and Sony are gearing up for a battle to salvage sales of brand new ‘mint’ games whilst putting a dent into preowned, both in terms of sales and the customer trading in. Many of these companies are tired of sitting back and watching whilst the retailer makes money over and over again on titles in which the publishers can only sell once.

However, what if retailers gave back a small percentage of the profits created by used game sales, what about then? Would publishers now be willing to ‘play ball’ with the retailers on the current situation they find themselves in, or would they still be gunning to drastically cut down all preowned transactions? Well, an answer may be here sooner than you think, as GreenManGaming.com attempts to put all the benefits of used game trade-ins and sales to customers, whilst at the same time giving publishers and developers the support they need.


I’ve been saying for years that retailers should be giving back a percentage of their preowned profits to the publishers, and that if they did do such a thing, then the development community wouldn’t have so much of a problem with people wanting to trade-in and save money whilst still obtaining the latest releases. That idea, it seems, is also very favourable to the development community, who with the service offered at GreenManGaming’s new online portal, seem to be strongly in favour for the notion of trading in, and seeing cheaper versions of their latest products available, if only because they finally see some of the return on these sales.

Online, it seems is the perfect testing ground for this idea, and the ailing PC market also lends itself nicely for such an experiment with users constantly expecting lower prices, and struggling against some particularly aggressive DRM measures. This is where GMG and their website comes in. It is at first, like any other website selling downloadable PC games. Create an account, add in your credit/debit card details, download your selected game, and away you go. However, the site unlike any other on the market, offers its users the option of trading back in their digitally downloaded games when they have finished with them.

So how does this work, how can some give back an existing download on their computer at home for a new download of another game? Well, you're not quite giving back the download itself.

When you purchase any software from GMG’s website you are given an activation code, just like with boxed PC games, and it’s this that you effectively trade back in. All you have to do once you want to trade back in a game, is click on the ‘trade in’ option below the box art on the game page and then that’s it, your game gets traded. Of course, you are given a trade in value for your title beforehand, and if you choose to accept, you are given credit to purchase further games from the GMG website. Your original code gets re-generated into a new one, and is then sold off at a cheaper price, depending of course on its market value.

This means that it is not only possible to trade in your old GMG website purchases for new ones, but also the ability to buy cheaper versions of other games which have been traded in. All of the games are new, there is nothing except for the price that could be considered preowned. In terms of pricing, everything is determined by market value, just like how actual bricks and mortar shops operate. So, the more people that are trying to buy one particular title will send both its trade and purchase prices right up. Whereas if a certain title is being constantly traded in, its purchase price drops accordingly, as does its trade price, just as you’d expect it would.

At the same time, highly popular or rarer titles will maintain there market value over longer periods of time, unlike in some regular retail stores in which some popular titles see both their trade and purchase prices reduced massively over time. GMG’s system should be fairer, with customers through their own buying and selling habits dictating the overall price of certain items. New releases however, are likely to be price protected for a short period, as you would expect.

Of course, for such a system to work securely, away from the hands of pirates whilst satisfying the publishers, there has to be some form of DRM involved. In this case, SecuROM. However, GMG’s implementation of this somewhat hated form of DRM isn’t quite as intrusive as the ones used in previous boxed retail copies of high profile titles. Instead, after installing the newly downloaded game onto your computer it will register itself with GMG’s online servers, verifying its authenticity and thus allowing you to play. This authentication needs to be done via an Internet connection every three days. Although if you are away for long periods of time it is still possible to activate the game again after the three-day period, it’s just that the game won’t work after three days unless you re-activate it.

The system may sound harsh, but looking at the increasing number of titles which require a continuous internet connection, it is a pretty fair compromise, especially if gamers are getting all the benefits of cheaper titles and the ongoing option to trade in old titles. For the PC market, this would actually be the first, as previously hardly any shops would take in PC games with their reliance of activation codes and online registration.

So far, a few companies including PlayLogic, JoWood Productions, Midas and Namco Bandai have signed up to have their games available on the site, and apparently GMG is in talks with the likes of Rockstar, THQ and Sega to see if they are interested in at least trying out the service.

If the security measures are good, and the overall service is popular enough, then I suspect many more will come on board, as there is very strong evidence to suggest that people who trade in more games, also buy a lot more games as a result. This seems to be the view held at GMG as well, so they are very positive that their service will offer gamers a new way to empower themselves by trading and buying new titles online. Certainly, the aim is to make things fairer for both the publishers and the gamers.

GMG’s service will be launching here in the UK first with a planned roll out into many other territories, starting with the United States in a few months time. Currently gamers anywhere in the world can use the site, however the prices and currency are all localised for users in the United Kingdom. Later on when you visit the site from other territories other than the UK, a specifically localised version will instead appear with the correct pricing and currency for that particular territory.

Other than having an eventual worldwide presence and new release game sales, GMG also are hopeful that their service will attract titles that have failed to garner a publishing deal, and that might have otherwise been left upon the scrap heap. Instead they hope that developers will release their gamers independently on the service fee of needing any kind of publishing deal. The likelihood of which, means that there is a high probability that titles featuring original ideas, or simply independent IP, will eventually appear on the site, giving gamers both choice and variety on the site.

Overall, GMG’s revolutionary service could well be the way forward for traditional retail outlets to maximise trade-ins and preowned software sales, whilst at the same time satisfying large publishers and developers, in addition to the smaller ones who struggle to break even, let alone make a solid profit these days. I imagine that traditional retail will be looking at how successful GMG’s service is, both in terms of profits and market penetration, before perhaps adopting a similar system further on down the line.

If the service is successful, then there is no reason why bricks and mortar retailers also couldn’t start giving back a percentage of profits made from their sales of preowned games. After all, in the long run it would benefit the entire industry, from the developers and publishers, to the gamers, and even the retailers themselves.

Of course, it has to be done at the right price, and it has to be fair on the consumer, fairer than the current retail system in which you pay near £40 for a preowned title, only to be given around half that when you trade it in days later. Personally, I think it’s pretty obvious that this change isn’t going to happen overnight, but a change is necessary, especially for the industry to continue to thrive and push forward the boundaries of interactive entertainment.

GreenManGaming’s site (greenmangaming.com) opened to the public earlier this week. We definitely suggest that you pop along and check it out, as it could well be the future in the making.

Saturday, 8 May 2010

Tech Analysis: Halo Reach Beta

You may remember that we did an initial tech analysis on some of the first in-game screenshots of Halo Reach way back in February, in which we discovered that the underlying engine behind the game had been completely reworked, and overhauled in such a way, that there was a large noticeable jump in quality over both Halo 3 and ODST.

Certain things still eluded us however, such as the game’s final rendering resolution, or whether or not Bungie could still afford to keep their trademark high-end HDR lighting system firmly stamped in the final build. The beta we said would finally be the place in which we could get a tangible look at the tech behind the game. And so today at IQGamer that’s exactly what we’ll be doing, ripping apart the engine behind Halo Reach and revealing just how far it’s come from its early Halo 3, and original Xbox beginnings.

The first thing to say, is that the engine powering Halo Reach is more of a giant evolutionary step forward rather than a brand new revolutionary driving force. That said it is a vastly superior beast in every way shape and form compared to the engine used in the previous two games. Boasting numerous improvements, from rendering resolution, texture work, lighting, shader effects, and character modelling, everything has seen an overhaul. Some areas have only been subtly enhanced, while others have been completely changed, making for not only a large boost in image quality, but also a smoother looking game as a result.


One of the main complaints in Halo 3 and ODST besides the lack of any anti-aliasing, was the game’s sub-HD rendering resolution. Both titles rendered at 1152x640 in a dual framebuffer, which came together to form the final 640p image. For Reach Bungie have upper the game’s resolution, albeit ever so slightly, just enough it seems to be able to be loosely qualified as 720p. Reach basically renders in 1152x720p, keeping the horizontal resolution the same as Halo 3 and ODST whilst upping the vertical res - which is the one that the human eye is most sensitive to, thus the most important to increase.

It is also likely that the developers opted for this 1152x720p resolution in order to keep the framebuffer firmly fitting into the 10MB EDRAM, which is something that seems to be a priority for Reach. Even with all the enhancements and additions made to the game engine, they still want to avoid tilling.

In addition to this increase in resolution, Halo Reach also retains the unique HDR lighting implementation from the last two games. The effect has been reduced somewhat, appearing to be of a slightly shorter range compared to the ultra wide range lighting on offer in the last game. However it has been bolstered by the use of far more local lights, and a brand new differed dynamic lighting system featuring dozens of individual lights on screen at once.


This new lighting system means that there can be upward of thirty or more light sources on screen at once, given off via weapons fire, explosions, and environmental lighting, such as the glow given off from lights inside buildings. All of these light sources are real-time, and interact with their surroundings. So a gunshot, or rounds from a Needler will light up surrounding areas, and change the shadows created by moving objects. Each individual projectile from the Needler also has its own light source, as do many other projectiles in the game, which is a first for the series and is exactly what you’d expect from next-generation lighting techniques.

Shadowing is a mix of pre-baked and dynamic. All the environmental shadows in the game are baked shadow maps, stationary and un-reactive. Moving objects however, are given the proper real-time treatment, with full dynamic shadows to complement the use of multiple light sources in the game. Shadows on these react to both other objects and the environment, with neighbouring light sources affecting how they are displayed.


SSAO (screen-space ambient occlusion) is also present in the beta, though it is only visible on indoor areas, and isn’t used anywhere else. It’s implementation is pretty much artifact free, and blends almost perfectly with the baked shadow maps in the dark areas which use it. Bungie had originally stated that it wouldn’t feature in the beta, but clearly, its here for all too see, if very subtle at this point. We expect that the use of SSAO will extend to the outdoor areas in the final game, if only for the single player campaign.

In terms of texturing, detail, and filtering, Reach has seen a massive improvement over Halo 3 and ODST. Texture detail has been significantly increased, with better use of normal and environmental bump mapping creating a depth and detail that simply wasn’t there before. Texture filtering, one of the main complaints with the last two game, has seen a huge boost. Reach uses what looks like a combination of anisotropic (AF) and trilinear (TF) filtering for all of its textures, meaning that detail is now visible for longer distances than before. You can see this at work in the screenshot below.


The other main complaint from the last two games, the lack of any anti-aliasing, has also been approached, though not completely dealt with. Reach uses a form of AA known as ‘temporal anti-aliasing’, which works by blending two separate frames together whilst combining them during a time delay, creating a 2xMSAA look on certain objects and geometry when the game isn’t moving. However, the down side is that when there is any movement this form or AA causes a distinct blur effect, not unlike the motion blur encountered on a old LCD TV, and one which is highlighted by the game’s use of a post process motion blur effect.

Also, another downside is that certain objects, such as the 2D foliage, aren’t affected by this form of AA, leaving them with noticeably jagged edges. This doesn’t blend in too well with parts of the game that do benefit from the temporal AA, and just showcases another problem with using this technique. A proper MSAA solution would have been far more beneficial, though Bungie would have then have to use tiling to fit the framebuffer into the 10MB EDRAM.


Despite these issues, Reach in beta form is still a great looking game, and features some impressive high resolution particle effects, debris at lower resolution, good use of transparency effects, tessellated water, and a nice bit of bloom lighting to top it all off. The whole visual range feels a lot more organic than before, even with the Halo series’ typically clean lines and smooth industrial look.

All this is backed up with an accurate post-process motion blur effect, one that is even more impressive than the one created by Namco for use in the PS3 and 360 versions of Tekken 6. Reach’s motion blur technique, like in Tekken 6, works on an individual object basis, and is incredibly accurate. Unfortunately, it so obviously interferes with the temporal AA used in the game, creating some unwanted ghosting and being pretty intrusive when you least want it to be.


Like with Halo 3 and ODST, Reach aims to maintain a constant 30 frames per-second at all times, without breaking the v-sync that’s in place. Occasionally it does do this creating some mild screen tearing, but this is usually relegated to one or two frames appearing at the top of the screen. The game does slow down however, mainly in busy scenarios, but that scarcely seems to affect the amount of tearing that appears to any great extent, meaning that the v-sync is working as it should do.

In many ways Halo: Reach is simply using the backbone of the previous game engine, reworking and enhancing it along the way, using it to blend in new graphical improvements with tried and tested old ones. At the same time it still manages to work in the tight constraints of the 360’s EDRAM. Not so surprisingly we don’t get a proper 720p (1280x720) rendering resolution, or multi-sampling AA. However the game’s cleaver new LOD system allows the screen to be filled with dozens of detailed objects and light sources, whilst retaining most of the HDR lighting from the last two games, and still include some excellent texture filtering.

So far the multiplayer beta has certainly impressed, especially with its use of effects that we thought would probably just feature heavily in the single player campaign. Instead Bungie have seen fit to try and include all of the technological improvements the revised engine has to offer for both single and multiplayer modes. The game is clearly visually superior to its predecessors in nearly every way, minus the blur caused by the AA, and still has a good couple of months to go before its done and out the door.

It should be interesting to see just how far the main campaign has come along, and whether they have managed to further improve on the foundations laid down in the beta. Certainly, what we’ve seen today looks better than the early screenshots of the single player gameplay, and no doubt that the final code will look even better. How much better though, will largely depend on how much they insist on pushing the engine for the multiplayer side of things.

All things considered, Halo Reach looks like every bit the next-generation Halo game that it predecessors should have been. Of course, the sparse slightly bland look that comes with the Halo universe isn’t going to go away. After all, that IS the look and feel of the series. But at least, for the first time the franchise has actually transcended its old Xbox roots into something that actually feels it belongs, from a visual perspective anyway, on Microsoft’s 360.

Thursday, 6 May 2010

Tech Analysis: Super Street Fighter IV (PS3 vs 360)

With nearly every big release here at IQGamer, it is almost a given for us to have our trademark technical analysis to go along with our in-depth review. But with Super SFIV we were considering skipping over the whole tech thing seeing as the differences are so small between the two versions, that whilst the game is running (at the preferred 60 frames per-second) it is almost impossible to tell the differences apart.

That would however, in our humble opinion, be doing our loyal readers a disservice. So instead of simply glazing over the technical aspect with our enthusiastic review, we are going to put Super SFIV through its paces as per usual for the full tech treatment.

Okay, I’ll start be saying that the same things which applied to last years Street Fighter IV, on both PS3 and 360, applies to this Super edition too. Everything from texture work right down to how the shader effects work, are handled in exactly the same way, although rendering resolution is the same on both platforms this time. This means that if you know about how the last game performed on both systems, then you know for the most part how Super SFIV performs as well.


Super Street Fighter IV is rendered in 720p (1280x720) on both PS3 and 360, with the 360 getting the usual 2xMSAA (multisampling anti-aliasing), whilst the PS3 version once again features no AA solution of any kind. This lack of AA only really manifests itself in scenes with high levels of brightness, in which such high contrasting areas create a slightly jagged look to the edges of polygons in the game, along with a small amount of edge shimmering too. Most of the time it is barely noticeable at all, and the only benefit is that the 360 game looks slightly cleaner at all times.

During performance of any Super and Ultra moves, along with the real-time pre and post fight intro and ending sequences, the PS3 game no longer drops resolution down from 720p to 1120x630 unlike in SFIV. It seems that through optimisation, that Capcom have managed to solve some of the bandwidth issues that may arise from the fact that PS3’s RSX GPU has access to less overall bandwidth than either the 360, or the Taito Type X-2 board the original SFIV runs upon. Essentially, all the transparency effects that are displayed onscreen during a Super or Ultra move vastly eat into each system’s bandwidth. However, it just so happens that this time around, that capcom have found a way of maintaning full 720p resolution on both platforms at all times.


In addition none of the normal transparencies or special effects have been rendered at lower resolution either, instead solidly maintaining 720p throughout. Quite clearly this increase in resolution isn’t the most noticeable change when comparing the two versions side by side, and especially whilst in motion at a constamt 60fps, in which they both look identical.

Perhaps the most noticeable difference comes in the form of texture detail, or more specifically, from the observation that the 360 version has slightly more detailed textures, which are used in some of the background scenery found in the game. These, along with some of the background objects are indeed rendered in 1120x630 instead of 720p on the PS3 build. You can see this happening clearly in the screenshot below, just look at the trees in the top right hand corner.


At worst, these lower resolution textures and objects make some of the background details appear a little fuzzy when comparing the two in real-time 60fps, though nothing particularly intrusive. Whilst at best, it is barely even noticeable at all, unless of course you switch between seeing the two versions on the fly. But this isn’t something that people usually do when playing games, so it really isn’t an issue, just another observation.

Texture filtering on the other hand looks to be identical on both versions of the game, which is somewhat surprising, considering the PS3 usually gets the exclusive advantage of having almost free use anisotropic filtering. This time around, both PS3 and 360 versions feature equal amounts of AF, with detail being visible far off into the distance. Yet another sign that the game isn’t perhaps pushing the 360 as much as it is the PS3, with all its use of alpha transparency effects sucking away potential performance.


Last time with Street Fighter IV, we noticed that in terms of shadowing on both systems, it was the 360 game that had the obvious advantage. Microsoft’s version featured not only softer shadows than the PS3 game, but also had exclusive use of self-shadowing not found in the Sony build at all.

For Super SFIV this has changed. Now both version feature self-shadowing – where a character casts their own shadow over themselves - as so evident in the screenshot below, while the 360 version also features the use of more natural soft shadows. The PS3 game on the other hand, uses a sharper more conventional shadowing method, although this isn’t visible during fast 60fps gameplay, and is barely visible when the characters are in their ‘standing’ positions.


When it comes down to it, Super SFIV is pretty much equal on both platforms, with the PS3 game becoming even closer to the 360 one compared to last year’s SFIV. Some differences remain, like the lack of any anti-aliasing on the PS3 game, along with one or two missing effects and the occasional lower resolution texture. The use of self-shadowing on the PS3, and equal amounts of texture filtering balance out any differences to the point that when seeing the game in motion it doesn’t really matter at all.

You have to remember as well, that in screenshots the differences are more pronounced, as they also are when you pause both games and view them one after another on the same telly. Of course there is still a small image quality advantage given to the 360 game, but really, this is only visible at certain points throughout the game and not all the time, making it a factual, but somewhat moot point.

In terms of recommendations, both come equally recommended, with your choice most likely to be dictated by what controller options you have available, and not by the very minor graphical differences on offer here. People without a separate arcade stick or specific fighting game control pad would be better suited with the PS3 game, as the Dual Shock or Sixaxis controllers both perform better than the 360 one. On the other hand, 360 owners can still get the same polished experience with the aid of a separate pad or stick.

Either way, both versions are visually superb, and the overall game itself is perhaps the best beat’em up available on current-gen systems. Whichever console you happen to own, Super SFIV is well worth the asking price, especially for fans of the series and people who missed out on the original game. All I’d say is that to get the most out of the experience, then you really need either an arcade stick or USB Sega Saturn pad, and that goes for anyone regardless of the version you happen to end up buying.