They say Christmas comes but once a year. Well, life-engulfing single-player BioWare RPGs are even rarer than that - Dragon Age: Inquisition comes with approximately 8000% more beard than Santa Claus could ever dream of having, and in a 1 v 1 faceoff Rudolph would have more than a red nose to worry about in fisticuffs with a Fereldan Frostback.
In value for money terms single-player BioWare RPGs are unparalleled and Dragon Age: Inquisition is no different, offering up hundreds upon hundreds of hours of gameplay as you and your party of intrepid adventurers ramble across its impressive world. It might be the ultimate worthwhile purchase, but before you commit to that 25GB download and fork over $50 to the folks at EA, you might want to check out how Dragon Age: Inquisition runs on the GD Machine 2014, armed and ready with a GeForce GTX 750 Ti, 8GB RAM, and an Intel i5-4670K processor.
The new Dragon Age Inquisition is receiving some great reviews, ours will be along shortly, and while it looks good for an RPG it's not destroying our modern day PC setups. Dragon Age Inquisition is built on a modified Frostbite 3 game engine, which is the same one powering Battlefield 4, so we knew there would be scope for an interesting graphical experience.
To benchmark, we decided to start Dragon Age Inquisition off on one of our lesser graphics card setups, the Nvidia GTX 650. Lets take a look to see how well Dragon Age Inquisition runs on an ageing middle-to-low-end GPU.
Samsung has filed a petition to the International Trade Commission to ban Nvidia GPU sales in the US, following an ongoing dispute between the two popular hardware manufacturers. Nvidia had first taken both Samsung and Qualcomm to task back in September, claiming that several patents used within Samsung's range of mobiles and tablet devices are the property of Nvidia. The lawsuit primarily focuses on advanced technologies used in the GPUs, ranging from multithreaded parallel processing to the likes of unified shaders and programmable shading, demanding that Samsung and Qualcomm's GPU's be removed from sale.
In retaliation Samsung has argued that Nvidia's claims are false, claiming that in fact Nvidia is violating a number of Samsung's patents and seeking to ban Nvidia's graphics chips from the US market. The issue stems from supposed copyright infringement in how semiconductors process data, and Nvidia isn't the only one in the line of fire, with the Biostar Manufacturing group, EVGA, Zotac and Elitegroup Computer Systems also under attack.
AMD is taking aggressive measures to boost its market presence. A while back they announced the Carrizo APUs and detailing the energy efficiency and performance boosts which will be brought to laptops. Today they have slashed the price of their monster dual in one graphics solution, the R9-295X2 to a still eye watering but certainly much more affordable, 800 bucks.
The card is a dual graphics solution based on GCN 2.0 architecture having two full Haiwaii XT cores with 5632 shader processing units, 352 TMUs and 128 ROPs, on a 1024-bit memory interface of fast GDDR5. Its core clock operates at 1018 MHz and it has an massive consumption of 500 watts. This card was originally available around $1499.00 when it was released. AMD already lowered the price of this card, along with several others, when the GTX 980 was announced.
Next up in our Up For Debate feature is a discussion around DDR4. Making the leap to the next-generation of memory can be a tricky little blighter. It’s one of those things that rarely has to happen, but when it does it generally involves ripping out your motherboard and everything thats on it and starting your PC build again with DDR4 in mind from the start. We’ve been cruising along with DDR3 for a while now, happy as Larry, but along comes something new and shiny to tempt us.
DDR4 then? It’s new. It’s faster. It’s bloody expensive. Traditionally when I look at the main set of system requirements system memory is often way, way, way down at the bottom of the priority list. In comparison to basically everything else it’s pretty much a bargain, and there’s not many things simpler in the world than throwing in a new stick of RAM.
TellTale games, the guys behind the highly successful episodic adaptation of The Walking Dead, have got their hands on another massive franchise, A Game Of Thrones. The first episode will be along even sooner than we thought, December 2014. A few weeks is all we have to wait before we can march beyond the wall or if you are like me, get drunk on wine and eat wild boar, while upsetting the status quo throughout the kingdom with my lies and naughtiness.
As this is the next episodic series from TellTale Games, we can expect pretty reasonable PC system requirements for A Game Of Thrones episode 1: Iron From Ice, as they will be using their usual graphics game engine. Lets take a look then at what the PC Game of Thrones System Requirements will look like to get it to run your computer.
Welcome to our new feature, Up For Debate. We will be introducing topics to the community where we start the "discussion" off and then you expert hardware gurus get in and throw your knowldge about on the subject. Providing links to relevant on topic things you have seen on the net to help others. On the flip side, fellow GD members who are not too sure about details on a particular "Up For Debate" subject can ask questions and provoke interesting conversation. This way everyone learns and shares tonne. Cool huh. Send Felix or me suggestions for future "Up For Debates"
This first feature is on a certain screen resolution aspect ratio. In the huge scramble for 4K gaming that’s been going on in recent months, it’s easy to forget there’s an oft-overlooked cousin lurking behind the scenes. Yep, I’m talking about 21:9 gaming, the full glory of widescreen CinemaScope that takes up enough of your precious desktop space to make your eyes pop out.
During a recent webcast to investors, Intel executives revealed plans for three-dimensional NAND flash memory to be incorporated in its solid-state drives for future products. The SSDs would be the largest commercial products of their kind, and the breakthrough in flash storage means that 10 terabyte SSD’s could become a reality within the next two years.
The massive high-speed storage is possible because of 3D NAND memory, which allows for the stacking of multiple levels of memory on top of one another. The memory, developed in partnership with Micron, allows for up to 32 layers of NAND flash, theoretically providing 48GB of storage on a single die, single up from the equivalent 16GB achieved by Samsung on its V-NAND equivalent.
Yesterday we put up our official benchmark results for the GD Machine 2014 on Far Cry 4, but to give a feel for how it plays we've grabbed some footage of it running at both 1080p and 900p resolution. In yesterday's benchmarks the 750 Ti coped pretty well, much better than in Assassin's Creed Unity at least, but if you want to get some 1080p gaming going you might need to dial back a few options.
For those unfamiliar with the GD Machine 2014 the 750 Ti is ably backed up by an i5-4670K and 8GB of RAM. Basically it’s got the recommended requirements for Unity covered when it comes to memory and CPU, but the GeForce GTX 750 Ti is a fair chunk weaker than the recommended requirement of a GeForce GTX 680.
Roll up, roll up! It's the next in the seemingly never-ending line-up of Ubisoft releases hitting over this three-week period. Far Cry 4's out and it's caused quite a stir, crash-landing amid the PC gaming community before razing everything in sight down with a handily placed flamethrower.
It's safe to say Far Cry fans are loving it, and what's not to love about riding an elephant armed with an RPG? You that is, not the elephant, they probably haven't got the dexterity for it, even if some of them do paint. We put Far Cry 4 through its paces already on the GeForce GTX 670, but now it's the turn of the GD Machine 2014. Locked and loaded with a GeForce GTX 750 Ti 2GB, 8GB RAM and an Intel i5-4670K, it's primed and ready to (hopefully) give Far Cry 4 a roasting in some benchmarks.