<![CDATA[ Latest from PCGamer UK in Reviews ]]> https://www.pcgamer.com Sun, 29 Dec 2024 13:11:42 +0000 en <![CDATA[ Ikea Utespelare desk review ]]> The day gaming desks were renamed to battle stations, I knew marketing targeted to my demographic was always going to be just a touch batshit. Now you can get desks complete with RGB lighting, standing desks, desks that are actually a bed and much more. While undeniably cool, that’s not always what a fair chunk of gamer people are looking for in a desk. In fact, I’d argue a lot of us just want something that does the job of holding up our stuff while looking smart and not being uncomfortable to sit at at a reasonable price.

IKEA’s Utespelare gaming desk absolutely nails that brief, but adds a few extra features that I’d argue are more battle worthy than an inbuilt rave experience.

The Utespelare comes in two colour options, an all black which feels very gamer chic, or a breezier grey with light wood look top. Both are otherwise identical so i grabbed the lighter colour way to match my setup. Unsurprisingly it arrived as a flat packed box ready to end relationships in its assembly.

Because the table top is largely one piece, it’s quite a large unwieldy box and you need a bit of room to set it up. This includes flipping the desk after installing the table top which was a mighty effort, but that’s to be expected for a desk this size.

Utespelare desk specs

IKEA Utespelare desk lit up at night

(Image credit: Future)

Max load: 50 kgs
Width: 160 cm
Depth: 80 cm
Height: 68-78 cm
Features: Cable management hammock, metal mesh portion
Price: $270 | £129 | AUD$199

That’s the first thing to hit me about the Utespelare gaming desk, it’s really pretty big. Coming from standing desks, having a 160 cm by 80 cm workspace is actually huge.

You can easily fit a full sized PC, a couple of monitors, routers, your mug printing gear, and that old iMac you rescued out of the rubbish and have been working on getting up and running for fun without any trouble. Plus there’s a decent amount of space under it for more stuff thanks to the relatively small space taken by the legs.

The official site says it has a 50 kg max tolerance but I’m a mystery number over that and have sat on it for science while feeling pretty safe. I can even lay on it if I’m a bit curled up. I could probably put curtains around the bottom and rent it in this dystopia.

There is just so much room for activities on this flat surface of opportunity.

IKEA Utespelare desk three quarter view

(Image credit: Future)

While putting the desk together there are a couple of choices to make. One is how tall you’re looking to have your desk—which is decided by the leg height. This can be adjusted from 66 cm to 78 cm which is an important feature for ergonomics.

You’ll want to choose your height fast though, because it’s not so easy to adjust once everything is all set up. It’s a little weird if you’ve come from the flexibility of a sit stand setup, but with all the space on this desk it would be perfect for one of those desktop raisers added later down the line.

IKEA Utespelare desk front hammock

(Image credit: Future)

The other choice is where to place the metal mesh portion of the desktop and how that relates to the cut out in the wooden piece. After consulting the ancient Swedish hieroglyphics, I went with having the cut out backing onto the metal, giving me a really tidy way to hide my cables when pared with the hammock solution included.

It’s one of the easier cable management options to actually use I’ve seen built into a desk as I can keep a power board there, that’s easily accessible for swapping things while still hiding all of my mess.

Buy if...

You’re on a budget: The favourable price tag makes the sturdiness and practicality even more impressive.

You want a good, big desk: With a tabletop space of 160x80 cm, dedicated cable management, and plenty of space underneath, this desk has a lot of room.

You’ve got lights: I know a cool RGB lit battle station is where it’s at but you might be better off attaching your own aurora to this one.

Don't buy if...

You’re short on space: This desk took me by surprise with its dimensions once placed in reality so do your measurements carefully before deciding on this one.

While being a helpful cable tidy, the metal portion of the desktop is also perfect for all my hot running gamer stuff. It’s a great place to put your PC knowing it’ll have all the airflow it could possibly want accessible to it. I’ve also used it as a place to stick things like routers and other hot-running electronics. It’s great because they’re out of the way at the back of my deep desk but I don’t worry about them overheating or being covered in dust.

While the Utespelare is billed as a gaming desk it feels like a much more sensible product made with gamers in mind than the usual affair. It’s simple, smart, sturdy, and clean. When I walk past it sometimes I just look at it and smile, thinking to myself what a decent desk that is for AUD$199 and how useful it is while still looking quite good in my home.

I know that means I’m getting old, but it’s also a pretty clear sign that IKEA has delivered with this stealth battle station.

]]>
https://www.pcgamer.com/hardware/gaming-desks/ikea-utespelare-desk-review/ eatpQErJYDgQE2fMwd5uHc Sat, 28 Dec 2024 15:00:00 +0000
<![CDATA[ Asus ROG Harpe Ace Mini wireless mouse review ]]> The ROG Harpe Ace Mini is the cousin of the carbon fiber ROG Harpe Ace Extreme, only far less extra. It's just 2 grams heavier than the 47 gram Ace Extreme, though rather than carbon fiber it uses its minimal size to stay light and easily flingable.

Marketed toward pro FPS players, and anyone who needs absolute pinpoint accuracy in the games they play, the Harp Ace Mini uses the same immense 42,000 DPI ROG AimPoint Pro sensor, to prove without any question that your mouse isn't the thing making you miss all those headshots.

In hand, the Harpe Ace Mini feels almost nonexistent compared to the more MMO focussed mice I'm used to. Next to the Razer Naga Pro this thing feels like it's filled with helium. That's partially thanks to the miniscule size, but also the lack of buttons. Where an MMO mouse might have buttons in the double digits, the Harpe Ace Mini packs just seven.

That's your standard two left and right clickers, a clickable scroll wheel, plus two buttons on the right side, and two on the underside. All have a satisfying click and most don't sound too hollow, bar the one on the back left side which resonates a bit within the body. It's to be expected on a mouse this light, since there's next to nothing absorbing sound on the inside. And while it can make a mouse feel cheap, the Harpe Ace Mini manages to come off as sturdy. Pair it with the lovely finish and it's certainly a quality-built little rodent.

Harpe Ace Mini specs

The ROG Harpe Ace Mini side buttons

(Image credit: Future)

Sensor: 42,000 DPI
Polling rate: 8,000 Hz
Weight: 49 g | 1.72 oz
Size: 6.4 x 11.7 x 3.7 cm | 2.5 x 4.6 x 1.45 inch
Buttons: 7
Shape: Ambidextrous but with left-side buttons
Price: $130 | £130

The Harpe Ace Mini is mostly ambidextrous, in so much as it's semi-symmetrical. The side buttons still live on the left, so lefties will have to use their pinkie or whatever wizardry they perform to deal with our righty-dominated world.

There's definitely some thought that's gone into keeping the shape less curved to one side, at least. That may be better for lefties, but it means there's less ergonomic specificity for either. Pit it against something like the Logitech G502 X and its superior ergonomics and it comes up a little short, but then some will prefer a less curvy mouse design.

Image 1 of 2

The ROG Harpe Ace Mini top down

(Image credit: Future)
Image 2 of 2

The ROG Harpe Ace Mini side view

(Image credit: Future)

This thing is inoffensively small, which makes it great for my little hands, though I imagine players with bigger grippers would struggle a little with it. Not greatly, but if you're a palm grip user with large hands you might want to think twice before buying something so itty-bitty. That said, the size does go toward making this mouse all the more portable.

Speaking of portability, I've been really impressed with the Harpe Ace Mini's battery life. Using it for 12 hours on the 2.4 GHz Wi-Fi without charging or letting it go to sleep the charge indicator still showed green.

Image 1 of 3

Charts showing ROG Harpe Ace Mini testing

(Image credit: Future)
Image 2 of 3

Charts showing ROG Harpe Ace Mini testing

(Image credit: Future)
Image 3 of 3

Charts showing ROG Harpe Ace Mini testing

(Image credit: Future)

Above: Tested at 1,000 Hz — The more erratic the dots are, the worse the tracking on the mouse.

A week of 3-4 hour gaming sessions a day later and it was still at more than half charge, even with the RGB lighting on. I don't have enough time to drain the battery but, by my calculations, it would take me four weeks at this pace to fully drain it, minimum.

As for the software needed, it requires Armoury Crate Gear to play with button assignments, performance and power settings, but if you want to get going with scenario profiles and design your own macro settings you'll have to download the full Armoury Crate & Aura Sync.

Buy if...

You're in need of supreme accuracy: The ROG Harpe Ace Mini really is ace when it comes to accuracy. Its 8K polling rate and consistent 24,000 dpi sensor give it an impressive edge in terms of accuracy.

You don't want to have to charge up a lot: The battery life on the ROG Harpe Ace Mini is impressive for something so light. It'll last weeks working via Wi-Fi connectivity, even more if you opt for Bluetooth connection most of the time.

Don't buy if...

You're looking to save money: There are plenty of accurate mice out there that come in well under the Harpe Ace Mini's $130/£130 price tag. Wireless ones, too.

You prefer heavier mice: The Harpe Ace Mini is incredibly light. So light it's barely noticeable in your hand, so it's not for those with a preference for heavier mice.

Frankly the latter is a little much just to access macro options. Even the lite Gear version is quite a large file to download and install which is a shame, and I've had a couple of issues with connection and setting changes being met with errors. Having to restart my machine at several junctures just to get the software to work isn't a good look, though when it does work it does all the basics well and the interface is nice and clean.

Performance-wise, the ROG Harpe Ace Mini is a stellar mouse. Hardly a waver when it comes to accuracy, and consistent tracking points all across the board show that this is one accurate rodent. Does it need to be this accurate for everyday use? Probably not, but with no tradeoff in terms of weight for a more impressive sensor, it's easy to appreciate the effort in this department.

The Harpe Ace Mini goes hard where accuracy is concerned, and while its attempts to stay ambidextrous cost it the more specific ergonomics of a right-handed mouse, I can really see it taking off for pro gamers and claw grippers who prefer light mice, as long as they don't mind downloading the huge Armory Crate software just to mess with macros. With impressive battery life, anyone wanting to take the Harpe Ace Mini on the go will be best pleased they won't have to charge it for ages. Sure there are cheaper wireless gaming mice out there with similar battery lives, but not many that also match the Harpe Ace Mini's impressive accuracy and weightlessness.

]]>
https://www.pcgamer.com/hardware/gaming-mice/asus-rog-harpe-ace-mini-wireless-mouse-review/ iNJnFL3jZm9RU4huxLRN8e Fri, 27 Dec 2024 15:00:00 +0000
<![CDATA[ Corsair K70 Core review ]]> Switches make a mechanical keyboard. Your chosen plank can have all the extra features and RGB in the world, 15 USB ports at the back and an FM radio built in, but if pushing those keys down feels like stroking the back of a porcupine, it’s not going to become a favoured gaming companion.

Lucky, then, that Corsair has been thinking a lot about keyboard switches recently. Exhibit A in this investigation are the Hall effect magnetic switches in the K70 Max, which went down quite well in the PC Gamer review. In the K70 Core you’ll find something slightly different, pre-lubed Corsair MLX Red linear mechanical switches, which are making their debut here, but still with a whole lot of sound-damping foam, just like the K70 Max.

Core keyboards are a step down from the top-of-the-line Pro boards in Corsair’s range. This doesn’t exactly make them budget models, as the price is still higher than that of truly cheap examples, but like a politician desperate for re-election they focus on the core values that make a keyboard great.

These are: having keys? Check. Flippy feet at the back? Rubber pads at the front? Check and check. USB connection? Check (though a wireless version is also available for the profane and corrupt). Absolute dust and fingerprint magnet? Check. Little dial in the top corner that looks neat but you’ll probably never use? Check... though we’re diverging from core values and into oppugnant morality here.

K70 Core specs

The Corsair K70 Core close up d pad

(Image credit: Future)

Size: Full with numpad
Connectivity: USB 3 wired, 6ft cable
Keycaps: Double-shot ABS
Switches: Corsair MLX Red (linear)
Hot-swappable: No
Media controls: No
Lighting: Full RGB per-key
Software: iCUE
Price: $100/£90

Otherwise, it’s the usual mix of aluminium and plastic, RGB LEDs and a smidge of flash memory (enough for five profiles). It also uses a USB wired connection that sees a rubber-coated cable emerge from the rear-left of the chassis and make its way round to the back of your PC. A USB hub would have been nice here, but we don’t get one. The cable has a plug on the end that looks for all the world like a USB-C plugged into a USB-A adapter, with a grippy bit to allow easy pulling off. However, after a heavy session of vigorous tugging it remained stubbornly attached. It seemed a bit extreme to get the pliers out, so we’re just going to have to accept that Corsair made it look like that on purpose.

But there's also foam, and it goes a long way to soundproofing the board. You’ll still get a clicky sound if you bottom-out the keys, but there's no case ping (probably thanks to the foam) and lubed switches mean actuations are silent too - though it does still make a sound as the key returns to its resting position following a press. While it’s never going to be as silent as a membrane keyboard, if you really try you can switch between a glorious mechanical noise and a reasonably quiet typing action.

The Corsair K70 Core close up USB

(Image credit: Future)

Performance is excellent, with a 1,000 Hz report rate and full anti-ghosting rollover. The important thing here is the feel of the keys, and they are superb. They sit fairly high, mushrooming out of the deck as if left behind by frolicing fairies in the night, their stems clearly visible if you hold the board up and peer into the edge. Push down on them and you experience a smooth descent to the baseplate, with 45g actuation force and a pre-travel distance of 1.9 mm, before bottoming out at 4 mm. There's a degree of adjustability in this too, if you’re the sort to half-press a key while lining up a shot.

There are no clever optical or adjustable actuation systems here, and you can’t hot-swap anything, but there is a magnetic wrist rest in the box with a strange rough texture that makes you glad you’re not touching it with your fingers.

By default, the dial adjusts your PC’s volume, and there's an iCUE button next to it that maps to play/pause as there are no dedicated media keys. Much of the functionality can be customised in the iCUE software, which as well as remapping keys and cycling through RGB lighting schemes (there’s a keyboard shortcut for dimming which is much faster than opening up an extra app) can also record macros and change between presets for the control dial.

The Corsair K70 Core close up dial

(Image credit: Future)

That dial is an exception to the ‘everything is customisable’ rule, but you can use it for brightness, scrolling and zooming as well as volume. It’s also an exception to the slippery lubed feel of the board, as not only does it have a textured surface and yellow highlight paint that’s not found anywhere else, but it jerks around with a tinny click that’s so different from the rest of the keys it’s almost as if it came from another keyboard entirely.

The iCUE button next to it is also the only key on the board not balanced on a high switch, so you have to dive over the surrounding keys to find it, and don’t get satisfying feedback from pressing it.

The Corsair K70 Core underside

(Image credit: Future)
Buy if...

You’re replacing a membrane keyboard: The K70 Core doesn’t come with much customisability, but it will feel like a step up from just about anything you’ve used before.


You want something easy to type on: The K70 Core is breezy to bash out an essay on, and won’t make too much noise.

Don't buy if...

You’re looking for something more like the K70 Max: This isn’t a flashy feature board, but one designed to be a pleasure to use.

With storage for five profiles onboard the board, FN+ F2 can be used to switch between them, but if you’ve got a load of iCUE devices flashing away on your desktop it won’t communicate with them, forcing you to use software profiles.

Fire up the app and you get access to Corsair’s Mosaics, which are pre-set lighting patterns that are simple to activate. Community-created Mosaics can be found on a digital store, which thankfully has a degree of curation as even the Romans knew the levels of detail and lewdness that could be contained in a simple arrangement of square blocks.

The lighting patterns and customisability are nice to have, but the reason you’ll buy the K70 Core sits between the keycaps and the baseplate. These silky-smooth switches are a big draw for the board, and despite having a limited feature-set in a competitive market segment (and with it being possible to pick up a mechanical gaming keyboard for much less) its quietness and efficiency of operation mean the K70 Core is still able to stand out.

]]>
https://www.pcgamer.com/hardware/gaming-keyboards/corsair-k70-core-review/ WbyeocfkYcj8fEHN28EqVS Thu, 26 Dec 2024 15:00:00 +0000
<![CDATA[ Xiaomi G Pro 27i review ]]> There's a funny thing about monitors. As you progress, from a 14-inch CRT goldfish bowl to a 17-inch screen you can barely lift, to a 19-inch 4:3 LCD, to a 21-inch 16:9 1080p VA model to a 27-inch 4K IPS and eventually a 32-inch 4K with HDR or an OLED, the ‘normal’ setting in your mind shifts with it.

Use a 32-inch monitor every day and you’ll soon forget that it’s the sort of size that would have been considered excessive in a living room TV not so long ago. Stepping down to a 27-inch monitor after getting used to a 32-inch feels like going back to that 14-inch that buzzed and needed to be degaussed once in a while all over again.

And then you remember that this screen only costs £300/$350, and suddenly any sense of lingering disagreeableness passes. A mini-LED monitor for this kind of money is remarkable enough, with comparable models from Cooler Master or AOC going for twice the price, but the G Pro 27i also sports Quantum Dots, a fast 180 Hz refresh rate (with FreeSync), and a veritable plethora of inputs too.

For UK readers, there's also a three-pin plug on the power adapter. Previous Xiaomi screens have been sent out for review with US-style two-pin plugs that require an adapter, though work perfectly well on UK voltage, and it’s nice to see that this practice has stopped. Though as the screen requires its own power brick and the cable on it isn’t spectacularly long, you’ll need to keep it reasonably close to a power socket.

Xiaomi G Pro 27i specs

The Xiaomi G Pro 27i front on

(Image credit: Future)

Screen: 27-inch mini-LED backlit IPS
Resolution: 2560 x 1440
Refresh rate: 180 Hz
Response time: 1 ms
Brightness: HDR1000
Connectivity: 2x DisplayPort 1.4, 2x HDMI 2.0, 3.5 mm audio
Dimensions: 613 x 169.5 x 526.5 mm including base
Weight: 6.8 kg including base
Price:
£300 | $350

The mini-LED backlight is split into 1,152 local dimming zones, each made up of four mini-LED beads which means it’s mostly free of the haloes you can get from screens with larger zones. You’ll instantly realise that this means 3,200 pixels per zone, which at a pixel density of 109 ppi isn’t a large area—assuming the zones use the same 16:9 ratio as the screen itself you’re looking at each one being around 75x42 px. This is still much larger than the individually lit pixels of an OLED, but much better than traditional backlights.

There's still a little bit of bloom, though, especially around bright objects set against a dark backdrop. It’s very bright, unnecessarily so in fact, and most of the time you’ll want to turn the brightness down, something easy to achieve thanks to Xiaomi’s sensible implementation of the OSD controls. Place it next to a standard IPS panel and the difference is clear, with increased brightness and contrast.

The Xiaomi G Pro 27i close up

(Image credit: Future)

It also has excellent colour reproduction, claiming to display 99% of the DCI-P3 colour gamut that can display 25% more colours than the more common sRGB. Combined with the HDR-1000 certification, this should lead to an extremely vibrant result if you’re keen on playing games that have colours in them. Tests with a colourimeter bear this out, showing a 98% response to P3, and a maximum brightness of 690 nits. That’s better than many OLEDs and a heck of a lot better than the usual figures we see at this price point.

The panel is the headline feature, but there's plenty else to like here. The G Pro 27i is a screen that’s meant to be admired from all sides. As a result, there's a lighting ring around the point at which the stand clips into the back of the screen itself (its colour controlled from the OSD), and a cover that will hide the bit where the input cables plug into their sockets (though you’ll still be able to see the cables themselves snaking away).

The Xiaomi G Pro 27i base close up

(Image credit: Future)

The foot, unusually, attaches to the vertical part of the stand with four screws (a driver is provided) instead of a single thumb-turned attachment, which feels rather old-fashioned, but it’s not the sort of thing you’re likely to do more than once or twice in the monitor’s lifetime and leads to a very stable configuration.

Having four video inputs is very nice to see on a gaming monitor, and the sort of setup that uses all of them is probably somewhere in our dreams. Desktop and laptop PCs, games console and streaming stick, perhaps? As the only screen in a dedicated gaming room or bedroom it’s good to be able to hook up multiple devices without needing to rely on switchers, though you will need to connect something to the audio socket as there are no built-in speakers on the Xiaomi screen.

The Xiaomi G Pro 27i back

(Image credit: Future)

The HDMI ports hit version 2.0, so are limited to 144 Hz at 1440p, but the DP 1.4 connectors can really let the pixels flow. The only thing that’s missing is a USB-C connection, and if you’ve become used to switching a USB hub between a couple of PCs, then it can be a wrench to go back to doing things the old-fashioned way.

Buy if...

You want a great screen for a decent price: There may be flashier monitors out there with more features, but if you just want a fast 1440p gaming display with high contrast and brightness, this is well worth a look.

Don't buy if...

You're desperate to go 4K: The only downsides of the G Pro 27i are things it doesn’t try to provide. If you want a 32-inch 4K OLED, go buy that instead.

And that’s one of the big things about the G Pro 27i. In a world of OLEDs, it uses mini-LED. Where USB-C is the up-and-coming video connector, it sticks with DisplayPort, and uses HDMI sockets that can’t feed it the max refresh rate. It doesn’t have anything like a built-in webcam or even speakers.

While other monitors have become hubs around which to organise your PCs and other devices, this is something more pure: a display, and nothing else. Happily, displaying things is something it’s good at, and as it’s available at a low price for a mini-LED panel we perhaps shouldn’t be too dismissive of its more focused approach. Stick one (or a pair!) of these on your desk, hook it up over DisplayPort, and you’ll have a fast, bright, colourful PC gaming experience. And isn’t that really all we want?

]]>
https://www.pcgamer.com/hardware/gaming-monitors/xiaomi-g-pro-27i-review/ beJeESfHqPa68D7wCM5JGS Wed, 25 Dec 2024 15:00:00 +0000
<![CDATA[ Corsair MP700 Elite 2 TB NVMe SSD review ]]> Categorically, what is the biggest problem with PCIe 5.0 drives? It's the heat. At launch, and even now, those early 5.0 units complete with the Phison E26 controller and Micron 232-layer TLC run seriously hot. To the point it was almost impossible to run one without some form of active cooling baked into it. Corsair wasn't immune to this either, despite its street cred as a storage manufacturer first and foremost, and its MP700 line initially featured the reference Phison cooler, complete with in-built fan, and a touch of yellow branding to try and separate it from the crowd.

A lot's changed since then, and we've seen a whole host of non-fan-cooled PCIe 5.0 solutions arrive with us. However, heat has still always been a major concern.

That is what the MP700 Elite looks to rectify in its bold and brave quest to become one of the best SSDs out there today. In short, this is an exceptionally cool PCIe 5.0 drive that not only delivers relatively comfortable performance on the sequential front but does so with both a low power draw and radically lower temperatures as a result. I'm not saying you can run this without a dedicated heatsink just yet, but we're getting close. That does however come with some drawbacks.

As for the hardware, Corsair's built the MP700 Elite around Kioxia's latest 218-layer BiCS8 TLC NAND, combined with Phison's E321T controller. That does mean it comes without any DRAM cache or buffers, but honestly, that's not a huge concern given the raw throughput that Kioxia's NAND can deliver.

MP700 Elite specs

Corsair MP700 Elite SSD with its packaging on a desk.

(Image credit: Future)

Capacity: 2 TB
Interface: PCIe 5.0 x4
Memory controller: Phison E321T
Flash memory: Kioxia 218-Layer BiCS8 TLC NAND
Rated performance: 10,000 MB/s sustained read, 8,500 MB/s sustained write
Endurance: 1,200 TBW
Warranty: Five years
Price: $260 | £235

It's also worth mentioning that this is a single-sided M.2-2280 design, making it pretty ideal for laptops and other consoles, although if you do get the heatsink variant, please do note that it's too large to fit into something like a PS5 Pro (although you can easily disassemble it if you do).

On launch, it's a somewhat limited choice for capacity, sadly. You can grab one of these either as a 1 TB or a 2 TB configuration, and that's it. For this specific 2 TB model I've got on test here, it retails relatively respectable at $260 ($265 with the heatsink), or £235 in the UK (£240 with heatsink). Unfortunately, no AUD availability just yet.

Right, the big thing I need to cover first is temperatures, because boy, is this quite literally a cool drive. At least compared to other PCIe 5.0 offerings. To be clear, almost always, I try to test all of the SSDs I get in for review underneath the exact same heatsink with the same thermal pads. That's done on an Asus ROG Strix X870E-E motherboard. If the drive comes with a heatsink, if possible, I strip it off and chuck it in the board for the full testing suite.

Corsair MP700 Elite SSD with its packaging on a desk.

(Image credit: Future)

So, for comparison, the Seagate FireCuda 540, a relatively early PCIe 5.0 drive, during its benchmark run, topped out at 83 C. The Crucial T700, which did use its own integrated heatsink, landed at 87°C. The MP700 Elite? 55°C. Ambient room temperature at the time of those tests was 24°C across all three SSDs. That is just a staggering drop in overall heat, and if we're generous and compare it only to the FireCuda 540, there's a 33.7% difference between them.

This is all thanks to Phison's latest E31T controller. In short, it's basically a pseudo-evolution of the E26 found in the bulk of most 5.0 drives to date (Teamgroup's Z540 a good example of that). Although it lacks any DRAM and features half the channels and bandwidth, it's wildly more efficient than the original controllers. That's thanks to Phison moving the manufacturing process from a 12nm FinFET solution to TSMC's 7 nm N7 process instead. It also has half the number of channels, and because of that power draw, equally has been cut significantly as well. What that leads to is a significant drop in overall temps as a result. Certainly compared to drives like Crucial's T700 or Gigabyte's Aorus Gen5 12000.

As for the numbers game, general sequential speeds are about what we saw with the initial PCIe 5.0 launch, albeit with one exception. Crystal Disk manages 10,197 MB/s, respectively, on the read, but 8,608 on the write (the latter quite a bit slower than the FireCuda 540 and Crucial's T700).

Where the MP700 Elite picks its head up, however, is in the random 4Ks. It dominates that field, with 88 MB/s on the read and a whopping 336 MB/s on the write, pipping both of our other PCIe 5.0 drives to the post. As for in-game performance, it basically sat in the middle of the pack, landing a load time of 7.426 seconds in Final Fantasy's Shadowbringers benchmark.

PC Gamer test bench
CPU: AMD Ryzen 9 9900X | RAM: 64 GB (2x32GB) Team Group T-Create Expert DDR5 @ 6000 C34 | GPU: Nvidia GeForce RTX 4080 Super | Motherboard: ASUS ROG Strix X870E-E Gaming WiFi | CPU Cooler: Asus ROG Ryujin III 360 ARGB Extreme | PSU: 1200W NZXT C1200 (2024) 80+ Gold | Chassis: Geometric Future Model 5

The real kicker is the price. There's a lot of tech and hardware featured here that's relatively new to the playing field. Whether that's Phison's E31T controller, which landed with us in September 2024, or Kioxia's latest 218-layer BiCS8 NAND, it all costs money up front to bring this together to the table, and despite this drive being pitched as sort of a "mid-range" entry-level option, it's got some tough competition from older hardware that right now, just kind of works.

Image 1 of 2

Corsair MP700 Elite SSD with its packaging on a desk.

(Image credit: Future)
Image 2 of 2

Corsair MP700 Elite SSD with its packaging on a desk.

(Image credit: Future)
Buy if...

✅ Random 4K performance and cooling are everything: The MP700 Elite delivers impressively potent random 4K performance, along with some phenomenally low temperatures due to improved power efficiency. That should translate well in game.

Don't buy if...

❌ You're looking for the fastest sequential drive around: With 8 GB/s on the write and 10 GB/s on the write, it lacks the sequential grunt of other older, and cheaper PCIe 5.0 drives.

Crucial's T700 is a fine example of this (and it's not alone either). If you go for the non-heatsinked variant, at time of writing, you can pick up 2 TB for just $210, and it's consistently been that price for the last four months (even lower during Black Friday). Corsair's MP700 Elite, on offer right now, is still slightly more expensive, despite dropping in price to $215. And to be fair, you can only grab that deal directly from its webstore. Admittedly, you can get the MP700 Elite with a heatsink for just $5 extra versus the $50 investment needed for the T700 heatsinked, but, in reality, most folk buying this are likely just going to chuck it behind a motherboard M.2 heatsink anyway, negating the issue.

Then there's the performance delta between those two drives. Although the random 4K numbers are higher for the MP700 Elite, those sequential numbers, particularly on write performance, are awkwardly lower by contrast. Depending on your workloads, that could be a real deal breaker.

Similar to graphics cards and CPUs, it feels like at this point that excess heat generated by most modern, less-efficient PCIe 5.0 drives has already been accounted for and designed around. Whether that's through better motherboard heatsinks or standard ones included with the drives themselves, it's no longer an issue. Although Phison's latest controller is impressive, it's technology that really should be utilized to better improve the performance of the next generation of PCIe 6.0 SSDs instead. Combine those facts with just how limited that extra performance is for gamers, and well it's a real tough sell.

Still, the MP700 Elite is a solid all-round performer. If you're looking for something a little cheaper and budget is a factor, if you can get this thing on offer, it'll deliver on its promise, and then some, all without breaking the bank.

]]>
https://www.pcgamer.com/hardware/ssds/corsair-mp700-elite-2-tb-nvme-ssd-review/ HZww2Y3FJ9us5mqd2FKvDL Tue, 24 Dec 2024 15:00:00 +0000
<![CDATA[ OneXPlayer OneXFly F1 Pro review ]]> I'm a smitten kitten. And it's nothing to do with the fetching red clothing of the version of the new OneXFly F1 Pro I've been testing, either. Which is a good job, because this Evangelion EVA-02 version isn't available outside of China, so if that was the real kicker you guys would be out of luck.

No, the real kicker is that this is the first gaming handheld PC I've used, held, or tested that sports AMD's latest APU, the Ryzen AI 9 HX 370—the chip whose name I can rarely get right on the first try. Seriously, it's a curse, and I only ever remember the first bit because I know it's AMD desperately trying to make 'Ryzen AI' happen. Stop trying to make Ryzen AI happen.

The Strix Point silicon is a bit of a game-changer for handhelds, especially when you start to factor in all the other extras AMD has crafted that really play into the literal hands of PC gamers. Radeon Anti-Lag and Fluid Motion Frames 2 really are the key ones for handheld gaming, but also any game which sports FSR3 and its own per-game frame generation implementations, too.

Those are what sets the OneXFly F1 Pro apart from any other gaming handheld you could care to mention, because of how the HX 370 extends performance over the competition. Mind you, the $1,339 price tag will also set it apart. That's the sort of money that will get you a full RTX 4070 Super gaming PC and still leave you change enough to buy yourself a decent 1080p gaming monitor, too. So yeah, you've got to really want the form factor and performance to consider dropping that sort of cash on a handheld.

F1 Pro specs

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)

APU: AMD Ryzen AI 9 HX 370
Cores: 12
Threads: 24
GPU: Radeon 890M
Compute Units: 16
RAM: 32 GB LPDDR5X-7500
Storage: 1 TB Acer N7000
Battery: 48.5 Wh
Weight: ~599 g
Price: $1,339

But mobile gaming is expensive; high-end handheld PCs doubly so. That's where Valve came in and played a blinder with the Steam Deck; it took Nintendo's Switch smarts, picked a lower spec chip, and stuck to a price point. Asus and Lenovo, with their own manufacturing might, have managed to bring prices down for their own performance devices, but smaller scale manufacturers, such as OneXPlayer and Ayaneo, seemingly cannot compete on that front and so you get pricing that feels way beyond acceptable.

What you are getting in the OneXFly F1 Pro, however, is a stellar little gaming device with performance to match its aesthetics. Though it is worth saying these are the same aesthetics with which the original OneXFly F1 was adorned.

That is no bad thing, because the slightly rubberised texture and smooth curves of the chassis feel great in the hands. And, while it is relatively weighty—coming in around the 600 g mark—its ergonomics and balance make it feel like one of the best designed handhelds I've used. It's created for the gaming long-haul and I've not had any of the hand cramping I get with the Steam Deck or other larger devices.

Okay, I say it's created for the long-haul, but that sadly does not extend to the OneXFly's battery. I want to get it out of the way up front, this is the biggest issue with an otherwise beautiful little gaming handheld: the battery life. Outside of the upgraded APU, the 48.5 Wh battery was the only thing that I really wanted to change from the original design. It's far too small to deliver a convincing long-term gaming experience with the OneXFly.

Basically, you are absolutely going to need to get yourself an external power pack to be able to enjoy this thing for more than an hour and forty reliably. I wish there had been a way to fit a higher capacity battery in that chassis, but it's so tightly packed in there that ain't going to happen.

Image 1 of 5

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 2 of 5

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 3 of 5

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 4 of 5

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 5 of 5

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)

The level of gaming performance you can get out of it running at just 15 W is pretty jaw-dropping.

The good news is that the Ryzen AI 9 HX 370 (oof, nailed it first time) absolutely slaps inside a handheld gaming PC. From our time testing it in laptop form, that's come as no surprise, but I will say the level of gaming performance you can get out of it running at just 15 W is pretty jaw-dropping. Sure, that's only made real by the twin pillars of upscaling and frame generation, but being able to hit between 43 and 52 fps in Star Wars: Outlaws at an upscaled 1080p resolution feels great.

Especially when, running at the same 15 W level with the same settings, the original OneXFly F1—with its Ryzen 7 7840U APU—is only capable of knocking around the low 20s in the fps stakes.

That sort of performance delta looked unlikely, however, when I was first doing my comparative testing against the older Ryzen APU. Looking at most of our gaming performance numbers and you'll see that in general you're only getting a handful of fps between them, the same is true even when you start throwing in upscaling, too.

There's the odd outlier, such as F1 24 and Hitman, where you're looking at around a 10 fps margin in favour of the Strix Point handheld, which is definitely more significant. But otherwise the 16 compute units of the Radeon 890M inside the HX 370 APU, compared with the 12 compute units of the Radeon 780M GPU of the Ryzen 7 7840U (and 8840U), don't seem to amount to a hill of beans/frames in most games.

Where that changes is when you enable frame generation in any form. Instantly there's a bigger performance delta, and especially so when you start to pull back on the power you let the APU draw in the first place. I feel that's largely because the CPU cores in the F1 Pro are running pretty slowly in the grand scheme of things; where it's capable of 5.3 GHz boost clocks, when pushed the F1 Pro is mostly just running around the 3.3 GHz mark and below.

It's not like the games are going to be CPU-limited on the new OneXFly, but the use of frame generation helps take the load off the CPU a little and lets those extra compute units make more of a difference when it comes to gaming performance.

The main takeaway, though, is that you can be running most games at 15 W, with either per-game frame generation or Fluid Motion Frame 2, and see great gaming performance that is both smooth and responsive.

Image 1 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 2 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 3 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 4 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 5 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 6 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)
Image 7 of 7

OneXPlayer OneXFly F1 Pro handheld gaming PC

(Image credit: Future)

However, even at 15 W in Star Wars: Outlaws I was seeing the battery drain at almost exactly 1% per minute. That's going to get me more game time than the PCMark gaming benchmark delivers at 30 W—in testing that's just 68 mins—but crucially not twice as much.

It's worth saying that not every game is going to be as intensive as Star Wars: Outlaws or a modern 3D title, such as Elden Ring. Throw something more lightweight, such as OlliOlli World or Lonely Mountains Downhill, into the mix and you're going to see that battery life stretch much farther.

The new AMD APU is one of the main reasons I'm so smitten with the OneXPlayer F1 Pro, but not the only one. That 7-inch 144 Hz OLED panel has also got my attention. Running at 50 % brightness it's still got plenty about it, and throwing it all the way up makes it look just stunning because of that 800 cd/m2 peak luminance. The contrast is obviously exquisite, but the colours also sing, and both the refresh and OLED response time make gaming feel great, too.

Buy if...

You want peak handheld performance: The extra cores and CUs of the Strix Point hardware make this the most powerful gaming handheld around.

You want a compact handheld: The diminutive design feels great in the hand and isn't going to take up too much space in your luggage either.

You want connections: With a pair of USB4 sockets and a full Type-A port it's easy to plug things into the device even while charging, and means it can become a full PC without too much docked trouble.

Don't buy if...

You're after an affordable handheld: There are options with 80% of the performance for pretty much have the price of the F1 Pro. It's a great little device, but you've got to be prepared to pay full gaming PC prices for the privilege.

You were hoping for many hours battery life: By using the same 48.5 Wh battery as the original means that you're getting a pretty short up time. But it can game happily at 15 W, which massively helps eke that out.

I've also got the 32 GB RAM / 1 TB SSD version of the F1 Pro in for testing, which makes it a very good PC, too. Combined with the fact that the Acer SSD in question is pretty rapid (7,300 MB/s and 6,600 MB/s for sequential read/write performance) and that you get two USB4 Type-C connections and a full-size USB 3.0 Type-A port on top, you could happily dock this bad boi to a monitor and have a fully functional PC without much messing around at all.

People will keep saying what a mess Windows 11 is on a handheld, and while yeah, it's not a touchscreen OS—especially not on a small-screen—set the thing up to boot directly into Steam's Big Picture mode and you're not a million miles off SteamOS functionally.

The F1 Pro has all the extra configurable physical buttons you could want, the OneXConsole application has matured a lot and, while it's still not as user-friendly as the excellent Ayaneo software, it's got all the functionality, especially now there are performance profiles you can make and switch to on the fly.

It's also a size that delights, too. I liked the Ayaneo Kun for its big screen and extra touchpads, but it's a lump to lug about. With the OneXFly it's just the size of that 7-inch OLED screen and the pads either side. The bezels are slim and the device relatively diminutive, if a little chunky. But, y'know, reassuringly chunky.

So yes, you can colour me a big fan of the new OneXFly F1 Pro. For me it's one of the best handheld gaming PCs I've used, combining functionality with form and performance. I love the fact I can scale back the APU to such an extent and still get great gaming performance out of the device, and it is absolutely my favourite aesthetic of all the handhelds I've used. Though, to be fair, the gorgeous Ayaneo Flip DS has a lot of appeal, too.

But there are still two big things letting the OneXFly F1 Pro down: the sky-high price and the weak, weak battery. With an external power pack you can combat one of those, but there's nothing anyone but OneXPlayer can do about the other.

]]>
https://www.pcgamer.com/hardware/handheld-gaming-pcs/onexplayer-onexfly-f1-pro-review/ 5BAdm6nGQSBjBtjzLG3pPo Mon, 23 Dec 2024 13:15:22 +0000
<![CDATA[ Turtle Beach Stealth Pivot review ]]> Turtle Beach has decided it likes to put screens on its gamepads. That's fine: the LCD display on the excellent Turtle Beach Stealth Ultra was a lot more useful than I thought it would be. When it comes to the new Stealth Pivot, the screen isn't even the chief novelty. No, the big novelty here is the Pivot's unusual approach to modular design.

Let's clarify that, though. While modular, the Pivot isn't coming for the Scuf Instinct Pro's lunch, nor does it have the versatility of something like the Victrix Pro BFG, which boasts unscrewable swappable pieces. No, the Pivot is really just two things: It's a conventional gamepad out of the box, but literally hiding underneath its default configuration is a gamepad made for fighting games, or arcade games, or any other kind of controller-centric game that doesn't require analog sticks.

In other words, the analog sticks, d-pad and face buttons can be "pivoted" to reveal an alternate pad configuration hidden within the controller itself. It's a neat set up.

Once you've lightly twisted down the analog sticks and toggled a lock switch at the rear, it's just a matter of giving either module a little push, revealing its fighter-friendly cousin beneath. It's reminiscent of gamepads like the aforementioned Victrix Pro BFG Controller, but since the Pivot forgoes screws it arguably takes a less fussy approach (though it does lack the freedom of, say, adjusting the analog sticks between symmetrical and asymmetrical placement).

Turtle Beach Stealth Pivot specs

Turtle Beach Stealth Pivot with alternate layout displayed

(Image credit: Future)

Compatibility: Windows 10 and 11, Xbox (wired only), Android
Connectivity: 2.4 Ghz wireless, Bluetooth
Ports: USB-C, 3.5 mm stereo headset jack
Thumbstick layout: Asymmetric
Weight: 298 grams
Dimensions: 120x160x64.2mm
Price: $129.99 | £119.99 | AU$249.95

Perhaps it's better to think of the Pivot less as a customisable gamepad than a hybrid one, and this blurry identity extends to its "pro controller" chops. The analog sticks use drift-free hall effect tech, which you should consider essential in any modern controller, and the trigger buttons have adjustable stops, meaning you can change the depth of their presses. It comes with a 2.4 GHz wireless dongle and also supports Bluetooth connectivity. If you have an Xbox, the Pivot needs to be wired with the included USB-C to A cord.

But where pro features are concerned, the P or “paddle” buttons are where that aforementioned blurriness comes in. There are four in total but only two on the rear. The other two P buttons are face buttons, contributing to the six button layout of the right pivoting module. I've never really used all four rear P buttons on a controller at once, but if you do, and you need them all on the rear, the Pivot won't do that. Up to five profiles for these P buttons can be stored locally and changed on the fly either with the lil' screen on the gamepad itself, or using Turtle Beach's Control Center app. Each of these profiles can also store different configurations for analog stick deadzones and trigger sensitivity.

Based on this, you can probably already see that the Pivot has a fairly niche use case, but if you happen to lie within that niche it may be a godsend. Rather than forking out for a premium gamepad and a fight stick, you can buy this and effectively get both. If you happen to play a lot of arcade and retro games, but also like your analog sticks for modern blockbusters, the Pivot is perfect. If you aren't either of these people, though? You may be better off with something else.

For all its fight stick credentials the Pivot does have one drawback: it doesn't have tactile microswitches. To be clear, all P buttons here are microswitches, but compared to the eminently clicky buttons on the Stealth Ultra—which feel more like mouse clicks than gamepad presses—all buttons on the Pivot have the same slightly mushy feel of a normal Xbox controller. Foregoing tactile switches is an odd choice; In my opinion their audible precision is perfect for fighting and arcade games, and while that's a matter for debate, the tactile switches on the Stealth Ultra were one of my favourite qualities of that pad.

Image 1 of 7

Turtle Beach Stealth Pivot

(Image credit: Future)
Image 2 of 7

Turtle Beach Stealth Pivot

(Image credit: Future)
Image 3 of 7

Turtle Beach Stealth Pivot

(Image credit: Future)
Image 4 of 7

Turtle Beach Stealth Pivot detail images

(Image credit: Future)
Image 5 of 7

Turtle Beach Stealth Pivot detail images

(Image credit: Future)
Image 6 of 7

Turtle Beach Stealth Pivot detail images

(Image credit: Future)
Image 7 of 7

Turtle Beach Stealth Pivot detail images

(Image credit: Future)

Another of the Pivot's nice 'n' niche features is the ability to assign the analog stick functions to either of the D-pads, lest you prefer the exactness of digital inputs. Oh, and there's a slider beneath the Xbox button, which makes sense as a volume wheel, but can also be reassigned to be, for example, a mic volume wheel. The onboard screen lacks some of the functionality seen in the Stealth Ultra, such as the ability to tweak the sensitivity of the trigger buttons and analog sticks, but that can still be adjusted in Turtle Beach's Control Center desktop app where vibration settings and RGB lighting can also be tweaked. Another feature of marginal use is the social media notifications, which works via a separate app on your smartphone and is compatible with Discord and a bunch of other social media platforms. Like it did on the Stealth Ultra, it feels like a superfluous feature that only sounds good in theory, but you may come to love it.

Buy if...

You love blockbusters, arcade and fighting games: The Pivot basically turns from a conventional controller into a small fight stick, making it brilliant to the likes of Street Fighter.

Don't buy if...

You don't think you'll ever need those alternative face buttons: If you don't fit into the (rather large) niche Turtle Beach is targeting here, you probably don't need to fork out

I had no trouble at all with connectivity, even when I had a handful of pads connected via Bluetooth to a Steam Deck. Battery life is a chill 20 hours, but if that's not enough for you rest assured it charges to 100% in under half-and-hour. The RGB implementation is minimal—not as weirdly excessive as its predecessor—with just two colourful bars on either side of the volume slider. And overall, the Pivot feels great in the hand: I like a heavy controller so its 300 grams felt good to me.

I really liked using the Pivot, and if you're going to make use of the swappable face buttons it's a brilliant pad. It's undeniably better than the vanilla Xbox controller—hall effect sticks, rebindable P buttons, adjustable trigger stops—but it's also double the price. Keeping that in mind, the Pivot is really for the folk out there for whom it will really feel like two controllers in one, and on those terms it succeeds. For everyone else, its older sibling the Stealth Ultra can usually be had for around the same price on sale.

]]>
https://www.pcgamer.com/hardware/controllers/turtle-beach-stealth-pivot-review/ ff6fMoprs6xFKVTcqqzk94 Mon, 23 Dec 2024 10:02:53 +0000
<![CDATA[ Ikea Matchspel gaming chair review ]]> Chairs can easily be one of the most important choices you make in life. A cheap uncomfortable gaming chair can lead to all sorts of back pain and health problems, especially if you have long gaming sessions in them. Gaming chairs known for their ergonomics don't come cheap, to the point where they're often prohibitively expensive.

This sucks, especially given most people who most need them have probably spent a lot of their cash on things like medicine, and doctors. Reckless fools.

While it's not groundbreaking for ergonomics, Ikea's Matchspel gaming chair is a PC throne that offers a fair amount of customisation at a very friendly price.

Honouring Swedish tradition, the Matchspel arrived at my door, flat-packed in a cardboard box ready to be assembled. It's a fairly easy setup, and like most modern gas lift chairs relies mostly on gravity and your juicy behind to keep itself on its feet. I didn't have any trouble putting this chair together almost entirely by myself.

Ikea Matchspel chair specs

Ikea Matchspel gaming chair back of headrest

(Image credit: Future)

Seat type: Mesh
Recline:
Yes, not full
Weight capacity:
125.2 kg | 276 lbs
Max seat height:
59 cm | 23.22 in
Warranty:
3 years
Available colours:
Grey or black/red
Price:
$290 | £129 | AUD$249

There's a choice between a black or grey colourway on these. The black sports a red trim which is very ROG gamer, but I went with the light grey to match my desk.

I also wanted to see if it gathered dirt or discoloured with sweat or just contact. Thankfully, I am happy to report both the mesh backing and leatherette seat still look as grey as the day they arrived, despite much use, including a little sweating during heated gaming moments.

For a seat this affordable, there are a fair few settings to play with to get your own individual comfort.

When setting it up the instructions tell you to leave some screws a bit loose so the back of the chair can move a bit with your body and the lumbar support on this stretched fabric is pretty decent.

The downside is it's a little rickety feeling and noisy when parts move but not obnoxiously so. The headrest also has this mesh fabric backing and can have its height and angle adjusted quite dramatically to suit different heights.

Image 1 of 4

Ikea Matchspel gaming chair seat

(Image credit: Future)
Image 2 of 4

Ikea Matchspel gaming chair side on close up

(Image credit: Future)
Image 3 of 4

Ikea Matchspel gaming chair front on

(Image credit: Future)
Image 4 of 4

Ikea Matchspel gaming chair back of headrest

(Image credit: Future)

There's one lever under the seat that lets you adjust the height via gaslift, and also the tilt of the back portion of the chair. Adjusting vertically is fairly standard and quite granular, but the tilt locks in at different set points. The recline isn't all the way back either but does let you lean a fair way for a quick relax.

When pushed in, that under seat lever locks the adjustments so you don't accidentally change your comfortable seat in a heated gaming moment. The armrests are pretty standard plastic but can also be pulled up and be pushed forward and back for further personalised comfort.

Ikea Matchspel gaming chair being put together plus dog overseeing events.

(Image credit: Future)
Buy if...

You want a good bet for cheap: The Matchspel is a pretty cheap computer chair for how pleasant it is and how many adjustment points it has. You could do a lot worse for the RRP of this seat.

Don't buy if...

You need a lower seat tilt: While packed with a fair few customisable settings for the price, it does forgo that pelvic tilt many might find crucial for health, comfort, or both.

You're a larger human: The chair isn't rated for over 110 kgs and I think longer legs would find the depth of the seat wanting. Not necessarily a pick for the big and tall among us.

Despite a fair amount of customisation, especially for the price point, my biggest complaint with the Matchspel is still about ergonomics, and it's that there's no tilt for the bum cushion. To be fair, being able to tilt this forward and back would have made this chair a bit of a holy grail as it's not the most common adjustment you find in computer chairs.

If a chair this cheap had it I would be shocked, but given all the other options I was a little hopeful.

Unfortunately, for a lot of people, especially women or other folks with pelvic pain problems or similar issues this is a big deal and arguably is the most important part of a chair to adjust.

It also might have helped make up for the seat being quite firm under the buns, by being able to adjust it on the fly, especially on long sessions. As it stands I do notice that pressure starts to seep into slight discomfort after a little while seated.

That slightly personal (50% of the population) complaint aside, the Matchspel sports a fair few customisation options, and is reasonably comfy, especially for a chair that only costs $290 / £129. It's on the lower effort end when it comes to hefty computer chairs to set up, and looks business-appropriate in an understated way, especially with the light grey colour options.

It's gamer stealth, so you can pretend you're a professional who didn't totally just alt-tab out of Steam while on Zoom calls.

]]>
https://www.pcgamer.com/hardware/gaming-chairs/ikea-matchspel-gaming-chair-review/ w6Bg4GzUh3VpLw2af9eBP Fri, 20 Dec 2024 17:37:33 +0000
<![CDATA[ ASRock DeskMini X600 review ]]> If you're looking to do a high-performance scratch-build in miniature, the ASRock DeskMini X600 seems like an intriguing foundation. Most mini-PCs come with a pre-soldered mobile APU running the show, but the DeskMini doesn't ship with a chip. Instead, you can install any desktop AM5 CPU across AMD's 7th and 8th-gen Ryzen range, provided your chosen one doesn't break the 65 W TDP mark.

You'll also need one with integrated graphics and, assuming you'll be using the X600 for some level of gaming, you'll want the best, namely the Radeon 780M. With those thoughts in mind, we decided to test-drive the DeskMini with the Ryzen 7 8700G (4.2 - 5.1 GHz, 8 cores, Radeon 780M graphics, 65 W TDP, 95 °C TJMax) and the Ryzen 5 8600G (4.3 - 5 GHz, 6 cores, Radeon 760M graphics, also 65 W TDP and 95 °C TJMax).

The chassis is a breeze to work with. Undo the four screws on the rear and the internal tray slides out to reveal the motherboard. Within minutes we had the CPU, the bundled cooler, RAM and M.2 storage secured, ready to do the Windows installation dance.

The specs are a curious mix of the cutting edge and the weirdly legacy. It has support for current AM5 CPUs, DDR5 up to 96 GB @ 6400MHz (we installed 32 GB @ 5600MHz), and it can take a fast, PCIe 5.0 M.2 drive plus a second PCIe 4.0 drive. And yet, round the back, I was bemused to find... a VGA port? Perfect for playing Leisure Suit Larry on your CRT monitor, but thankfully, you get DisplayPort and HDMI as well. There's no WLAN/BT card out of the box though; you get an E-Key M.2 socket to install one, but you'll need to provide your own or dongle up.

specs

The Asrock Deskmini X600 IO ports

(Image credit: Future)

CPU support: AMD Socket AM5
Max. TDP: 65 W
Cooler support: Max height 47 mm
Memory support: 2x SODIMM DDR5-6400, Max. 96 GB
Rear I/O: HDMI 2.1, DP 1.4, D-sub (!?), 2.5 G LAN, 2x USB 3.2 Gen 1 Type-A
Front I/O: 1x Mic-in, 1x 3.5 mm audio out, 1x USB 3.2 Gen1 Type-C, 1x USB 3.2 Gen1 Type-A
Dimensions: 155 x 155 x 80 mm
Price: $190 | £190

I/O options are spartan, with a single USB 3.2 type-C, a USB 3.2 type-A and an audio jack up front. Round the back, there are just two more USB 3.2 type-A ports, a 2.5G LAN port, plus the aforementioned AV slots. Without a USB hub, you may find yourself juggling devices in and out.

The kit also comes with cabling to snug a pair of 2.5” SSDs to the underside of the mobo tray, but frankly, two M.2 drive slots seems ample; I'd much rather ASRock forewent the SATA option and made space for a low-profile 120 mm fan somewhere. Because as it stands, there is no provision whatsoever for active case-cooling in the X600. Weirdly there are two fan headers, but that's a moot point; there's just no spare volume, or indeed mounting points, for a case fan.

The ASRock Deskmini X600 top down with peripherals

(Image credit: Future)

The kit ships with a CPU cooler, but it's tiny. The fan is a mere 65 mm and it has a big job to do, both in cooling the heatsink on a 65 W CPU and exhausting the resultant hot air out of the case. It doesn't bode well for your ears either; when fans of this size get busy, they make like tiny, angry server-farms.

However, there's space enough to fit an aftermarket cooler up to 47 mm inheight instead. We tested the bundled cooler against Thermalright's AXP90-X47, a unit capable of handling higher-TDP chips and a popular choice with small form-factor builders, though we swapped the stock fan out for a Noctua NFA9-X14 to keep things civil.

Like many aftermarket coolers, the AXP90-X47 has a custom mounting system and CPU backplate, which means removing the stock AM5 heatsink-mounts from the business side of the mobo and unbolting the board from its mounting tray in order to remove the stock CPU backplate from its underbelly.

And let me tell you, that thing is not designed to be removed.

The Asrock Deskmini X600 three quarter view

(Image credit: Future)

The backplate was glued on so hard, I simply couldn't pry it off without risking damage to the motherboard, either from my pry-tool skipping, or the amount of flexure the board was going through as I tried to lever the backplate up and loosen the glue. Admitting defeat, I found that I could bolt the Thermalright cooler onto the existing backplate, but thanks to a curved metal lip running around the backplate's outer edge, it was extremely fiddly to grip and tighten the four hex-nuts which tension the cooler onto the CPU.

Got there in the end with a nice tight fit but, I won't lie, it was a royal, finger-numbing pain in the ass. So caveat emptor: while ASRock states you can fit an aftermarket cooler—and this box absolutely demands one—it's not necessarily a straightforward job. Closing the case up afterward, I also found it a very squeaky fit, with the fan-frame pushed flush against the intake grill of the case panel, which is a perfect recipe for noisy air turbulence.

So with all our parts in place, let's get into it. Paired with the Radeon 760M and Radeon 780M respectively, the 8600G and 8700G offer comparable levels of gaming performance to AMD's mobile APUs in our 1080p, medium-settings, AAA game-tests. While the 8700G outmatched it's sibling and mobile peers in most cases, neither offered significantly better performance than, say, the Ryzen 7 7840HS and Radeon 780M combo running at a full 54 W, and the 8600G often fell below that.

The Asrock Deskmini X600 coolers

(Image credit: Future)

Most of what we threw at each CPU saw 30 fps+, except for Total War: Warhammer 3's intensive campaign benchmark, which left the 8600G panting to make 25 fps, and Forza, which only saw 27 fps in contrast to the 8700G's 44 fps. With a negligible core clockspeed difference between the two chips and both having ample cores for gaming, It's clear that the Radeon 780M enjoys a tangible performance advantage over the 760M.

Both CPUs ran indies and less-demanding titles very nicely, with the 8700G predictably ahead. The 8600G made 45-55 fps in Stray, to the 8700G's 50-70 fps. In Subnautica, the difference was 45-60 fps to 55-75 fps; in Soulstone Survivors and Bioshock Remastered however, both chips performed similarly, at around 45-75 fps and 140-200+ fps respectively. While the 8700G is more costly, it's the clear choice if you're looking to build a compact 1080p gaming system in the ASRock DeskMini X600.

But should you? Because without any active cooling in the case, which equates to completely unmanaged airflow, it's hard to keep the temperature down on these chips without running the CPU fan pretty hard.

Predictably, the bundled CPU cooler performs dreadfully under load. It'll stop a CPU hitting its TJMax, but DEAR LORD does it sing. At idle it's virtually inaudible, but the second it starts spinning up, it emits a high-frequency whine that only increases in pitch with the RPM until you're at risk of defenestrating your lovely new build.

The Asrock Deskmini X600 face on

(Image credit: Future)

This is exacerbated by the CPU fan's speed-stepping. It's an absolute sluggard in responding to temperature change. Given multicore torture-test loads, both CPUs jump straight to 90 °C, but incredibly, the 65 mm fan takes 4-5 minutes to slowly accelerate to its highest, most irritating pitch-intensity. Upon killing the load, we then measured a twelve minute interval until the fan wound down to blessed silence, with the temperature inching slowly down all the way. The fan and cooler combo just cannot dissipate heat well, and with no case fans to shove it out the door, that's what you get.

The Thermalright AXP90-X47 shows a marked improvement, both in noise levels and heat displacement, but it still steps too slowly by default. We avoided the hassle of jumping in and out of the BIOS to remedy this, and installed the free and excellent Fan Control app to experiment with custom curves and reduce the speed-step interval to 1 second. This resulted in a near-instant RPM response to temperature change.

Buy if...

Storage is your priority: with capacity for two SATA and two M.2 drives,the Deskmini X600 would make a great media server.

Gaming is secondary: The X600 is best suited to cooler, lower-TDP CPUs.

Don't buy if...

❌You demand silence: The X600 lacks the fan provision to exhaust heat, making the CPU cooler work twice as hard.

❌You want an off-the-peg solution:
It's motherboard in a box, and needs a CPU, RAM, storage, WiFi, an aftermarket cooler, and probably a USB hub.

With a custom curve applied to the Noctua fan, we achieved a decent noise-to-cooling balance, where intensive gaming had the CPU hovering inthe 80-85 °C range with the fan capped at a palatable 55% of its max RPM, albeit with noticeable turbulence thanks to the fan's proximity to the inlet grill. We simply couldn't achieve the same results with the bundled cooler, which squeals disagreeably all throughout its rev-range.

All of which leaves the DeskMini X600 rather in limbo. You can pack in a cutting-edge CPU and RAM, but there are no active cooling capabilities to compensate. You can achieve a performant and moderately quiet (though by no means silent) build using the Ryzen 7 8700G and an aftermarket cooler, but the costs soon stack up. There's the price of the chip itself, some RAM, some WiFi provision, and we'd class an aftermarket 47 mm cooler plus a top-tier quiet fan as requirements rather than options.

Add those components to the cost of the box itself and you‘re blasting past the price of a quality, APU-armed mini-PC with broadly similar performance, a cooling solution to escort heat from the case, and quiet operation. With two M.2 storage slots and provision for a further two SATA drives, the X600 would make a neat media server, though we'd opt for a lower-TDP CPU, and you'd still want a superior cooler to eliminate the audio horror-show.

In short, if you're looking for a side-rig for 1080p gaming, this is not the droid you're looking for.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/asrock-deskmini-x600-review/ MQ9GuUvtuhsEfPadW96iJT Fri, 20 Dec 2024 15:49:05 +0000
<![CDATA[ Zotac Zbox Magnus EN374070C review ]]> Recent years have seen a powerful new breed of integrated GPUs populating mini-PCs. Thus armed, such machines offer an economical and delightfully dinky way to game at 1080p. If you're looking for a compact PC with something closer to desktop-level games performance however, the options are rather thinner on the ground.

Machines which fold discrete GPUs into their design are both rarer and pricier, but they enable you to crank things up in terms of the resolutions, frame rates and graphics settings they can achieve. Pairing a 13th-gen Intel i7 mobile CPU with a laptop-grade RTX 4070, the Zotac Zbox Magnus EN does just this, and brings the fight to Asus' ROG NUC 970 and the Minisforum AtomMan G7 Ti in terms of 1440p games performance.

The Magnus is barebones from the get-go, though some suppliers offer RAM and storage pre-installed for a fee. There are several CPU and GPU variant configurations to choose from, but Intel's penultimate mobile i7 chip, the Core i7 13700HX, features in our test unit. With eight efficiency cores and eight performance cores running at 3.7 GHz to 5 GHz on the turbo, It's a powerful chip capable of desktop levels of performance, and is also a product of Intel's last-gen philosophy: power at all costs, basically.

That's reflected in Intel's stated maximum draw of 157 W, though we suspect that Zotac have chosen to tune this at the hardware level, and wisely so given the limited volume for cooling in this box. With Cinebench 2024's render-test pushing all cores to the max, the chip settles in for the long haul at around the rather more sensible 60 W mark.

Zbox Magnus EN374070C specs

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)

CPU: Intel Core i7 13700HX
GPU: RTX 4070 mobile 8 GB GDDR6
RAM: up to 64 GB DDR5-4800 SODIMM
Storage: 2x M.2 PCIe Gen4
Networking: WiFi 6, BT 5.2, 2x 2,5G LAN
Front panel: Headphone, Mic, SDXC Card Reader, 1x Thunderbolt 4 Type C, 1x USB 3.1
Rear I/O: 2x DisplayPort 1.4, 1x HDMI 2.1, 4x USB 3.1, 2x WiFi Antennas
Price: $1,960 | £1,700

Where the chip shows its age is in RAM compatibility, tapping out at DDR5-4800. Low-latency RAM at this speed can still deliver greater throughput than higher-latency sticks at faster-rated speeds, but you'll absolutely pay for that privilege.

Like the ROG NUC 970, the Zbox Magnus sports the mobile RTX 4070 GPU, which differs from the desktop variant of the GPU. It has with 8 GB of dedicated VRAM rather than 12 GB, slightly lower clockspeeds, and a narrower, 128-bit memory bus. However, it can still pull its weight admirably at 1440p, as we'll see.

Design-wise, the case is all grills, no frills: a simple black box where every surface, save for the front fascia, features some form of intake or exhaust. The top is basically one big dust-meshed intake for the CPU and GPU cooling system. And while this means plenty of fresh air for panting processors, it also means you're intimately exposed to the full range of sounds that the Zbox's cooling emits.

From idle to full load, you're treated to a constantly modulating wind-scape as fans step up and down to counter heat, and air is sucked or pushed at varying speeds through various grills and mesh panels. Tucked under your desk or behind a TV, that's kind of fine, but we wouldn't recommend having the Magnus right near you on your desk, if you're not using a headset—it's not super-loud, but the constant variable frequency-changes start to wear thin pretty quick.

While the CPU and GPU are soldered in place, the SODIMM RAM and M.2 storage (there's space enough for two drives) are interchangeable. As is the wireless card, but as the Magnus ships with a Killer WiFi 6/Bluetooth 5.3 card, there's really no need.

If you go thrifty with the barebones option and choose to install your own M.2 drive and RAM, Zotac makes it super-easy to do so. Flip the unit on its back, undo two thumbscrews, and the bottom of the case slides off, laying bare all the slots you need to work with. No fussy mechanisms, no fiddly component-stacking, it's just the most straightforward and user-friendly design possible, and that's great.

Image 1 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)
Image 2 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)
Image 3 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)
Image 4 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)
Image 5 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)
Image 6 of 6

Zotac Zbox Magnus EN374070C mini PC on a wooden background

(Image credit: Future)

I/O-wise, the front panel bears a type-C Thunderbolt port, a Type-A USB 3.1, SD card reader, and separate 3.5 mm headphone and mic jacks. At the rear you get dual 2.5G Killer Lan ports, an HDMI 2.1, twin DisplayPort 1.4 ports, and a further four Type-A USB 3.1s. Comprehensive to be sure, though I'd happily trade one of those five USB 3.1 slots for another Thunderbolt, or even a bog-standard Type-C USB.

Down to business then, and as expected from a machine with zesty silicon, the Magnus runs games very nicely. At 1080p, it blazes through everything you throw at it—173 fps in Total War: Warhammer 3's battle engine is crazy-smooth, and 154 fps in Cyberpunk using the Ray Tracing Ultra preset looks and feels just fab.

Moving up to 1440p and ultra settings, things are still very rosy, with everything moving at 60 fps or better. Cyberpunk with RT ultra on still nets a healthy 79 fps, Forza with full RT glides along smooth and stutter-free at 60 fps, and Warhammer 3's more demanding campaign-engine hits the same frame rate.

We also ran a bunch of other games without built-in benchmarks to get a general feel for ultra-settings 1440p performance across the board, and found similarly cheering results. Helldivers runs at a barely-variable 68-72 fps in every environment and combat situation we tried. Motive Studios' awesome Dead Space remake is a revenant meat-treat, enjoying a range of 94-125 fps in the final boss battle at 1440p/Ultra/DLSS Balanced. And A Plague Tale: Requiem's stunning Hives level looks and feels fabulous at 90-120 fps. In short, I have no doubt that the Magnus would see you right with any title at 1440p.

The machine absolutely trades blows with Asus' ROG NUC 970, and does so at a lower price (a quick squirrel around the web reveals a number of sites supplying the Magnus at appreciably under the RRP, so shop around). One of the key differences between the two machines is the choice of processor; the NUC 970's Intel Ultra 9 185H CPU is newer and more efficient than the Magnus' 13th Gen Core i7; it can employ faster DDR5 speeds, and likely adds a premium to the build-price of the machine. But the truth is, they perform at a very similar level when paired with the RTX 4070 mobile; you just don't really feel the difference when the game's afoot.

In price/performance terms, if it came down to a choice between the two machines, the Magnus would be our recommendation. But there's another factor to consider here which we've touched on already, and that's noise.

Buy if...

✅ You want a plug-and-play PC: Whack in a hard drive and RAM and off you trot.

✅ You're a living-room gamer:
The Magnus would fit nicely and unobtrusively under a TV as a console replacement.

Don't buy if...

❌ You're a future-proofer: The CPU and GPU are soldered in and can't be upgraded.

❌ You're seeking silence: Good airflow and cooling, but you can hear every move it makes, every breath it takes.

Setting the ROG NUC 970 to Silent Mode almost eliminates fan noise, for no appreciable loss of performance. With the Magnus, you're stuck with what you get—an admittedly capable cooling solution, but one that cannot be adjusted, with a noise output that's very hard to ignore. Even light tasks can set the fans off, and you can't help but notice every change in pitch and frequency. It's possible to zone out constant frequencies, but variable ones—not so much.

I enjoy much of my gaming-audio using hi-fi speakers, reserving the headset for multiplayer and nighthawk sessions. For my use-case, the Magnus is just too chatty, but your mileage may vary; If you only ever take your headset off to sleep, eat or interface with fellow earthlings, I doubt you'll be so bothered.

Like the ROG NUC 970, this machine isn't for those who envisage future CPU or GPU upgrades. It's for someone who wants solid off-the-peg gaming performance at 1440p, and for whom footprint is a key factor. If you can live with its 'lively' sound-profile, the Zotac Zbox Magnus EN is a solid performer at 1440p.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/zotac-zbox-magnus-en374070c-review/ 3d4FbKo3LY2kean3iWmUaE Thu, 19 Dec 2024 11:23:06 +0000
<![CDATA[ Minisforum AtomMan G7 Ti review ]]> Spend any time with Miniforum's AtomMan G7 PT and it's clear you're dealing with a tidy piece of engineering. The looks I can leave behind, but the choice of performant components, well-managed by a quiet and competent cooling solution, makes for a neat package. It's a beast at 1080p, competent at 1440p with reduced settings and, in a world of off-the-peg machines, represents the kind of technical innovation we love to see.

This next offering in Minisforum's AtomMan line may be viewed as the G7 PT's bigger brother. Marrying Nvidia's laptop RTX 4070 with Intel's Core i9 14900HX, The G7 Ti is a blade-thin desktop machine which is more squarely aimed at 1440p gaming. With its clean-and-sharp aluminium panelling and understated RGB flighting, it cuts a rather more serious and mature figure than the G7 PT.

Pop the side-panel off and the reason for its slim-and-tall form factor is plain to see. It's literally built around a laptop motherboard, with four wee daughter-boards cabling off to perform external IO duties. This isn't the standard modus operandi for Minisforum, which usually opts for custom APU boards and cooling in its mini-PCs.

A copper Yakisoba of heat-pipes shrouds the lower half of the mobo, which is heartening given the choice of CPU. In mobile terms, the mobile RTX 4070 is no slouch, but the Core i9 14900HX is downright monstrous; a 24-core, 5.8 GHz, desktop-level bulldozer. Together, they make for highest-performing mini-PC we've tested, outpacing both the ASUS ROG NUC and the Zotac Zbox EN—both of which pack the same GPU—across a range of synthetic and gaming benchmarks.

AtomMan G7 Ti specs

Minisforum AtomMan G7 Ti mini PC on wooden background

(Image credit: Future)

CPU: Intel Core i9 14900HX
GPU:
Nvidia RTX 4070 mobile
Memory: 32 GB DDR5 5600 MHz SODIMM
Storage: 1 TB M.2 NVME SSD
Wireless: WiFi 7, Bluetooth 5.4
I/O front: 2x USB 3.2 Type-A, SD reader, Audio jack
I/O rear: 1x HDMI 2.1, 1x USBC Data/DP/PD), 1USB 3.2 Type-A, 1x 2.5G LAN
Price: $1439 | £1349 (1TB storage, 32GB RAM) | $1279 | £1249 (Barebones)

Like the G7 PT before it, the G7 Ti has two performance modes which alter the CPU's TDP from max 85W to max 115W, with different fan-RPM profiles configured for each. There's a button on the front of the chassis to toggle between the modes, or you can use the preinstalled hardware-management app, which also lets you customise the RGB lighting. Cranking the TDP up helps the machine blast through rendering workloads at a tangibly greater pace, though the performance gains you'll see in gaming are entirely title-dependent.

There's no doubt the G7 Ti makes for both a capable workstation and a solid 1440p gaming machine. Running Cyberpunk 2077 at the Ray Tracing Ultra preset, with DLSS and Frame Gen, sees a tidy 84 fps at 2560x1440 in 85 W mode, and 86 fps at 115 W. Homeworld 3, a more CPU-intensive title, sees bigger gains with a jump from 62 to 72 fps, but neither Forza (69 fps) nor Total War: Warhammer 3 (96 fps) saw any notable uplift for running the TDP up to 115 W.

And in all honesty, you won't want to. The G7 Ti's little turbo-prop blowers are silent at idle and maintain a bearable volume at 85 W under gaming loads, generating a consistent but largely unintrusive blow. Jump to 115 W though, and the noise becomes obnoxious. Neither mode produces any nasty variable coil-whine frequencies; it's very much the passage of air you're hearing, which is preferable. But with the CPU scarfing down 115 W in performance mode, the sheer volume emitted by the blowers at their max RPM is impossible to ignore, even with headphones on.

While I was benchmarking the machine, my wife (who works a couple of rooms away in our abode) poked her head round the door to say "It's so weird, I swear I can hear heavy rain but it's blue skies outside." Then her eyes alighted on the G7 Ti, puffing its cheeks and emitting a high-pressure TCHHHHHHH from its desktop perch, and the penny dropped.

CPU temps hover around the 90 °C mark, so my first assumption was "well, at least the cooling is keeping things in check," but HWinfo tells a different story. Whether at 85 W or 115 W, the CPU begins to throttle under load. Which means you're bouncing off the redline and not getting the full potential of the chip.

All of which leads one to conclude that this particular CPU in this particular form-factor just isn't the most sensible combination. The ultra-slim design doesn't allow for the volume of copper, or the larger and slower-spinning fans that the i9 14900HX so desperately needs to keep heat in check without throttling or setting off car alarms.

It's also not a sensible pairing with what is effectively a midrange laptop GPU. I suspect the G7 Ti would fare just as well at 1440p with a less power-hungry chip; an i5, Core Ultra, or modern Ryzen APU perhaps. And in fact, a variant of this machine—the AtomMan G7 Ti SE—is available at a slightly cheaper prince-point with the i7 14650HX. It's still a beast, but with lower turbo-boost clocks (5.2 GHz compared to the 14900HX's 5.8 GHz) and 8 fewer E-cores, it strikes me as being something approaching a saner match with the RTX 4070 mobile in a slim form-factor.

Image 1 of 4

Minisforum AtomMan G7 Ti mini PC on wooden background

(Image credit: Future)
Image 2 of 4

Minisforum AtomMan G7 Ti mini PC on wooden background

(Image credit: Future)
Image 3 of 4

Minisforum AtomMan G7 Ti mini PC on wooden background

(Image credit: Future)
Image 4 of 4

Minisforum AtomMan G7 Ti mini PC on wooden background

(Image credit: Future)

While WiFi7 and Bluetooth 5.4 are very welcome, the physical IO is not quite up to par. A single Type-C USB4 at the rear, and none up front? That's not great. Plus, the lack of a dedicated DisplayPort means that, if you want to run a DP cable to your monitor, you need to devote that single USBC socket at the rear to doing so. Which of course means no USB-C peripherals for you, plucky reader, unless you add a hub to your shopping list. The lack of any audio jack at the rear is another annoying omission, as it means trailing a 3.5mm jack round to the front if you want to plug in a set of desktop speakers.

There's also some awkwardness with the form-factor. The vertical alignment looks cool, but isn't comforting, as there's a lot of vertical weight balanced on that slim stand; it wouldn't take too much of a nudge to topple it. Plus, the cable from the external power-brick connects to the upper-portion of the rear-end, which introduces an extra leverage-point. This wouldn't be an issue if it connected nearer the base.

Laying the unit flat on your desk is out of the question, as the right-hand side-panel—what would be the ‘underside' if you were to lay it down - is all air-intake. In any case, a horizontal aspect would take up similar desktop real-estate to a tower PC; you'd be losing the one benefit of its blade-like design, which is the agreeably tiny footprint.

Buy if...

✅ You're a power-monger: The Core i9 14900HX is a rendering beast.

✅ 1440P is your sweetspot:
Best-in-class mini-PC performance at 2560x1440.

Don't buy if...

❌ You want to hear yourself think: In performance mode, the fans work overtime.

❌You want a truly mini-PC: The sheer vertical height pushes the definition somewhat.

It all adds up to a machine that is somewhat in limbo. It's a static PC with a small desktop footprint but a large visual one, which underperforms against a desktop machine at the same price-point. Equally, there are identically-specced laptops out there that cost around the same, but offer the added benefits of a screen, keyboard, trackpad, speakers and mobility.

There's no question that It's a great performer at 1440p, even with the CPU limited to 85 W, which keeps the fans at a sedate and palatable volume. But that touches on my biggest gripe with the machine; if you want to get the best out of its Core i9 14900HX and feed it 115 W in performance mode, you're kind of punished for doing so. It quickly starts hitting the throttle and the cooling system becomes unbearably noisy.

This particular CPU in this particular form factor is overkill. And while I think Minisforum—an outfit with good form in bespoke cooling—has done its level best to tame the beast, there are way more sensible processors out there to build a slim gaming machine around, and better tier-matches for the mobile RTX 4070 GPU.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/minisforum-atomman-g7-ti-review/ rRBi9bYD27R7MydDpUHwJX Wed, 18 Dec 2024 15:05:19 +0000
<![CDATA[ Samsung Galaxy Book4 Ultra review ]]> PC gaming is not what you might call a cheap hobby. A thousand of your local currency units doesn't go far once the old red mist descends and you start on the path toward obsessive acquisition of processor cores, TFLOPS, and always, always bigger numbers. And if you're looking at spending more than three thousand of those currency units, then you'll be expecting something extremely special.

The Samsung Galaxy Book4 Ultra costs more than three thousand units whether you're pricing it in dollars, pounds, or euros. Its pricing is up there with the MacBook Pro and Predator Helios 18, and as distasteful as it is to talk about money when I could be rhapsodising about frame rates the simple fact is that if frames are all you care about then skip over to the latest from MSI, Asus, or any of the other usual suspects where you'll find things more to your taste.

The Galaxy Book4 Ultra isn't really a gaming laptop at all, it just kinda looks like one. Albeit a skinny one. The combination of the Core Ultra 9 (Meteor Lake, 16 total cores) and GeForce RTX 4070 certainly make it useful in that area, and the 16-inch, 3K, 120 Hz, AMOLED touchscreen is a lovely thing to look at, but it comes with RTX Studio drivers installed (and there's a sticker to that effect just below the keyboard) pointing to an intended usage in the creative arts instead.

Swapping over to the Game Ready drivers is a matter of a few clicks in the GeForce Experience software, or the newer Nvidia App, however, after which it becomes broadly comparable to any other RTX 4070-toting laptop.

Galaxy Book4 Ultra specs

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)

CPU: Intel Core Ultra 9 185H
Graphics: Nvidia GeForce RTX 4070 (70W stated)
Memory: 32 GB LPDDR5X
Screen size: 16-inch AMOLED touchscreen
Resolution: 2880 x 1800
Refresh rate: 120 Hz
Storage: 1 TB SSD, MicroSD
Connectivity: Wi-Fi 6E, Bluetooth 5.3, 2x Thunderbolt 4, 1x USB 3.2, 1x HDMI 2.1
Dimensions: 355.4 x 250.4 x 16.5 mm
Weight: 1.86kg
Price: $3,000 | £2,849

There's a small amount of confusion over how much power the GPU pulls, however—the box states 70W, while a software check shows it going a little higher than that at 80W. Whichever is true, this is half (or thereabouts) what you're getting from a machine like the Razer Blade 14 or MSI Vector 17 HX which run their RTX 4070 chips at 140W, and the test results back this up.

Take the Time Spy Extreme benchmark, in which the Galaxy Book's GPU scores 3,426 points. Sounds good, until you look at the Lenovo Legion LOQ 15APH8, which has an RTX 4050 running at 95W and scores 4,105. It also only costs $1,100.

The Razer Blade 14 manages 5,634; the MSI Vector 17 HX scores a shattering 6,102 and costs $2,299. What about a proper game? In Cyberpunk 2077 the Galaxy Book struggled to get a playable frame rate using Ultra Ray-Tracing settings (yes I know you can turn settings down but this is PC Gamer) producing 24 fps at 1080p, 10 fps slower than the RTX 4060 in the Legion Pro 5i 16 Gen 9. This happens time and again in the tests, with cheaper laptops leaving the pricey Samsung behind.

So what sets the Galaxy Book apart, and justifies the price tag? Well, you do get 32 GB of speedy LPDDR5X, and a 2 TB SSD that posts a higher average bandwidth than many other laptops. It's possible to down-spec the machine too, to save a little cash, but what you get for your money is an extremely nicely built laptop.

It's from the slim, sleek and executive machined-aluminium school of design, which means that the Samsung logo on the lid is rainbow-reflective, like someone built it out of an oily puddle on a sunny day, and the keyboard has a solid white backlight rather than unicorn vomit RGB. The pair of Thunderbolt 4 ports, alongside a full-size HDMI 2.1, USB Type-A, headset socket, and MicroSD card slot leave it as well equipped for the world outside gaming as they do for hooking up an external 4K screen and screaming into a microphone.

Image 1 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)
Image 2 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)
Image 3 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)
Image 4 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)
Image 5 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)
Image 6 of 6

Samsung Galaxy Book4 Ultra laptop

(Image credit: Future)

It's sturdy, the hinge is smooth, and there's no hint of flex even if you hold it by the corner and try to use the trackpad, something which can fox some less well-made laptops. It switches on automatically when you open the lid, which is faintly annoying when you just want to check whether there's a sliding cover over the webcam (there isn't, but it does at least have a 1080p sensor) yet very handy and efficient-feeling when you need to actually use the Book.

You get a fingerprint reader to login with too, which always feels like a more sci-fi method of authentication than face recognition. One day we'll retina-scan ourselves (or the excavated eyeball of the scientist we just chloroformed) using the webcam, but until then this will suffice.

The combination of the low-power cores in the Core Ultra and the lower wattage rating of the GPU allow the Book 4 Ultra to eke out a remarkable battery life. Under test conditions (shut in a cupboard where I can't see it flickering) the laptop managed almost 13 hours of life in a test that keeps its screen on (at 50% brightness) while running a loop of video and office applications. This test isn't troubling that GeForce chip too much, and in a simulated gaming benchmark that allows it to stretch its legs a bit more it kept going for 2 hrs 13 minutes.

Buy if...

You crave the brand recognition and Samsung linkup features, and want a laptop that looks expensive: Because the Book4 Ultra has all of those, and is definitely expensive.

Don't buy if...

You have no idea why Nvidia releases Studio drivers: If you are more interested in Indiana Jones than InDesign, then maybe swerve this pricey Samsung.

This is a less excellent result, even in an age of gaming laptops that can barely hold enough charge to keep them alive between power sockets. The Book 4 Ultra isn't a particularly thick machine, with less space to hide battery cells than chunkier models, so perhaps relies on component efficiency rather than a large battery for its extended lifespan. The moral of this story is that, if you're gaming, you need to plug it in.

There are also a few Samsung-specific features, such as the ability to hook up with a Galaxy smartphone in a more intimate way than the standard Windows Phone Link, such as using it as a wireless video camera for meetings. These have absolutely no bearing on its gaming abilities, but are nice quality-of-life enhancements nonetheless, even if the constant nagging to create and sign into a Samsung account makes you fire up the Settings app to uninstall it all.

And that just about wraps up the Samsung Galaxy Book 4 Ultra. It's expensive, it's not amazing for games despite what the specs might tell you (though it's no slouch) and did I mention it's expensive? But it's an extremely easy machine to live with, and if you're envious of MacBooks' sleek design, sharp screens and long battery life, this is about as close as you're going to get without giving up Windows.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/samsung-galaxy-book4-ultra-review/ c4p7sy6YhbRbFrBnopyzRV Tue, 17 Dec 2024 21:01:28 +0000
<![CDATA[ Noctua NH-D15 G2 review ]]> Noctua needs no introduction when it comes to air coolers with several of the best-reviewed models to its name that pack some of the biggest punches when it comes to air cooling. Amongst its range of the biggest and most powerful coolers in it lineup, the NH-D15 has long been regarded as one the best premium large coolers on the market. The NH-D15 G2 is a long-awaited successor, but while it might look similar, this cooler sports a variety of significant tweaks that Noctua claims make it perform better while producing less noise and with smaller dimensions.

With a massive 168 mm-tall dual heatsink stack and dual 140 mm fans that top out at just 1,500 rpm, the NH-D15 G2 is designed for one thing, which is to bring the best that air cooling has to offer for as little noise as possible. That makes sense seeing as at $150, it’s competing directly with large 360 mm AIO liquid coolers.

There’s also a strong case for opting for cheaper air coolers too, many of which have similar size heatsinks and dual fans while costing less than half the price. If you want to dial down the noise even further then Noctua includes speed reduction cables if you’re prepared to take a small hit in cooling, with zero need to fiddle with your motherboard’s own control.

So what makes Noctua think the NH-D15 G2 is worth its asking price?

Noctua NH-D15 G2 specs

The Noctua NH-D15 G2 in situ.

(Image credit: Future)

Compatibility: LGA 1851, 1700, 1200, 1150, 1151, 1155,1156 | AMD Socket AM5, AM4
Dimensions (with fan):
150x 152x 168 mm
Cold Plate: Nickel-plated copper
Fans:
2x NF-A14x25r G2 PWM 140 mm, SSO2 bearing, up to 1,500 RPM
Lighting: None
Price: $150 | £128

Well, there’s a lot of R&D that’s gone into making this the king of coolers for modern CPUs. For starters Noctua has worked out that having two fans mounted in close proximity on a cooler and spinning at the same speed can actually cause unwanted noise and vibration as the two interact. Ever wondered what that random, intermittent vibration from your PC is that comes and goes? If you were using a dual fan cooler in a push-pull configuration, this could be the reason.

To solve this problem, the NH-D15 G2’s fans spin at slightly different speeds.

The heatsink stacks have been optimised according the performance of the fans too, which have an improved pressure to airflow curve. This has allowed Noctua to reduce the fin clearance from 1.9mm to 1.6mm and include 23 extra fins compared to the original model, with the added benefit of making the heatsink nearly a centimetre less deep, but offering even better performance. The most important change, though, is that the NH-D15 G2 comes in three variants, all with slightly different shaped cold plates. This is to cater for the wide variation in convexity between modern CPUs.

For example, the high socket and cooler pressure applied to Intel LGA1700 CPUs can cause them to bend in the socket and over time came become permanently misshapen. With a combination of washers to reduce heatsink pressure on the CPU heatspreader and machining the cold plate of each variant to different amounts of convexity, you can buy the cooler that’s best-suited to your particular CPU.

For example, the high base convexity or HBC model caters for used Intel LGA1700 CPUs where a shaped base will allow for better contact and in term better cooling. By contrast, AMD Ryzen 7000 and 9000 CPUs remain relatively flat, so the low base convexity (LBC) model will result in better performance. There’s also a standard model that Noctua claims also works well with AMD CPUs as well as Intel LGA1851 Arrow Lake CPUs as well as LGA1700 CPUs that use a contact frame.

Image 1 of 3

The Noctua NH-D15 G2 radiator top.

(Image credit: Future)
Image 2 of 3

The Noctua NH-D15 G2 radiator bottom.

(Image credit: Future)
Image 3 of 3

The Noctua NH-D15 G2 radiator side view.

(Image credit: Future)

We’ll be looking at the results later, but Noctua has an excellent table on its website that we highly recommend checking out to see which model is best for your CPU. The down side here is whether you swap CPUs in future as it’s not recommended to use an HBC model with AMD CPUs and similarly performance might be worse with an LGA1700 setup with no modifications such as contact frames. In addition to these features, the cooler is equipped with an offset for AMD Ryzen 7000 and 9000 CPUs, which allows the cold plate to sit closer to the hot spot on those models that’s slightly off centre due to the location of the CCDs.

Installation is fairly straightforward, but in addition to requiring a specific screwdriver, the heatsink is heavy and quite ungainly to deal with that can make installing it a challenge. It’s something you’ll only have to do once or twice so we can’t be too harsh here. You’ll need to pay careful attention to the components too as some look similar apart from their colour, but the key is to make sure you get the right model for your particular CPU.

PC Gamer test rig
CPU: Intel Core i7 14700K | Motherboard: Gigabyte B660 Gaming X DDR4 | Memory: Corsair Dominator Platinum DDR4 3466 | SSD: 512GB Samsung 980 Pro | GPU: Nvidia RTX 4090 Founders Edition | PSU: MSI MEG Ai1300P PCIe5 | Case: BarrowCH Rhopilema Test Bench


In our thermal testing it was very clear that both using the NM-ISW1 shim washers to reduce socket pressure or using the HBC version of the cooler definitely improved performance on our Core i7 14700K, with the HBC version usually edging ahead by a few degrees. The best result in Cinebench of 86°C of the HBC was 3°C better than the standard cooler, but that was still 11°C warmer than the Be Quiet! Light loop 360mm with both coolers hitting a peak of 52dBA.

There was another noticeable benefit using the HBC cooler in 3DMark’s Steel Nomad test where the NH-D15 G2’s lowest temperature of 84°C was only 6°C higher than the Be Quiet Light loop 360mm. It was 10°C adrift in the X264 test, with not much difference here between different configurations, but the Metro Exodus test saw the HBC version again edge out a lead and also come within 4°C of matching the 360mm liquid cooler and all but equalled it in returning to idle temperatures too, massively outstripping the smaller single fan Be Quiet! Dark Rock 5.

Image 1 of 2

A Noctua fan.

(Image credit: Future)
Image 2 of 2

The Noctua NH-D15 G2 in situ.

(Image credit: Future)
Buy if...

✅ You want great cooling without using liquid cooling
If you prefer to steer clear of AIO liquid coolers and custom watercooling but still want great cooling and low noise levels this is where the Noctua NH-D15 G2 really shines.

You want an air cooler that can handle high-end desktop CPUs
Intel’s LGA1700 CPUs are still very popular, but they generate a lot of heat. The Noctua NH-D15 G2 easily tamed our Core i7 14700K, offering performance that only large AIO liquid coolers can beat.

✅ You want an air cooler that offers low noise levels
Thanks to its large fans and massive cooling headroom, the Noctua NH-D15 G2 will maintain low noise levels more of the time compared to smaller heatsinks that quickly ramp up fan speed under load.

Don't buy if...

❌ You have a limited budget for a cooler
While it performs excellently, it costs three times as much as other well-regarded dual-fan air coolers and 360 mm AIO liquid coolers can be had for a lot less that perform better too.

❌ Your case has a low CPU cooler height limit
At 168 mm tall, even some ATX cases might struggle to house this massive cooler so it’s worth checking your case’s CPU cooler height limit before you buy.

❌ You plan on switching CPU sockets in future
There are three versions of the Noctua NH-D15 G2 catering for CPUs with various heatspreader convexities. While they’re physically compatible between sockets, the differing cold plate convexity may mean worse performance if you switch CPU sockets.

In games and lightly-threaded tasks the Noctua NH-D15 G2 offers real benefits over smaller air coolers and also keeps up with and even matches large 360 mm liquid coolers too. Even in its warmest results it was never more than 10°C warmer than the Be Quiet Light loop 360mm and was only a few degrees away from matching the Arctic Liquid Freezer III 360 A-RGB while also running rings around smaller air coolers, completely taming the Core i7 14700K.

Its price is eyewatering for sure, but the fact it keeps up with enormous and powerful liquid coolers is impressive as are its maintenance-free credentials and potentially longer life span too. The only issue is that certain variants are CPU-specific and picking the right one can be tricky, especially for Intel owners. However, we can’t argue with the build quality and cooling performance, even if there are far cheaper dual-fan coolers out there.

It was much happier in our game tests, sitting at 83°C in Metro Exodus and 87°C in our 3D Mark Steel Nomad test, but even these were noticeably higher than Noctua’s larger dual-fan NH-D15 G2. Part of the reason for this is undoubtedly airflow.

The cooler was fairly quiet at full speed, hitting 50 dBA on our sound meter, which was 2 dBA quieter than the Noctua cooler, but also shoving far less air through its relatively large heatsink. It also took a long time to return our processor to idle temperatures. Adding a second fan would undoubtedly help it out in all areas.

While it’s hideously expensive, the fact that air coolers such as this one last decades means that it will likely serve you well for many years maintenance-free where liquid coolers won’t. However, with that in mind there are some question marks over future compatibility the differing base convexity does reduce compatibility, especially if you swap from AMD to Intel or vice versa.

The price of $150 is still a lot to spend on an air cooler with limited features so it’s always work checking out cheaper alternatives and never miss out on GPU, CPU or SSD upgrades when something cheaper will be adequate. We can’t argue with the cooling, build quality or packaging, though, with performance on par with some 360 mm liquid coolers.

]]>
https://www.pcgamer.com/hardware/cooling/noctua-nh-d15-g2-review/ AAoCRcvacswdQoWTEYLSp3 Tue, 17 Dec 2024 14:42:02 +0000
<![CDATA[ Gulikit KK3 Max review ]]> Once upon a time, pro gaming controllers were for the elite of the elite, costing several hundred dollars a piece. Giants like the Xbox Elite Series 2 and Razer Wolverine used to dominate the scene with unparalleled build, performance and innovative features. These days, you can get some truly impressive controllers—like the excellent PowerA OPS V3 Pro—that offer similar features for under $100. And now joining the fray is the Gulikit KK3 Max.

Retailing for an approachable $79 (£76, AU$140), the KK3 Max Pro brings impressive features such as Hall effect joysticks and triggers, swappable rear paddles, intriguing macro support, and versatile connectivity options.

Unlike many of its competitors, it works seamlessly with the Nintendo Switch but, curiously, is not compatible with Xbox or PlayStation consoles. I've been using it exclusively on my PC for the past couple of weeks, and while some features went unused beyond testing, I can confidently say it's an impressive device.

The Gulikit KK3 Max's design, however, is still reminiscent of an Xbox controller, enhanced with extra buttons, slots, and RGB lighting. Instead of an Xbox button, there's a smaller backlit Gulikit logo (non-functional as a button), and the standard menu and start buttons are replaced with a “-” and “+”. Beneath these are four additional Gulikit buttons for customization.

KK3 Max specs

Gulikit KK3 Max controller

(Image credit: Future)

Compatibility: Windows 11, Nintendo Switch, Android and IOS devices
Connectivity: 2.4 GHz, Bluetooth, USB wired
Ports: USB-C
Thumbsticks: Hall effect
Thumbstick layout: Asymmetric (Xbox-style)
Rear paddles: 6
Weight: 247 g
Price: $79 | £76 | AU$139

The Hall effect joysticks feature RGB rings but are non-swappable and lack height adjustment. The D-pad and face buttons offer decent tactility without feeling mushy, though they don't match the snappy feedback of Razer's mecha-tactile switches. The ABXY buttons are replaceable, and the included puller makes swapping them easy. Their slightly larger size helps reduce missed inputs.

On the back, you'll find trigger locks for switching between analog and digital clicks, along with four attachment slots for the six paddles included in the box. The spacing is more user-friendly, and the ability to choose specific shapes and placements makes this controller highly customizable compared to others with fixed rear buttons. Constructed from robust plastic, the KK3 Max Pro doesn't quite exude a premium feel, but it doesn't feel cheap either. The subtle textured grips could be grippier, more like those on the PowerA and the excellent Razer Wolverine V3 Pro controllers.

The Gulikit controller comes in black or white with matching paddles and ABXY buttons. However, the black seems to pick up odd stains that I couldn't wipe off while trying to snap pics for this review. I'm not sure what's going on but I never encountered this with other controllers. Furthermore, the plastic protective case that comes with the controller is really shady and I wouldn't want to be seen carrying this anywhere. I really think Gulikit could have sprung for a hard case rather than this thing.

In terms of connectivity, the Gulikit KK3 Max supports 2.4 GHz wireless with up to a 1000 Hz polling rate on PC, Bluetooth for smart devices, and USB-C wired connections. It works with Windows PCs, laptops, Nintendo Switch, and Android devices but, as mentioned earlier, not with Xbox or PlayStation. Pairing with the 2.4 GHz dongle took some effort, as the documentation wasn't particularly intuitive—a recurring issue when learning to use a lot of its features. I suspect many a user won't fully grasp the power in their hands due to the vague and complex steps required for most of them. Even switching between connections is slightly convoluted.

Image 1 of 5

Gulikit KK3 Max controller

(Image credit: Future)
Image 2 of 5

Gulikit KK3 Max controller

(Image credit: Future)
Image 3 of 5

Gulikit KK3 Max controller

(Image credit: Future)
Image 4 of 5

Gulikit KK3 Max controller

(Image credit: Future)
Image 5 of 5

Gulikit KK3 Max controller

(Image credit: Future)

That said, at least the battery life is solid, providing around 15 hours of play with RGB lighting active and nearly 28 hours with it off. I'm happy to sacrifice the lackluster RGB in favor of longevity and I suspect you will too. However, the KK3 Max doesn't do a great job of indicating low battery. Rather than a clear warning, it just starts to disconnect intermittently until it doesn't connect at all anymore.

Performance is where the KK3 Max Pro truly excels, however. Its Hall effect joysticks deliver precision and longevity, outclassing traditional potentiometer designs. Whether lining up a sniper shot or executing tight manoeuvres, the accuracy is tangible. Sensitivity can be adjusted directly on the controller, as there's no accompanying software. Triggers and paddles are equally responsive, with the paddles proving easier to use than those on other pro controllers I've used. The ability to position them where you need and the unique squat paddle design make all four back paddles accessible—a rarity among controllers.

A Turbo Mode allows rapid button presses with a single hold, ideal for games requiring quick, repeated inputs. Auto Fire Mode enables continuous action without holding the button, perfect for sustained firing in bullet shooters. Auto Pilot Gaming lets you program action sequences for automatic execution, making repetitive tasks or complex combos a breeze.

Buy if...

You want a seriously customizable pro controller on a budget: The Gulikit KK3 max will match the Elite Series 2 but at a mere fraction of the cost.

You can make use of auto features: The KK3 Max has some advanced auto-play features and if you play the sort of bullet-hell games that might make use of that it will find itself at the top of your controller list.

Don't buy if...

❌ You want a simple life: If easier controls, a fine software experience, and the ability to store separate profiles for each of your main games is your bag, the KK3 Max might not be.

❌ You want a controller that can work on Xbox or PlayStation as well as your PC: With compatibility strangely missing the modern Microsoft and Sony consoles, that is something you'll need to be comfortable with before you buy.

Gulikit claims the KK3 Max's improved chipset and software enhance in-game self-correction, though its full potential is hard to gauge. Competitive gamers will appreciate the near-zero lag provided by the 2.4 GHz wireless and USB wired connections, as well as the 1000 Hz polling rate.

Remapping the paddles is straightforward, allowing on-the-fly adjustments without software. This is crucial during game play when quick changes are needed. However, the inability to save profiles will be a major drawback for players who frequently switch between games. The lack of dedicated software also means that deeper customization options are limited compared to premium alternatives like the Xbox Elite Series 2.

The Gulikit KK3 Max Pro stands out with its ergonomic design, durable build, and some rather unique advanced features. Hall effect joysticks, versatile connectivity, and performance-enhancing options make it a strong contender in the pro controller market. While it has some shortcomings—such as no software support, complex on-board customization shortcuts, and the absence of Xbox or PlayStation compatibility—these don't overshadow its strengths. Especially since the KK3 Max costs just $80 which makes it one of the best PC gaming controllers and a fantastic buy for the budget conscious buyer.

]]>
https://www.pcgamer.com/hardware/controllers/gulikit-kk3-max-review/ mHjdHdweVP2xatm2wfQ7LG Mon, 16 Dec 2024 15:49:05 +0000
<![CDATA[ Ballionaire review ]]>
Need to Know

What is it? A pachinko machine that's also a strategy roguelike.
Release date December 10, 2024
Expect to pay $11.10/£12.34
Developer newobject
Publisher Raw Fury
Reviewed on Gigabyte G5 (Nvidia RTX 4060, Intel Core i5 12500H, 16GB DDR4-3200)
Steam Deck Verified
Link Official site

In 2007, a game called Peggle was released. In 2008, the world suffered a global economic crisis. To this day I refuse to believe these two facts were unrelated. Peggle was as all-consumingly moreish as it was simple—it was essentially just a pachinko machine, where you fired a ball into a screen full of ‘pegs’ and tried to hit ten orange ones on the way down. The thing is, the only agency you had was the direction you fired the ball in. After that, all you could do is sit back and watch, making it almost entirely luck-based.

Once you finally accepted that, the spell would hopefully break, and you could finally uninstall the bloody thing. But now developer newobject has come up with Ballionaire, a twist on pachinko that adds roguelike deckbuilding elements. Essentially Peggle has gone to college, smartened up, and gotten itself a masters degree in compelling strategy gameplay. Oh no. Oh God no.

The goal is to make enough money to pay a tribute that has to be cleared every seven balls or it’s game over. The starter tribute is 500 dollars, which is worrying, because the board is initially a complete tightwad. Hitting these pegs pays out a pathetic 1 dollar apiece, the kind of financial reward that would embarrass even a games journalist. Luckily, after every ball you get a choice of three obstacles (also, a little confusingly, called balls) that you can add to the table. A trampoline will pay out $200 if you hit it, and bounce the ball upwards. A smiling tree will also pay out $200, but only if you hit it from the bottom up. Ah, but as an apology for that irritating caveat, the tree also gains a small multiplier bonus after every single ball played, meaning it can become a substantial payout in the late game.

Oooh, hang on! If I place the trampoline ball under the tree, I’m far more likely to get that lovely multiplier money. And naturally this kind of synchronicity is what you’re aiming for with every ball you place on the table. Some balls are ‘droppers’ which means they’ll drop another ball when you hit them. These pair well with ‘holders’ that will hold onto any balls that hit them and give some lovely bonus in return (but naturally now the ball is ‘held’, it’s out of play and can’t hit anything else). Crucially, some balls automatically activate at the start of each go, like a fantastic butterfly ball that flies up the screen - only obtainable if you keep a caterpillar ball from being hit for five rounds, naturally.

It means a game that at first feels as random as flipping a coin can be manipulated into one that actually rewards strategy. Sure, sometimes God decides they hate you and nothing tumbles down the screen in the direction you want. But the trick here is to put together a Rube Goldberg machine so sophisticated that such randomness doesn’t stand a chance. There’s a Piggy Bonk ball that pays out $800 for every coin ball it’s holding, and I’ve become obsessed with molding my entire strategy around stuffing coin balls into it, regardless of what tools the game is offering me (this, in roguelike deckbuilding parlance, is known as ‘being an idiot’). Whatever, the runs where my beloved pig stratagem has worked have given me gloriously satisfying payouts.

(Image credit: newobject, Raw Fury)

Once you’ve won a run on the initial pyramid stage, you unlock four more, each nicely varied. One sees the ball being lowered in on a fishing line that’s then reeled back to the top when it hits the bottom. Great for making you try out stuff that relies more on being hit from below. The pinball table, with its two limited-use flippers, is my personal highlight, and I assure you that the Deathwheel stage has been named with no hyperbole whatsoever.

But let's get back to the balls for a moment, in what I’m fast worrying is becoming my most innuendo-laden review ever. Balls are divided into subcategories, such as Agers, Movers, and the aforementioned Droppers and Holders. I had fun slowly working out what all these different terms met. And that’s good news because the game did a pretty lousy job of explaining them to me. There’s a "Ballipedia" tucked away in the pause screen that gives you details on every ball, but it’s one of those irritating tutorial screens that often only gives you half the information you want. "Adjacent triggers to this ball have a +0.1 multi for each coin ball held". OK, cool, and a "trigger" is…? It’s incredibly irritating to mess up a run because you had to just make a guess at how something worked.

It’s not the only place where Ballionaire feels a little incomplete. Win a run and you’ll be rewarded with a currency that you can use on a vending machine to get more ball options for future runs. It won’t take you long to unlock everything (in fact it’ll only take a few seconds if you choose ‘unlock all’ in the pause menu, which is a nice touch). That’s all well and good, but it’s the limited difficulty options that have left me struggling for a reason to come back.

(Image credit: newobject, Raw Fury)

There are five difficulties, which throw predictable challenge increases at you like demanding heftier tribute amounts (yawn). Far more entertaining are the malicious curveballs which force you to place horrible annoyances on your lovely table. One has to be hit 500 times before the end of the game, or you lose. Another will sap all the payout money from the surrounding balls, and can require at least $250,000 before finally pissing off. I love these. Worrying about them while also still trying to hit those required tributes is a great, tricky balancing act.

I just wish it went further. After you’ve cleared the five difficulties, that’s your lot, and it feels like the game could easily keep escalating—look at all the horrible debuffs on offer in the create your own table mode! The lack of an endless mode is disappointing too—it'd be a perfect addition.

But maybe it’s greedy to complain about longevity when I got 20 fun hours out of Ballionaire. I’d rather pick on its irritating repetitive music, or its obnoxious dancing mascot. These are the ridiculously minor moans of someone who had to constantly slap themselves to stop alt-tabbing into Steam and playing more instead of writing this review. It’s a marvelous bookend to a year that opened with the almighty Balatro, and with a few updates, Ballionaire could easily become my new podcast game of choice. For now, it’s the elevator pitch of a Peggle deckbuilder, ball-illiantly executed.

]]>
https://www.pcgamer.com/games/roguelike/ballionaire-review/ 4GbCsQfu9BdBs4fwJ2Sy8V Mon, 16 Dec 2024 14:51:54 +0000
<![CDATA[ Marvel Rivals review ]]> It seems like every moment of my spare time nowadays is spent in a hero shooter. When I'm not fighting for my life in Valorant, I'm doing the same thing in Overwatch 2. So the last thing I need is a hero shooter I actually enjoy and will, therefore, carve more time out of my day playing it: enter Marvel Rivals.

Need to know

What is it? A third-person hero shooter in which you play as various Marvel heroes in teams of six.

Release date December 5, 2024

Expect to pay Nothing

Developer NetEase Games

Publisher NetEase Games

Reviewed on RTX 3070, Core AMD Ryzen 5 5600G, 16GB RAM

Multiplayer Yes

Steam Deck Verified

Link Steam

I didn't expect Marvel Rivals to get its grips into me so quickly, but I'm having a good time feeling out the kits of its 33 launch heroes divided into three classes: vanguard, duelist, and strategist (tank, DPS, and Support), every one of which also includes unique 'team-up' abilities activated by building the right team composition. I've been maining Hawkeye, which I am ashamed to admit, considering how strong he is right now, and that's not even considering Black Widow's team-up ability, which gives him a 20% damage boost.

It's always fun to go through the wringer with a new hero shooter, especially at the chaotic beginnings. At first, I found it hard to recognise who was hitting you and where. I spent a lot of my time getting shot in the back by Moon Knight's Ankh, which he can set down and then reflect projectiles off. It doesn't matter if you're behind a wall or can't even see Moonknight—he'll find a way to get to you.

Hawkeye MVP

(Image credit: NetEase)

Eventually, I was able to pick up on the sound Moonknight's Ankh made when it reflected darts, and it became second nature to destroy it before trying to run for cover. There are loads of these little learning curves to master, many of which will feel familiar for Overwatch vets.

There's nothing quite like reflexively sleeping an enemy Scarlet Witch with Mantis before she ults or pushing away a diving Wolverine without a second thought before he can sink his claws into you. There are a lot of intuitive abilities and counters in Marvel Rivals—all you have to do is take the time and have the patience to learn them.

Rivals' maps are one of its most distinctive features, as battlegrounds transform throughout a match.

Getting to know each of the eight launch maps is also a lot of fun. Rivals' maps are one of its most distinctive features, as battlegrounds transform throughout a match either through a dynamic destruction or bespoke map events. It's a good way to shake up a match and can also be used to turn the tide in your favour. If there's a particular spot that a Black Widow player likes to perch on, for example, chances are you can blast it for a few seconds to destroy that vantage point. Unfortunately, the map will build itself back up every now and then, so it's best to be on the lookout for that pesky Black Widow.

Although I thought I was losing it when walls kept appearing in front of me on Klyntar, it turns out that the map's walls periodically shift around and literally change its blueprint, sorta like the Clockwork Mansion in Dishonored 2. This can be a blessing and a curse as it can sometimes cut off long lines of sight, which can protect attacking teams but also means you sometimes need to switch up your angles.

Too much of a good thing

Jeff throwing people off the map

(Image credit: NetEase)

Starting with 33 heroes is a bold move, considering most other hero shooters begin with a smaller roster that's easier for new players to digest before growing over time. That character select screen is a bit daunting, but the expansive roster is also great because it means there's something for everyone.

You have your straightforward heroes like Black Widow or Hawkeye, pure aimskill heroes who rely on players landing headshots, Punisher is your beginner-friendly Soldier 76 type with an assault rifle, and tanks like Groot and Magneto, who just place shields and gradually wear down the front line.

Then, on the slightly more complicated side, you have your rapid-dive heroes like Iron Fist, Psylocke, and Spider-Man. These heroes are a little trickier to get the hang of, but it's high-risk, high reward because I don't think there's anything scarier than a genuinely skilled Spider-Man player.

Jeff the Shark

(Image credit: NetEase)

Marvel Rivals manages to cater to most sections of the hero shooter playerbase with characters that have varying skill ceilings and purposes. You can hide in the back, sniping or healing, or be in the midst of battle, darting around or breaking down walls with brute strength. On the other hand, this massive roster of heroes is also Marvel Rivals' undoing: There are loads of completely busted heroes right now. Hawkeye is a one-shot headshot for duelists and strategists, while Jeff the Shark is able to gobble up full teams of players and launch himself and all of them off the map in one go. Dominant heroes and instakill gimmicks are considered funny and meme-worthy right now in Marvel Rivals' honeymoon stage, but soon graduate to annoyances—talking points from which mobs of players will sharpen their pitchforks. Dr. Strange's teamkill teleport tactics are already beginning to wear thin—if I have to watch my team helplessly run off the map one more time, I'm going to seriously lose it.

Balancing a roster of heroes is arguably one of the hardest things a live service hero shooter has to do, and Marvel Rivals has made things unnecessarily difficult for itself by starting off with so many instead of drip-feeding players. But arguably, NetEase is more concerned with filling Marvel Rivals with loads of fun heroes rather than ones that are more balanced straight out of the gate. It's an unorthodox approach, but one that may work out in the long run if NetEase pulls up its sleeves quickly enough.

Duelists everywhere

Black Widow and Hawkeye team up

(Image credit: NetEase)

In a similar vein to unbalanced heroes, there is also an uneven number of heroes in each class. The trouble is in a game that requires an even number of roles—usually two vanguards, two duelists, and two strategists, or something similar to this—if you have twice as many duelists than strategists and vanguards combined, chances are most teams will just be made up of duelists.

The balancing issues and iffy team composition will hopefully not weigh Marvel Rivals down.

The worst matches I've experienced in Marvel Rivals have been when no one else is willing to switch off duelist, so we just get completely rolled by the opposing team, which has a balanced composition of vanguard, duelist, and strategist. I can play the strategist Rocket Racoon all I like, but if I'm stuck with a team full of duelists, we're not going to get very far.

This is something that Overwatch has struggled with ever since it merged the roles of offensive and defensive DPS. Since then, tank and support have been an afterthought for many years until a recent push to try and even out the classes. So I thought that this headache was something Marvel Rivals would want to avoid from the start by creating classes with a similar amount of heroes, but it seems that I was wrong. So, unbalanced team composition will probably stick around as a common issue until something like a role queue is added to the game, or the weak give in and assign themselves to vanguard or strategist.

The balancing issues and iffy team composition will hopefully not weigh Marvel Rivals down—my pessimistic outlook could just be a weary symptom of years playing Overwatch 2. I really hope the developers manage to even out the playing field a bit more so Marvel Rivals can carry on seeing the same kind of success that it has had in its launch week. But for now, at least, I'm going to enjoy the chaos of an unbalanced game while it's still fun and reap the benefits of running around as Hawkeye and getting more picks than anyone else, with the prize being my MVP award at the end of the match.

]]>
https://www.pcgamer.com/games/third-person-shooter/marvel-rivals-review/ fLaHsvBQZv3whtt9DRA7pE Mon, 16 Dec 2024 13:20:59 +0000
<![CDATA[ Be Quiet! Dark Rock 5 review ]]> Being considered an old hand in the CPU cooler market these days, Be Quiet! has a reputation for producing good-performing coolers with minimal fuss, moody aesthetics and great build quality and the Dark Rock 5 seems to continue with that theme. There’s no RGB lighting here, but this single-fan, mid-size model does sport some refinements, compared to previous generations and is one of the best-looking Be Quiet! heatsinks we’ve seen.

Starting with the design, Be Quiet! has introduced a magnetic cap for the Dark Rock 5 that covers the heatpipes and screwdriver hole for mounting the cooler. Its coolers, especially the more premium models, haven’t exactly been ugly in the past, often sporting a black anodized finish.

However, the cap, which can rotate around depending on the cooler’s orientation, looks even better and definitely makes the cooler stand out from the crowd. That’s just as well because you’re not looking at much change from $70 or £70 from this cooler, which is pretty steep for a single fan heatsink arrangement.

There are benefits to its compact dimensions and single fan, though, such as the ability to avoid all of your memory slots allowing for unrestricted height memory modules. At 161 mm tall it’s also going to fit in practically any ATX case too. The downside is that it’s a relatively deep heatsink and six heatpipes might benefit from a second fan.

Be Quiet! Dark Rock 5 specs

The Be Quiet! Dark Rock 5 in situ from the side

(Image credit: Future)

Compatibility: LGA 1851, 1700, 1200, 1150, 1151, 1155 AMD Socket AM5, AM4
Dimensions (with fan): 136 x 101 x 161 mm
Cold Plate: Nickel-plated aluminum
Fans:
1x Silent Wings 4 120 mm, Fluid Dynamic Bearing, up to 2,100 RPM
Lighting: None
Price: $65 | £65

Thankfully, Be Quiet! has included the clips for one in the box, but with a peak speed of 2,100 RPM, the included Silent Wings 4 120 mm PWM fan isn’t the largest or most powerful we’ve seen included with a heatsink so we’ll have to see how it holds up against our toasty Core i7 14700K.

The machined base is nickel-plated rather than bare copper, which means if you want to use liquid metal paste, you won’t run the risk of the paste staining or being absorbed by the copper and drying out as quickly. You won’t need to do any liquid metal paste research, though, as standard thermal paste is included.

This is in the form of a tube, so you get a couple of applications as opposed to just one when the paste is pre-applied. The base also has two sprung mounting screws fixed in place, while owners of older Be Quiet! coolers will remember a separate fiddly plate had to be used to secure the cooler.

The mounting mechanism is identical to that on Be Quiet!’s liquid coolers with quite a few pieces to deal with. It’s straightforward, but probably not something you’d want to attempt with your motherboard still mounted in the case.

Image 1 of 5

The Be Quiet! Dark Rock 5 in situ

(Image credit: Future)
Image 2 of 5

The Be Quiet! Dark Rock 5 top

(Image credit: Future)
Image 3 of 5

The Be Quiet! Dark Rock 5 radiator

(Image credit: Future)
Image 4 of 5

The Be Quiet! Dark Rock 5 fan

(Image credit: Future)
Image 5 of 5

The Be Quiet! Dark Rock 5 bottom

(Image credit: Future)

The exception here might be on AMD motherboards, which don’t require the included backplate. Installing the Silent Wings 4 fan is far easier, though, as the clips slot into recesses in the heatsink with minimal fuss. The same definitely can’t be said of some cheaper heatsinks.

Unfortunately, the Dark Rock 5 wasn’t quite able to tame our Core i7 14700K. For the first few minutes of our various stress tests, the temperatures were acceptable, but would steadily climb and in our torturous x264 and Cinebench stress tests, the CPU ended up coming close to 100°C and throttling.

We should emphasise this only happened during extended multi-threaded stress tests and with Intel 14th Gen CPUs, but it seems that handling high loads for long periods with these CPUs is a bit beyond the Dark Rock 5.

It was much happier in our game tests, sitting at 83°C in Metro Exodus and 87°C in our 3D Mark Steel Nomad test, but even these were noticeably higher than Noctua’s larger dual-fan NH-D15 G2. Part of the reason for this is undoubtedly airflow.

The cooler was fairly quiet at full speed, hitting 50 dBA on our sound meter, which was 2 dBA quieter than the Noctua cooler, but also shoving far less air through its relatively large heatsink. It also took a long time to return our processor to idle temperatures. Adding a second fan would undoubtedly help it out in all areas.

Buy if...

✅ You want a great-looking and quiet heatsink: The black finish and magnetic cap really allow this cooler to stand out from the crowd, and what it lacks in airflow, it makes up in low noise.

✅ You want unrestricted memory clearance:
This cooler does not overhang memory slots on most motherboards allowing for unlimited height memory modules

Don't buy if...

❌ You want to cool high-end Intel LGA1700 CPUs: While it coped in games, extended multi-threaded workloads proved a step too far as the CPU throttled after a few minutes.

❌ You have a limited budget:
The build quality, magnetic cap and anodized finish contribute to the price, but not the cooling performance. There are plenty of dual-fan coolers available for similar prices

The Be Quiet! Dark Rock 5 is an attractive cooler with excellent build quality that remains fairly quiet at full speed and doesn’t suffer from memory and motherboard compatibility issues nearly as much as larger coolers.

Even with toasty Intel 14th Gen CPUs such as our Core i7 14700K, it’s able to tame them in games, but the same can’t be said for extended multi-threaded workloads where eventually the temperature climbed high enough to see it throttle.

Coping with less power-hungry AMD processors will be a different story, though, as will avoiding multi-threaded workloads using a high-end Intel 13th or 14th Gen CPU that take more than a couple of minutes to complete.

We can’t be too critical here as a result, especially if gaming and low noise are your main focus as most scenarios are well within this cooler’s capabilities.

That said, it’s a shame a cooler costing this much can’t handle all situations with popular current CPUs at stock speed and while it includes clips for a second fan, the inclusion of one would have made it a much sweeter deal.

Ultimately, the future looks like a chillier place as far as CPUs go, with Intel’s Core Ultra 200 processors running far cooler than their predecessors so while it’s not ideal for handling everything a Core i7 or Core i9 Intel 14th Gen CPU can throw at it, the Dark Rock 5 will fare much better once Intel’s hot-running LGA1700 CPUs have been consigned to history.

]]>
https://www.pcgamer.com/hardware/be-quiet-dark-rock-5-review/ PKTwyFYSHrrhdmHHnnQ3JU Fri, 13 Dec 2024 17:14:01 +0000
<![CDATA[ Dough Spectrum Black 32 OLED review ]]> Original review December 13, 2024: Formerly known as Eve, Dough is the kind of company you want to like. It's an independent startup with a self-styled community angle rather than a sprawling, faceless corporate entity. And that shows in its monitors, both for better and worse. Notably, there have been some serious missteps when it comes to actually shipping its products. Arguably, that goes with the startup territory.

That startup mentality is also absolutely evident with the Dough Spectrum Black 32 OLED, the brand's very latest PC monitor. It ups Dough's ante to fully 32 inches of glorious 4K OLED action and, just like the original Spectrum, you sense the effort and investment, in terms of design and careful attention to elements like the design aesthetic and build quality, that's gone into this new OLED panel. You just don't see that from the bigger brands.

Where models from more mainstream monitor makers tend to feel a bit cheap, plasticky and mass produced, even at this premium end of the market, the Dough 32 OLED sports a lush metal rear chassis and a gorgeous, beautifully engineered alloy stand, albeit that latter feature is an optional $100 extra.

Add in the slick Corning Gorilla Glass screen cover and minimalist design and you have easily the most physically desirable 4K OLED we've yet reviewed. It makes most of the alternatives feel like cheap, disposable toys.

Dough Spectrum Black 32 OLED specs

Dough Spectrum Black 32 OLED

(Image credit: Future)

Screen size: 32-inch
Resolution: 3,840 x 2,160
Brightness: 275 nits full screen, 1,000 nits max HDR
Response time: 0.03 ms
Refresh rate: 240 Hz
HDR: DisplayHDR 400 True Black
Features: LG WOLED panel, Adaptive Sync, 1x DisplayPort 1.4, 2x HDMI 2.1, Gorilla Glass 3
Price: $1,099 (without stand) | £1,299 (Hub model)

It's also mostly well specified. Dough has gone for the LG WOLED option as opposed to Samsung's QD-OLED tech. There are pros and cons to both panel types, but the LG panel in 4K format is the company's third generation WOLED technology and, importantly, it closes the gap to Samsung for full-screen brightness.

Actually, if anything it's a bit brighter full-screen for white tones, though QD-OLED probably still has the edge for color brightness. That's because WOLED, as the name implies, has an additional white subpixel to boost brightness that obviously does its best work with white tones.

Anyway, Dough has specced this panel up at 240Hz like most 4K OLEDs and its 275 nit full-screen and 1,000 nit peak HDR (in a 3% window) brightness ratings are largely par for the course, as is the claimed 0.03ms response performance.

(Image credit: Future)

To that you can add both HDMI 2.1 and DisplayPort 1.4 connectivity, plus a USB-C interface with 100 W of power delivery for slick single-cable connectivity, although the 100 W rating, while at least equal to that of most comparable displays, means that there's not enough power for a genuine gaming laptop. The only arguable exception to that is the new HP Omen Transcend 32 OLED with its 140 W USB-C interface, which is just about enough for some low-spec RTX 4060 laptops.

Anywho, while there is USB-C for video input and device charging, there's no USB hub for peripheral connectivity. At least, there isn't with this model which goes for $1,099, or $1,199 with the stand. A so-called "Hub" model which adds that functionality and a few other features is coming, but will be priced at $1,299 without the stand. Ouch.

(Image credit: Future)

Out of the box, the sRGB calibration in SDR mode looks great.

That's a pity, because the USB-C interface without a hub rather undermines the whole single-cable connectivity notion. If you're not going to have a hub, it probably makes sense to ditch USB-C altogether and hit a lower price point. Still, at $999 with a stand, the Dough Spectrum Black OLED is actually pretty attractively priced given the overall feature set and clear build and design quality edge it has over the competition. If it performs well, it could be a no brainer as our pick of the current 32-inch 4K bunch.

Sadly, that's not quite the case. Out of the box, the sRGB calibration in SDR mode looks great, with accurate colors and plenty of punch with the brightness cranked up. There is a little ABL-driven brightness variation depending on how much of the screen is being ignited. But it doesn't distract to anything like the extent that ABLs or automatic brightness limiters did with earlier generation LG WOLED-equipped monitors.

Image 1 of 4

Dough Spectrum Black 32 OLED

(Image credit: Future)
Image 2 of 4

Dough Spectrum Black 32 OLED

(Image credit: Future)
Image 3 of 4

Dough Spectrum Black 32 OLED

(Image credit: Future)
Image 4 of 4

Dough Spectrum Black 32 OLED

(Image credit: Future)

The Gorilla Glass cover is likewise very sweet, enabling a sense of heightened contrast but without excessive reflectivity. In fact, it makes for the best glossy implementation of an OLED we've seen. You really would have to be using this monitor in incredibly bright ambient conditions to choose the matte version Dough will also be offering starting at $899 with no stand and no USB-C port.

The problems begin when you enable HDR.

As an SDR panel, then, this thing really pops, the pixel density thanks to the 4K resolution is fab, and then you're getting all the usual OLED upsides including ridiculously fast response times, great viewing angles and perfect per-pixel lighting. However, the problems begin when you enable HDR.

For starters, despite updating to the latest firmware that supposedly fixes an HDR brightness issue, our review sample wasn't producing the full HDR performance.

(Image credit: Future)

Meanwhile, SDR tone mapping in HDR mode is essentially broken. That means you have to switch between SDR and HDR modes depending on content type. Not the end of the world? Nope. But it's a problem lots of HDR monitors used to have but latterly far fewer exhibit and contributes to a broader impression of a monitor that's not quite ready for retail availability.

(Image credit: Future)

On that note, the USB-C interface in its current state is essentially broken and only works intermittently with the multiple laptops with which we tested. This should be a fairly straight forward fix for Dough, just like the SDR tone mapping in HDR mode. But as we write this review, actual customers are receiving monitors and they really shouldn't be shipping out with these fairly basic flaws.

Buy if...

You want the best looking 4K OLED around: The Spectrum Black is beautifully built and designed.

Don't buy if...

You want something polished and free from flaws: Dough needs to put a little more work into the calibration and ironing bugs with the USB-C interface.

This is, frankly, a little disappointing. At $999 with the stand, this could be our favourite 4K OLED. The design and build is lovely and the LG WOLED panel is fundamentally fabulous. Fix the SDR tone mapping and the USB-C interface and this will be a killer product. As it is, the Dough Spectrum Black 32 very much a hold rather than a buy.

If you're interested, our advice is to keep a weather eye on the Dough Reddit forum and jump in when it's clear all the issues have been sorted. They very likely will be, at which point the Dough Spectrum Black 32 OLED with Gorilla Glass will be very appealing, albeit at a price.

Ideally, we'd like to see a Gorilla Glass version with the arguably redundant USB-C ripped out and offered for about $100 less. That would be a very compelling proposition. As it is, we'd counsel caution for now. In its current state, the Dough Spectrum Black 32 OLED isn't quite ready to be bought.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/dough-spectrum-black-32-oled-review/ w34zFdgcRTvgKNNkWAM5BF Fri, 13 Dec 2024 17:01:13 +0000
<![CDATA[ Be Quiet! Light Loop 360mm review ]]> With a few generations of AIO liquid coolers under its belt, Be Quiet! has done pretty well in balancing cooling, aesthetics, noise and price, usually focussing on low noise and RGB-less designs. As RGB lighting has become more popular, though, it’s had to relent and add it to their coolers even if it’s done so in limited fashion. That changes with the Light Loop 360mm, which is the most illuminated Be Quiet! liquid cooler yet.

Thankfully, a lot of the great features of previous Be Quiet! coolers are still here. You get a trio of Be Quiet!’s own Light Wings LX fans and being best known for its fans and air coolers, you can all but guarantee these will be epic.

Here, they have 16 RGB LEDs embedded in the fan hub that radiate light outwards through nine semi-transparent fan blades. The fans themselves use rifle bearings and are 4-pin PWM-controlled, topping out at a reasonable 2,100 rpm.

You get screws for the included fans to mount them to the radiator, but sadly there’s no extras included to add your own should you want to boost cooling or lower noise at a later date as some other manufacturers allow. Our sample had all-black components, but there is a white version available if that’s more your thing. The actual illuminated sections are the pump top, which features a diagonal blade-like design, and the fan hubs and blades and both can display multiple colours simultaneously. The lighting is very punchy and vibrant and easily more appealing than that on Arctic’s Liquid Freezer III coolers.

The pump can spin up to 2,900 rpm, but as it has its own 4-pin PWM cable, it can be controlled separately. However, Be Quiet! has certainly been singing the pump’s new progressive motor IC in terms of reducing noise so hopefully running it at full speed won’t involve dealing with any hideous whining. Another noise-reducing feature we’ve seen on previous Be quiet! liquid coolers is the cooler’s ability to be refilled. Coolant can, over time, evaporate out of the sealed loop meaning air can be introduced, potentially increasing the noise in the pump.

Be Quiet! Light Loop 360mm specs

Be Quiet! Light loop AIO close up.

(Image credit: Future)

Compatibility:  LGA 1851, 1700, 1200, 1150, 1151, 1155  AMD Socket AM5, AM4
Dimensions: 120 x 397 x 52 mm (cold plate: 55 x 55 x 1.5 mm)
Radiator: 397 mm, aluminium
Pump:  Up to 2,900 RPM
Fans: 3x Light Wings LX 120 mm, Rifle Bearing, up to 2,100 RPM
Lighting: Full RGB on radiator fans, pump section
Price: $159 | £130 

A coolant bottle is included to top the coolant up should this happen, but should also mean the cooler can be installed in any orientation without fear of air getting into the pump if it’s higher than the radiator, which also acts as a reservoir, trapping any air in the loop. This is a well-known issue with long term use on standard liquid coolers, but so long as you top up the coolant here every year or so, the Light Loop 360mm should live a long and happy life in any position. The radiator itself is your standard thickness model, sitting at just 27 mm deep and 52 mm with the fans installed.

Unlike Arctic’s Liquid Freezer III coolers, which feature much thicker radiators, there should be no compatibility issues, even in cases with modest clearance around fan mounts. Arctic’s coolers also suffer from motherboard compatibility issues due to their large pumps and integrated VRM fans, but the Light Loop 360mm’s pump is very compact so we double any standard motherboard will struggle to house it.

While its fan and pump cables lack the brilliant pre-tidying of Arctic’s alternatives, Be Quiet! has at least included a fan and lighting hub that can combine all of the cooler’s cables into a single 3-pin ARGB and 4-pin PWM cables. This means you just need to route the cables behind your motherboard tray to the hub, rather than trailing across your motherboard, but we’d still rather see something more elaborate here from Be Quiet! in future.

The installation process is simple, although there’s a significant number of parts to deal with. Thankfully, you don’t need to remove the processor socket mechanism on Intel motherboards like you do with Arctic’s coolers and the pump itself attaches to mounting brackets you construct on the board using just two screws. On AMD Socket AM5 boards, the brackets allow for a slightly off-centre pump installation too, catering for Ryzen 7000 and 9000 processor hot spots that sit away from the centre point on the heatspreader.

Image 1 of 6

Be Quiet! Light loop AIO with RGB lighting.

(Image credit: Future)
Image 2 of 6

Be Quiet! Light loop AIO fans close up.

(Image credit: Future)
Image 3 of 6

Be Quiet! Light loop AIO fans close up.

(Image credit: Future)
Image 4 of 6

Be Quiet! Light loop AIO fans close up.

(Image credit: Future)
Image 5 of 6

Be Quiet! Light loop AIO close up.

(Image credit: Future)
Image 6 of 6

Be Quiet! Light loop AIO liquid.

(Image credit: Future)
Buy if...

✅ You want a very quiet pump: Pump noise can be a major drawback for AIO liquid coolers, but the Light Loop 360mm’s pump was practically silent even at full speed

You want to top up coolant: Coolant can evaporate out of the loop over time causing the pump to become noisy, but Be Quiet! includes extra coolant and a fill port to top it up

✅ You want great cooling The powerful pump and fans were able to cool a Core i7-14700K very effectively 

Don't buy if...

❌ You’re looking for the cheapest RGB AIO liquid cooler: There are certainly cheaper options both with and without RGB lighting that have large 360mm radiators

❌ You want a simple installation: While the pump uses just two screws to fit to the mounting mechanism, there’s a large number of overall components to deal with

The pump’s noise level was impressively low, only audible at full speed if you turn down all other system fans and put your ear within a foot of it so if you’re concerned about pump whine then we can highly recommend the Light Loop 360mm and that’s before playing around with the pump speed, which cut the noise even further. The fans, on the other hand, were not so quiet, adding a few decibels to the noise we’ve seen from coolers such as Arctic’s Liquid Freezer III 360 and definitely less pleasant to sit next to at full speed. Thankfully, as you edge away from full speed, the noise drops significantly and the noise quality improves so if you’re particularly noise-sensitive, we’d suggest limiting the fan speed to only hit 100% under extreme conditions.

There’s definitely scope to do this too as the Light Loop 360mm outperformed the Arctic Liquid Freezer III 360 in all of our tests and even once the ambient temperature was factored in, undercut the Arctic cooler by up to 5°C when pitched against our Core i7 14700K with a temperature of 77°C in our x264 test. It even shaved a few degrees of the temperature in our game test too and returned to idle temperatures far quicker as well.

Ultimately the Be Quiet! Light Loop 360mm is powerful, flexible, looks fantastic and its unique feature of being able to top up the coolant potentially means it has an extended lifespan compared to other coolers too. The negative points you’ll want to be aware of are relatively high noise levels from its fans at full speed and relatively high price in some areas for what is a fairly basic AIO liquid cooler in terms of not being software-controlled or having cable-free fans or other premium features. However, its excellent performance, good build quality and super-quiet pump go a long way to make up for this.

]]>
https://www.pcgamer.com/hardware/be-quiet-light-loop-360mm-review/ RzBLvAcdaaNM7aHSiwxjKB Fri, 13 Dec 2024 15:30:43 +0000
<![CDATA[ Intel Arc B580 review ]]> I dread to think how bad things would have been had Intel released this Arc B580 graphics card back in the summer. All year long we've been expecting the new Battlemage GPU architecture to arrive in a new suite of discrete Intel graphics cards, but we've had to wait until the very last minute of 2024 for the company to be able to legitimately say it hit its target of a launch this year.

And where the previous generation Alchemist GPUs were notoriously unreliable in their levels of gaming performance, we were given to believe such things wouldn't be an issue with Battlemage. Indeed, from looking at the Xe2 GPU's first outing inside Lunar Lake, things did look pretty positive.

In its first discrete graphics card of this Battlemage generation—strangely the card whose Alchemist equivalent launched last—the $250 Arc B580 is offering a hefty 12 GB frame buffer, with a healthy 192-bit bus, and a re-designed graphics core aimed at being both more familiar to game engines and to deliver on the twin modern demands of AI and ray tracing workloads. Good job, considering it's launching at a higher price than the $190 A580 did.

Intel is also offering up XeSS-FG. This is its own take on Frame Generation, the interpolation technique introduced by Nvidia to magic up game frames seemingly from nowhere. And in principle it's far more like Nvidia's take on it than AMD's, using a single-model AI technique to achieve its own effects.

There's more cache, more memory, improved core components, and the promise of solid, reliable drivers. All very positive. That positivity, however, has largely evaporated upon first contact with the Arc B580, its BMG-G21 GPU, and the PC Gamer graphics card test gauntlet.

It's not all bad, not by a long shot, but given the state of the review drivers right now, I don't want to imagine what this launch would have looked like just a few months ago.

Intel Arc B580 verdict

Image 1 of 2

Intel Arc B580 graphics card

(Image credit: Future)
Image 2 of 2

Intel Arc B580 graphics card

(Image credit: Future)
Buy if...

You're willing to wait: There may well come a time soon where these driver inconsistencies are toast, and down the line the Arc B580 could be the go-to budget GPU. But I'd wait to see if that happens first before spending the money.

Don't buy if...

You want solid, reliable performance: Isn't that what we all want, deep down? Sadly, in its current state of driver support, the Arc B580 is not the reliable GPU we crave.

The Intel Arc B580 was touted by its makers as a key 1440p budget GPU, and I wish I had the confidence to agree. Though the numbers I have managed to get do kinda bear that out for the most part; where it's behind the key RTX 4060 battle in terms of frame rates it's only by a relatively small margin at this relatively high resolution, and where it's ahead it's there by sometimes a fair margin, especially when you add in upscaling and frame generation.

But for the new Intel graphics card to be able to get any kind of recommendation from me it had to do one thing: run consistently. You had one job, Battlemage… Just run the games we throw at you, and not flake out.

Sadly, while things looked super positive from Intel's carefully picked pre-launch benchmarks, that hasn't translated into independent testing. I've heard of other reviewers having driver issues in their testing, so I know I'm not alone with the failure rate in our new GPU benchmarking suite.

Consistency was always going to be key and it's just not there yet with the Arc B580. It's going to be a problem for Intel to turn this initial impression around, even with the sorts of performance bumps we saw in Alchemist drivers over the past couple years.

You had one job, Battlemage… Just run the games we throw at you, and not flake out.

If the issues are ironed out, however, then this $250 GPU will be a hugely tempting budget graphics card in a market where it's been tough to make a positive recommendation before. The $300 RTX 4060 is the obvious alternative, but its 8 GB VRAM has always been a stumbling block for its own 1440p frame rates. On the AMD side the similarly priced RX 7600 XT, with its 16 GB VRAM, is a far less tempting proposition. That hefty frame buffer is still hobbled by a 128-bit aggregated memory bus and RDNA 3 still struggles with ray tracing.

So, it feels like there is an opportunity here if Intel can make the Arc B580's handling of PC games far more consistent. That window of opportunity might be closing rapidly if Nvidia gets anything like a budget GPU out of its Blackwell generation in the first half of 2025, though.

Right now, however, it feels like too much of a lottery when for a small amount more you can buy a boring Nvidia card that will just work.

Intel Arc B580 architecture and specs

Image 1 of 2

Intel Arc B580 graphics card

(Image credit: Future)
Image 2 of 2

Intel Arc B580 graphics card

(Image credit: Future)

The Arc B580 is the first discrete graphics card Intel has released to sport the new Battlemage, or Xe2 GPU architecture, and its BMG-G21 chip represents a necessary change from the Alchemist GPUs launched in the tail end of 2022. We've covered the architectural changes in our Lunar Lake coverage, but it does bear repeating here what Intel's switched around and why it's done so.

In raw terms, the BMG-G21 looks like a lesser chip than the A580 or the A750 of the Alchemist generation, but in performance terms the B580 is actually somewhere between the $190 A580 and the $290 Arc A750. That's because Intel has made sweeping changes to its architecture to both improve efficiency and performance. And maybe to improve compatibility along the way, too.

Arguably the biggest change is the switch from SIMD8 to native SIMD16 execution, something that was seen as a bit of a misstep with the original Arc GPU design. Previously there were essentially eight lanes of processing for each instruction fed to the GPU, that has been swapped out for a native 16-lane design giving a wider execution model which helps efficiency and might even give it a little fillip in terms of wider compatibility.

Another mistake Intel has fessed up to is the decision to run some DirectX 12 commands entirely in software emulated states, rather than being supported by specific units within the GPU hardware itself. Intel states that its architects ran deep-dives on various workloads to see what its graphics acceleration hardware was doing, and this has led to the Xe2 GPU being far better optimised for DX12 and better utilises the hardware at its disposal.

The biggest jump is the Execute Indirect command, which was one of those previously emulated in software, but is used widely by modern game engines such as Unreal Engine 5. Now it has support baked into the Xe2 hardware Intel is claiming a performance boost on that part of a frame's draw time by anywhere from 1.8x to a massive 66x faster than with Alchemist. Though apparently it averages out to around 12.5x in general.

Image 1 of 2

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)
Image 2 of 2

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)

But there are a host of other small improvements in that graphics acceleration area which should all end up delivering the extra final frame rate performance that Intel is promising with Battlemage. And that is effectively a significant increase in performance per core, which is why the 20 Xe cores of the B580 are so much more effective than the 24 or 28 cores of the Arc A580 or Arc A750.

Those second-gen cores will be familiar to anyone who checked out the Alchemist GPUs in any depth, but Intel has kinda smooshed things together rather than separated them out. Instead of having 16 256-bit Vector Engines in each of the cores, there are now just eight 512-bit Vector Engines. And instead of 16 1024-bit XMX Engines, there are now eight 2048-bit XMX Engines. But that native SIMD16 compute capability means there's no lack of parallelism because of the new structure and it can throw the full force of a Vector Engine's floating point units at a task if it needs to, rather than having to pull them together from separate blocks.

Another reason for the improved Xe2 performance is that Intel has also bumped up the cache levels, with the first level cache being boosted by 33% to 256 KB in total, and a total of 18 MB of L2 cache, up from 16 MB in the previous generation.

Ray tracing was one of the more positive parts of the Alchemist picture, and Intel has improved on its second-gen RT units again. It was a major headache for Intel to implement originally with Alchemist, but being able to build upon it for Battlemage means that Intel has a better ray tracing GPU than AMD at this point in time.

Image 1 of 3

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)
Image 2 of 3

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)
Image 3 of 3

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)

Another positive has been Intel's own upscaling solution, XeSS, which is also seeing a version two with Battlemage. Though, arguably, the actual upscaling part isn't changing, just getting a slight naming change for clarity, now being known as XeSS Super Resolution in order to differentiate it from the new features of XeSS2.

The key one being XeSS Frame Generation, or XeSS-FG. Nvidia kicked off the interpolation race, and true to form Intel has followed its lead rather than matching AMD's less full-force simulacrum. That's because Intel, like Nvidia, has specific matrix engines inside its GPUs rather than doing all its extra work in shaders as AMD does.

The new XMX AI engines can hit both FP16 and INT8 operations, which makes them well situated to cope with the rigours of modern generative AI fun times. But they will also have a part to play in the new frame generation feature of XeSS, too, as that also has AI elements.

Image 1 of 2

Intel XeSS Frame Generation breakdown

(Image credit: Intel)
Image 2 of 2

Intel architectural breakdown of new Battlemage GPU designs

(Image credit: Intel)

XeSS-FG is a single AI model implementation that looks at the previous frame as well as the new frame in flight, using optical flow and motion vector reprojection algorithms, blended together to create interpolated frames.

At the moment that's only available in one game, F1 24, but is certainly impressive from what I've seen. It will function in any Battlemage GPU, including those of Lunar Lake.

Alongside that, and often linked for what will become obvious reasons, is XeSS Low Latency, or XeSS-LL. This is akin to Nvidia's Reflex feature and is designed to cut the PC or display latency (depending on what you want to call it) of a game. That is fundamental for fast-paced competitive games, but also vital to make the most of Intel's XeSS-FG, too. Frame interpolation always adds latency as there's another step in the process before a rendered frame is displayed, but having XeSS-LL in play will cut that back down to a more standard level.

Combining all the XeSS2 features together and you get far greater performance, with both higher frame rates and lower latency. A win win. So, how does it all actually perform then when it comes down to the numbers?

Intel Arc B580 performance

Image 1 of 3

Intel Arc B580 graphics card

(Image credit: Future)
Image 2 of 3

Intel Arc B580 graphics card

(Image credit: Future)
Image 3 of 3

Intel Arc B580 graphics card

(Image credit: Future)

In raw native resolution terms it is the dictionary definition of 'a mixed bag'.

Intel's claims of beating out the immensely popular—but certainly due a toppling—RTX 4060 graphics card definitely do have some merit from our own testing, but it's absolutely not a cut and dried case of Intel dominance of the budget market. Though, were you to just take the 3DMark performance of the B580 as gospel you would have a markedly different view of the situation than you get once you actually look at it on a game-by-game basis.

Intel has shown, from Alchemist onwards, how well optimised its drivers and hardware are when it comes to UL's benchmark standards. The 3DMark Time Spy Extreme performance of the Arc 580 is some 45% higher than the competing Nvidia GPU, which looks like a stellar achievement. In Port Royal, too, there's a huge gap in the ray tracing benchmark, with the Intel card having a 31% lead over the Nvidia RTX 4060.

But as soon as you start talking about games, that's where things look different. In raw native resolution terms—testing the hardware itself rather than upscaling algorithms—it is the dictionary definition of 'a mixed bag'. In one benchmark it will sit slightly behind the Nvidia card, in another it will be slightly ahead, another still and it's well behind, and then well ahead again in further tests.

This up and down performance was a feature of the Alchemist range of cards, and I was hoping it wouldn't be the case again with Battlemage. The twist here, however, is that where the A-series cards would just perform poorly, with this first B-series card I'm coming up against games that simply will not work.

Cyberpunk 2077 is a particularly frustrating example, because without upscaling enabled it completely froze our test system while trying to load into the game world. Not just a crash-to-desktop, but a complete lock which required a hard reset. And it's particularly frustrating because if you look at the pseudo real-world performance—our 1440p testing with upscaling and frame generation enabled where possible—the Cyberpunk 2077 performance is unbelievably good. Like, almost a two-fold performance hike over the RTX 4060 with the RT Ultra preset enabled at 1440p. It's a battering of, well, more than 3DMark proportions.

Then I had Homeworld 3 refusing to run in DX12 mode, and then performing really poorly in the DX11 mode I had to enable just to get some B580 numbers for the game.

There are glimpses, however, of where the Battlemage hardware, with all its architectural improvements, is making a big difference in terms of how it performs in games. And when you take into account the difference XeSS-FG makes in F1 24 it does lead me to feel more positive about how this card could end up in the future. The fact that, even with the RTX 4060 using its own Frame Gen feature, the B580 gets over 50% higher frame rates, and looks damned good while doing it, is testament to what Intel has done with the feature.

It's also notable that even when there are performance disparities on the negative side for the Arc B580, that chonk 12 GB frame buffer really helps shrink the gap when you start looking at higher resolutions.

When it comes to the system-level performance of the card, too, it's a positive story. The card itself looks great, stays impressively cool, and even without messing around in the BIOS and Windows extra power profiling its a relatively efficient GPU, too. The performance per watt levels are up there with Nvidia's efficient Ada architecture, despite being a much bigger chip.


PC Gamer test rig
CPU: AMD Ryzen 7 9800X3D | Motherboard: Gigabyte X870E Aorus Master | RAM: G.Skill 32 GB DDR5-6000 CAS 30 | Cooler: Corsair H170i Elite Capellix | SSD: 2 TB Crucial T700 | PSU: Seasonic Prime TX 1600W | Case: DimasTech Mini V2

Intel Arc B580 analysis

Image 1 of 2

Intel Arc B580 graphics card

(Image credit: Future)
Image 2 of 2

Intel Arc B580 graphics card

(Image credit: Future)

There are few things as disappointing as wasted potential. And maybe it's too harsh to slap that tag onto the new Intel Arc B580 graphics card only a day out from its eventual public release, but in my time testing the new Battlemage discrete GPU that's the overriding feeling I'm left with: Disappointment.

Though, the sucker that I am, it is still tinged with hope for the future.

By the way, I get it. I shouldn't be talking about feelings as someone who purports to be a serious hardware journalist, especially not as to how they pertain to a fresh lump of silicon and circuitry.

But I so wanted this new generation of Intel graphics architecture to be a tangible improvement from Alchemist, and there are glimpses of the true promise of the new Battlemage architecture here, shrouded as they are in the now-familiar veil of consistently inconsistent software drivers. The fact we're still talking in those terms about Intel's second generation of discrete graphics cards is so disappointing when we were promised its drivers suffered from "no known issues of any kind."

For what it's worth, even just looking at the release notes for the first Arc B580 driver I installed when I started my testing, it's pretty clear there absolutely are "known issues."

But it's not the total train wreck I was preparing for after my first few benchmarks in our new GPU testing suite. It was maybe unlucky that where I started my testing also just happened to be where there are serious points of failure with the new GPU. Performance picked up with later tests, and I've been impressed with XeSS Frame Generation for the little I've seen of it in the only game supporting the new feature, but the early going was rough.

I was initially interested in seeing what the 12 GB of VRAM would mean for creator applications, and it slapped in the Procyon GenAI benchmark compared with the RTX 4060, which will be its major competition at this level. But shifting on to the PugetBench for DaVinci Resolve tests and I get my first failure… not at any particular point, but every time I've tried the benchmark (after subsequent updated driver releases just days before launch) it falls over at some point in the process.

Then I started at the beginning of the alphabet in our gaming test suite. Black Myth Wukong works, but delivers gaming performance behind the RTX 4060 it's supposed to be topping by an average 10%. Then there's Cyberpunk 2077 and the game cannot even load into the world unless you enable some form of upscaling.

Cyberpunk 2077, by the way, does deliver a significant performance boost over the RTX 4060 when you're pitting DLSS and Frame Generation vs. the Intel card utlising AMD's GPU agnostic equivalent features. So, at least there's that.

A subsequent successful game bench is then followed by a total catastrophic collapse with Homeworld 3 and it crashing before the first splash screen. After a back and forth with Intel's PR I could then bench the game in the DX11 mode by using a command line argument on boot, but that just delivered performance well off the budget competition.

Image 1 of 3

Intel Arc B580 graphics card

(Image credit: Future)
Image 2 of 3

Intel Arc B580 graphics card

(Image credit: Future)
Image 3 of 3

Intel Arc B580 graphics card

(Image credit: Future)

How do you recommend someone who's not a total contrarian spend their cash on an inconsistent graphics card?

You'll be relieved to discover I'm going to abandon the benchmark blow-by-blow now, but I just want to show how inconsistent the testing process has been. Maybe I've just been unlucky and have somehow devised the perfect suite of gaming benchmarks to hit the only problems the new Arc B580 GPU has, and you might also suggest that if only three out of 11 specific tests have failed it's not that bad.

But those are infinitely more problems than you'll have than if you spend another $40 on the mature Nvidia GPU. And therein lies the rub; how do you recommend someone who's not a total contrarian spend their cash on an inconsistent graphics card?

If it were merely a case of the card not performing as well in some games and easily out-pacing the competition in others that wouldn't be such an issue, I'd take the cost savings and enjoy the hell out of a great new budget GPU. But it's not, it's a case of not knowing whether the card will even boot a given game. For the short time I've been given to test the card ahead of launch and ahead of the holiday season, I've not had the chance to just chuck a ton of different games at the Arc B580 to see how widespread these failures are. But my anecdotal evidence isn't painting a particularly positive picture.

This isn't the end for the Battlemage graphics card, however, as bad as it might seem on day one. Intel has shown with Alchemist that it is capable of shoring things up down the line with subsequent driver releases, and I'm told fixes for both my Cyberpunk 2077 and Homeworld 3 issues are in-hand as I type. So Intel could be again borrowing some of AMD's classic 'fine wine' ethos, where the struggling cards at launch are slowly transformed into functioning members of gaming society by fresh software.

Certainly the outstanding performance in Cyberpunk 2077 with upscaling enabled gives me hope, as does the exceptional F1 24 results, too.

And, if that does happen across the board, my own testing figures do show a card that has the potential to be a real budget champion if it can nail that consistent performance across a wide range of titles. Right now, though, it's a struggle to make it a confident recommendation as a buy now GPU.

]]>
https://www.pcgamer.com/hardware/graphics-cards/intel-arc-b580-review/ BxjKstsiNHwNofTitW9qYR Thu, 12 Dec 2024 14:01:58 +0000
<![CDATA[ HP Omen Transcend 32 OLED review ]]> Pity poor old HP. It's late to the 32-inch OLED gaming monitor market with the new Omen Transcend 32. Call us a little jaded, but it feels like we've seen it all before, what with our reviews of 4K OLEDs like the LG UltraGear 32GS95UE , Alienware 32 AW3225QF, Asus ROG Swift OLED PG32UCDM, Samsung Odyssey G8 OLED G80SD, or MSI MPG 321URX.

Ultimately, there are only two sources you can go to for the actual 32-inch 4K OLED panel, LG and Samsung. So, every monitor including this new HP is a derivation of one of those two familiar technical themes. What chance, then, that HP has come up with something individual and compelling?

Actually, the Omen Transcend 32 is pretty nice. The caveat is that it all hinges on price. Conceptually, HP has gone for a full-feature take on the 32-inch OLED riff. It's the full 240 Hz treatment, sports probably the best connectivity of any display in this class, and HP has included some additional capabilities claimed to make this a great display for content creation, too.

All of that would have you expecting a pretty painful price. And indeed the MSRP is a hefty $1,299. That's around $500 more than the MSI MAG 321UP, which sports essentially the same Samsung QD-OLED panel. The difference is that the MSI lacks true single-cable USB-C connectivity and also runs at 165 Hz.

HP Omen Transcend 32 OLED specs

HP Omen Transcend 32 OLED

(Image credit: Future)

Screen size: 32-inch
Resolution: 3,840 x 2,160
Brightness: 250 nits full screen, 1,000 nits max HDR
Color coverage: 99% DCI-P3
Response time: 0.03 ms
Refresh rate: 240 Hz
HDR: DisplayHDR 400 True Black
Features: Samsung QD-OLED panel, Adaptive Sync, 1x DisplayPort 2.1, 2x HDMI 2.1, USB-C with 140W PD, KVM switch
Price: $999 | £1,199

However, you can actually buy this HP Omen for $999 from Best Buy, even at launch. In fact, Omen's own website links you through to Best Buy if you click "Shop Now", though the listing for the Omen Transcend 32 on the sister HP website allows you to buy direct for $1,299. Much will hinge on that disparity, but we'll come back to the sordid matter of money later.

To drill down into the specifics, then, this Omen monitor uses Samsung's latest QD-OLED 4K panel running at the aforementioned 240Hz and offering 0.03ms response times. In other words, it's hella quick.

It's rated at 250 nits full screen just like every other Samsung-equipped 32-inch 4K OLED, along with a peak of 1,000 nits for HDR content in a 3% window. Again, much of a muchness with other Samsung-based 4K OLEDs, albeit those are great specs.

The Transcend 32 begins to carve out something of a niche with its industrial design. The white rear section of the chassis is a little redolent of some Alienware and Samsung Odyssey panels. But there are a few signature Omen flourishes, such as the square base and slim stand. Overall, it looks pretty slick, though perhaps feels a little plasticky for such a premium panel.

(Image credit: Future)

For the record, the display adjusts for height, tilt and rotate into portrait mode, but doesn't have any swivel support. There is also an invisible magnetic headphone hook on the left side of the panel for support headsets.

However, where the Transcend 32 really begins to separate itself from the competition involves connectivity. Along with the dual HDMI 2.1 and Display 2.1 ports, plus a USB hub with both USB-A and USB-C connectivity, most notable is the USB-C interface with DisplayPort Alt mode and 140 W of power delivery.

Yup, you read that right, 140 W. That's a fair bit higher than the usual 90 W and 100 W that USB-C interfaces top out at with most monitors. HP claims that it's enough to get a "full experience" with a single cable when gaming with their own Omen Transcend 14 gaming laptop.

(Image credit: Future)

We suspect that's because the Transcend 14 can be had with a relatively low wattage Nvidia RTX 4060 GPU. Whatever, what we can say for sure is that in our testing with a Lenovo laptop equipped with an RTX 4080, the 140 W USB-C power delivery certainly wasn't enough for a "full experience". Using the USB-C only for power delivery cut the frame rate in half playing Cyberpunk 2077, but even the restriction in power usage that implies wasn't enough to prevent the battery being depleted during game play.

Anyway, the bottom line is that the 140 W USB-C interface almost certainly won't be enough to keep most gaming laptops juiced. But what it does allow for is a reasonably powerful laptop to be used alongside a gaming desktop. To that end, HP has included full KVM switch support.

Where the Transcend 32 really begins to separate itself from the competition involves connectivity.

It's also worth noting that, likely due to the high-spec USB-C interface, this monitor has easily the most massive power supply we've ever seen for a monitor. It's rated at 480 W and weighs a metric ton.

Connectivity aside, what of the Transcend 32's image quality? If there aren't really any surprises, we can confirm this is one of the better Samsung-based 32-inch 4K OLEDs we've seen. It's nicely calibrated in SRGB mode, but perhaps more importantly, SDR content actually looks its best and most punchy with HDR enabled.

That means you can simply select HDR and you're good to go with all kinds of content. It's also good to see that panel brightness is very consistent in all modes, with little if any evidence of brightness variability on the desktop depending on the content being shown. In other words, the panel's ABL or automatic brightness limiter is not at all intrusive.

Image 1 of 5

HP Omen Transcend 32 OLED

(Image credit: Future)
Image 2 of 5

HP Omen Transcend 32 OLED

(Image credit: Future)
Image 3 of 5

HP Omen Transcend 32 OLED

(Image credit: Future)
Image 4 of 5

HP Omen Transcend 32 OLED

(Image credit: Future)
Image 5 of 5

HP Omen Transcend 32 OLED

(Image credit: Future)

That may be down to the graphene film panel heat sink and fan, with the latter operating pretty much inaudibly during our review. Anyway, this is as punchy a 4K QD-OLED as we've seen. It's also slightly less prone to the overly warm color balance that some QD-OLED's exhibit. Nice.

This is as punchy a 4K QD-OLED as we've seen.

Yes, you can still see the slightly grey, purplish tinge to the panel itself in very bright ambient light conditions. All QD-OLEDs suffer from that.

But it won't be an issue for most users and the general image quality, contrast and HDR performance is just fab.

The contrast and HDR visuals in particular benefit from a glossy coating that manages to really let the per-pixel OLED lighting pop while at the same time not being excessively prone to reflectivity. It's all nicely judged.

(Image credit: Future)

The net result is a pretty stellar gaming experience. Cyberpunk's ray-traced HDR visuals look utterly spectacular on the Transcend 32, while the 240Hz refresh and vanishingly short response times make for a very nippy experience in shooters like Counter-Strike 2. Of course, a 4K panel like this wouldn't be your first choice if you want the absolute last word in low latency and high frame rates. But if you're not a pretty serious esports type, there's more than enough speed on offer here.

Along with all that gaming goodness, this is a really lovely panel for more mundane desktop duties. The pixel density of the 4K native is much crisper and sharper than, say, a 27-inch 1440p or 34-inch ultrawide panel, for instance.

HP has also added a few features specifically for content creation pros, including configurable HDR clipping levels, manual or AV Info Frame HDR controls and factory calibration. We're likewise big fans of Omen's OSD controls. The interface is exceptionally logical, clear and easy to use and has all the features and options most users will want to see.

(Image credit: Future)

Oh, and HP has built in a quartet of downward firing speakers that perform comfortably above the norm. They're not going to displace even fairly basic dedicated speakers. But they produce decent audio in a pinch and are useful as a temporary backup.

The overall upshot, then, is that this is certainly one of the more appealing and better resolved 32-inch 4K OLEDs we've seen. Is it the best? That's hard to say. But one thing is for sure. Its appeal very much hinges on price.

Buy if...

You want 4K OLED sizzle with great connectivity: HP's take on the 4K QD-OLED thing has the best connectivity of the lot.

Don't buy if...

You want the most cost effective option: You can get basically the same QD-OLED panel for about $200 less.

At HP's $1,299 MSRP, we've not overly bothered. There are alternatives that can be had for $300 less that will deliver a similar experience. But at the $999 tag that's currently applied at Best Buy, the Transcend 32 is a different proposition.

In fact, if USB-C connectivity and sharing a monitor like this across a desktop gaming rig and a laptop is what you desire, then this Omen is probably our pick of the current bunch.

Of course, we could argue that if all you care about is exclusively gaming with your desktop and you don't mind forgoing 240Hz for 165Hz, the cheapest of the MSI options is the obvious pick. And if the $200 price differential is meaningful to you, then that absolutely makes sense.

But if you can afford the full $1,000, as an overall package in terms of its strong feature set and marginally superior image quality compared to cheaper alternatives, this HP Omen is probably worth the extra cash. It's certainly worthy of being on your 4K gaming OLED shortlist.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/hp-omen-transcend-32-oled-review/ DchLE8REyDrUkY5ArAip75 Wed, 11 Dec 2024 14:04:39 +0000
<![CDATA[ Arctic Liquid Freezer III 360 A-RGB review ]]> With an enviable reputation for making some of the best value AIO liquid coolers on the market, Arctic has avoided making hardware too outlandishly feature-rich in an effort to offer great cooling without needing a bank loan. Now on its third generation of Liquid Freezer models, the likes of the Liquid Freezer III 360 A-RGB trump a lot of the competition in terms of value. This is thanks to unique features that boost their CPU cooling prowess while also keeping other parts of your PC cool for a lot less than many similarly sized options.

They’re available in a range of sizes in both 120 mm and 140 mm fan configurations, with the model I'm reviewing being the Liquid Freezer III 360 A-RGB. If you have more modest requirements then 240 mm and 280 mm models are also available, alongside a monstrous 420 mm model with three 140 mm fans.

This review concerns the illuminated white model with RGB lighting spanning both pump and fans, but if you want to save some cash there are standard black versions too with RGB lighting that cost even less. The equivalent of the 360 mm model I'm looking at here, for example, can be had for just £77 here in the UK—that’s good for a 360 mm AIO liquid cooler in anyone’s books.

Part of what impresses here is the extra thick radiator. At 38 mm, its depth is a good 10 mm or so above what you’d see on most other coolers and the added surface area inside means more heat can be dissipated with the same airflow. In theory it’s able to deal with more heat before its fans need to spin up too, but the added thickness can also mean more restriction.

Liquid Freezer III 360 A-RGB specs

A cooling fan glowing blue, with the Arctic logo on top.

(Image credit: Antony Leather)

Compatibility: LGA1851/1700, AMD Socket AM5/AM4
Dimensions: 120 x 398 x 38 mm (cold plate: 40 x 40 x 1.5 mm)
Radiator:
398 mm, aluminium
Pump: Up to 2,800 RPM
Fans: 3x Arctic P12 PWM PST A-RGB 120 mm, Fluid Dynamic Bearing, up to 2,000 RPM
Lighting: Full RGB on radiator fans, pump section
Price: $110 | £77

Thankfully the fins inside the radiator spaced a little wider apart to reduce that restriction, but it can still potentially be a limiting factor, especially at lower fan speeds. The other issue with a thicker radiator is case compatibility, but most cases that can fit a standard liquid cooler can also house a Liquid Freezer III as we’re only talking another 8-10 mm.

If in doubt, your case’s manual or product page often has this information, though five minutes with a ruler will also suffice. Still, the thickness requires some fairly beefy fans and the P12 PWM PST A-RGB fans included top out at 2,000 rpm, though there are AIO liquid coolers with fans way north of this. So, I'm hoping Arctic has done its research, finding a balance between fan speed and optimising the airflow as well as the thicker radiator.

The other eyebrow-raising feature is the removable fan that sits on top of the pump section. This drives air down onto your motherboard and helps to cool components such as the VRMs and SSD heatsinks that otherwise can get very toasty due to an AIO liquid cooler’s lack of local airflow compared to an air cooler. There are two cables included in the box that you’ll need to choose from and attach to the pump to allow for different modes of control.

One combines the pump, radiator fans and VRM fan into a single 4-pin connector, ramping all up or down depending on CPU temperature. The other cable separates those outputs giving you individual control. If you’re worried about the pump or VRM fan making noise (after all, the latter does max out at 2,500 rpm), thankfully you can control them separately using your motherboard’s additional fan headers.

In fact, many motherboards allow you to swap from the CPU to the VRMs as the input for fan control on specific fan headers, so you could ramp the fan up only when the VRMs get toasty, which is a fabulous option. The down side is that the pump section is quite large and occasionally has issues with motherboard compatibility, especially with Mini-ITX models, so make sure you check Arctic’s list on the product page.

Image 1 of 3

An array of three cooling fans, all glowing pink to demonstrate the fans' RGB lighting feature.

(Image credit: Antony Leather)
Image 2 of 3

An array of three cooling fans, all glowing blue to demonstrate the fans' RGB lighting feature.

(Image credit: Antony Leather)
Image 3 of 3

A fan cooler attachment for the coldplate.

(Image credit: Antony Leather)

All these separate cables sounds like a nightmare, but Arctic has at least combined the three fans’ 4-pin and RGB cables into a single line and also run those through the coolant tube’s sleaving. Using a few cable ties—which you'll have to source yourself as Arctic doesn't include these—you could easily combine the separate cables into one strand for neatness. The RGB lighting itself covers the pump and fans from one cable. It’s vibrant and contrasts well with the white components, although it's maybe not as bright as more expensive coolers.

Arctic further impressed us with the inclusion of a contact frame for LGA1700 and LGA1851 CPUs to reduce the pressure on the CPU that can otherwise cause it to bend or affect the contact with the pump section. This sounds great in practice, but was a little fiddly to install. Adjusting the pump’s contact plate to sit directly over the hottest part of the CPU, which on modern AMD CPUs is not quite in the middle, also offsets the AMD mount slightly.

Installation was otherwise pretty straightforward, but attaching the correct PWM cable for your needs before mounting the pump to the motherboard definitely requires a bit of work, as doing so later is tricky. There seems to be plenty of length to the tubing too and, with the cables on the fans pre-tidied, the installation didn’t take long at all.

Arctic cooling hardware mounted inside of a PC.

(Image credit: Antony Leather)
Buy if...

✅ You want excellent cooling without breaking the bank: 360mm AIO liquid coolers can be very expensive but, despite offering plenty of unique features, the Liquid Freezer III 360 A-RGB is one of the cheaper 360mm models around.

You like the idea of active VRM cooling: The Liquid Freezer III 360 A-RGB’s VRM cooling fan provides local airflow that aids cooling both your VRM and even nearby M.2 SSDs.

✅ You’re worried about your CPU bending on Intel sockets: The Liquid Freezer III 360 A-RGB includes a contact frame for Intel LGA1700 CPUs to prevent bending.

Don't buy if...

❌ You’re looking for elaborate RGB lighting: The RGB lighting here is fairly basic and while’s it’s vivid, it won’t make your PC pop like more advanced lighting on other coolers.

❌ You want a simple installation: While its AMD installation is straightforward, needing to remove the socket mechanism on Intel boards can be fiddly and a bit daunting.

With everything at full speed, I recorded a reading of 49 dBA on my sound meter, but without any horrible whines or droning. The removable VRM fan proved its worth too, dropping the peak VRM temperature by 5°C with no discernible increase in noise either. The pump was a tad loud at full speed, but the VRM fan housing encloses it, hiding most of the noise within, plus the pump only reaches full speed when the fans run flat out with the single PWM cable.

Tuning the fan down to 40% speed saw it operate almost silently, and I only saw a 2°C rise in the peak load temperature with my Core i7 14700K. In my Metro Exodus game test, the CPU reached 69°C, but rose to 83°C and 84°C respectively in 3DMark’s Steel Nomad stress test, and x264 encoding test. Cinebench was a little cooler at 80°C, while it took the cooler just under three minutes to drop back to idle temperatures from load.

If you’re looking for an affordable AIO liquid cooler that’s both capable of handling the hottest CPUs out there, but also operates at very quiet levels under lower loads, then the Arctic Liquid Freezer III 360 A-RGB is well made, easy to install (for the most part), and even includes RGB lighting. The VRM cooling fan is an added benefit, and the option to control it, the pump and radiator fans separately for fine tuned airflow is a welcome addition that caters to both die-hard tinkerers as well as those that just want to build their PC and start racking up headshots.

]]>
https://www.pcgamer.com/hardware/cooling/arctic-liquid-freezer-iii-360-a-rgb-review/ uPCGbYUNskmd5eyvyddmT4 Tue, 10 Dec 2024 17:21:22 +0000
<![CDATA[ Thermaltake Gaming Desk Pegboard (Medium) review ]]> Of all the various trinkets and doodads that find their way across my desk and into my loving arms, rarely is there one that tidies my workspace. But that's what this Thermaltake pegboard does, and it does so in a rather simple but stylish way.

Smart and sturdy: that's how I'd describe this pegboard. But that's if I were comparing pegboard to pegboard, in a world where gaming desk pegboards were commonplace. We don't live in such a world, however. In this one, the average person has nary even seen a desk pegboard in the wild, let alone owned one.

So, much better that we first clarify exactly what this thing does and whether it's better to have one (in general) than not have one (in general). Here goes.

It's a plate—a piece of your finest powder-coated steel—chock full of holes to fit various storage and organisation accessories. It's essentially a DIY-er's space-saving shelving unit. But one that looks a little trendy in this gamer space. We see big ones behind used on gaming desks such as the Corsair Platform: 6 and Decztop Bifrost Elite.

The Thermaltake pegboard is one that you can fit on your non-pegboardy desk to get in on some of that peggy action, letting you hook and balance all kinds of things hopefully not-so-precariously vertically up the side of your desk.

Specs (Medium version)

Thermaltake Gaming Desk Pegboard (Medium) with 2x controller and 1x headset accessories

(Image credit: Future)

Size: 42 cm (width) x 32 cm (height)
Weight capacity: < 11 lbs (5 kg)
Accessories: Hook x2, headset holder x1, controller holder x2, pin magnet x3
Price: $30
More: It's a pegboard. What more do you want? Holes?

So, how's it stack up? Well, first things first, I've had no complaints at all about sturdiness and build quality. It's only staying stuck to my desk because of its clamping mechanism, but I've honestly pushed and shook it quite hard and it doesn't move an inch. The steel is plenty strong, too.

It's only really with the single-peg accessories (the hooks) that "flimsy" comes to mind, because without two fixed points it can, of course, turn a little. But that's just the nature of single-peg pegboard accessories in general.

It was easy to set up, of course, being just a twisty-clampy ordeal. And once it's set up, it does look rather smart. I have mine on the left-hand side of my desk, which also gives my desk a little more of a boxed-in, cosy feel.

How about function? On that front, I've been quite pleased with it, but I'd have liked some different accessories to ship with it. I'm not sure whether I'm an oddball in this respect, but I have zero use for hooks or pin magnets. There's nothing small and magnetic that I want to attach, and no standalone cables I want easily available. That leaves me with a headphone holder and two controller holders for this Medium version. I only really need one controller holder, so that's just two useful accessories that come with it (for my use case, anyway).

Image 1 of 3

Thermaltake Gaming Desk Pegboard (Medium) with 2x controller and 1x headset accessories

(Image credit: Future)
Image 2 of 3

Thermaltake Gaming Desk Pegboard (Medium) with 2x controller and 1x headset accessories

(Image credit: Future)
Image 3 of 3

Thermaltake Gaming Desk Pegboard (Medium) with 2x controller, 1x headset, 1x hook, and 1x pin magnet accessories

(Image credit: Future)

What I'd have really liked is either a pen holder or a flat platform for me to display something, possibly even whatever book I happen to be reading at the moment. (I did actually rest a book on the controller holder for a while, but it was a little precarious.)

The large version of the pegboard, which is $10 extra and 52 x 42 cm, comes with just such useful accessories. In addition to some hooks and magnets (ugh), it also comes with one headset holder, two pen holders, and one storage shelf. But no controller holders in this one, for some reason. I had to go for the Medium because my desk has metal supports underneath that limited my clamping room, but I'd have probably gone Large if I could.

The main annoyance for me is that there doesn't seem to be an ideal package, whether small, medium, or large. What would be ideal? Well, IMO, that would be one controller holder, one headset holder, a pen holder, and a long platform, plus maybe a shorter display platform. And sure, throw in some hooks and pin magnets—why not? There's no version that comes with all of these things, though, and that's my only gripe.

Buy if...

You want to save desk space: Saving space is the primary purpose of a desk clamp pegboard.

You don't want to do any drilling: Its desk clamp mechanism is sturdy and requires no drilling or mounting.

Don't buy if...

❌ You want to attach lots of different kinds of things: Unfortunately, you don't get a full variety of accessories with this pegboard.

On the plus side, it looks like the holes are spaced about an inch apart, so you might be lucky fitting third-party accessories to it, given a lot of them have similar peghole spacing.

And that brings us to the crux of it: Thermaltake's marketing this as a "gaming desk pegboard", but don't for a second assume that it's the only desk-clamp pegboard on the market and that this is a gaming-specific niche. Just search "desk clamp pegboard" on Amazon and you'll come up with a range of options, including some with incredibly similar accessories.

Saying that, though, the price does seem about right for this Thermaltake one. In fact, it's significantly cheaper than many third-party options from brands I've never heard of.

Where does that leave us, then? Well, it's not a particularly unique product and it comes with a frustratingly thin variety of accessories, but it's reasonably priced, comes from a known gaming brand, and feels like a sturdy, quality product.

My suggestion: If you're looking for a pleasant way to save some desk space and display your gubbins, and if you're willing to look out for some cheap third-party accessories, go for it. But if you want a wide variety of accessories in the box, this pegboard unfortunately isn't it.

]]>
https://www.pcgamer.com/hardware/gaming-desks/thermaltake-gaming-desk-pegboard-medium-review/ keZ5Y7PcfYqpsGeUXieG3c Fri, 06 Dec 2024 16:27:46 +0000
<![CDATA[ Team Group MP44 4 TB review ]]> It wasn't that long ago when new gaming PCs shipped with a 256 GB SSD as the main drive and a 2 TB HDD to store all your games on. Now you can easily replace both with a single big solid-state drive. And in the case of this Team Group MP44, with 4 TB of capacity, you can do so without having to spend a small fortune.

That said, paying over $220 for an SSD is still a lot of money, but it works out at less than six cents per gigabyte. For a drive with performance claims of 7,400 and 6,900 MB/s sustained read/write, that's about as cheap as it currently gets. So you'd be forgiven for thinking that Team Group must be using poor-quality components to keep the price down.

The MP44 uses the same controller and NAND flash memory modules as those in the Lexar NM790, and nobody has ever accused that SSD of being cheaply made. It is simply a case that Team Group has sourced the best value parts on the market and delivered them all in a package that's free of frills and fancy features.

For example, there's no DRAM cache to help maintain sustained performance. Instead, like all such DRAM-less SSDs, it uses part of its capacity in a pseudo-SLC mode. NAND flash comes in various types (SLC, TLC, QLC, etc) and the YMTC 232-layer modules in the MP44 are TLC-based.

Team Group MP44 4 TB specs

A photo of a Team Group MP44 SSD

(Image credit: Future)

Capacity: 4 TB
Form factor:
NVMe, 2280, M.2
Interface:
PCIe 4.0 x4
Memory controller:
MaxioTech MAP1602A
Flash memory:
YMTC 232-layer TLC NAND
Rated performance:
7,400 MB/s sustained read, 6,900 MB/s sustained write
DRAM cache: None (dynamic SLC cache)
Endurance:
3,000 TBW
Warranty:
Five years
Price:
$225 | £268 | AU$585

These have less performance than SLC modules but they offer far more capacity. However, modern TLC chips can run some sections as if they're SLC, and the MP44 uses this high-speed section as a cache—buffering data transfers to ensure the drive keeps going at full speed for as long as possible.

Something else the MP44 doesn't have over some of the competition is a heatsink and with a maximum operating temperature of 70 °C, it's important that the drive is installed into a motherboard with a decent M.2 heatsink, as well as having good airflow inside.

That limit is 15 °C lower than that of the WD Black SN850X, the best gaming SSD around right now, which means the Team Group drive isn't ideal for situations where the SSD isn't going to be especially well-cooled (e.g. inside a console, laptop, or small form factor PC).

It does come with a graphene label covering all of the memory modules and controller but I found it rather easy to remove, which suggests it's perhaps not as effective at dissipating heat as one would believe, given the use of graphene.

Benchmarking SSDs is both simple and tricky. It's easy to throw one into a PC and run a series of tests to examine its performance in a variety of scenarios, but at the same time, the results one gets have the potential to be impacted by a number of factors.

For example, if you use a PCIe 4.0 M.2 slot on a motherboard that's managed by the chipset and not the CPU, then you're not likely to see the absolute full performance of the drive.

In reality, though, such differences rarely affect how well it works. You certainly won't notice your games loading considerably slower if you did use a chipset socket rather than a CPU one.

As you can see in the above results, the Team Group MP44 performs better than any of the other SSDs in the charts, although some of the test results are so similar that it's fairer to say that the MP44 is just as good as any other DRAM-less SSD.

Team Group does sell 4 TB SSDs with heatsinks but they're obviously more expensive than the MP44. If your gaming PC isn't best suited for a heatsink-less SSD, then you'd be better off choosing one of them.

PC Gamer test rig

CPU: AMD Ryzen 9 9900X
Cooler: Asus ROG Crosshair X670E Hero
RAM: 32 GB Corsair Vengeance DDR5-6000
GPU: GeForce RTX 4070
Storage: 2 TB Silicon Power XS70
PSU: MSI MAG AB50GL 850 W
OS: Windows 11 23H2
Chassis: Open platform
Monitor: Acer XB280HK

Performing a sustained and demanding sequence of data writes to the MP44 highlights the size of the pseudo-SLC cache. An average write speed of around 5,900 MB/s is maintained for around 196 seconds before dropping to 2,650 MB/S—which points to a maximum cache size of a little over 1 TB, and unless you're trying to write lots of huge 4K video files to the drive all the time, the MP44 will sustain that 5,900 figure comfortably.

That cache size is a lot larger than I was expecting. The WD Blue SN5000, for example, has a maximum SLC cache of 800 MB or so, and the Lexar NM790 which uses the same components, is smaller still at 600 MB. The MP44 doesn't perform quite as well as that drive in a sustained write but it's not far off, and I'd take the larger cache any day.

Do note that all dynamic SLC cache sizes depend on how much free space is available in the SSD—the fuller it gets, the lower the maximum size the cache can be.

During this test, the Team Group drive reached a reported peak temperature of 66 °C in an open-platform PC, with no active chassis cooling. That's a little close to the thermal limit for my liking but it's probably fine if one has a decent number of fans blowing air over it.

Image 1 of 5

A photo of a Team Group MP44 SSD

(Image credit: Future)
Image 2 of 5

A photo of a Team Group MP44 SSD

(Image credit: Future)
Image 3 of 5

A photo of a Team Group MP44 SSD, with its heatspreader label removed, showing the chips on the circuit board

(Image credit: Future)
Image 4 of 5

A close-up photo of a MaxioTech MAP1602A SSD controller chip, from a Team Group MP44 SSD

(Image credit: Future)
Image 5 of 5

A close-up photo of a YMTC 232-layer TLC 3D-NAND flash memory module, from a Team Group MP44 SSD

(Image credit: Future)

However, I have no concerns about the overall performance of the Team Group MP44. Even when the SLC cache is full, the write rate is more than acceptable. One note of interest, though, is that once the MP44 hits around 50% of its capacity, the write rate drops to an average of 980 MB/s, whereas the NM790 maintains a steady 2,700 MB/s.

That's possibly because of the controller limiting the rate to prevent the thermal limit from being reached but if you need to write 2 TB files on a regular basis, then a low-cost SSD like the MP44 probably shouldn't be your number one choice.

Buy if...

✅ You want an SSD with masses of space: 4 TB is as big as it gets right now, unless you want to spend an equally huge amount of money.

Don't buy if...

❌ Your PC doesn't have great cooling: The low thermal limit means you really need to ensure that the MP44 is covered by a good heatsink with lots of air flowing over it, if you want to avoid performance throttling.

With an endurance rating of 3,000 TBW, you could write 1 TB of data to the MP44 every day for a total of eight years before reaching that figure. You can certainly use the MP44 as a primary drive and not worry about its longevity, that's for sure.

There's an awful lot to like about the Team Group MP44. For its capacity, it's very good value for money—there are cheaper 4 TB drives available but they certainly don't have the same level of performance as this one—and it's perhaps all the SSD you'd ever need in your gaming PC. I personally prefer to have multiple drives, to separate apps, documents, games, and whatnot, but I appreciate that some people are perfectly happy to use just one big drive.

I do wish it had a higher thermal limit, though, and if I were fitting an MP44 in one of my rigs, I'd be looking to ensure there was a big fan blowing air across it at all times. That's tricky to achieve if it's mounted in the primary M.2 slot, as the graphics card will be belching out heat right next to it.

But as a secondary drive, inserted in a lower M.2 slot, I'd have no such qualms and this seems to be the ideal use scenario for the Team Group MP44. Sure it doesn't have DRAM—but when it's this good, who really needs it?

]]>
https://www.pcgamer.com/hardware/ssds/team-group-mp44-4-tb-review/ Yf9TsdVF2LkFHCvReeFyFB Fri, 06 Dec 2024 15:56:28 +0000
<![CDATA[ Indiana Jones and the Great Circle review ]]>
Need to know

What is it?: A first person, globetrotting adventure starring everyone's favorite archeologist.

Release date: December 8, 2024 (Dec 5 advanced access)

Expect to pay: $70/£60

Developer: MachineGames

Publisher: Microsoft

Reviewed on: Nvidia GeForce RTX3070, Intel Core i5 12600K, 32GB RAM

Steam Deck: All signs point to "no"

Link: Official site

I wasn't over the moon about the prospect of a triple-A Indiana Jones game before starting The Great Circle. A new MachineGames joint, now that's something I'd hang my hat on, but Dr. Jones? How many stories could there be left to tell about the Connecticut-based archeologist named after Illinois' weird little neighbor? Well it turns out he had at least one damn good videogame left in the tank⁠—MachineGames' finest to date.

The pairing made great sense on paper: The neo-Wolfenstein games were pulpy, swashbuckling adventures with tons of heart where a lovable, all-American protagonist gave those dastardly Nazis what-for. But I didn't believe it, feel it until Indy donned a Catholic priest's vestments to go undercover in a fascist-occupied Vatican—a miniature open world that I can only compare to the best hubs of the Deus Ex series. The combat, stealth, and puzzles are all simple but fun enough (maybe not the stealth⁠—more on that later), but paired with The Great Circle's phenomenal levels and nailing of what makes Jones such an iconic character, the result is a Batman Arkham-tier "who knew a licensed game about this guy could be this good?"

Crouch walking

Image 1 of 9

Indiana Jones in priest vestments standing next to cardinal talking to priest in The Great Circle

(Image credit: MachineGames)
Image 2 of 9

In-game recreation of iconic Indiana Jones stealing the idol in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 3 of 9

Beautiful baroque office with green marble walls and floors and a large statue of a dragon in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 4 of 9

Sunny vatican courtyard with red robed cardinals and fountains in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 5 of 9

Looking down at in-game journal with notes and sketches of characters and locations from Indiana Jones and the Great Circle.

(Image credit: MachineGames)
Image 6 of 9

View of library reading room from balcony in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 7 of 9

First person view of cardinal holding candlestick in storeroom in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 8 of 9

Sumptuous interior with marble floors in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 9 of 9

Cozy office with bookshelves and chandeliers in Indiana Jones and the Great Circle

(Image credit: MachineGames)

Indy starts with that priest outfit and can upgrade to an Italian fascist Blackshirt's uniform⁠ (that feels very weird to say).

Rather than the dual-wielding ultraviolence of Wolfenstein, The Great Circle is all about more brand-appropriate stealth, hand-to-hand brawling, exploration, and puzzle solving. There are guns, but limited ammo makes them more situational tools⁠—I only ever used Indy's trusty revolver to help take down a particularly meddlesome boss. Of The Great Circle's four pillars, the exploration is Indy's ace in the hole, while the rest range from "situationally tolerable" to "pretty fun." This game can be a bit wonky, but the overall experience, the immersion, atmosphere, and sense of fun are greater than the sum of its parts, with the magic and believability of MachineGames' vision of the 1930s and The Great Circle's fantastic storytelling carrying the day. I'm reminded of how the Elder Scrolls games have bad stealth, bad combat, samey dungeons, and uneven skill systems, but sell a fictional world so well, so completely, they're still some of the best RPGs around.

In a preview of Indiana Jones and the Great Circle, PCG senior editor Robin Valentine worried it would wind up being just a mediocre stealth game. I'd go so far as to say it's a bad stealth game, with the saving grace being that sneaking about Solid Snake-style is only core to a few story missions, while it can be minimized in the side stuff.

The Great Circle has simple, line-of-sight stealth, without much in the way of sound propagation, visibility control (camouflage or light/shadow), silent ranged takedowns, or tools for keeping track of enemies (a tagging system or radar). None of that stuff is particularly Indiana Jones, but only one of The Great Circle's stealth levels feels tailor-made to these limitations. A nighttime infiltration of Rome's Castel Sant'Angelo feels fit to task, with close quarters, limited numbers of guards, and lots of vertically-oriented alternate paths well out of their sightlines.

The other pure stealth sequences later are decidedly not that, with tons of enemies whose crisscrossing lines of sight preclude picking them off one by one, while the levels simultaneously lack reliable alternate paths around those guards. One main mission in particular felt like a base lifted straight out of Metal Gear Solid 5, but with less than half of that game's information and tools at my disposal. These sections absolutely sucked, and I found myself just crouch walking through them as quickly as possible, ignoring any side content and keeping my fingers crossed that none of the guards' "hey, I see you!" meters would fill up completely as I passed.

Marshall college view from Indiana Jones and The Great Circle showing a poster for the school chess club

(Image credit: MachineGames)

Thankfully there are only a few mandatory examples of these in the game, and stealth can be otherwise minimized through The Great Circle's disguise system. It's not Hitman, but it works well enough: Each open world hub has two disguises, one you get at the beginning that allows access to most of the zone, and another, hidden costume that lets you walk freely in restricted areas. In the Vatican, Indy starts with that priest outfit and can upgrade to an Italian fascist Blackshirt's uniform⁠ (that feels very weird to say). While disguised, there's no "that guy's acting weird" quotient for you to worry about⁠—you're free to climb about, grapple with your whip in full view of anyone, whatever, with the only hard stops on your activity being thievery in plain view or archeologist-detecting officers who aren't so easily fooled.

The Great Circle's brain teasers are just barely upper middle class, like a notch above Uncharted.

It's as simple as the line of sight stealth, but it works much better: It freed me up to explore and soak up The Great Circle's environments without having to worry so much, while the remaining danger of being spotted by an officer adds just enough tension to keep things interesting. I also dig how NPCs will react differently to you depending on your disguise. Story-wise, this setup results in a multitude of scenes as tense and memorable as Wolfenstein's train interrogation or Venus infiltration⁠—one in particular that involved a fascist and a confession booth gave me a real belly laugh.

You've seen the brawling here before in better and worse form. You can punch and counter enemies, with a super-generous parry window meaning the fights are fairly fun and easy without wearing out their welcome. On the second-hardest difficulty, I only had real trouble with some optional boxing prizefights against The Fascist Heavyweight Champion of Italy and The Strongest Nazi Who Ever Lived⁠—I kept getting up again and again for rematches like a brave glass jaw in an old cartoon, drastically increasing Indy's risk of developing chronic traumatic encephalopathy. The one real moment of brilliance for the combat was a boss fight that paired stand up fighting with some avoidance horror straight out of an Amnesia game. The improvised weapon system, meanwhile, doesn't add much mechanically, but it's an undeniable joy to see everything you can whack Nazis with, as well as their varied takedown animations.

In the great videogame scale of "BioWare Tower of Hanoi puzzle" on the simple end to Void Stranger or "Day one Destiny Raid" at most complex, The Great Circle's brain teasers are just barely upper middle class, like a notch above Uncharted. I was only really stumped once because I missed a super obvious aspect of a puzzle's mechanics, but otherwise it was a lot of sensible self-satisfaction at deducing safe combinations, finding hidden paths behind breakable walls, and using mirrors to bounce light beams around in that one puzzle you always see. Something I really appreciated was just how many little puzzles there were⁠—most side quests involved puzzle solving or navigational challenges of some kind, and I absolutely had more fun with that than I would've with yet more base clearing or whatever else most open-worlders have us doing these days.

Papal State of mind

Image 1 of 6

Indiana Jones in professorial outfit at night in Marshall College

(Image credit: MachineGames)
Image 2 of 6

Red-lit tomb interior in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 3 of 6

Indiana Jones in priest outfit talking to boy and mustachioed man in white suit

(Image credit: MachineGames)
Image 4 of 6

View of sumptuous palace interior in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 5 of 6

Indiana Jones climbing up his whip with a roman fresco visible in background

(Image credit: MachineGames)
Image 6 of 6

Darkened tomb with sarcophagus and skeletal statues in alcoves in Indiana Jones and the Great Circle

(Image credit: MachineGames)

Indiana Jones and the Great Circle is organized around three acts with their own open world hubs, with more linear, set piece missions stringing them together, and the design of these levels is the real triumph of the game: The open ended areas have the density, inspired design, and "feels like a real place" magic of a Deus Ex hub (Paris, Hong Kong, Detroit) or Dishonored map, and exploring them was pure joy.

Indy gets a range of different objectives in the hub zones, with your main story quest front and center, a selection of three or so story and cutscene-heavy side quests that are virtually indistinguishable from the main stuff in terms of production values, and then a wide array of bite-sized "mysteries" or one-off little side-puzzles whose stories are mostly told through notes and other clues in the environment.

Vatican City is the peak of the game for my money: It's a visual feast, all sumptuous Renaissance interiors⁠—including a digital facsimile of the Sistine Chapel⁠—and towering architecture over sunny cobbled streets. It 100% feels like there was an Assassin's Creed level of historical research (and budget) poured into The Great Circle's environments. The urban density of Vatican City also really works in its favor compared to the other two hubs, which have more of a sprawl to them. The Vatican looks like it has a small footprint based on its map, but then you're climbing up and down buildings, crisscrossing through shortcuts, or popping out of windows to shimmy to hard-to-reach areas.

first person view of suitcase with Indy's whip packed into it.

Rate my everyday carry. (Image credit: MachineGames)

The second act, which takes place in the shadow of the Pyramids of Giza, almost feels more like a slice of The Elder Scrolls. It verges into a little bit of that open world checklist gameplay, but it still worked for me thanks to The Great Circle's capacity for surprise. Even if the puzzles weren't too complex, the fact that each stop in the desert was unique⁠—often with a little surprise or twist of some kind⁠—had me wanting to see what each one had in store. I was reminded of Cyberpunk's gigs or The Witcher 3's monster contracts: This side content is fun to do on its own, independent of checklist clearing or character upgrade enticements.

The hub zones also reward exploration beyond formal quests. Money and ancient relics (that Indy is ostensibly returning to their rightful homes) are liberally scattered among the tasteful environment clutter and in all sorts of secret chambers or other nooks and crannies. I found a Thief-like appeal to waiting until guards' backs were turned to hoover up everything worth taking. They unfortunately didn't have terminals to hack into back in 1937, but The Great Circle provides the analogue version of that voyeuristic immersive sim thrill with inter-office memos and fascists' diaries for Indy to purloin.

MachineGames remains the industry leader at making new kinds of sneering Nazi freaks.

For the climactic bits of the game, The Great Circle does console-style, minimally interactive "wow factor" set pieces better than maybe any game I've played. I usually don't even go for that sort of thing⁠—I like RPGs and systemic interactions and other crusty PC gaming things. Substance, not flash! But The Great Circle had scenes that were so imaginative and surprising, I was just 100% on-board. An early one sees Indy catching a ride on a departing zeppelin by hooking it with his trusty whip, followed by a short sequence of climbing and ledge shimmying on the outside of the dirigible as Rome sprawls below you. I don't want to spoil too much, but later on you literally pull off a prop plane version of the beloved Battlefield Jet Swap, and that had me hooting and hollering in my office.

Tenure track

Image 1 of 8

Sphynx of Giza in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 2 of 8

View of Nazis playing poker in dark room under lamp in Indiana Jones and the Great Circle.

(Image credit: MachineGames)
Image 3 of 8

Snake charmer playing for viper in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 4 of 8

Looking out of tomb toward its brightly lit entryway in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 5 of 8

Indiana Jones disguised as Italian fascist limply waving.

(Image credit: MachineGames)
Image 6 of 8

View of throne in Egyptian tomb in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 7 of 8

campaign map with miniatures in German headquarters in Indiana Jones and the Great Circle

(Image credit: MachineGames)
Image 8 of 8

reaching for pile of keys in Indiana Jones and the Great Circle

(Image credit: MachineGames)

Troy Baker's turn as Indy is killer. He can say he wasn't trying to impersonate Harrison Ford, but I would have confused him for the real thing if I didn't know better. And beyond that, he just gives a phenomenal performance that nails Indy's easy, everyman-concealing-academic charisma. I'm usually against the digital de-aging/necromancy thing⁠—qualitative issues aside, I feel like it speaks to how stunted we've gotten culturally, we apparently can't make new movie stars and have to reheat the old ones⁠—but I'll make an exception here and hope that Harrison Ford is getting a big fat check in the mail for it.

I wanted to punch the guy and shove him down a hill the second he showed up onscreen.

The rest of the cast is similarly full of winners, calling to mind the great ensemble from the Wolfenstein games: You've got a fiery antifascist reporter, a jovial and portly cardinal, a sophisticated aristocratic patron of archeology, and a guerilla freedom fighter, to name a few. I'd like to make special mention of the late, great Tony Todd's turn as one of the game's antagonists⁠—this may be one of his final performances, and he just oozes menace, but with a mysterious dignity and gravitas as the eight-foot-tall man who kickstarts the adventure.

MachineGames remains the industry leader at making new kinds of sneering Nazi freaks who twirl their proverbial mustaches while declaring "Oho, you sentimental fool, but it is I who has the upper hand." There's this awful fascist-sympathizing priest who serves as your act one starter villain at The Vatican⁠—think a stuffy sitcom principle guilty of collaboration with world-historic crimes⁠—and he winds up serving as the perfect punching bag to introduce our true villain, Voss, the freakiest, creepiest, most "I'll get you next time, Herr Jones!"-ass Nazi villain to ever come out of MachineGames' R&D skunkworks. I wanted to punch the guy and shove him down a hill the second he showed up onscreen.

As a final note on playtime and tech: The Great Circle is a great value as far as triple-A games go, laughing those pre-release playtime estimates of "12 hours or so" outta the room with a robust runtime more in the 30-hour ballpark for a thorough, but not completionist playthrough. Graphically, this thing is beautiful but a system hog, punishing my poor RTX 3070 with its always-on ray tracing and VRAM gluttony. But even with lighting and textures down to low (the only setting that wouldn't give me a big old VRAM warning) and DLSS Performance mode at 1440p, The Great Circle is an absolutely gorgeous game. I did notice some obnoxiously flickering, shimmering shadows in the foliage-dense introductory areas that persisted no matter what shadow quality I set things to. That feels day one patch-coded, but even if it's not, it proved less of an issue later on.

The Great Circle is an odd duck to be sure, combining immersive sim level design with set piece spectacles that put Call of Duty or the Sony first party stable on notice; incredible graphical technology and triple-A polish with some truly baffling and idiosyncratic design choices. Despite (and at least partially because of) that funkiness, it's one of my favorite releases of this year and a new outing from a favorite studio that was well-worth the wait.

]]>
https://www.pcgamer.com/games/adventure/indiana-jones-and-the-great-circle-review/ VQFXrUcJuepFoxaUgAsJHg Fri, 06 Dec 2024 00:00:10 +0000
<![CDATA[ Caves of Qud review ]]>
Need to know

What is it? A sprawling science-fantasy RPG
Expect to pay: $30/£25
Developer: Freehold Games
Publisher: Kitfox Games
Reviewed on: Radeon RX 6800 XT, Ryzen 9 5900, 32GB RAM (but it runs on a potato)
Multiplayer? No
Link: Official website

Across the great salt desert there is a jungled plateau encircled by mountains and speckled with the ruins of an advanced, ancient civilization. At its center lies the Spindle, a towering cable that stretches into and through the heavens. This is where your stranger arrives, the protagonist in an epic science-fantasy RPG full of engrossing stories and elegantly-designed, deep world simulation. It's the best one I've ever played.

Caves of Qud is a traditional roguelike RPG in form, a top-down, turn-based game that relies on text and simple-yet-evocative graphics to convey its world. It doesn't stick hard and fast to permanent death, though, instead letting you choose whether you want save points and even tweak how much fighting you'll have to do. Nor does Qud stick hard and fast to the traditional roguelike rules of having an opaque, frustrating user interface and arcane, entirely keyboard-driven control scheme—it even plays very well on a mobile PC like the Steam Deck.

You make your character from a variety of archetypes that describe normal humans or the far more numerous mutant inhabitants of Qud. You then build in a relatively freeform way, choosing new skills, upgrading your abilities, and gathering equipment for a dizzying and thrilling array of possibilities. From there you set off across Qud from your starting village in true RPG sandbox fashion, choosing to follow or not follow the many quests and whimsical distractions you may come across. A lot of that will involve carefully delving into the ruins of the ancient civilization of the Eaters of Earth, fighting the strange and deadly creatures you find there, and pilfering their treasures.

Each playthrough has the same world map but randomizes nearly every local area according to a complex system of generated histories chronicling the sultans who ruled Qud in ages past. What they did then influences what you find now, where you go, and what you can expect to find there—and it integrates that generated history into the main story quest's fixed objectives in clever and subtle ways that you might otherwise think had the hand of a designer behind them. Though it may take you a hundred hours to really get to grips with and master Qud, a seasoned player can burn through the main story in 15 hours.

Layer qake

The big-idea, high weirdness science-fantasy world of Qud's far future is of an old genre (think Dune) with its own tropes, such as influence from ancient mideast cultures and mythology, but one that's so underutilized in videogames that the setting comes off as refreshing. The spires of humanity's utterly ruined past sit below and on top of Qud, forming a layer cake of dungeons filled with objects from a civilization so outrageously advanced that they invented teleportation and mined the neutron-degenerate matter at the heart of dead stars. The Eaters' civilization was once so advanced that its fall was just as spectacular as its rise, leaving Qud a vast ruin cut off from the universe and scoured by terrible plagues—fresh water is so precious, for example, that a dram of water is the only true currency on Qud.

Image 1 of 4

An adventurer in Caves of Qud approaches a statue of a sultan from a previous age.

(Image credit: Kitfox Games)
Image 2 of 4

In Caves of Qud, the player's character is surrounded by a throng of hostile hyenafolk and plated worms.

(Image credit: Kitfox Games)
Image 3 of 4

Inspecting a dagger in Caves of Qud, which has an engraving detailing a duel between one of Qud's sultans and a political aspirant, after which the sultan had the defeated claimant

(Image credit: Kitfox Games)
Image 4 of 4

In Caves of Qud, the players converses with Yurl, a sentient plant.

(Image credit: Kitfox Games)

Pure-blooded "normal" people have become one of many different successor species and ancient genetic engineering combined with reality-altering phenomena have unleashed a tidal wave of strange creatures onto this world—the many talking plants, for example, barely register as exciting compared to a Twinning Lamprey, which exists in a strange quantum state where only when both its bodies are killed simultaneously will it die. Qud is a world where you start off fighting Hyena-people in a swamp with a bronze dagger and end it clad in zetachrome armor made of forgotten matter from the start of the universe, fighting mecha and cosmic terrors while wielding an eigenrifle that fires a subatomic particle beam capable of piercing through every single thing on the screen, friend or foe.

Rich storytelling

The main quest takes you on a journey well-suited to pit stops, dotted with odd characters like a deaf-mute albino bear-porcupine gunsmith and a very rude talking fungus you have to bring with you by letting it grow on your skin. It also stays varied: You'll delve ancient ruins for lost technology, yes, but also manage diplomacy among factions, make friends, solve riddles, and unpick that generated history. All of this is written in a prose so flowery and rich that it passes beyond purple and arrives in the same place as something like Lewis Carroll's Alice's Adventures in Wonderland, utterly unafraid to pillage the depths of the English language for the right words and then invent its own words when those don't quite cut it. I could fill this review with phrases cut from item descriptions. ("It's a large piece of rock, older than every idea" reads the description for a boulder. "Its shelled torso is downside up and sprouts a bouquet of crumpled legs," I'm told about the corpse of a giant crab.)

The world and writing and story are so appealing in part because Qud draws from a diverse set of sources that aren't just other videogames and obvious, obligate nerd pop culture stuff. Its cultural borders extend well outside the limits of so many games. It's rare we get the pleasure of playing a videogame influenced by Dune or The Book of the New Sun or Gamma World that also freely quotes Lord Byron and A.E. Housman while drawing elaborate allusions to the Hebrew Bible.

Get weird

Qud's world is powered by a physics simulation that lets you drill through walls with a jackhammer or melt rock into lava or spread clouds of corrosive gas that shift in the wind.

Your character's traits and background will be just as rich as the world's. You might play a True Kin, one of the last unmutated remnants of old humanity close enough to their biology and genome that the Eaters' technology, robots, cybernetic implants, and miracle medicines still recognize and work for you, enhancing yourself with cyberware like extra-large hands and firearm hardpoints to wield four two-handed chainguns at once. You might instead be a mutant with a beak, wings, night vision, and talons whose mutations grow in power so that they can sunder steel. Your mutant might have psychic powers like cryokinesis and telepathy and disintegration. You might even play as a purely physical chimera that grows twelve arms, or a purely psychic esper who becomes so powerful that their Glimmer attracts extradimensional hunters and predators from beyond our own universe.

(Image credit: Kitfox Games)

It's not just variety that makes Qud so fun, it's the depth of simulation in the world you explore. Aside from your health value or the rare damage calculation that can go to the hundreds, you rarely have to deal with numbers larger than 30 when considering statistics and attributes. But Qud's world is powered by a physics simulation that lets you drill through walls with a jackhammer or melt rock into lava or spread clouds of corrosive gas that shift in the wind. The characters you meet all have a faction and a base attitude and an allegiance to it that plays out in hilarious ways: You might be beloved by apes, causing the shaggy white monsters that roam the canyons to leap to your defense, or hated by insects, causing typically chill giant dragonflies to come after you with a vengeance. Manipulating other factions' attitudes toward you is key to enjoying Qud, causing you to form alliances with groups like the machine-worshipping Mechanimists or become self-appointed protector of the villagers of a randomly generated town. I defended one with my life because I thought it was hilarious that its mayor was a talking pig that was hated by other swine for "defiling their holy places."

Stay weird

Random generation can of course make or break a game like Qud. Sometimes it's against you in the most hilarious ways, sometimes it's just frustrating as you plumb some deep stratum praying that the next chest has some upgrade—please, any upgrade—for your gun or armor. Sometimes it's for you as it spits out a unique relic that feels purpose-made to fit just the character you wanted to play—as a sword-wielding knight I once found a shield that could reflect enemy lasers back at them.

Your tolerance for unpleasant, run-ending (or save-reloading) surprises has to be pretty high as you first discover things like which robot is armed with missile racks and how to kill a Twinning Lamprey and how when you see a biblically accurate angel you should just run away. You can contract infections or lose a limb, but to enjoy yourself those must be interesting problems to solve or situations to exploit, part of your overall story rather than frustrations. The fun to find in Qud is about learning—reading the flavor text for hints, figuring out which randomly-generated named creature to befriend, and carrying a variety of tools for different situations. Always have an EMP grenade on hand, for example, to deal with wayward robots. You have to be willing to try and fail and to learn when it's better to run away, consolidate your gains, and try again tomorrow. Every challenge in Qud was not hand-made to be beaten. Quite the opposite, in fact.

(Image credit: Kitfox Games)

That is no surprise to fans of the traditional roguelike's deep details and convoluted systems and surprising random events. For others it will be unpleasant at first. Caves of Qud has what is likely the best, most modern interface and controls in a game of its kind, but the experience is still at times frustrating as you try to unravel just what happened in the last combat round that caused you to die. For some who have not played games like this before, simply the act of learning to control and maneuver your character will be frustrating. Despite this I cannot recommend Caves of Qud enough for its innovations in mechanics and storytelling, however anachronistic it may look.

Besides, on top of it all? You can easily mod this game.

]]>
https://www.pcgamer.com/games/roguelike/caves-of-qud-review/ ckQX9qYYsoKRgwF3q4Nskk Thu, 05 Dec 2024 17:42:28 +0000
<![CDATA[ Zotac Zone review ]]> The Zotac Zone, is the latest stab by a manufacturer at Valve's throne and the Steam Deck. While it comes incredibly close to being a genuine best attempt—the design and layout is 100% aimed at mimicking Valve's console—it is held back by software, and not just by the pesky hand of Microsoft and Windows 11.

It's not the most flashy of entries in the ever growing list of Windows handhelds. It's a more subtle affair, despite its typical "gamer aesthetic" outer shell. It's all in the hardware, without the gimmicks.

However, it is the PC handheld that has felt the most detached from Windows during set up. After digging through Microsoft's endless requirements and agreements vertically (there's a gyroscope), I wasn't met with any baked-in software. I did find Zotac's software pre-installed, but it was a much older version than the latest release, and it consistently crashed.

Once I got it working Zotac—like everyone else—presents a far worse version of Steam Big Picture Mode (Valve's additional front end originally intended for TVs) to get around Microsoft's inability to launch its own, handheld console-like experience.

Zone specs

Zotac Zone handheld gaming PC

(Image credit: Future)

Processor: AMD Ryzen 7 8840U
GPU: Integrated AMD Radeon 780M
RAM: 16 GB
Storage: 512 GB
Screen: 7-inch 120 Hz AMOLED touchscreen
Controls: Hall effect analogue sticks, gyroscope, back paddles, dual touch pads
Connectivity: 2x USB4, 3.5 mm jack, Micro SD slot, Wi-Fi 6E, Bluetooth 5.2
Battery: 48.5 WHr
Dimensions: 285 x 115 x 35 mm
Weight: 692 grams
Price: $799 | £820

Zotac's launcher is just riddled with glaringly obvious oversights. On the ROG Ally systems, Asus has presets for the performance options you'd like. Zotac's "One Launcher" instead has you build your own presets, which without prior knowledge would make little sense. Does the regular consumer know that 17 watts is the agreed upon “best middle ground” for the 8840U or what a TDP even is?

Even the launcher's controller remapper left me wanting more. A lack of options, including remapping a keyboard button to one of the back paddles or even having the remap function when active just led to frustration.

More frustrating is the branded buttons can't be remapped either, an issue I found on the ROG Ally. It's wasteful, when it could very easily be a generic guide button rather than causing the Zotac proprietary launcher to lurch to the front on instinctive presses.

I'm quite down on Windows handheld software, simply because we've been at this for some time now. GDP and OneXPlayer didn't figure it out in the years before the Steam Deck. Not one manufacturer has cracked it since Valve's release, with only Ayaneo really getting close.

Zotac Zone handheld gaming PC

(Image credit: Future)

While I could whittle on about Windows 11 on handhelds, it's old hat. Just know that it's the same here. You'll be thumbing at the desktop environment—even with the trackpads—and fumbling through Game Pass to try get the wretched app to install anything.

It does make me look like a fan boy, but the honest truth is that Valve's custom built version of Linux and major rework of Steam Big Picture to be a true frontend work so well, and work so well precisely because they're custom-built to work with the hardware. It's the complete package, something that Zotac is ever so close to figuring out.

While Zotac's software is lacking, its hardware decisions aren't. It feels as though at every turn, it took into consideration what people would want from a higher end handheld.

Underneath the hood is AMD's 8840U, a killer chip that is the current favorite of companies like Ayaneo. It provides just that extra smidge of power over the previous 7840U, but the 16 GB of RAM means it still lags behind in some applications that of a fully upgrade Ayaneo 2S with 32 GB or Asus ROG Ally X with its 24 GB.

The Zotac Zone performs best when you have total control of its hardware. In initial tests with the hardware, I found that if I didn’t manhandle it to exert to 30 W, it’d play it too safe. Performance in 3DMark was significantly lower, circling nearer to 2100 points than the 3027 it actually scored.

I also saw it in Cyberpunk 2077, where the average framerate was 30 fps until giving it the juice it really needed. After fumbling around in the Zotac One Launcher and creating a profile to show off the true capabilities of the system, it began to show how much more important RAM is this time around.

The Zotac Zone and Ayaneo Flip DS, for example, are split by RAM speed and a meager difference of two watts in total power draw. Ayaneo’s slower RAM—6400 MT/s—and Zotac’s 7500 MT/s just prove that a simple chip upgrade isn’t all it takes these days.

However, with the 8840U I found that Metaphor: ReFantizo ran exceptionally well at 1080p, 60 fps. I've been playing it quite a bit on the Steam Deck and the jolt between seeing everything at 30 fps, 720p and what the Zotac Zone can do, is a little like whiplash.

Other titles such as Ace Combat 7, Amid Evil, and Half-Life 2 all played magnificently at the higher end of the frame rate. Half-Life 2 and Amid Evil obviously held down the 120 fps mark, while Ace Combat 7 held steady at 60 fps. Even in the newest Forza Motorsport game I was hitting mid-40s with FSR supersampling helping along the way.

These handhelds do need supersampling for some newer titles. You'll never be playing that new Indiana Jones game comfortably on one of these, but some of the Unreal Engine 5 games I played also needed that extra push.

Robocop: Rogue City ran extremely poorly without supersampling and frame generation. Both software tools, they use algorithms to achieve better performance. Supersampling shrinks the image and blow it back up at the desired resolution, taking work off the GPU. Frame generation literally tries to create the next frame based on the data provided.

I found that Rogue City introduced quite a bit of latency when using frame generation, but also, even with FSR, still needed to be bumped down a graphical notch in the settings to achieve a steady frame rate. Essentially, as with any of these handhelds, so long as you keep your gaming expectations in check, the Zotac Zone could be your next best friend.

Image 1 of 3

Zotac Zone handheld gaming PC

(Image credit: Future)
Image 2 of 3

Zotac Zone handheld gaming PC

(Image credit: Future)
Image 3 of 3

Zotac Zone handheld gaming PC

(Image credit: Future)

One area that I wish Zotac had considered more was the storage space. 512 GB is paltry in the modern era. After installing Cyberpunk 2077 for benchmarks, as well as a few other titles, I was fast running out and it hadn't even been a day. It really needs to be a 1 TB or more for a modern handheld. However, upgrading should be cheaper and easier than on something like the Steam Deck. Inside is a full sized, 2280 NVMe drive, instead of the usual 2230 short stack. This is a fantastic move for tinkerers, just making the system even more flexible.

Where Zotac obviously thought things through is the AMOLED screen. Measuring 7-inches, it is glorious. Rich, vibrant colors pour out of it. Horror titles like Crow's Country ooze atmosphere as the deep blacks envelope everything. I found no major haloing or weird effects coming from it either, with not much in the way of color inaccuracies either.

Buy if...

You want solid hardware, and the software be damned: While the software is awkward it does offer the best actual hardware of the bunch.

You're after a best in class screen: The AMOLED screen truly is in a class by itself on the Windows handheld front, only bested by those with OLED.

Don't buy if...

You want an affordable option: The Zotac Zone sits with the likes of Ayaneo and Lenovo with a meaty price tag.

You're expecting to play the newest games: These handhelds play great with older titles or indie games, so no, you won't be playing Stalker 2 at full whack on the go.

The Zone is incredibly comfortable to hold, even with its jagged edges on the palm rest. Aping Valve's design, it sports two small trackpads with a definitive, satisfying click. There's two USB 4 ports, making peripheral or docks use super easy to use.

It even has a Switch or Legion Go like stand to prop it up. There's also a webcam on the front for Windows Hello and I've even used it as a last minute Teams meeting device—helped by its stand.

One major addition is the Hall effect sticks, meaning that even after prolonged use, you shouldn't find them drifting. Even the rings around the sticks twist to adjust the brightness, volume, or RGB lighting. While I wish I could properly remap these, it shows Zotac's genuine care in the hardware space.

Zotac has taken notice of the push in the controller space for, well, more control. The software might not be helpful, but the physical switches beside the triggers means you can set if you want it to be a quick snap press or a longer, more analogue one depending on the game you're playing.

Its things like these which set the Zotac Zone apart from the other Windows handhelds. The Zone feels like a contemplated idea, rather than a rushed to market thing. Combined with its performance prowess, if you can put up with some quirks—as with any of these machines—you should put this on your list of considerations.

]]>
https://www.pcgamer.com/hardware/handheld-gaming-pcs/zotac-zone-review/ RFKctxjtuiq3ScxvuykGQB Thu, 05 Dec 2024 13:11:48 +0000
<![CDATA[ Fantasian Neo Dimension review ]]> I thought I'd had enough of RPG heroes with dead parents and memories full of holes. I was done with save crystals, world maps, and feisty princesses with magical powers. What's that? The world's very existence is under threat because an ancient and all-powerful being got a bit sad? Spare me.

Need to know

What is it? A sweet RPG exterior hiding a rock-hard battle core
Release date: December 5, 2024
Expect to pay: £44.99/$49.99
Developer: Mistwalker Corporation
Publisher: Square Enix
Reviewed on: Intel i9-13900HX, RTX 4090 (laptop), 32GB RAM
Steam Deck: Verified
Multiplayer? No
Link: Official website

But it's hard to hold such timeworn clichés against Fantasian. The game's producer and chief storyteller is none other than Final Fantasy legend Hironobu Sakaguchi, who's been making RPGs for so long he practically invented some of these tropes, or at least knew the person who did. Besides, this retro-styled approach to the plot brings a fantastic focussed attitude along with it, never spending any longer than absolutely necessary on an energetic prison break sequence or the latest almighty god's dramatic monologue. The game wants to take me away on a grand adventure filled with magic and monsters, whisked from one place to the next before I've had the chance to think too hard about why I'm currently fighting a haunted gondola or a really angry sun.

The places I end up in are as traditional as they come—a dusty little town, a shining city, strange mechanical nowheres filled with danger—but they all look brand new thanks to the game's unusual visual style.

Here the locations I visit aren't created from 3D renders or detailed pixel art, but photographs of real handmade dioramas. The effect is so stunning I often ended up wandering around just to see more of the craftsmanship that's gone into these sets, places where every tiny glass bottle and folded bed sheet has been made and then positioned by hand. There are obvious paint marks on rocky outcrops, curled corners on tiny rugs, and all sorts of other wonderful imperfections that enliven these scenes in ways no quantity of raytraced polygons could ever hope to match. At a time when a hundred CEOs are insisting faceless AI mush is the future of "creativity," this proudly human touch is a very welcome respite.

As RPG tradition demands, wandering around these incredible spaces triggers random battles against anything from mechanical snakes to adorable little mouse-gicians, and again Fantasian has a novel twist up its sleeve that turns the ordinary into the extraordinary: the Dimengeon Machine.

Image 1 of 5

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)
Image 2 of 5

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)
Image 3 of 5

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)
Image 4 of 5

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)
Image 5 of 5

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)

When it's turned on, this handy little device will automatically hoover up any enemy types I've previously fought, allowing me to grab a distant treasure chest or make my way to the nearest save point without being interrupted. The fun thing is those enemies are stored, not destroyed, and the machine needs manually emptying from time to time in one large-scale gauntlet of a battle, unless I want to suddenly face a potentially fatal grab-bag of 30 monsters in one painful go when it overflows. It's a brilliant mix of reward and risk, allowing me to avoid fights when I'm tired or in a rush, while still making me put in the work at a more convenient time. Unique items appear to help tip these longer battles in my favour, from straightforward attack boosts and status clears to the chance to take an extra turn.

And unlike the random battles in so many RPGs, whenever I choose to fight Fantasian's menagerie of monsters I always have to take them seriously. The game may be perfectly charming in cutscenes, but it's never anything less than merciless when the swords come out.

Learning how to master the unusual manual targeting system is crucial. Here attacks aren't a simple choice between hitting one enemy or a neat row of them at once—they can pierce through an uneven line if I aim them at just the right angle, or even bend in a graceful curve around a defensive blocker to hit the fragile spellcaster standing behind them.

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)

Using an analogue stick to aim instead of jabbing my finger at a touchscreen feels so natural that, bar the odd inclusion of a battery meter on the in-game menu, it's easy to believe the game had always been made with PCs in mind, and it plays like it's just another RPG sitting on my SSD. The only weak point is the lack of graphical options, reduced to resolution, FPS (up to 120), and a choice between "low" and "high" detail settings. Although considering most of the time I'm looking at a photo with a few small 3D people on it, maintaining a visual consistency between these elements is more important than really cranking up shadow detail.

I'm usually too busy hacking away at one of Fantasian's many, many, bosses to notice fine details anyway. Some change stances as the fight goes on, making them more vulnerable to attack. Some have weapons or body parts I should aim for and destroy before they use them to wipe out my entire party. There's even action-style timing in here, the right move at the wrong time perhaps completely blocked by a rotating barrier or hard shell, or not landing as powerfully as it would have if it had hit the softest body parts.

This concept really goes into overdrive in the second half of the game, to mixed results. I appreciated being forced to use every last spell, skill, and item at my disposal instead of saving them for some imagined "next time" that never came. Debuffs, barriers, and turn-altering tactics are absolutely essential life-saving techniques in Fantasian, and it's extremely satisfying to shut down everything a boss tries to throw at me.

Fantasian Neo Dimension JRPG

(Image credit: Mistwalker)

But as the game wears on too many of these epic clashes start to hinge on how I set up my characters before the battle, rather than how clever I am once I'm in it. Equipping the right sort of status-nullifying or elemental resistance-boosting gems can be the difference between getting hit by a painful blow and an unrecoverable one, and if I didn't happen to have the right gear on my team already then my only real option is to wait to die so I can fiddle with everyone's equipment after a reload. Some enemies are incredibly weak to a specific elemental attack I might not currently have unlocked, and I can't do anything at all about that until I've struggled, failed, and then gone back to an earlier checkpoint to access a character's expansive skill tree and reallocate their points to counter the latest issue.

In spite of these setbacks the pleasure is generally worth the pain, and I always come away from my sessions exhausted but happy. These late-game challenges are just one surmountable issue in a game filled with fresh takes on old ideas, and I admire Fantasian for daring to be a turn-based RPG with real teeth to it—one that makes it all but impossible to out-level or overpower my troubles.

]]>
https://www.pcgamer.com/games/rpg/fantasian-neo-dimension-review/ oabHBQFsuPkmDC6v6tbwaW Wed, 04 Dec 2024 11:00:00 +0000
<![CDATA[ Infinity Nikki review ]]> What happens when you take a long-running mobile game series all about dress-up, wrangle it away from its cellular origins, shove the concept into a Genshin Impact mold and unleash it onto PC and console? You get Infinity Nikki, Infold Games' fifth entry in the series, but the first one many will have likely ever heard of.

Need to Know

What is it? An open-world adventure with pretty outfits.

Release date December 5, 2024

Expect to pay Free-to-play

Developer Infold Games

Publisher Infold Games

Reviewed on Nvidia GeForce RTX3070, AMD Ryzen 7 2700X, 32GB RAM

Steam Deck N/A

Link Official site

Don't worry though, you don't need any knowledge of the previous games to jump in. Infinity Nikki doesn't directly follow on from any of its predecessors, but it does retain their main themes: A human stylist named Nikki and her sassy talking cat Momo end up getting sucked into Miraland, a fantasy realm stuffed with all kinds of fashion-related lore and a history that implies some sort of war among eight garment wizards.

At least, I'm pretty sure that's what's going on, from what I've gleaned. My history with the series lies almost entirely with Love Nikki, the third game in the series, one which I remember having a heap of translation issues and an oft-confusing narrative. Huge improvements have been made for the former, but the latter still remains. Dialogue was occasionally entirely unnecessary—one moment where I had to listen to Nikki back-and-forth with a character over the fact they repeated a sentence had me questioning my sanity—and concepts are either over-explained or under-explained.

(Image credit: Infold Games)

I went through a good chunk of my time not really getting anything, narratively. There are a ton of lore books knocking around though, which I found helped to add a ton of much-needed background history to Miraland, giving me more insight into what was actually going on around me. I often preferred these flavour texts to the main narrative, and as I read more I was having a much easier time piecing together the story I'd largely been thrown into the thick of.

Sew far, sew good

To be fair, part of my half-baked understanding definitely came from the huge time-gaps between completing quests, which happened entirely on account of me repeatedly venturing off the beaten path. Infinity Nikki is the first time the series has gone open world, and it's bloody gorgeous to look at. Every blade of grass, stone fence, and babbling brook is stunning, and I regularly found myself stopping in my tracks to make use of the game's excellent photo mode.

I spent far too many hours clicking through every pose—each of which has their own subtle animations you can cycle through for a frame-perfect shot—fiddling with the lighting and the framing to capture Miraland's stunning sunsets or the reflection of the moon off a large lake. I could even leave behind a 'snapshot' which allows friends to come along to my photo spot and take a snap alongside me.

There's no better pairing for a pretty landscape than some beautiful threads to go with it, and I had a hard time finding an item of clothing that I didn't like. There's a huge variety available, from giant princess dresses to pyjamas and sneakers. While I did have the occasional clipping issue between items that definitely should be able to be worn in harmony, I had a great time mixing and matching all the different pieces I had crafted, collected, and pulled from its gacha banner.

(Image credit: Infold Games)

When I wasn't busy turning into a professional photographer, I was putting a whole load of other fits to use: Infinity Nikki's ability outfits. They're really just a way to show off loads of different pretty clothes in a variety of scenarios—there's an outfit for fishing, bug catching, animal cleaning (yes, it's adorable), floating, combat, and story-specific fits that have more niche uses like electrical repair and one for playing the violin. Additional outfit options for these can be unlocked through rolling on the gacha banners, but thankfully they're merely a cosmetic change rather than offering anything different on a mechanical level.

I regularly found myself getting lost in simply collecting everything I could. Picking every flower, catching every bug, or sneaking up on skittish birds to wipe their wings or scrubbing away at a cat's paw, with every animal receiving their own adorable cleaning animation. Making full use of the ability outfits let me get my hands on all manner of materials to craft clothing items, so it's well worth doing. They also come in handy for collecting Whimstars, which are a sort of skill-tree currency, allowing me to unlock new outfits, small stat boosts and extra rewards.

Whimstars are one of the many ways Infinity Nikki stuffs small puzzles into the world—there are your standard platform puzzlers where I have to nab a star off a roof somewhere, but there are also ones that have me finding a hidden star shape in a certain area, or even hopping into another domain to try and puzzle out a path with blocks, balls, and using the ability outfits I've picked up along the way.

(Image credit: Infold Games)

I was a particular fan of the latter, especially as I gained more ability outfits such as one that lets me shrink down and hop on Momo's back, using him to navigate small crawl spaces or make use of weight-based puzzling. It was one of the rare occasions I had to actually use my brain or implement any real level of problem solving, something which I desperately wish Infinity Nikki made more use of.

Yeah, the game's biggest issue right now is its lack of any real challenge. Despite movement feeling really great, it's never used for any particularly tight or complex platforming. While the game's dungeons are a highlight in world design, creating fantastical areas like frog-filled sewer systems or bright-white rooms with floating bookcases to hop along, they have perfectly laid out paths that rarely branch out. Almost all of Miraland's mysteries are easily solvable in a way that feels infantilising at times.

The lack of challenge is felt most in its combat, however. It's incredibly primitive, with Nikki throwing out an orb and being able to one-shot the vast majority of enemies, called esselings, in the game. Most of them will stay stationary, maybe firing out the odd projectile. Nikki's own orb can be a pain to wield sometimes too, and I found myself in situations where it was shooting straight over an enemy's head because the game couldn't comprehend that I was slightly uphill.

(Image credit: Infold Games)

The only relief is that thanks to the one-shotting, encounters are over incredibly fast. That's even true of its bosses, which offer Infold Games a better opportunity to show off that it can make engaging battles. The boss battle in chapter seven was a particular highlight for me, giving me vibes of an MMO crossed with something like Super Mario Odyssey. Even then, I was able to fell my foe in a mere 90 seconds, which left me a tad gutted.

I only wish the mini-Nikki ability outfit came hours sooner, because it really does open up the world design and puzzles in a way I would've preferred to have from the start. The final zone I visited had tons of verticality, requiring me to regularly shrink down and hop on Momo's back to be fired through a cannon, crawl through a tunnel or hop up a rope with my big ol' cat claws.

Superstylin'

I do think Infold Games is perhaps finding its feet with all these new experimental open-world activities, so I'm glad to see its still retaining some of its classic Nikki gameplay with styling battles. They're simple enough: Every item of clothing has a handful of tags attached to it—playful, cool, retro, ballroom, elegant, you get it—along with a score in each category attached to each piece. One dress might have an S-rank fresh rating, while another will specialise in elegance.

(Image credit: Infold Games)

Styling battles basically put these numbers and tags to use, with NPCs dotted around Miraland giving some kind of theme to meet. Ultimately the theme doesn't matter, rather the tags attached to each item of clothing. It means I ended up presenting some downright ugly fits to my opponent, but that's the way it's always been in Nikki. I've always had a soft spot for how the actual number-crunching fashion battle aspect of the series has you throwing together some right atrocities, and I'm glad to see it's still present.

It does take more of a backseat than in previous games though, and I'm kind of glad about it. Getting hold of clothes that were able to hit some ludicrously high scores was difficult at times, as the game expects you to funnel a bunch of materials into levelling up individual garments.

What materials, you may ask? Well, a little bit of everything. Even for a gacha, Infinity Nikki is a bit oversaturated in the currency and growth mats department, having a staggering array of systems and items and even different methods of doing the same thing—for example, getting a palette-swapped version of an outfit is totally different depending on how you obtained the outfit. A 5-star outfit you pulled in the banner? You'll need one material for that. A 4-star outfit you crafted? You need something completely different. Oh, and sometimes you'll need a duplicate set of the outfit, too.

(Image credit: Infold Games)

It's the classic gacha obfuscation, and I did find myself getting dizzy trying to figure it all out at times, even as a battle-hardened veteran of the genre. That was before I even had to deal with any real-life money, which of course wasn't offered to me in the review build.

I did take time to peruse the shop though, and it seems largely on par with what you'd expect from other gacha offerings. There's a battle pass, which appears to be a carbon copy of the one Hoyoverse offers, and premium currency which can be used to buy shop-exclusive outfits, ranging anywhere from a $1 offer to a $50 outfit.

When it comes to free-to-play generosity, that's something I'm still feeling out. Right now, I'd wager it's somewhere in the middle: It's not ultra-generous, but not robbing me blind either. Something I did appreciate was that, on the permanent banner at least (there was no limited banner so I'm not sure if it works the same way), you're guaranteed a 4-star item every 10 pulls, and a 5-star item every 20 pulls. The game also won't feed you any duplicates until you've completed every outfit the banner has to offer which, on the one hand, is a bummer if you're jonesing for a palette-swapped version, but it also means that for a long time you'll be getting something new. I'm sure more glaring monetisation woes and wins will reveal themselves in the weeks post-release—it's a free-to-play gacha, it's gotta make money somewhere—but right now it seems fairly reasonable.

Whatever the grind may be, I'll most definitely be taking part. Despite its over-easygoing nature—which I think may have come from Infold overcorrecting itself in wanting to make an inclusive experience—I found myself falling in love with Infinity Nikki. I can't wait to continue to soak in its picturesque landscapes and dress up Nikki to match the vibes of wherever I am. Ultimately, that's what it's about, really. Dressing up, having fun, and sharing the views with pals. I do sincerely hope Infold Games stops being scared of making things hard, because Infinity Nikki already has the groundwork laid to stand up there with the gacha greats.

]]>
https://www.pcgamer.com/games/adventure/infinity-nikki-review/ yRBi9iTSs2xJcVhygXxs3G Mon, 02 Dec 2024 15:41:26 +0000
<![CDATA[ Microsoft Flight Simulator 2024 review ]]>
NEED TO KNOW

What is it? The latest in virtual aviation that includes the highest-fidelity recreation of our planet in any videogame to date
Expect to pay: $70/£70
Developer: Asobo Studio
Publisher: Xbox Game Studios
Reviewed on: Ryzen 7 3700X, GeForce RTX 2070 Super, 32 GB RAM, Logitech Extreme 3D Pro Joystick
Steam Deck: Unsupported
Multiplayer? Leaderboards
Link: Official site

Presenting the entire Earth in even greater fidelity than its 2020 predecessor, it's hard to overstate what a remarkable technical accomplishment Microsoft Flight Simulator 2024 represents. The aircraft and cockpits are astoundingly detailed, both visually and in how they function. The physics of flight are delightfully authentic, and while it has its quirks, no one has ever even tried to include our entire planet in a game with such fine-grain detail.

Before we get ahead of ourselves, I want to acknowledge the horrendous state of the servers at launch that prevented me and many others from even being able to play at all. These have been, at least for me, almost entirely resolved as of the writing of this, though I still do run into occasional issues like the fact that I can never seem to get all of the high-res geometry in the Grand Canyon to load. It was not an acceptable launch day experience, by any means. But I have seen no indication in my testing that you will have that bad of an experience today, so I'm willing to let the past be the past.

It is worth noting, though, that due to the streaming nature of modern Flight Simulator, you will have a variable experience depending on your network set-up. I'm lucky enough to be soaring on gigabit wired ethernet, and allocated 200 GB on a fresh SSD to FS2024's "rolling cache" that stores things like terrain data for frequently-visited locations. The load times, especially for a first-time launch, are much better than they were in FS2020 regardless, but Microsoft does recommend at least 100 Mbps of bandwidth for playing on max settings. I haven't needed that much—the most I've seen FS2024 use at any given time is around 46 Mbps. But it's something to be aware of.

You are a pilot

Microsoft has identified three types of players Flight Simulator aims to cater to: hardcore simmers who want the most realistic experience, gamers who want to earn rewards for completing challenges, and sightseers who want to visit cool world landmarks. I like to think I have a little bit of all three wolves inside of me. I appreciate a lot of the little touches of realism, I'm a geography and architecture nerd who is easily delighted by things like being able to fly under a faithful recreation of the Golden Gate Bridge, and I sure do like it when number goes up.

While I'd say the simmers and sight-seers are well served, FS2024's attempt at a goal-driven career mode is a little bit underwhelming. I did enjoy practicing and taking exams to earn new certifications for stuff like IFR flight and jet aircraft, but the actual economics of being a freelance pilot are shallow and not that interesting to manage. It takes many, many hours of mercenary work on borrowed wings, handing over most of your pay in finder's fees, before you can even afford to own the cheapest plane available. And while the mission variety—from dropping off skydivers at 10,000 feet to helping put out forest fires—is enough to keep things interesting, I never got the feeling of running a small business that I was looking for.

Blue Yonder

An small jet plane flying over a city

(Image credit: Microsoft)

It hardly mattered when I was cruising in and out of gorgeous volumetric clouds or diving down almost low enough to dip my toes in the sparkling Nile, however. FS2024 truly looks incredible. At least from 1000 feet up, or when you're admiring the many bespoke airports and landmarks—both new and returning from FS2020—it's probably one of the best-looking games I've ever played. Beyond the bounds that human artists have touched up, though, it handles certain things better than others at eye-level.

I know, I know. This isn't Microsoft Walking Simulator 2024. But if you give me the option to exit my aircraft and trudge around with a custom avatar, of course I'm going to put it through its paces. And what I found was a startling level of believability across a multitude of distinct biomes… as long as you stick to rural or wilderness, inland areas. Walking through the wooded hills around Divide, CO that I spent a lot of my childhood in was almost eerie in how faithful it felt.

But if you get to more populated areas, things get surreal and bizarre pretty quick. Don't expect to be able to walk around the streets of Tokyo without wondering if your graphics card is malfunctioning or you're maybe having a bad trip. This engine also still can't handle places where land and water meet that well, which is unfortunate if you want to take a cruise up the Pacific Coast or around the fjords of Norway. Waves often look painted on and unmoving, and river banks frequently defy gravity in wacky ways. It's still damn impressive. I mean, they modeled the whole Earth! But these details stick out more because of how unbelievable a recreation it is otherwise.

Different yokes for different folks

An small airplane in a hangar with a pilot standing nearby

(Image credit: Microsoft)

Aside from a good internet connection and a good GPU, the remaining entry cost to have an ideal experience with FS2024 is some kind of a dedicated peripheral. You don't have to splurge on a full HOTAS if you don't want to—my trusty Logitech Extreme 3D Pro, which will run you about $30 on sale, served perfectly admirably with a keyboard beside it to give me access to more hotkeys.

Each excursion in a new vehicle [feels] almost like getting to know a new person.

But I also tried to tough it out with an Xbox controller for about 12 hours, and I can only just barely recommend this experience if that's your only option. It's playable, but not having fine throttle control is a constant issue, and all of the control surfaces are really, really touchy with that tiny thumbstick. I definitely got cramps trying to hold the correct angle of attack for climbing, and I had to adjust the sensitivity per aircraft to not put myself into a death spin on every take-off and landing. I can't even imagine trying to play with a mouse and keyboard alone.

If you're less interested in the simulation aspects, there are a lot of options to tweak your flying experience. With all of the assists turned on, it can feel pretty arcadey. Much too arcadey for my taste, but it's nice that they give you that choice. On the other end of the spectrum, you can go through a pre-flight checklist and manually flip every switch in the cockpit before takeoff, and the air traffic control system is much more detailed than FS2020. You're still going to hear a lot of uncanny AI voices, which I don't love. But given the number of missions and different airports, it's not like they could have recorded human dialogue for all of it.

Final destinations

An military jet flying over the grand canyon

(Image credit: Microsoft)

The variety of aircraft, with 70 even in just the base edition, is also pretty incredible—with everything from commercial airliners to fighter jets to hot air balloons. Each has its own distinct quirks and handling challenges to learn, which makes each excursion in a new vehicle feel almost like getting to know a new person. Helicopters and I never quite saw eye-to-eye, but I'm particularly partial to the rugged "taildragger" bush planes that let me take off and land quickly in the middle of some random field in Africa.

And many of these wild areas are now populated by migrating local wildlife, too, with the highly-detailed and excellently-animated models borrowed from Frontier's Planet Zoo. They don't have a wide variety of behaviors, unfortunately. You can walk right up to a polar bear or a water buffalo and they really won't even acknowledge your existence. But it's a cool little touch, and yet another excuse to use the full-featured photo functions. There's even a whole new World Photographer mode that challenges you to snap pics of various animals and landmarks, which I found to be a relaxing break from career mode.

The control panel of a plane

(Image credit: Microsoft)

If you want to really put your piloting skills to the test, there's also a selection of challenges with weekly leaderboards, ranging from perfecting difficult landings to doing what I can only describe as "some Top Gun shit," trying to fly through locales like the Grand Canyon in an F-18 while maintaining as low an altitude as possible. These are neat, but definitely a lot more stressful than the other modes as well.

The core of what makes this long-running franchise great is stronger than ever.

And this sim is also not without some quirks, glitches, and oversights. One issue I ran into multiple times was at some of the smaller, procedurally-generated airports you might fly out of for certain missions, where my plane would spawn with one wing partially stuck inside of a structure, making it impossible to take off. Other times, I'd get dinged for infractions like using my flaps at too high of a speed when my plane was standing completely still with the parking brake on. These issues tend to be small, infrequent, and with an easy workaround of simply picking a different mission. But they are still annoying.

None of that gets in the way of what Flight Simulator 2024 represents in its entirety, though. It takes the mind-boggling ambition of the 2020 sim and executes on it even better—launch woes notwithstanding—which is already a massive accomplishment. Some of the new things it tries to do work better than others, but the core of what makes this long-running franchise great is stronger than ever.

]]>
https://www.pcgamer.com/games/sim/microsoft-flight-simulator-2024-review/ SubmnZJYQiJPNECGXucp88 Wed, 27 Nov 2024 22:52:41 +0000
<![CDATA[ Lenovo Legion Tower 5i (Gen8) review ]]> Lenovo isn't messing around with its Legion gaming PCs. They are resolutely, uncompromisingly just PCs. Sure, there's the faintest nod to 'gamer' styling with the RGB-illuminated front panel and the see-through side, but the Legion Tower 5i is, without wanting to sound at all mean, pretty basic.

Which I think is grand. I am all for that when it comes to affordable gaming PCs, I don't want to see money wasted on needless luxuries when I'm chasing down a good budget rig. And there certainly aren't any of those here. Pull that side panel off and you'll see what I mean; the CPU cooler is a no-name brand, there are no VRM heatsinks or SSD-cooling plates on the barebones motherboard, and the memory sticks are likewise bare PCBs, too.

The OEM Nvidia RTX 4060 graphics card (basically a non-branded one made for system builders) is similarly simple, but beautifully so. I'm into miniature cards where they make sense, and the low-end Ada Lovelace GPU is so efficient that it doesn't need a massive dual-slot, triple-fan cooling array to keep it running to full effect.

Now, you might be getting a bit of the fear with all this talk of limited cooling options, basic CPU cooler, and a small GPU. I get it, you're worried this thing's going to overheat and get hella loud when you boot up any game more demanding than Solitaire. Bury that fear, because at this level we have components smart and efficient enough they don't overly tax the cooling options available.

Legion Tower 5i specs

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)

CPU: Intel Core i5 14400F
Motherboard: OEM B660M
Memory: 16 GB (2x 8 GB) DDR5-4400
GPU: Nvidia RTX 4060
Storage: 1 TB Samsung PCIe 4.0 SSD
Warranty: 1 year
Price: $900

In my testing neither the CPU or GPU go beyond the 76°C mark under heavy gaming loads, and, while the Intel processor will hit 95°C when all its cores are being hit at 100% processing load, that's actually entirely standard. In fact, other systems we've tested will go all the way up to a throttling 100°C, so the fact the Legion Tower 5i doesn't go that far highlights why it doesn't need an expensive chip chiller atop its CPU.

But what is this processor of which I speak? Lenovo has gone for the Core i5 14400F, a ten-core, 16-thread Raptor Lake Refresh chip, that's arguably one of the best budget CPUs around. We still recommend the Core i5 13400F as the best budget CPU but only because it's around $10 cheaper if you're buying it as an upgrade—they are otherwise the exact same processor.

It's no productivity beast, I'll be honest. Despite that effective ten-core labelling, what you're getting are actually just six Hyperthreaded Performance cores, with a further four Efficient cores for lighter workloads. But for gaming those P-cores are the important factor, and are more than capable of delivering data to the GPU in a speedy enough manner to keep the good frames rolling.

And that GPU is the budget-focused RTX 4060. It's the lowest-spec graphics card in the RTX 40-series, with a specs sheet that only serves to highlight that positioning. It's an effective enough 1080p gaming GPU, though, and comes with all the DLSS 3.5 goodness, which includes Nvidia's Frame Generation technology for some free fps in supporting games. It's that extra GeForce frosting which slightly pushes Nvidia's budget card ahead of the AMD alternative.

Alongside that well-balanced CPU/GPU combo, Lenovo has dropped in a 1 TB Samsung PCIe 4.0 SSD (and a pretty speedy one) and 16 GB of DDR5 memory. Sadly, that bare memory is running at 4400 MT/s, but as this isn't exactly designed as a workstation beast, that slow RAM isn't going to be a real hindrance.

So, how does it actually perform? Well, our benchmark suite for gaming PCs runs at 1440p to capture the performance at both ends of the market, both the high and low end machines. And while the native performance isn't too hot—you can see why they call the RTX 4060 a 1080p GPU—as soon as you start to drop in DLSS and Frame Generation you can actually start to see properly playable frame rates.

Hitting a smooth 60 fps on average in Cyberpunk 2077 at 1440p, supported by that 49 fps 1% Low figure, is pretty impressive—especially given that running natively without Frame Gen and DLSS you're only getting 24 fps. Still, even with the panacea of upscaling and interpolation, I wouldn't recommend this Legion Tower 5i as anything other than a 1080p gaming PC.

You're still well behind something like the RTX 4070 Super in terms of gaming performance, but you'll only find that GPU in gaming PCs which cost well over the $1,200 mark. Which is, I guess, where we need to start talking about money.

When I first received this machine for testing it was on offer for $850 at Best Buy, and had been for a number of months, but despite there being a ton of Black Friday gaming PC deals around at the moment, the cheapest this rig is on offer for is $900 at Lenovo's own store.

That is still a decent price, but there are RTX 4060 machines out there with the same spec discounted down to much less, such as the iBuyPower Scale. In general, non-silly season sales times, however, the Lenovo Legion Tower 5i remains one of the best budget gaming PCs from a known brand.

Image 1 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 2 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 3 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 4 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 5 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 6 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 7 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)
Image 8 of 8

Lenovo Legion Tower 5i Gen 8 gaming PC

(Image credit: Future)

The Legion's simple setup works for me, and also means it's an easy system to upgrade down the line, too. There's a spare M.2 slot for an additional SSD to supplement the 1 TB drive already in there, and there are no clearance issues if you wanted to go for higher-spec memory. The only issue there is the OEM B660M motherboard is very basic, even down to the BIOS and that might cause some issues trying to get XMP on speedy RAM kits running. It's also only got a handful of USB sockets, and only a solitary Type-C connection on the back panel.

Buy if...

You want a simple entry-level gaming PC: The Legion Tower 5i is a no-frills gaming PC, that does the basics rather well.

You want a machine from a big-name brand: Alienware's obsession with proprietary parts means we'd rather have an equivalent Lenovo rig any day.

Don't buy if...

You want extended future-proofing: It will be a great PC for right now, and there is a spare SSD slot, but that limited PSU will make it tough to upgrade the graphics card without also upgrading the power supply.

You're happy shopping around: Such is the competition in the budget gaming PC space that there is a good chance you may well find similarly specced rigs from other system builders for less.

The other sticking point would be that 500 W PSU. You could probably stretch to an RTX 4060 Ti as an upgrade, but even that might be stretching things a little. Realistically, if you wanted a big GPU upgrade you'd need a new PSU alongside it. But, unlike something like an Alienware PC, the PSU inside the Legion Tower 5i is entirely standard, too.

I've got a real soft spot for the Legion Tower 5i, in fact for the other Legion Towers I've tested, too. They're simple gaming PCs that don't pretend to be something they're not, and even that flowing RGB lighting can be disabled via a single physical button on the rear of the machine. But Lenovo has a lot of competition in the budget market, from the likes of Newegg's ABS range, and upstarts such as Yeyian. That is a level of competition that seems to be far more aggressive than Lenovo is, or maybe more aggressive than it can be.

If you're after a rock solid build from a known brand, then the Tower 5i will be a great shout, especially as you'll be able to find it with a consistently solid discount If you're purely chasing the best performance at the lowest price, however, there are better alternatives if you're willing to shop around.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/lenovo-legion-tower-5i-gen8-review/ 2srRURoMqkJTghxLDweqVc Tue, 26 Nov 2024 14:54:43 +0000
<![CDATA[ Void Crew review ]]>
Need to know

What is it? A first-person spaceship 'em up for 1-4 players.

Expect to pay: $25 / £22

Developer: Hutlihut Games

Publisher: Focus Entertainment

Reviewed on: Ryzen 7 5800X 8-Core Processor, RTX 4080, 32GB RAM

Steam Deck: Playable

Link: Steam

Void Crew is a chaotic first-person mix of FTL: Faster Than Light and Sea of Thieves that asks you to take your team of one to four players and tumble through the galaxy, trying to keep your ship intact as you bounce between increasingly difficult space battles.

The structure actually is FTL. At the end of each encounter you'll power up the ship's void drive and hop into space, before going to a glowy 3D map to choose which of three missions you want to take a swing at next. They are varied, and while you inevitably end up throwing down with clouds of enemy ships, it does feel like there's a difference between, say, ambushing a convoy and wiping out shipyards.

(Image credit: Focus Entertainment)

The end result is basically the best Firefly game ever made, with you and as many friends as you can cram into the ship waging a guerilla war in every direction, jumping away into the void just as things start to turn against you.

Your crew is divided into four different classes: Pilot, Engineer, Gunner and Scavenger. The roles are simple: the Pilot flies the ship, the Engineer fixes it, Gunners defend it, and the Scavenger… gets a grappling hook. The first three have skill trees that boost their specific niches and even give them a power-up that's perfectly suited for moments of do-or-die heroism, from letting Gunners supercharge their emplacements to the Engineer running faster and repairing quicker.

The Scavenger, meanwhile, has a loose hodgepodge of upgradable skills that makes them better outside of the ship. It's a role that’s just not needed - while hull breaches require someone to go out into wide open space to fix them, in that case you really want that someone to be an Engineer, who's better at the actual fixing. There are certain objectives that require you venture outside of the ship, too—but generally speaking, having someone off-ship is a negative. As a result, the Scavenger simultaneously offers both the most exciting possibility and most disappointing experience when you hop into a game for the first time. Compared to the other classes, there's not a clear fit for the Scavenger, so you feel anomalous in the ecosystem of keeping your little boat safe.

(Image credit: Focus Entertainment)

These roles are loose and you can multiclass, but because the farthest reaches of the skill trees for each role are so powerful, it seems inefficient to spec into other classes. When I level up a bit further it might turn out that I can pick up every different skill in every class—I haven't yet found the upper limit of how many skill points I can get—but because each successive skill feels like a huge leap forwards in terms of what is possible, it feels foolish to spread out your points early on.

There's a narrative here, but you can ignore it. The real story is the one you make. I couldn't tell you whether the faction you are aligned with, the METEM, are good guys or bad. But I can't stop thinking about our Engineer, who remained preternaturally calm as he bounced around the belly of the ship, powering up the void drive, extinguishing flames and fixing shorted wires to keep us alive long enough to make the jump outta dodge. This isn't the only tale of improbable survival Void Crew has given me, and honestly several of them hinge on my friend Tim being a calm engineering presence while the rest of us try our best not to get crushed by the unfeeling universe.

(Image credit: Focus Entertainment)

Still, Void Crew's story engine is far from the only hardware you'll be getting your hands on. This is a game all about spaceships and tinkering with them.

Void Crew's ships feel like they were made by Fisher-Price, all big buttons for opening the airlock door and chunky levers for depressurising afterwards. You can interact with just about everything using a quick tap of the F key, and most things are quite self-explanatory. Constructing new modules in your ship is a simple case of dragging a box into an empty space and then turning a handle on the top. You deconstruct parts of your ship by pulling a big lever in the base of each part. You can do this at any time, but outside of slapping a shield generator down mid-fight it's hard to imagine why you would want to do it anywhere outside of the relaxing safety of the void tunnel.

Void Crew's ships feel like they were made by Fisher-Price, all big buttons for opening the airlock door and chunky levers for depressurising afterwards

The Frigate is easy to pilot and maintain, but the Destroyer is huge and feels like a stretch for a four-person team to keep afloat. You unlock several different configurations with their own specialities, whether that's a CQB (Close Quarters Battle) ship that starts with two whirring miniguns, or an energy boat that offers more laser guns (and power issues) than you can shake a stick at.

(Image credit: Focus Entertainment)

We didn't pick a favourite, but did find ourselves more comfortable on the smaller Frigate, and as we learned every inch of that ship, the game came alive—the team running around like little ants inside our colony. "I'm running out of ammo," I'd call out as I got to the last 600 rounds in my ridiculously fast-firing gatling gun. At this point, the Engineer would run from the engine room to the storage shelves, grab a box of ammo, leg it up to my emplacement at the back of the ship and load more ammo in. On the way back, he'd dip into the power room and take a battery from the charger, loading it into our shield and allowing us to take a tiny bit more punishment. From there, he'd take the empty battery from the shield to put it on charge, before returning to his nest in the engine room, keeping the engine trimmed and thrusters charged so our Pilot had as much mobility as possible.

It's a role he was born to play, and I think most groups will find people gravitating into their roles quite easily. My love of things that go boom make me a natural Gunner, but I also enjoyed the twists and turns of piloting, and found some peace in darting around the inside of the ship during combat as an Engineer.

(Image credit: Focus Entertainment)

Ultimately though, it's the guns that draw me in. There's a range of different weaponry here, from long range snipers to gatling guns and everything in between. The energy weapons don't require ammo but draw a lot of power, while physical guns put out an unholy amount of damage and never overheat. Having to track enemies and then lead your shots is unusual in a game of this type, and a blast to get to grips with.

All in all, Void Crew feels like it has a good foundation here. There are a few omissions: as a Pilot it's weird that you can't control your pitch and yaw. And we've already covered how pointless the Scavenger feels. But the only major flaw is that there isn't quite enough content yet for sustained play. I've played a fun 20 hours but feel like I've just about seen everything, even if I haven't mastered much of it. My next challenge is to start topping the bosses: giant, screen-spanning things like a giant spike surrounded by turrets, or a hollow sun full of energy that will set your ship on fire from the inside as soon as its giant energy beam touches you. After that, I’m not sure what I’ll have left to do.

(Image credit: Focus Entertainment)

Still, Void Crew offers a compelling reason to go to a galaxy far, far, away—and while there are a few quirks here and there, this is a sci-fi blast of co-op chaos. Grab some friends, get out there and make your own stories.

]]>
https://www.pcgamer.com/games/action/void-crew-review/ ebYRsNd8rpVwKZu9mU2RFL Mon, 25 Nov 2024 17:00:10 +0000
<![CDATA[ Empire of the Ants review ]]>
Need to know

What is it? Photorealistic third-person ant-vs-ant RTS
Expect to pay: $40/£35
Developer: Tower Fire
Publisher: Microids
Reviewed on: Radeon RX 6800 XT, Ryzen 9 5900, 32GB RAM
Multiplayer? Online 1v1 and free-for-all PvP
Link: Steam

Zooming in on tiny things and imagining what life is like for them: it never gets old. Like 2016 platformer Unravel and Obsidian survival game Grounded, new RTS Empire of the Ants plops us into the world of macro photography. It does a great job of showing us the world from an ant's perspective, where pebbles are boulders and a beetle is an elephant, but you've really got to love that feeling for it to work, because as an RTS campaign, it's not great.

In Empire of the Ants you're 103,683rd, a warrior-caste red wood ant who fights for their confederation of ant colonies against the much larger world's wonders and horrors. It's a wonderfully whimsical world—drawn from a series of French novels—and although the game mechanics aren't anything special, and the campaign missions are a letdown more often than not, there is an undeniable delight in exploring its tiny world. This is a game about beauty and enjoyable scenery above all else.

The art almost aggressively leaps at you, with as photorealistic a set of greenery and logs and other tiny things as Unreal Engine 5 can muster. The many insects and arachnids and other creatures have believable texture to their exoskeletons that varies between species and type. I was especially enamored by the huge ferns, grasses, and flowers your ant can climb all around on. The surfaces aren't always as detailed as insect carapace, but the way your ant's legs twist to clutch at the stem as you climb or spread out on smoother surfaces is just kind of detail you hope for from a game about zooming in on details. It's simply a very fun, and pretty, game to move about in.

The huge artifacts of the larger human world are also delightful. A rugged old soccer ball features quite early on and looks realistic with its worn stitching and fraying panels and slick slug-trail from a circling, curious creature. These human objects are accompanied by ant-scale descriptions wondering what they might be: We can't eat the soccer ball, says one, it's not real leather. Avoid a glass jar in the summer, says another, as it gets very hot.

That does make the places that lack similar detail very obvious: You can build little wood walls for your nests, for example, but you just clip right through them when walking. Lack of attention to certain mechanical details is painful in your average mission, because much of Empire of the Ants isn't actually RTS missions—it's platforming around these worlds as a little ant. And that's fun when you're traversing and sightseeing, but most of those platformer segments boil down to timed scavenger hunts that have you rushing about and fighting with the awful UI for smelling pheromones more than enjoying the scenery.

Empire of the Ants screenshot

(Image credit: Tower Five)

The antgony of defeat

As a whimsical ant exploration simulator, Empire of the Ants does pretty well, but it's in theory a third-person RTS first and foremost. That part of it is lacking quite sorely. Running around and directing your ants to take new nests—the fixed-in-place capture points that also serve as your only base building—is often too simple. The battles are largely deterministic, where you can see from the start which side would win and which will lose because of the Warrior-Worker-Spitter unit triad forming a Rock-Paper-Scissors counter system. The only wrench in those gears is that sometimes you can use pheromone abilities from your otherwise non-combatant ant leader to do stuff like boost your units' movement or cause an enemy to flee. It's very simple stuff that doesn't inspire an interesting range of tactical scenarios.

That becomes painfully obvious in multiplayer, where the battles devolve pretty quickly into who's quicker on the draw to take resources and leverage them—I didn't have any interesting or surprising tactical interactions. There's little to nothing to recommend this as a committed multiplayer game over other, more strategically-varied games that'll feel fresh longer.

Image 1 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)
Image 2 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)
Image 3 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)
Image 4 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)
Image 5 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)
Image 6 of 6

Empire of the Ants screenshot

(Image credit: Tower Five)

In the later campaign missions, it's often frustrating to tab through the long list of ant legions and allied creatures like beetles and wasps you command—and tabbing through the list is indeed the only precise way to select units other than facing and clicking them directly… which can be hard when a half-dozen enemy and friendly unit icons overlap. The UI for base upgrades can also frustrate: it's projected on the ground and can get covered by friendly ants.

The two halves of Empire of the Ants, exploration and tactics, are both mediocre and at war with each other. It is possible to blend puzzle-platforming and small-scale tactics—typified by the excellent Pikmin series—but Empire of the Ants strictly separates the two types of play. Campaign missions are always either exploration or strategic scenarios, and the exploration is only sometimes mysterious or surprising enough to justify itself. A Pikmin-style game blends the two gameplay types into a fluid whole, with larger levels or regions that reward exploration with strategic bonuses. Here, I feel like I'm hopping between a passionate environmental tech demo and a mediocre third-person RTS in stutters and stops, but those environments are at least enough to recommend it, with big caveats, to lovers of insects and other tiny things.

]]>
https://www.pcgamer.com/games/rts/empire-of-the-ants-review/ GxbrEw6fYp2QnyMvtxFPnf Fri, 22 Nov 2024 22:03:31 +0000
<![CDATA[ Asus ROG Maximus Z890 Hero review ]]> Intel's new Core Ultra 200S series of desktop processors launched to a mostly negative response, due to the regression in gaming performance compared to the previous generation of chips. However, they are pretty good at content creation tasks and they consume far less power in gaming, so they could be suitable for some folks. But what motherboard do you buy?

The new Arrow Lake CPUs only fit into an LGA1851 socket, so you can't fit one in a previous-generation motherboard. At the moment, there's only one chipset available (Z890) and most board vendors have focused on creating ATX-sized models that are quite expensive. Then again, if one is going to spend over $500 on a Core Ultra 9 285K, you're probably best off getting a suitably capable motherboard to ensure the setup lasts you as long as possible.

The ROG Maximus Z890 Hero isn't at the very top of Asus' Intel motherboard range but it's not far off, as it sports as comprehensive a set of features as one could possibly ask for. Naturally, that means the price tag is equally as big, but that's par course for high-end motherboards these days.

Pretty much everything about the ROG Maximus Z890 Hero is big—opening the enormous packaging reveals a motherboard that weighs in at an astonishing 3.6 kg (8 lbs). From the huge primary M.2 slot heatsink to the massive polychromic display, nothing about it is subtle even though it doesn't sport in-your-face graphics.

Asus ROG Maximus Z890 Hero specs

A close up photo of the Asus ROG Maximus Z890 Hero motherboard

(Image credit: Future)

Socket: Intel LGA1851
Chipset: Intel Z890
CPU compatibility: Intel Core Ultra 200S desktop
Form factor: ATX
Memory support: DDR5-4800 to DDR5-9200+(OC), up to 192 GB, CUDIMM supported
Storage: 6x M.2, 4x SATA, 1x SlimSAS
USB (rear): 2x Thunderbolt 4 Type-C 40 Gbps, 1x USB 3.1 Type-C 10 Gbps, 4x USB 3.1 Type-A 10 Gbps, 4x USB 3.0 Type-A 5 Gbps
Display: 1x HDMI 2.1, 2x USB/Thunderbolt 4
Networking: Intel 2.5G LAN, Realtek 5G LAN, Wi-Fi 7
Audio: Realtek ALC4082
Price: $692 | £713 | AU$1,249

Just read through the specifications to see what I mean. Few motherboards come with six M.2 slots and fewer still have two Gen 5 slots and four Gen 4 ones. If that's not enough storage for you, then there are a further four SATA ports and a SlimSAS connection, for additional NVMe or SATA drives. How about 11 USB ports on the rear IO panel and a further 10 via headers?

If overclocking is more your thing, then you're not going to be disappointed. Power is handled by a total of 27 stages (22 of which are rated to 110 amps), covered by some of the chunkiest heatsinks I've seen for a very long time. I'm a little surprised that Asus didn't choose to use active cooling for the VRMs but given Arrow Lake's reduced power consumption compared to Raptor Lake, it's perhaps not necessary.

The engineers at Asus have worked hard at making the motherboard more user-friendly ('EZ DIY', as Asus calls it) and it's replete with lots of so-called 'Q' features—M.2 Q-Release, M.2 Q-Slide, M.2 Q-Latch, PCIe Slot Q-Release Slim, Q-Antennae, and so on. Fortunately, it's not just marketing nonsense, as the features genuinely make it really easy and simple to install and remove graphics cards, SSDs, and the Wi-Fi aerial.

Image 1 of 4

A close up photo of the Asus ROG Maximus Z890 Hero motherboard

(Image credit: Future)
Image 2 of 4

A close up photo of the Asus ROG Maximus Z890 Hero motherboard

(Image credit: Future)
Image 3 of 4

A close up photo of the Asus ROG Maximus Z890 Hero motherboard

(Image credit: Future)
Image 4 of 4

A close up photo of the Asus ROG Maximus Z890 Hero motherboard

(Image credit: Future)

I do wish Asus would refresh its BIOS layout and structure, though. It's packed full of options as always and now defaults to a 1080p resolution (so it looks crisp and clear) but it's somewhat overwhelming at times. The jump in complexity between Easy and Advanced modes can't be very welcoming for beginners, that's for sure. Still, at least the Q-Dashboard is a nice touch, displaying an overview of your motherboard's parts and connectors.

Performance testing motherboards is a somewhat tricky affair, mostly because there's usually very little difference between various models in most tests. It's only when one uses highly specific synthetic benchmarks that small differences begin to make an appearance, and these rarely translate into a noticeable discrepancy in real-world situations.

This is especially true in the case of Z890 motherboards because they're all brand new in the market, so there isn't a wealth of results to compare against. Hence why there are only two other boards in the results below: an MSI MEG Z890 Ace and an MSI MAG Z890 Tomahawk.

PC Gamer test rig

CPU: Intel Core Ultra 9 285K
Cooler: Asus ROG Ryujin III 360 ARGB Extreme
RAM: 32 GB Lexar Thor OC DDR5-6000
Storage: 2 TB Corsair MP700
PSU: MSI MAG AB50GL 850 W
OS: Windows 11 23H2
Chassis: Open platform w/ 2x 140 mm fans
Monitor: Acer XB280HK

For the most part, you can see there's little to separate them, especially in gaming. However, the Factorio test really stands out, as the Asus ROG Maximus Z890 Hero achieves a 21% faster result than either of the MSI boards. I've spent some time discussing this with both companies without much success in understanding the result.

The only thing I can think that might be behind it is that the Asus motherboard applies a fixed clock speed for the Core Ultra 9 285K's internal ring bus, whereas the MSI boards allow it to vary. Intel tells me that Asus' approach is the correct one.

However, the ROG Maximus Z890 Hero is slower in the 7zip tests and not by a margin of error, achieving a 14% lower result in the decompression benchmark. Games store all data in a compressed format and decompress it on the fly when required so the 7zip figure suggests that one might notice a decrease in gaming performance when using the Asus motherboard. But as you can see in the gaming results, it doesn't affect those particular games.

Tracking the CPU package power, VRM and chipset temperatures during the main benchmarks will highlight any deficiencies in a motherboard's power delivery and cooling systems and while there are some disparities between the three motherboards, they're not significant. The ROG Maximus Z890 Hero's heatsinks do an excellent job of keeping everything cool but given that the MSI board's do likewise, I do wonder if their vast size was really necessary.

(Image credit: Future)

But I have no such concerns over the heatsink for the primary M.2 slot. It's a monstrous slab of metal and as you can see, it does a superb job of keeping a PCIe 5.0 SSD's temperature under control.

Buy if...

✅ You want as many features as possible: This motherboard has all the ports, slots, and sockets you're ever likely to need.

Don't buy if...

❌ You're on a tight budget: Handing over nearly $700 for a motherboard, even one as good as this, means it's not one for a sensibly priced gaming PC build.

At no point did it ever exceed 70 °C and this is the only motherboard I ever tested that's managed to do this with a Gen5 SSD. The fact that it's so easy to remove and reattach is the proverbial cherry on top.

There's an awful lot to like about the Asus ROG Maximus Z890 Hero. Despite its sheer mass, hulking heatsinks, and pretty polychromic display, it's a relatively understated design that should suit most PC enthusiasts.

Underneath all the metal is a motherboard that's loaded to the hilt with ports, slots, and sockets. If you want to build a new gaming PC with an Intel Core Ultra 200S processor, you're not going to be left wishing for anything.

Well, apart from its price, that is. At a few bucks over $690, it is very expensive—and while there are even pricier Z890 motherboards on the market, that price tag is enough to make anyone look twice before buying.

The Asus ROG Maximus Z890 Hero is an excellent motherboard, of that there's no doubt, and should easily last you through many years of use. Whether that's enough to justify spending that kind of money, especially when more LGA1851 motherboards will come out in 2025, is far less certain.

]]>
https://www.pcgamer.com/hardware/motherboards/asus-rog-maximus-z890-hero-review/ D5V5e73q48gBFBEu9NMFNh Thu, 21 Nov 2024 16:52:57 +0000
<![CDATA[ Nanoleaf 4D Screen Mirror Kit review ]]> Reactive RGB has come a long way over the years. From resource-hogging glowy speakers to AI-driven wall shapes you can control from your phone. But I’m always looking for a cleaner, more simple, and accurate solution. This time, it’s Nanoleaf’s go. Yes, the same Nanoleaf cladding the walls of your favourite tech/gaming influencer or cosy game fan.

Reactive lighting isn’t new for the brand known for turning wall geometry into a nerdy subculture of home decor. Its new take is the Nanoleaf 4D Screen Mirror Kit—a trimmable light strip for screens up to 65-inches (or 85-inches for a premium), flexible corner brackets, a camera, and a tiny control hub. It’s a clean, affordable marvel that supports anything on your screen. However, some minor issues during setup can hold it back.

The Quick Start guide summarises the installation process in four steps: Apply the corner brackets, stick the lights down, place the camera, and link it to the box. A handy kickstand lets the camera sit in front of the display if you don’t want it perched atop your screen. It’s not ideal on a desk, but it’s an option for living rooms. Either way, plug it in, and you’re good to go. Oh, if only things were that simple.

The purpose of the flexible corner pieces is never explained. You’re told to skip that step if you don’t want them, but there’s no mention as to why you might. In reality, they’re a simple solution to a very real problem.

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)

Length: 4m
Dimmable? Yes, via app
Contents: light strip, camera, camera mount, camera privacy cover, hub, corner brackets, power supply
Connectivity: Apple Home, Amazon Alexa, Google Home, IFTTT, SmartThings, Razer Chroma
LED type: RGBIC (10 clusters per meter)
Power usage: 24W
Price: $80/£90

By wrapping the lights along the corner buffers, the LEDs can shine further afield. This keeps the corners from becoming too bright where the LEDs on the strip bunch up while allowing the lighting to spread further along the wall or, in a tight space, onto adjacent ones.

But that was the least of my worries. The 3M adhesive lining of the light strip just couldn’t hold on. Interestingly, the aforementioned corner strips performed much better. Could chilly conditions be blamed? The manual didn’t mention optimal application temperature. Google suggested above 15 degrees celsius. The room? Around 13C. Warming things up and applying after-market adhesive improved things, but they’re still slowly peeling away.

With things in place and a docked Steam Deck as its source, it was time to give things a go. But now that the hardware issues had been ironed out, software issues needed to be tackled.

While clean next to Govee’s icon-laden app, the Nanoleaf app isn’t perfect. It’s good fun setting things up with a wide fisheye view from the camera displayed on your phone, but a minimal UI often means minimal instruction. The setup process was somehow too basic. Showing the camera each edge was no trouble, yet in cutting the calibration process short for simplicity, the performance of the reactive lighting was borked.

Image 1 of 4

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)
Image 2 of 4

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)
Image 3 of 4

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)
Image 4 of 4

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)

By cutting down the light strip to fit a 42-inch TV without the control box intelligently concluding the lack of LEDs, it couldn’t accurately deduce how to translate a glowing object in one corner to the respective spot along the strip. It thought opposite corners were actually beside each other. This led to a top-right light source being displayed beside a dark corner in the bottom right.

Rectifying this isn’t a huge hurdle: it’s just hidden away. Worse still, the lights would lose connection. That issue was solved with a software update, but it was another annoying setback, furthering the complaint of poorly worded UIs.

Another thing never explained is the 4D mode. That’s the top end of “D” settings (they start at 1D of course). Even when trying each, it isn’t clear how they work. They’re immediately different, but an explainer would have been appreciated. There are interactive graphics on the Nanoleaf website to demonstrate the difference, but it does claim the lights offer “VR levels of immersion,” which is complete nonsense.

Image 1 of 3

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)
Image 2 of 3

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)
Image 3 of 3

Nanoleaf 4D Screen Mirroring kit set up on a gaming monitor.

(Image credit: Future)

In practice, they seem to affect the range and complexity of the reactive lighting. The higher you go, the more camera data is translated into illumination: 1D appears to display a simple bright or dim effect on the wall to suit the mood of a scene. At 4D, the contents of the screen edge make up the light show. Settings below that focus on highlight the focal action in the centre of the screen. You can enable Rhythm to have sound taken into account as well.

On that 4D setting on a wide-ranging game like Dead Cells, you’ll notice the alluring warmth of a distant torch, light on a shallow pool reflected beneath the screen, and the rays bleeding through the high prison window highlighting the upper corners. In battle, explosions and effects burst beyond the panel. At a lower setting, the focal point of the action is what you’ll see blasted onto your wall.

Employing a camera to extract that data isn’t the fastest solution. Still, you won’t see the blood splatter appear off-screen long after damage has been dealt. It’s imperfect, but it’s practical. But like any camera pointed at a screen, it’s susceptible to glare. A darker room is recommended—as it would be for a light show—but the app does feature sliders to rectify imperfect conditions.

Amusingly, Nanoleaf managed to simplify the method of creating custom ambient lighting themes to a word or phrase. That’s where the AI comes in. I typed in YMCA and got a fittingly vibrant gradient the hub’s built-in microphone could manipulate. Generative AI is still a scourge, but this is a sensible use.

Despite the initial setbacks, the Nanoleaf 4D Screen Mirror kit entranced me. Move beyond the poor setup process and it’s a genuine joy. It’s a simple solution to the once-complex quest for reactive lighting. And at $80/£90, it’s considerably cheaper than the bulkier Govee T2 while still feeling as fast as the Govee AI Sync Box 2.

Buy if...

✅ You want reactive lighting on a budget: The Nanoleaf 4D Screen Mirror kit is one of the more affordable reactive lighting systems from a reputable brand. It works on basically any screen and can extend to other lights. It’s a simple solution that can grow with your lighting needs.

Don't buy if...

❌ You want bright, fast, and accurate lighting: Without the additional white LED like some competing products, the range of colours available is limited. But it’s only something you’ll notice if you see what else is out there. The camera system will also never be as accurate as a cabled solution.

Sweetening the deal is compatibility with Nanoleaf stuff like the Smarter Essentials Bulb and its myriad wall decor. If you have other lights around your TV, you can save $40/£20 by foregoing the LED strips. Plug your current lights into the box and you’ll get the same feature set. And with support for Razer Chroma, you can add specific game compatibility for context-based lighting on PC—the sort you can configure to mirror health bars, react to damage, or celebrate a triple kill.

Want it all? Link the lot over Wi-Fi to spread reactive lighting throughout your room. Philips Hue and Govee have had this for years, but that doesn’t make it any less magical. And by never linking up to your PC, there’s zero overhead bogging down your system.

But there’s always room for improvement. The lights themselves do a terrific job of extending the visual flare of your media beyond the screen. It’s just a shame the sticky solution is rarely up for the task, the hardware can struggle for a connection, and the app is so simple it’s perplexing.

With 10 RGB clusters per meter, the Govee AI Sync Box 2 wins out with 17 RGBWIC bulbs per meter, drastically improving contrast and colour mixing. But this Nanoleaf solution is ⅓ the price. With that in mind, great compatibility, a low cost, and in-store availability make the Nanoleaf 4D the easiest reactive RGB solution to recommend. It’s far from perfect, but it’s a very good option.

]]>
https://www.pcgamer.com/hardware/lighting/nanoleaf-4d-screen-mirror-kit-review/ Df68ovSw5RYkjnNfHPJEiW Thu, 21 Nov 2024 13:05:51 +0000
<![CDATA[ Govee AI Sync Box Kit 2 review ]]> From sconces and curtain lights to glowing geometric shapes and things seemingly named after bread rolls or vegetable form factors, Govee has a gargantuan catalogue of (frequently discounted) smart lighting options. There’s always something you’d like for any product to do better, and it’s nice when someone listens. The result of one such wish is the Govee AI Sync Box Kit 2.

The Govee AI Sync Box Kit 2 is the successor to its reactive RGB solution from a couple years back. It’s a flexible strip light for your monitor, two lighting towers, and a sturdy hub that’s one part HDMI switch and one part processing unit. You get some good-quality HDMI cables, too.

Reactive lighting uses hardware or software to translate the colours on your screen into a dynamic light show that can respond to the image in real time. The result is a soft and immersive ambience that reacts to the scenes of your games and movies.

It’s a technology that’s been around for decades. But with Wi-Fi enabled lights everywhere now, the Govee AI Sync Box Kit 2 can technically fuel a house-wide solution of lamps, wall art, string lights, and spotlights. The use of AI is there to offset the traditionally processor-heavy workload needed to reflect specific in-game actions without requiring collaboration between individual game developers.

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)

Lighting: RGBWIC
Contents: Light strip, light towers, hub, braided HDMI cables
Power: 60 W
Reactive lighting? Yes
I/O: 4x HDMI 2.1 in, 1x HDMI 2.1 out, 2x USB-C (for lights), DC in
Price: $240/£280

Setting up this kit is identical to the last Sync Box Kit, which I've previously used. Though while there are many similarities, there are some key differences. The strip’s LEDs now pushes for 75 bulbs per meter. We can’t count the bulbs in the light towers, but the whole package now supports RGBWIC: bright white LEDs that expand its range of colour temperature.

But I never had a problem with the apparent lack of LEDs on the old model. My issue was the main hub’s lack of DisplayPort output. With the box acting as the middle man between your graphics card and monitor, this locked popular gaming displays out of Nvidia G-Sync, creating a conundrum: immersive reactive lighting or silky smooth images.

Competitive players running high refresh rates and low resolutions were fine, with the lighting being a distracting way for an enemy flashbang to feel worryingly real. For those looking to add extra flair to their single-player adventures, the compromise wasn’t welcome.

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)

With no support for VRR solutions, I retired my original kit. Given the £200+ asking price, it was a demoralising decision likened to removing an extra monitor to clean up your space. Thankfully, all five ports on the Govee AI Sync Box Kit 2 are HDMI 2.1. G-Sync, Freesync, VRR, and ALLM are all fair game. Ultrawide 1440p displays can hit 240 Hz with 144 Hz supported at 4K. Above that, 60 Hz is the peak.

Still, the 34-inch length of the LED strips means anyone rocking a Samsung Odyssey Ark can forget it. There’s a version for larger screens in select regions. You won’t get the light towers, but you’ll get the hub and strips. That’s because the towers are designed to sit away from a screen of equal height, extending the glow beyond the backlights.

Unlike Corsair’s LT100 light towers, the tower base isn’t illuminated. So while the lighting can stretch across your wall, you won’t have bursts extending along your desk. With no option to choose specific parts of the screen to sample, manually matching the towers to ally and enemy resource bars is limited to what’s officially supported through the phone app.

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)

Speaking of the app, it’s required for setup. Want to link up multiple devices through its Dreamview system? You need the desktop software—but only after enabling LAN control through your phone. The problem? Despite mentioning Dreamview in the box, the app says the Sync Box doesn’t support LAN.

I’ve always had issues with Govee’s convoluted software solutions, but this takes the cake. The messy app’s poor English wording only makes matters worse. You’ll also need to jump into the app to enable profiles for supported games. They won’t kick in automatically. Still, reactive lighting support is there the moment you hook everything up.

Beyond its AI capabilities that generate ambient lighting profiles based on keywords, the four HDMI 2.1 inputs and eARC output on the hub can make it an essential part of your desk-based gaming setup. You can connect modern-day consoles and streaming sticks, removing the need to swap cables on monitors with limited I/O, introducing immersive reactive lighting to more than just your PC titles.

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)

Though I can only realistically shine these against a set of brown blinds, which is understandably dumb, I dug out a pure white projector screen in the name of science.

The result? Bright and enthralling reactive lighting that admirably reflects the warm glow of the forge (and Fatalis fire) in Monster Hunter World or the chill of its snowy climes. The popping colours of a frantic Overwatch match faired just as well. Even the smooth-panning landscape of Hyrule in Echoes of Wisdom on a connected console was an added delight.

There’s still a perceivable delay, and transitions can appear sudden and staggered. But when you’re not actively looking for faults and properly lose yourself in your on-screen entertainment, the effect is joyous and easy on the eyes. Especially in a dark room.

Image 1 of 4

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)
Image 2 of 4

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)
Image 3 of 4

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)
Image 4 of 4

Govee's AI Sync Box Kit 2 set up on a gaming monitor.

(Image credit: Future)
Buy if:

✅ Your entire entertainment experience revolves around a single space: The HDMI switch/control hub is what commands the high price here. If you can make use of up to 4 HDMI 2.1 ports, this is a good way to add highly versatile, bright, balanced reactive lighting to your universe.

Don’t buy if:

❌ Your rig relies on DisplayPort: The continued omission of the popular PC gaming port means the Govee AI Sync Box Kit 2 won’t suit your current setup. If your GPU and monitor lacks HDMI 2.1 support, you’ll have to settle for some version of Freesync instead, which typically means lower refresh rates. It’s a very expensive bit of kit that could bottleneck another.

The Govee AI Sync Box Kit 2 fixes most of the major issues of the first, and most of my gripes. The lights are brighter, more colourful, and more even. The hub also no longer bottlenecks expensive gear, supporting higher resolutions and frame rates alongside its fanciful, immersion-enhancing ambient lighting. There’s still no support for DisplayPort, so you’ll still be sacrificing Nvidia G-Sync without an HDMI 2.1 monitor.

But there’s still the issue of just how expensive it is. With the box being the original thorn in the foot, why they won’t sell it as a standalone upgrade is irritating. If you found no fault with the original lights, at nearly $300/£300, it’s a lot of money to right one wrong. At least there’s a slim chance you’ll need to bin it if you buy another monitor.

Is it the best solution to the reactive lighting conundrum? For some, it very well could be. So long as it’s against a largely white surface, the bright LEDs help create a mesmerizing display with more punch than before. The powerful HDMI switch alone can solve an annoying issue of the complete gaming desk.

But if your desk is for PC gaming only, Corsair’s iCUE system still feels like the more premium offering with better build quality, additional lighting zones, cleaner software support, and a more manageable price tag.

]]>
https://www.pcgamer.com/hardware/lighting/govee-ai-sync-box-kit-2-review/ JLcwmMxrfjLT5wVoekuCuk Thu, 21 Nov 2024 11:42:21 +0000
<![CDATA[ Blacklyte Kraken review ]]> With the gaming chair market being somewhat saturated with plush perches, all looking very similar, it's hard for any company to stand out from the competition. Secretlab achieves this by making its chairs out of quality materials, to a high standard, and at a reasonable price. For the past few weeks, I've been using a Blacklyte Kraken gaming chair to see if the same holds for the Canada-based company.

At first glance, the Kraken gives off a distinct Titan Evo vibe but it's different enough to not be a direct clone. There is clear evidence of some strong influences in the design but the main reason why I'm using Secretlab as a reference is the Kraken's price. Blacklyte wants $519 for its chair, which also just so happens to be the same amount of money as a Titan Evo if one buys it directly from Secretlab.

There's only one size of Kraken available, recommended for people between 165 and 195 cm (66 - 75 inches) in height, and under 150 kg (330 lbs) in weight. I'm 184 cm and 70 kg (read: tall and thin) and I have to say that its dimensions don't fit particularly well for my build.

It's not because the Kraken isn't spacious—the seat base is quite wide and filled with dense memory foam and covered with some nice PU leather (aka leatherette). The problem is that it's quite short and there's quite a large gap between the edge of the base and my knees. Tilting the chair back doesn't help, due to the large 'panel' that fills much of the seat back.

Blacklyte Kraken specs

A photo of the Blacklyte Kraken gaming chair

(Image credit: Future)

Sizes: R (165 - 190 cm, <150 kg)
Fabric: PU leather
Recline: up to 168°
Warranty: lifetime for frame, 2 - 3 years for seat, back, and accessories
Armrests: 4D adjustment
Price: $519 | £449 | €442

This is the Kraken's lumbar support and it's basically a spring-loaded section that moves forward with the pull of a small lever. Blacklyte claims that it "perfectly match[es] the natural shape of your lower back" but it absolutely doesn't with mine—all it does is push me further along the base, leaving my legs with even less support than it already does. A small pillow stuffed down my back does a far better job.

Speaking of pillows, the Kraken comes with one for your head/neck that magnetically attaches to the headrest and it's really quite nice. I'd prefer it to be larger than it is, though, because even with the seat back fully upright, it doesn't quite reach my head or neck enough.

It's a similar story with the armrests. They're fine for the most part but the amount of positioning available is a bit restrictive. I'd prefer it if they could go a little lower and more forward than they do. They're topped by a magnetically attached pad that's a little too firm for my tastes but where Secretlab offers alternate pads to apply to the Titan Evo, Blacklyte has nothing (so why are they removable?).

A photo of the Blacklyte Kraken gaming chair

(Image credit: Future)

The armrests are four-way adjustable (three-axis movement and rotation) but it's a rather clunky affair, with quite a lot of play in the fixtures. In fact, there's a lot of play in the whole chair and loosening the tilt mechanism, to allow one to rock back further, induces some rather worrying-sounding clunks.

A photo of the Blacklyte Kraken gaming chair

(Image credit: Future)

It might just be the specific sample I was sent, but the build quality was poor. The chair's height adjustment mechanism stopped working within a week and from the very start, the two levers controlling it had such a degree of slop to their fitting, I thought I'd broken them during assembly. The mechanism to adjust the lumbar support is also starting to fail, with the cable sticking in its sheath.

Buy if…

✅ You don't want to conform with the crowd: Should the thought of being seen with a Secretlab or Corsair gaming chair put you off, Blacklyte is at least a name that should make you stand out.

Don't buy if…

❌ You want a quality chair for a reasonable price: The Blacklyte Kraken costs double what it should do, given the relative lack of features and quality control.

Unlike Blacklyte's Atlas gaming desk, which was very straightforward to build, the Kraken was a fiddly affair, involving much juggling of heavy parts whilst trying to line up bolts with holes, many of which were partly hidden by material. At least all the fittings came in a labelled package, so judging which piece to use during the assembly process was easy enough.

I was disappointed to see that the Kraken is shipped with a lot of packaging material that's non-recyclable. Some of that can't be avoided, given how far the chair needs to travel in distribution, but in this day and age, having every component wrapped up in some kind of soft plastic isn't environmentally friendly.

But I could forgive all of this if the Blacklyte Kraken was comfortable to use for lengthy periods. In my case, that means spending a day at a desk working and then a few hours in the night gaming.

Unfortunately, it's not and I have to say it's one of the least comfortable gaming chairs I've ever used. Now, a good part of that is because it just doesn't suit my body's dimensions and shape, but I've asked a few people to try it out and none of them found it nice to use.

A photo of the Blacklyte Kraken gaming chair

(Image credit: Future)

As with all gaming chairs with a thick memory foam base, it takes a while for it to bed in, and my scant weight is probably delaying that process. But the unpleasant lumbar support, short seat base length, and overly high armrests all contribute to the general lack of comfort.

Now, if the Kraken only cost $250, like the Corsair TC100 Relaxed does, then I'd happily forgive its failings. But what's acceptable at a few hundred dollars is absolutely not at $500, even accounting for my atypical body dimensions. You might fit the Blacklyte Kraken better than I did, nor suffer any of the quality control issues, but are you willing to take the risk? For this kind of money, I suspect the answer is no.

]]>
https://www.pcgamer.com/hardware/gaming-chairs/blacklyte-kraken-review/ EXkp8CrCwBp9SiMtMLJYX7 Wed, 20 Nov 2024 17:36:24 +0000
<![CDATA[ Stalker 2: Heart of Chornobyl review ]]>
Need to know

What is it? A big wide systemic FPS sandbox in the Chornobyl exclusion zone.

Expect to pay $60/£50

Developer GSC Game World

Publisher GSC Game World

Reviewed on RTX 4080, Ryzen 7 3700X, 32GB DDR4 RAM

Steam Deck Unsupported

Link Official site

I admit it: I was scared. The gleaming trailers, the Microsoft showcases, the Unreal Engine 5—Stalker 2's marketing didn't look like any Stalker I've known and loved. Had GSC traded in all the series' beloved jank in pursuit of streamlined console success? Had it made Metro by another name?

Well, never trust a trailer. It might be shinier and it might have gamepad support, but Stalker 2 is still Stalker down to its bones, that unique and unreplicable mixture of FPS, survival horror, and immersive sim. Whether X-Ray or UE5, the game's ambition still strains against the seams of its engine. It's still filled with systems—factions, artifacts, anomalies, a world filled with people going about their business and the staccato thuk-thuk of Eastern Bloc weaponry—that at times push the whole thing to breaking point and beyond it.

(Image credit: GSC Game World)
Putting Stalker to the test

PCG hardware guru Nick Evanson has been hard at work putting Stalker 2 through an exhausting battery of performance tests across all sorts of hardware configurations, including handheld PCs. You can find his full Stalker 2 performance analysis here.

It's excellent, and undoubtedly my personal game of the year, but here comes the caveat. I meant what I said: Stalker 2 is Stalker to the bone, and that means the bad stuff too. There were errors, crashes, progress-halting bugs and at-times hilarious glitches in animation and AI, plus minor stuttering that I just came to accept as the price of admission, even at 1440p on my 3700X, RTX 4080, and 32GB RAM-equipped machine. And though a meaty day-one patch has helped a lot, the game still feels rickety: a bit stuttery, with AI that still sometimes fails to distinguish between friend and foe, and so on.

Lord knows I can't fault the devs—that the game exists at all despite its home country being invaded partway through development is a miracle—but it does mean the whole thing feels like it hasn't quite set just yet. Do I love Stalker 2? Yes. Does Stalker 2 make itself easy to love? No. Or at least, not without the patches that GSC promises are in the pipeline.

Fire and forget

(Image credit: GSC Game World)

You don't play anyone familiar in Stalker 2 (which is actually the fourth in the series, if you're new to the Zone). You're Skif, an ex-military type from the Mainland—the relatively normal Ukraine outside of Chornobyl—who awoke in the night to find his kitchen rudely annihilated by an artifact, one of the many priceless, semi-magical treasures that the Zone produces as a matter of course, and which make it catnip to fortune-seekers.

Down one apartment and rather annoyed about the whole thing, Skif ventures into the Zone to figure out just what the hell an artifact thinks it's doing spawning in his flat, and soon finds himself entangled in a multi-layered plot pileup of factional warfare, personal vendettas, competing ideological visions, and sporadic gangland violence, with plenty of choices for you to make about who to side with along the way.

(Image credit: GSC Game World)

Stalker has always, quietly, had one of the stranger and more philosophical stories in the videogame narrative pantheon, and boy does Stalker 2 make good on that legacy. It doesn't so much have a plot as it has a series of small ones in sequence that all eventually add up, each taking you to some new part of the astonishingly massive Zone map, which encompasses, so far as I can tell, every location featured in previous games and then some, and which looks more beautiful than ever with all the whizzbang bells and whistles of 2024. Never before have videogames let us descend into cursed basements with lighting so atmospheric or so eerie.

Never before have videogames let us descend into cursed basements with lighting so atmospheric or so eerie

Exploration feels never-ending, as do the 2-kilometre sprints every time the game sets down a new quest marker that seems almost spitefully far away (fast travel requires you to speak to paid guides in population hubs, and they only go to other hubs). Ah well, you'll probably pick up some neat artifacts on the trip.

I enjoyed the tales-within-a-tale structure, for the most part, but it does have the discombobulating effect of making you feel multiple times like you are approaching the final showdown, the great denouement, the climax of climaxes, only for the game to turn around and pretty much say 'and now, the rest of the story. Please turn your cassette to side B'. Its pacing gets fatiguing, particularly as you get into the true endgame but things still continue to go on. And on. And on.

(Image credit: GSC Game World)

The world at war

But you don't really come to a Stalker game for its story. You come for its stories: the bizarre anecdotes that the game's systems and "A-Life" AI can't help but spit out, and Stalker 2 has those in spades. This is a bonafide clockwork world, and its NPCs—who to this day retain their utterly charming procgen names like "Gena Badass," "Vanya Ampoule," "Max Sleepy," and so on—are just as much subject and victim to its whims as you are.

Take, for instance, the military checkpoint that solved itself. Sprinting to a quest marker, I ran into a roadblock set up by the bullying, military Ward faction. I was not on good terms with the Ward. The main plot had given me multiple opportunities to side with them and I responded to every single one with gunfire, so the troops weren't keen on letting me pass.

(Image credit: GSC Game World)

A conundrum. I stood, brow furrowed, rain pattering on my hardened military exoskeleton, pondering how best to resolve the situation. I could open fire, I could sneak through, I could search for some other means of ingress.

Or I could count on the enormous pack of feral dogs that descended out of the darkness to lay total waste to the camp. Stalker 2's stars had aligned in such a way that these dogs, on their travels, happened to walk right into the checkpoint I needed to get through. Like a plague sent by God, they devoured every troop in there, taking heavy casualties themselves as the panicked soldiers let rip a hail of gunfire that turned night into day. After 15 seconds my problems were a dim memory. I put down the few remaining pups and carried on.

(Image credit: GSC Game World)

This, to me, is Stalker's soul, and it burns bright in Stalker 2. It's not just the checkpoint dogs, it's the quasi-bossfight with a big scary mutant that got cut short before I even breached the basement he was hiding in, because he'd gotten caught in a burner anomaly and died. It's watching two hostile factions destroy each other in a woodland skirmish before you pick off whoever remains for the loot. It's being pursued across the map by a faction you've tanked your rep with by, uh, picking off its members in the aftermath of a woodland skirmish. It's a world that obeys its own rules without favour or bias: action meets consequence meets action, on and on and on.

But it's also here, in the knotty systemic stuff like the faction system, that some of the worst bugs reared their head for me. Although I went to great lengths—including multiple extended bouts of mass murder—to communicate my dislike for the Ward, the game never quite processed that I had completely severed ties with them.

One of their main quests, loose and orphaned, appeared in my log and sat there abandoned right up to the ending. This confused things multiple times when I had an objective along the lines of 'kill all these Ward guys.' The enemies wouldn't attack, and when I had finished the slightly disconcerting task of euthanising my foes like cows, the quest log wouldn't update, leaving me stranded. To be fair, GSC's day-one patch mostly fixed this. The quest was updated, but the Ward still refused to treat me like an enemy. Better, but not quite a clean bill of health.

(Image credit: GSC Game World)

The rough with the smooth

But as a Stalker die-hard even the bugs feel a bit like an old friend. It's these idiosyncrasies—the quirky, unforgiving, systems-driven game design and, yes, even the jank, that I worried GSC might be tempted to sacrifice in the leap to whatever 'AAA' means.

More fool me. The devs have been resolute in refusing to sand down any of the series' rough edges for a new audience of potentially more casual players. Stalker 2 does not hold your hand and the Zone does not care about you. This is a game where the first mutant you fight is completely invisible, and that won't hesitate to chuck its strongest enemies at you if you happen to make a wrong turn early on. Even if you escape, you spent precious resources on staying alive and your equipment probably got chewed up in the tussle; they'll get you next time.

(Image credit: GSC Game World)

It's almost uncanny to play something with such unsentimental, old-school design sensibilities that looks as good and feels as modern as Stalker 2 does, and it's a welcome return for one of PC gaming's greatest and most eccentric series. Yes, there are bugs aplenty, so maybe give it a little while before you dive in, and yes, the abrasive mechanics might put some people off. But it's great to be back in the Zone after over a decade away, and to find that it hasn't sacrificed the things that made it so special all those years ago.

]]>
https://www.pcgamer.com/games/fps/stalker-2-review/ uDF2JsmLdT9SLEA8Uw3Hib Wed, 20 Nov 2024 14:00:10 +0000
<![CDATA[ BenQ MOBIUZ EX321UX review ]]> The high-end PC monitor market has been all about OLED for the last 18 months or so. It's been one big-money OLED after another. But this BenQ MOBIUZ EX321UX has just landed to remind us that, yes, there is actually an alternative. This is a conventional 4K 32-inch LCD panel but with a mini-LED backlight enabling OLED-baiting HDR visual sizzle.

It's a full-array affair with 1,152 zones. On paper, it enables HDR1000 support, which means a peak brightness of 1,000 nits. This panel will also do fully 700 nits full-screen, something that no existing OLED monitor can even approach. Even the latest generation large-screen OLED tech tops out at about 250 nits full screen.

Indeed, with a typical price of around $1,200, this BenQ really does need to do something a bit special to justify its existence. Currently, 4k 32-inch OLED panels can be had from around $800. An LCD panel for 50% more cash is a serious ask.

Anyway, what with all the attention on OLED of late, it's worth briefly recapping what this mini-LED, full-array shizzle is all about. The idea is to use an active backlight to compensate for the fact that LCD panels allow light to leak through even when a given pixel is supposed to be switched off or showing a very dark colour.

BenQ MOBIUZ EX321UX specs

A photo of a BenQ MOBIUZ EX321UX gaming monitor on a desk

(Image credit: Future)

Screen size: 32-inch
Resolution: 3,840 x 2,160
Brightness: 1000 nits HDR, 700 nits typical
Colour coverage: 99% DCI-P3
Response time: 1 ms
Refresh rate: 144 Hz
HDR: HDR1000
Features: IPS panel, 1,152 dimming zones, HDMI 2.1 x3, DisplayPort 1.4, USB-C, KVM switch
Price: $1,199 | £1,108

To fix that, theoretically at least, you use a backlight split into zones. That allows the backlight intensity to vary across the panel according to the image being shown. For brighter areas of the image, you crank up those zones, for darker areas, the opposite.

If that sounds like a neat trick and a good idea, it comes with downsides. We'll dig into the details momentarily, but the basics involve weirdnesses arising from the algorithms used to control the zones and the resolution of the backlight itself. 1,152 zones might sound like a lot, but it actually means each zone is lighting up 7,200 pixels. That's not exactly precise.

Full-array backlighting aside, the BenQ MOBIUZ EX321UX has plenty more going for it. It's a 144 Hz 4K panel, for starters. In this age of 480 Hz-plus panels, that's a relatively modest refresh. But it actually makes sense with the 4K native resolution. Even with upscaling, realistically, you're not going to be playing the latest games at 4K and 200 Hz-plus, especially if you don't have an RTX 4090.

Image 1 of 2

A photo of a BenQ Mobiuz EX321UX gaming monitor on a desk

(Image credit: Future)
Image 2 of 2

A photo of a BenQ MOBIUZ EX321UX gaming monitor on a desk

(Image credit: Future)

This is a very bright, clean, punchy IPS panel.

It'll do its 4K60 thing over both HDMI and DisplayPort. Oh, it'll do it over USB-C, too, which includes 65 W of power delivery. That's not going to be enough to keep a proper gaming laptop juiced. But it does make this monitor ideal to share between a gaming desktop and laptop or, whisper it, a console. Speaking of connecting multiple PCs, there's full KVM switch support, too. Incidentally, the HDMI interfaces have eARC 7.1 support and there's a built-in audio DAC.

If the feature set is pretty strong, BenQ has also leaned into the aesthetics a little, too. There's white plastic for the stand and the rear of the screen that gives a similar overall vibe to Samsung's high-end Odyssey gaming monitors and helps to justify at least some of the price premium. It's a pretty good-looking monitor, then, just not anything truly exceptional.

Of course, it's the image quality where the BenQ MOBIUZ EX321UX really needs to stand out and first impressions are good. This is a very bright, clean, punchy IPS panel. Notably, BenQ has gone for a matte rather than a glossy anti-glare coating, the latter being borderline compulsory on OLED monitors of late.

Image 1 of 6

A photo of a BenQ Mobiuz EX321UX gaming monitor's menu options

(Image credit: Future)
Image 2 of 6

A photo of a BenQ MOBIUZ EX321UX gaming monitor, showing the rear side of the display

(Image credit: Future)
Image 3 of 6

A close-up photo of a BenQ MOBIUZ EX321UX gaming monitor's input/output ports

(Image credit: Future)
Image 4 of 6

A photo of a BenQ MOBIUZ EX321UX gaming monitor, showing the USB ports on the bottom edge of the display

(Image credit: Future)
Image 5 of 6

A photo of a BenQ MOBIUZ EX321UX gaming monitor

(Image credit: Future)
Image 6 of 6

A photo of a BenQ MOBIUZ EX321UX gaming monitor

(Image credit: Future)

Matte versus glossy is a pretty subjective thing. But one thing is for sure, this panel is easily bright enough to offset any perceived dullness that can come with matte anti-glare coatings. It looks good for every one of those 1,000 nits in HDR mode and SDR content can be set almost uncomfortably bright.

On the subject of SDR versus HDR, SDR content looks absolutely bob-on in HDR mode, with perfect colour calibration. So, this is very much a display you can set to HDR and just leave it, regardless of content type.

(Image credit: Future)

Another brightness-related foible is that this display allows you to adjust the overall brightness levels in HDR mode. That's unusual and somewhat counter-intuitive given that brightness data is inherent to an HDR signal. But it is actually a welcome feature and one we wish more monitors would include.

Whatever, the HDR experience is mostly impressive. This is a really zingy, impactful display. It's also more prone to light up a backlight zone for a small image element than many full-array LCD monitors.

In the end, there are unavoidable compromises that have to be made when the backlighting resolution is so low.

On the plus side, that means you don't lose small details on dark backgrounds. Less welcome is the more apparent popping on and off of backlight zones as a small bright object moves across a dark background, plus the inevitably bright halo that entails.

In the end, there are unavoidable compromises that have to be made when the backlighting resolution is so low. So low, that is, compared to the resolution of the panel itself. And thus also very low compared to the perfect per-pixel lighting of an OLED panel, which effectively has eight million zones to this panel's 7,200. All that said, this thing still looks pretty bloody marvellous in, say, Cyberpunk 2077 with all the HDR bells and whistles fired up.

(Image credit: Future)

Arguably the other major contention when it comes to LCD versus OLED is the assumption that the latter is far faster for pixel response. And it is. But it is also debatable how much that matters.

As IPS panels go, the BenQ MOBIUZ EX321UX is very quick indeed. There's very little blur or smear and, just as important, no obvious overshoot or inverse ghosting. For most gamers, most of the time, the response and clarity will be just fine. But an OLED panel is definitely that little bit better, and a much higher refresh OLED is better still.

Buy if...

You want a seriously punchy 4K panel: At 1000 nits HDR and 700 nits full screen, this thing packs a serious visual punch.

Don't buy if...

You've seen how much OLED monitors cost: For a full $400 less, you could have a 32-inch 4K OLED panel.

Now, were this monitor significantly cheaper than an OLED alternative, it would be easy to rationalise away the response deficit. You could acknowledge OLED's advantage, note how much more it costs and conclude this monitor still looks pretty good for the money.

But at $1,200, it's much trickier. Along with the comparatively clunky HDR rendering, this panel is clearly slower and slightly blurrier than an OLED monitor. And yet it's priced up with the most expensive OLED options. In return, you do get superior full-screen brightness. But that really is about it.

In the end, the BenQ MOBIUZ EX321UX is a very good example of its breed. As LCD gaming monitors with mini-LED, full-array backlights go, we like it a lot. It's got one of the better backlight algorithms we've seen. But at this price point, it's just so hard to recommend. Mini-LED panels like this would be much more appealing as a budget alternative to OLED, priced slightly above a conventional LCD monitor.

At, say, $600 to $700 and roughly $100 to $200 more than a basic 144 Hz 4K 32 incher while still undercutting all the OLED options, this monitor would be intriguing. But at basically double that price point, it really doesn't make much sense.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/benq-mobiuz-ex321ux-review/ d5Mbh5rNHeW7cpqpinmcJf Wed, 20 Nov 2024 10:40:35 +0000
<![CDATA[ Fractal Design Refine review ]]> Ever since it launched back in 2021, the Secretlab Titan Evo has been the best gaming chair and for good reason. With soft materials, a great look, and a good price point for that quality, it's hard to justify getting anything else at that cost. The Fractal Design Refine, with its 'refined' look and comfortable pads has its work cut out for it, if it wants to knock the king off that top spot.

Starting with the very first thing you will do when you get a new chair, putting the thing together, this is a very easy build. The Refine not only comes with a mini user guide but a huge poster going over the main steps. You only need a few screws and an Allen Key, all of which are provided for you, and mine even had a few spare screws, which can be a blessing if you happen to lose one while moving parts around.

Coming with the wheelbase, backrest, seat itself, and a handful of extra parts, things simply slot together, where you can then put the screws in. The seat itself has a few screws already attached in the box, which require a little bit of effort to get out, but the process itself is otherwise pretty painless.

Due to this chair being quite heavy, you will benefit from having a second person to help out but I managed to put it all together in less than fifteen minutes by myself. Due to how easy the armrests are to move (we'll get to that later) I did misread part of the instructions wrong and pop an armrest on backwards, but the hardest part of fixing this was checking my bruised ego in an office full of people.

Fractal Refine Specs

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)

Max rec. height: 6 ft 6 in
Max rec. weight: 125 kg (275 lbs)
Recline: 125°
Material: Cold-cured foam
Armrests: 4D
Colours: Fabric Light / Fabric Dark / Mesh Dark / Mesh Light / Alcantara
Launch price: $549 | £475

The Fractal Design Refine is a great looking chair, with a whole host of black and white toned colours to choose from. Mine came in the standard 'Fabric Light' which is an opaque off-white colour, where 'Mesh Light' is more of a breathable transparent shade.

You then have 'Fabric Dark' and 'Mesh Dark' which are similar to the white options, and the line is finished with Alcantara, which is much more expensive at $899 and made out of the Alcantara, suede-like fabric.

With the Refine, Fractal Design has created a gaming chair that is much closer to an office chair in aesthetic. It doesn't have stripes, logos, or textures typically associated with the gamer aesthetic and the more muted colour palette of its lineup is indicative of this. It is quite a tall chair with a very slim back and a short seat. You don't fall into it like with bigger gaming chairs, and the lumbar support helps keep that posture right.

Importantly, the lumbar support can be adjusted very quickly and intuitively. There's a plate on the back of the chair you can manually move up and down to adjust where your back rests on it, and a knob can be turned which pushes it further into or away from the chair. This means, crucially, you can adjust it while sitting down to find the perfect spot to support your posture.

Image 1 of 5

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 2 of 5

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 3 of 5

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 4 of 5

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 5 of 5

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)

As is expected from a good gaming chair, you can adjust the height, tilt tension, and seat depth with a number of small handles on either side. For the most part, these function well and are easy to pull, though the seat depth handle feels a tad inconsistent.

The others have defined moments where I can tell the chair has registered the pull but one is a bit mushy. They are directly on the side of the seat cushion, which means you don't have to reach under your chair, as is the case with many gaming chairs.

The price point puts it at basically the same range as the Secretlab Titan Evo and it makes for a decent competitor

Once sat down and set up, this chair is very comfortable and super sturdy. The materials aren't super soft but not so hard as to not welcome the weight of your body as you plop it down.

Everything has this rigidness that feels quite comforting and makes me feel less guilty about my posture after a long gaming session. The recline on the Refine is comfortable too, giving just enough space in between each position you can lock it in to lean back a little.

As well as this, there's some smart bits of design, like the headrest that can be popped off with a button and hooked in with metal prongs on the back. This not only allows you to change where it sits on the chair but avoids you accidentally knocking it out of place.

Image 1 of 3

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 2 of 3

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)
Image 3 of 3

Pictures of the Fractal Design Refine gaming chair

(Image credit: Future)

This rigidity is one of the better parts of the chair, for me, but is also responsible for my biggest problem. The seat itself has curved edges shaping up at the sides, so, if you have particularly big thighs (like myself) or like to pop a leg under yourself as you sit, the cushion will jut into you.

Buy if…

You don't like the aesthetic of most gamer chairs: This is the least 'gamery' chair I've used in some time, with a clean and pretty aesthetic.

✅ You like a smaller seat: From both its length to its width, the shape of this chair helps sit you uptight, which is a pretty smart design.

✅ You don't want a Secretlab: At its price point, this offers a good alternative to the best gaming chair you can currently buy.

Don't buy if…

❌ You have bigger legs: If your thighs are a little on the big side or you like to partly sit on your leg in your chair, the curves at the side of the cushion will likely dig into you.

You like your armrests a certain way: Despite how sturdy everything else is, the armrests move back and forth with little pressure, which can be frustrating once you've gotten them just right.

❌ You don't like a firm chair: Though comfortable, the cushion is quite strong, and this means you can't fully collapse into it like you might some other chairs.

The recommended height and weight for the chair will, of course, warn you of that a littlebut using weight to gauge how your legs may react to the cushion is an inaccurate science.

The armrests are comfortable, though being simply made out of plastic and being quite large, they move back and forward with very little effort. With a click of the button on the outer side of it, you can push them up and down and they snap into place, which is very intuitive but there's no such defined lock for pushing it back and forward.

I have often found myself knocking the arm rest out of its best position by casually moving my arms back and this left me never quite relying on them. This is a shame as the 4D aspect of the armrests work well and they can move relatively comfortably to most positions you might need them in.

Though these flaws definitely betray some of that otherwise premium feel, I do appreciate some of the nicest parts of this chair. The price point puts it at basically the same range as the Secretlab Titan Evo and it makes for a decent competitor with a nice feel and great look.

There's an elegance to a lot of Fractal Design's choices here that really comes through in the Refine's presentation and feel, but given some of the drawbacks, it's hard to choose this when you have so many excellent alternatives out there.

]]>
https://www.pcgamer.com/hardware/gaming-chairs/fractal-design-refine-review/ PA6PfZraXQ5L4oangkyhyE Wed, 20 Nov 2024 10:34:18 +0000
<![CDATA[ Half-Life 2 review ]]> 20 years ago, Half-Life 2 was the recipient of two of PC Gamer's highest review scores ever: 96% in PC Gamer UK, and 98% in the US version of the magazine, which at the time published separate reviews.

"This is the one unmissable game," UK reviewer Jim Rossignol concluded. "It's time to get that cutting-edge PC system. Sell your grandmother, remortgage the cat, do whatever you have to do. Just don't miss out."

I didn't promise any cats to the bank, but perhaps I would have if I'd needed to. My obstacle to playing Half-Life 2 was a pathetic lack of furniture: All I had in my studio apartment at the time was a futon, one Ikea chair, and a wobbly breakfast table that was too compact for a comfortable keyboard, mouse, and monitor arrangement. My Sony desktop PC, Logitech peripherals, and tiny LCD display instead lived on the floor under a window, flanked by a radiator that worked in tandem with the meager heat output of my early-2000s GPU to keep me warm.

If there was a good reason I was living like a disgraced detective in November 2004, it's been compressed in my memory to 'I was in college.' Maybe I spent all my desk money on Half-Life 2. Whatever the case, after setting up an account with a frustrating new service called Steam, I played through Half-Life 2 lying on my stomach with my neck bent at 90 degrees, as if posing for the 'what not to do' section of an ergonomics textbook. We did what we had to do, as Jim said.

20 years later, I'm proud to say that I now own a desk, but I like to think it hasn't changed me. Half-Life 2 certainly did, though, and along with Steam it set a new course for PC gaming. To celebrate the big anniversary of Valve's landmark shooter, we've republished the original text of PC Gamer UK's Half-Life 2 review below—enjoy the brief trip back to one of PC gaming's most exciting moments. —Tyler Wilde, US Editor-in-Chief

Half-Life 2 review - PC Gamer UK, November 2004

It was all in that moment when I just sat back and laughed. I couldn't believe it was quite this good. I chuckled in muddled disbelief, expectations utterly defied. My nervous fingers reloaded the level, knowing that I had to see that breathtaking sequence one more time. It was then that I knew for certain: Valve had surpassed not only themselves, but everyone else too. Half-Life 2 is an astounding accomplishment. It is the definitive statement of the last five years of first-person shooters. Everything else was just a stopgap.

Half-Life 2 is a magnificent, dramatic experience that has few peers.

Half-Life 2 is a near perfect sequel. It takes almost everything that worked from Half-Life and either improves on it, or keeps it much the same. But that simple summation undersells how the Valve team have approached this task. Half-Life 2 is a linear shooter with most of the refinements one would expect from years of work, but it is also a game of a higher order of magnitude than any of the previous pretenders to the throne. The polish and the stratospheric height of the production values mean that Half-Life 2 is a magnificent, dramatic experience that has few peers.

It would be madness for me to spoil this game by talking about the specific turn of events, so spoilers are going to be kept to a minimum. We're going to talk about general processes and the elements of style and design that make Half-Life 2 such an energising experience. Key to this is the way in which Gordon's tale is told. Once again we never leave his perspective. There are no cutscenes, no moment in which you are anything but utterly embedded in Gordon's view of the world. Everything is told through his eyes. And what a story it is. Gordon arrives at the central station at City 17—a disruptive and chilling dystopia. And from there? Well, that would be telling. This is not the contemporary America that Gordon seemed to be living in during the original Half-Life. The events of Black Mesa have affected the whole world. The crossover with Xen has meant that things have altered radically, with hyper-technology existing alongside eastern bloc dereliction.

(Image credit: Valve/PC Gamer)

The world is infested with head-crab zombies and the aliens that were once your enemies now co‑exist amongst the oppressed masses. This very European city is populated by frightened and desperate American immigrants, and sits under the shadow of a vast, brutalist skyscraper that is consuming the urban sprawl with crawling walls of blue steel. It's a powerful fiction. City 17 is one of the most inventive and evocative game worlds we've ever seen. The autocratic and vicious behaviour of the masked Overwatch soldiers immediately places you in a high-pressure environment. People look at you with desperate eyes, just waiting for the end to their pain, an end to the power of the mysterious Combine. Who are they? Why are you here? Who are the masterminds behind this tyranny? The questions pile up alongside the bodies.

Half-Life 2 isn't big on exposition, but the clues are there. You're thrust into this frightening near-future reality and just have to deal with it. Your allies are numerous, but they have their own problems. Your only way forward is to help them. And so you do, battling your way along in this relentless, compelling current of violence and action, gradually building up a picture of what has happened since Black Mesa. The Combine, the military government that controls the city in a boot-stamping-face kind of way, are a clear threat, but quite how they came to be and what their purposes are become aching problems. Once again Gordon remains silent, listening to what he is told so that you can find those answers for yourself.

(Image credit: Valve/PC Gamer)

But even with Gordon's vaguely sinister silence (something that is transformed into a subtle joke by the game's characters) there are reams of dialogue in Half-Life 2. It is spoken by bewilderingly talented actors and animated with almost magical precision. Alyx, Eli, Barney and Dr Kleiner are delightful to behold, but they only tell part of the story. There are dozens of other characters, each with their own role to play. And each one is a wondrous creature. They might be blemished, even scarred, with baggy eyes and greasy hair, but you can't tear your eyes away. People, aliens and even crows, have never seemed quite so convincing in a videogame. Doom 3's lavish monsters are more impressive, but Half-Life 2's denizens are imbued with life. More importantly, they offer respite. Half-Life 2's world is a high-bandwidth assault on the senses that seldom lets up. That moment when you see a friendly face is a palpable relief. A moment of safe harbour in a world of ultraviolence. As Gordon travels he is aided by the citizens of City 17 and the underground organisation that aims to fight the oppressors. Their hidden bases are, like the characters who inhabit them, hugely varied—an abandoned farm, a lighthouse, a canyon scrapyard and an underground laboratory—each superbly realised.

(Image credit: Valve/PC Gamer)

It is this all-encompassing commitment to flawless design that makes Half-Life 2 so appealing. Even without the cascade of inventiveness that makes up the action side of events, the environments become a breathtaking visual menagerie. Cracked slabs and peeling paint, future-graffiti and mossy slate, tufts of wild grass and flaking barrels, shattered concrete and impenetrable tungsten surfaces—City 17 and its surrounding landscape make you want to keep exploring, just to see what might be past the next decaying generator or mangled corpse. Whether you find yourself in open, temperate coastline or mired in terrifying technological hellholes, Half-Life 2 presents a perfect face. The first time you see ribbed glass blurring the ominous shape of a soldier on the other side, or any time that you happen to be moving through water, you will see next-generation visuals implemented in a casual, capable manner. Half-Life 2 doesn't have Doom 3's groundbreaking lighting effects, but objects and characters still have their own real-time shadows and the level design creates a play of light and dark that diminishes anything we've seen in other games. The very idea that people have actually created this world by hand seems impossible, ludicrous. The detritus in the back of a van, the rubbish that lies in a stairwell—it all seems too natural to have come about artificially. Add to this the split-second perfection of the illustrative music, as well as the luscious general soundscape, and you have genuinely mind-boggling beauty.

But these virtual environments are little more than a stage on which the action will play out. And what jaw-dropping, mind-slamming action that is. What's tough to convey in words, or even screenshots, is just how much impact the events of combat confer. This is a joyous, kinetic, action game. The concussive sound effects, combined with the physical solidity of weapons, objects, enemies and environment, make this a shocking experience. Each encounter is lit up with abrupt and impressively brutal effects. Explosions spray shrapnel and sparks, bullets whack and slam with devastating energy. The exploding barrel has never been such a delight. You think that you've seen exploding barrels before, but no: these impromptu bombs, like everything else in the game, are transformed by the implementation of revelatory object physics. Unlike previous games, the object physics in Half-Life 2 are no longer a visual gimmick—they are integral to the action and, indeed, the very plot.

Gordon can pick up anything that isn't bolted down and place, drop or hurl it anywhere you choose. Initially this consists of little more than shifting boxes so that you can climb out of a window, but gradually tasks increase in complexity. Puzzles, ever intuitive, are well signposted and entertaining. If they're tougher than before they're still just another rung up on what you've already learned. This is immaculate game design. There are a couple of moments in these twenty hours where something isn't perfect in its pace or placing, but these are minor, only memorable in stark contrast to the consistent brilliance of surrounding events. There is always something happening, something new. You find yourself plunging into it with relish. Just throwing things about is immediately appealing. You find yourself restraining the impulse to just pick up and hurl anything you encounter. (Free at last, I can interact!) Black Mesa veteran Dr Kleiner is remarkably relaxed about you trashing half his lab, just to see what can be grabbed or broken. Combine police take less kindly to having tin cans lobbed at their shiny gasmasks.

But the core process of this new physics, the key to the success of the game, is to be found in the Gravity Gun. Once you've experienced vehicular action and got to grips with combat, Half-Life 2 introduces a new concept—the idea of violently manipulating objects with this essential tool. The gun has two modes, one drags things toward you and can be used to hold, carry or drop them. The other projects them away and can either be used to smash and punch or, if you're already holding something, hurl it with tremendous force. A filing cabinet becomes a flying battering ram, dragged towards you and then fired into enemies, only to be dragged back and launched again to hammer your foe repeatedly, or until the cabinet is smashed into metal shards. Pick these up and you can blast them through the soft flesh of your enemies.

The gravity gun isn't just another a weapon, it's the soul of Half-Life 2.

Killing the badguys with nearby furniture becomes habitual, instinctive. Or perhaps you need cover from a sniper—picking up a crate will give you a makeshift shield with which to absorb some incoming fire. Likewise, you immediately find yourself using the gravity gun to clear a path through debris-blocked passages, or to pick up ammo and health packs, or to grab and hurl exploding barrels at encroaching zombies, setting them ablaze and screaming. You can even use it to grab hovering Combine attack-drones and batter them into tiny fragments on concrete surfaces. Soon the gravity gun is proving useful in solving puzzles, or knocking your up-turned buggy back onto its wheels. Yes, a buggy. I'll come back to that. The gravity gun isn't just another a weapon, it's the soul of Half-Life 2. Do you try to bodge the jump over that toxic sludge, or take the time to use the level's physics objects to build an elaborate bridge? Do you waste ammo on these monsters or pull that disc-saw out of where it's embedded in the wall? Of course, you always know what to do. When there's a saw floating in front of your gravity gun and two zombies shamble round the corner, one behind the other, well, you laugh at the horrible brilliance of it. Yeah, I think that was the moment that I sat back and laughed. It's just too much.

Sometime after these experiments in viscera comes Gordon's glorious road trip. Simplicity incarnate, the little buggy is practically indestructible, but also an essential tool for making a journey that Gordon can't make on foot. Dark tunnels, treacherous beaches and bright, trap-littered clifftops become the new battleground. Like the rest of the game there are oddities and surprises thrown in all the way through. The bridge section of this journey would make up an entire level in lesser shooters. And yet here it is, just another part of the seamless tapestry of tasks that Gordon performs. Also illustrative of the game as a whole is the way in which the coast is strewn with non-essential asides. OK, so you're zooming from setpiece to setpiece, but do you also want to explore every nook and cranny, every little shack that lies crumbling by the roadside? Of course you do. This is a game where every hidden cellar or obscure air-duct should be investigated; you never know what you might find.

(Image credit: Valve/PC Gamer)

Investigating means using the torch that, oddly, is linked to a minor criticism of the game. Both sprinting and flashlight use are linked to a recharging energy bank. It's clear why this restriction was imposed, but it's nevertheless a little peculiar. The quality of the game meant that I was searching, rather desperately, for similar complaints. Smugly I assumed that my allies in a battle were non-human because that way Valve dodged the lack of realism and other problems created by fighting alongside human allies. Of course my lack of faith was exposed a few levels later, when I found myself in the midst of the war-torn city fighting alongside numerous human allies who patched me up, shouted at me to reload, apologised when they got in the way and fought valiantly against a vastly superior force. What a battle that was. I want to go back, right now. The striders, so impressive to behold, are the most fearsome of foes. Fighting both these behemoths and a constant flow of Combine troops creates what is without a doubt the most intense and exhilarating conflict ever undertaken in a videogame. The laser-pointer rocket launcher is back and even more satisfying than ever before. Rocket-crates give you a seemingly infinite resupply to battle these monsters but it's never straightforward. Striders will seek you out, forcing you under cover, while the whale-like flying gunships will shoot down your rockets, inducing you to resort to imaginative manoeuvring to perform that killing blow. Even dying becomes a pleasure—you want to see these beasts smash through walls and butcher the rebels, again and again. Oh Christ, what will happen next?

I could talk about how those battles with the striders almost made me cry, or about the events that Alyx guides you through so cleverly, so elegantly. I could talk about the twitchy fear instilled by your journey through an abandoned town, or the way that the skirmishes with Overwatch soldiers echoes the battles against the marines in the original Half-Life. I want to rant and exult over this and that detail or event, this reference or that joke. I want to bemoan the fact that it had to end at all (no matter the excellence of that ending). And I'm distraught that we'll have to wait so long for an expansion pack or sequel. I even had this whole paragraph about how CS Source will be joined by an army of user-fashioned mods as the multiplayer offering for this definitively singleplayer game. But we're running out of space, out of time. There's so much here to talk about, but in truth I don't want to talk, I just want to get back to it: more, more, more... You have to experience it for yourself. This is the one unmissable game. It's time to get that cutting-edge PC system. Sell your grandmother, remortgage the cat, do whatever you have to do. Just don't miss out.

]]>
https://www.pcgamer.com/games/fps/half-life-2-review-uk/ 8GDqytpUUm2SVE5RqxMq5M Wed, 20 Nov 2024 01:38:29 +0000
<![CDATA[ Philips NeoPix 750 review ]]> The Philips NeoPix 750 is a $487 / £399 all-in-one projector with a focus on providing that cinematic experience on a budget. It tops out at 1080p, which is to be expected. However, that much cheaper price doesn't excuse its cheap build quality. Admittedly, projectors represent one area of tech that I'm quite snobby about, but with a sea of laggy Android TV boxes with a barely acceptable picture, this is a reasonable stance in my mind.

Immediately out of the box, I noticed that the NeoPix 750 is light. It's not slim, or easily packed away. Others in its bracket from competitors, such as XGIMI or Samsung, go for a much smaller form factor, and the bulky NeoPix is just a pain to set up unless I completely rearrange my room.

It's a similar size to a classic VHS player but in both directions. When I saw the promotional images I was imagining a far smaller device, but no, the NeoPix makes itself quite known.

The sense of cheapness seeps into the projector's focus ring. It's thankfully stiff, so there's no chance of it randomly going out of focus. However, it consistently felt like it might snap off with the wrong twist.

NeoPIx 750 specs

Philips NeoPix 750 projector

(Image credit: Future)

Display Technology: LCD
Native Resolution: 1080p FHD
Throw distance: 120-inch @ 144-inch distance (367 cm)
Brightness: 700 ISO Lumens
LED lamp Life: 30,000 hours
Inputs: 1x HDMI, 1x USB-C, 1x USB-A, 1x 3.5mm headphone jack
Weight: 3.9 kg (8.6 lbs)
Size: 305 x 119.5 x 319 mm
Price: £399 | $487 | €399

With its plasticky remote, I muddled through the setup process. It uses Android, but is masked under Philips' custom "LumenOS", and even that feels mediocre. Seeing a near-stock version of an ancient onscreen Android keyboard in the setup menu really set the bargain-bin mood.

The setup wasn't painless, either. Getting things like Wi-Fi and the remote paired were simple enough, but the limitations of the projector itself reared its ugly head. More expensive projectors I've used in the past—and currently own—beam a much wider area for you to work with and display the image how you want.

Philips' NeoPix 750 doesn't emit the largest space. In fact, it's quite dainty—not its fault, it's a budget-friendly device—but will absolutely cause issues if you don't plan ahead.

It does feature automatic and manual keystone options, where you can manipulate the image to match what you’re projecting onto. I couldn’t get the automatic option to stop angling downwards, and the manual option only reached so far.

Even when using the included digital zoom, shrinking or enlarging the image couldn’t get it to fit and I eventually gave up, beaming it onto an adjacent wall. Altering the picture to fit new placements constantly feels like a battle, but stretching your beamed image out on the different axes always does—even with the more expensive projectors.

Eventually, I opted to just physically move it back as far as I could to get the size and image I wanted. I just wish it would let me break the boundaries a little more, but of course, the NeoPix probably ran out of runway to do so—both physically and due to the software’s limitations.

Image 1 of 2

Philips NeoPix 750 projector

(Image credit: Future)
Image 2 of 2

Philips NeoPix 750 projector

(Image credit: Future)

Placing this on your coffee table and against a naked wall will work great (just as Philips itself advertises), but any room not conforming to a marketing bod's ideal, or with a centerpiece to build the “cinematic” experience around will unfortunately find themselves fidgeting with placement much longer than anticipated.

Part of the issue is that the power cable doesn't just slot into the back. Instead, it's supposed to sit flush underneath, keeping the area tidy. However, the cable groove ends before the projector's edge, meaning it just constantly looks like it's about to fall out of place.

Outside of the physical, LumenOS is inoffensive despite its user interface's ancient aesthetics. Downloading apps, navigating the menus, and watching things all feel quite smooth. Again, this is just Android with an additional layer, so if you've ever navigated a modern smart TV, you'll immediately understand what's happening.

So what about this "cinematic experience" then? Well, for your $500 / £400, you actually get quite a decent image, especially as the winter rolls in and it gets dark earlier in the day. The NeoPix 750 really benefits from a dark room, as with any projector, but using it during the daytime doesn't immediately strike any definite viewing hardships.

Image 1 of 3

Philips NeoPix 750 projector

(Image credit: Future)
Image 2 of 3

Philips NeoPix 750 projector

(Image credit: Future)
Image 3 of 3

Philips NeoPix 750 projector

(Image credit: Future)

Unfortunately, there’s no specific gaming mode here.

Sure, exploring the grotty buildings of the original Silent Hill 2 at 2pm didn't present the best image, but it wouldn't have on a modern TV either. Once it gets dark, however, the image is really quite solid.

Unlike other budget projectors like some of the XGIMI options, Netflix and other apps were natively supported, thankfully. It made testing to watch for abnormalities far less of a headache.

The colours pop and have a delightful vibrancy. Most importantly, they also appear to be fairly accurate. Watching movies presents no major colour aberrations, and the 1080p image is crisp. Even when streaming lower-quality video from YouTube or my Plex server, the NeoPix doesn't falter.

Gaming is something you're also going to want to plan around. While I don't think the average player will notice if their game has input lag, I definitely noticed it in twitch-heavy games. There’s a noticeable few milliseconds between input and action on the screen, masked in slower-paced titles.

Esports and precision titles like Street Fighter will probably feel out of sync. Even a few levels of Warhammer 40K: Boltgun on the Steam Deck, with a wired controller, had more than a few deaths down to input lag. Unfortunately, there’s no specific gaming mode here.

This said, titles like the recent RPG Metaphor: ReFantazio and indie darling Balatro are perfect fits for the NeoPix. Balatro became a multiplayer affair as my partner and I dabbled in the higher-ranking stakes, and Metaphor really let the speakers shine.

Image 1 of 2

Philips NeoPix 750 projector

(Image credit: Future)
Image 2 of 2

Philips NeoPix 750 projector

(Image credit: Future)
Buy if...

You don't want to break the bank: Its $487 / £399 price point does give you a decent image and sound, despite the cheap body.

Want to game on an even bigger screen: The NeoPix can go up to 120 inches, so as long as you keep your gaming expectations in check, you'll have a grand time.

Don't buy if...

Your favourite game is an esports title: The input lag will drive you up the wall, making you miss key winning moments.

You're expecting a top-of-the-line image: While the picture is crystal clear, it's still a 1080p picture on a mid-tier projector.

Its thunderous choir soundtrack really comes to life when it's blaring out of a good set of speakers. Philips might have cheaped out on some of the build quality, but the actual experience of watching or playing something on it feels quite up to the task.

Thankfully its inbuilt speakers are quite loud, because so are the fans. Projectors have a tendency to hum, it's the nature of the beast, but the NeoPix 750 is loud—and hot. My partner mentioned on more than one occasion that the room felt much warmer with it on than not, something we've rarely experienced with other projectors that have come and gone.

The Philips NeoPix 750 isn't the best budget projector on the market right now. Despite their weird Android issues (pre-2024 models still have some apps unsupported), XGIMI still holds the roost in terms of form factor and quality. However, if you want something below that £500 mark, you can absolutely do far worse. Projectors will never be the first choice for gaming and the Neopix is no different. As long as you keep those expectations in check, you'll have a good time at a good price.

]]>
https://www.pcgamer.com/hardware/philips-neopix-750-review/ dqPsNd8YvohYPUentjRiL8 Tue, 19 Nov 2024 21:38:56 +0000
<![CDATA[ Turtle Beach Kone II Air review ]]> If you're looking for a review on the Roccat Kone 2 Air, you should know that the Roccat brand has been assimilated under the Turtle Beach brand name. So I'm actually looking at the next iteration of the Kone Air, here, minus the associated Roccat branding. Got it? Cool.

The Turtle Beach Kone 2 Air is a wireless gaming mouse marketed as an ergonomic wonder with up to 350 hours of battery life. Everything on the box makes me very excited, because it looks like Turtle Beach has finally rounded out the Kone lineup with a chonky beast of a gaming mouse. But does it live up to the hype in actual, everyday testing?

Previously, I spent some time with another wireless sister of the Kone 2 Air—the Kone Pro Air. Just getting the Kone 2 Air out of the box I was already amazed at the vast comparative quality, against both the Pro and what I've seen of its predecessor.

The clearest difference between the Kone 2 Air and its Pro counterpart is the weight. The Pro is more suited to FPS games, weighing in at just 2.6oz / 75g, while the Kone Air has always been the chunkier option for players who don't fling their mouse around willy-nilly. With two AA batteries under the hood, the old Kone Air was around 140g… not an issue in and of itself; people like chunky mice, myself included. It's the weight imbalance that users found problematic.

Kone II Air specs

A Turtle Beach Kone II Air gaming mouse with RGB enabled. And a cute dog sleeping.

(Image credit: Future)

Buttons: 7
Connectivity: USB Type-C, 2.4GHz wireless & Bluetooth dual wireless.
Sensor: Owl-Eye 26K Optical Sensor
Max DPI: 26,000
Max acceleration: 50g
Max speed: 650 IPS
Polling rate: up to 1,000 Hz
Battery life: Up to 350 hrs
Size: 13 x 4.4 x 8.2 cm / 5.1 x 1.7 x 3.2 in
Weight: 110g / 3.8oz
Features: Infinite scroll, RGB Lighting, Grip tape and USB-A to USB-C Transmitter Adapter included
Price: $120 / £120

Thankfully, the Kone 2 Air has done away with AA battery power. It's much more balanced, and you don't need to keep spare batteries to hand. You can just plug in the lovely, light braided USB Type-C cable Turtle Beach has included and click away, unhindered.

Where the build quality is concerned, the Kone 2 Air feels good and sturdy in hand. The ergonomics are lush, with a thumb rest and a button on it that isn't as easy to accidentally click as it looks. There's also something about the thumb grove I really like as a claw grip mouse user. It makes it easy to keep hold of even without slapping the included silicone grips on the sides.

Every click of the Kone 2 Air's seven, well placed buttons feels purposeful. There's a satisfying, light press that doesn't sound hollow like a lot of lighter gaming mice do.

The design has also moved away from those awful metal scroll wheels Kone used to rep, replacing it here with a soft silicone one that lights up with RGB at the edges. Not only does it have the fancy, programmable tilt click function, it also unlocks and goes into freespin mode. If you're thinking infinite scroll is a pointless novelty, I can vouch that it's one of the single best features I've found in a gaming mouse. I've been using the same function on the Logitech G502 and G502 X daily for years and it never stops being useful.

Image 1 of 2

A Turtle Beach Kone II Air gaming mouse with RGB enabled. And a cute dog sleeping.

(Image credit: Future)
Image 2 of 2

A Turtle Beach Kone II Air gaming mouse with RGB enabled. And a cute dog sleeping.

(Image credit: Future)
Buy if...

✅ You move around and switch between games/playstyles: The Kone 2 Air's Easy Shift function and five onboard profiles make it great for changing things up, and playing away. Plus, the battery life is pretty fantastic.

✅ You're looking for a well-rounded, ergonomic mouse: The Kone 2 Air is super comfortable to hold, with a nice grip and thumb rest, and it's great for larger hands.

Don't buy if...

❌ You're trying to save money: I know the prospect of a gorgeous ergonomic mouse with seven buttons sounds great, but there are cheaper wireless gaming mice out there that are almost as good.

❌ You're a bigtime FPS gamer: There's a bit of latency going on with the Kone 2 Air's sensor. It's accurate, but the polling rate is a little inconsistent. With that and its weight in mind, this might be better suited to slightly slower-paced games.

In testing, the Kone 2 Air's Owl-Eye 26K Optical Sensor is relatively accurate, with a nice smooth curve and minimal outliers on MouseTester's xCount vs. Time chart. But while 1000 Hz polling looks great on paper, there are quite a few inconsistencies when it comes to the update rate. It's not enough to notice in general use, but there's around 8ms latency and quite a lot of wavering when shifting the Kone 2 Air around quickly.

One thing to note is that the Swarm 2 Software has come far. It's got a lovely clean look, and it importantly lets you create your own macros for using those seven buttons.

Taking more cues from the Logitech G502 and crew, the Kone 2 Air also includes an Easy Shift function, so you can switch some of the buttons up on the fly, and quickly back again. There's space for five on-board profiles, too, making it a godsend if you play away and switch between games/playstyles a lot. The fact you can get a good few days of battery life out of it over wireless connection (a week or more in Bluetooth mode) makes the portable aspect all the more enticing.

Honestly, I'm struggling to find something bad to say about the Kone 2 Air. Sure it's not the cheapest wireless gaming mouse out there, nor is it the lightest, or the mouse with the most buttons. But having rectified the issues with previous models the Kone 2 Air is perfectly placed in that sweet spot for weighty, ergonomic mice, and one that's worth your consideration even with that $120/£120 price tag.

]]>
https://www.pcgamer.com/hardware/gaming-mice/turtle-beach-kone-ii-air-review/ NFTevRbixXmHQ2URWgRnv8 Tue, 19 Nov 2024 17:20:23 +0000
<![CDATA[ Razer Freyja review ]]> Over the past year, it has become clear that Razer is starting to make a push toward immersive gaming with their Sensa HD Haptics. With both the Razer Freyja and the Razer Kraken V4 Pro featuring Sensa HD Haptics, gamers have a whole new way to immerse themselves in their gameplay like never before.

Being a gamer in my late thirties with three children and all the aches and pains that come with that, comfort is one of my biggest priorities when it comes to gaming. So when I saw Razer had released a haptic feedback cushion for gaming chairs, my first concern wasn’t how well it would work, but instead, how comfortable it is. So I strapped the Razer Freyja to my Secretlab Titan Evo 2022 and was pleasantly surprised by just how comfortable this gaming cushion is.

Being a one-size-fits-all all gaming cushion, I had expected the Razer Freyja to move around and sit uncomfortably on my chair, but after testing it on several chairs (even my dining table chairs) I was surprised by just how well it stayed in place.

Not only does the Razer Freyja feel good, but it looks good too. Sporting the Razer logo on the top in the brand's signature green, as well as green stitching around the edge of the haptic gaming cushion, it’s a stylish design that compliments even the best gaming chairs.

Razer Freyja specs

A Razer Freyja cushion set up on a Secretlab Titan gaming chair.

(Image credit: Future)

Connectivity: 2.4 GHz, Bluetooth
Power: AC adapter (with quick release)
Compatibility: PC, Android
Haptic motors: 6
Max weight: 299 lbs
Price: $300/£300

But at $300/£300, the Razer Freyja is an expensive gaming cushion, thankfully, it’s so much more than just a cushion. With Sensa HD Haptics and six actuators, the Razer Freyja has been designed to rumble you so hard even your ancestors will feel it, this is achieved by offering haptics across your body, thanks to the cushion's six customizable zones.

Each zone can be individually adjusted with the Razer Synapse 4 software, letting you change the intensity without impacting the other zones. While I found customizing the zones pointless when playing Sensa HD-compatible games, I did find it useful in the Audio-to-Haptics mode while listening to music, as I was able to reduce the intensity on my back, without impacting my lower body.

The Razer Freyja’s Audio-to-Haptics mode includes four haptic profiles, each with its own input range and haptic gain. The controlled profile works well with bass and is almost never triggered by voice and high-range frequencies, whereas the dynamic profile offers the most immersive experience by being triggered by higher frequencies as well as lower.

If you feel comfortable playing around with the ranges, there’s also a custom profile where you can add multiple audio cues and set your own ranges. However, I found I was happy using balanced for gaming, and dynamic for music, so I didn’t see the need to invest too much time in a custom profile.

A Razer Freyja cushion set up on a Secretlab Titan gaming chair.

(Image credit: Future)

Although at first, I considered haptic feedback to be a novelty that wouldn’t impress me when it came to music and movies, I must admit, I’ve grown fond of being able to literally feel the bassline as I listen to music on my Razer Leviathan V2 Pro.

The other mode, Sensa HD Games, which is one of the Razer Freyja’s main selling points sadly didn’t impress me as much as the Audio-to-Haptics mode. Sensa HD Games has been designed to offer haptic feedback at crucial stages of the game.

While actually being able to feel the casting of a spell in Hogwarts Legacy is exciting, the mode felt somewhat underwhelming with long periods without haptic feedback, including cutscenes, which could have added to the cinematic experience.

Image 1 of 3

The Razer Synapse application set up for the Razer Freyja cushion.

(Image credit: Razer)
Image 2 of 3

The Razer Synapse application set up for the Razer Freyja cushion.

(Image credit: Razer)
Image 3 of 3

The Razer Synapse application set up for the Razer Freyja cushion.

(Image credit: Razer)

When Sensa HD kicks in, there’s no denying it’s an awesome technology. Not only was I able to feel the gameplay, but after a little time, I was even able to feel the different rumbles between the spells being cast and identify them simply from how they felt. But, as cool as that is, it didn’t take long before I found myself going back to the Audio-to-Haptics mode for a more immersive feel.

It’s not just the feature itself that I found underwhelming, the list of Sensa HD-compatible games is somewhat disappointing too. Although, with it being a relatively new technology, we can expect the list of developers including Sensa compatibility in their games to grow.

A Razer Freyja cushion set up on a Secretlab Titan gaming chair.

(Image credit: Future)
Buy if...

✅ You want to find a unique way to immerse yourself in your games: With its six fully customizable haptic zones, the Razer Freyja offers a new and exciting way to make games, music, and even movies more immersive than ever.

Don't buy if...

❌ You want a product that will be at its best out of the box with all games: While the list of Sensa HD games is rapidly growing, many games simply won't work as intended without resorting to the Audio-to-Haptics mode.

The Razer Freyja works with Bluetooth, as well as the supplied 2.4 GHz HyperSpeed wireless dongle. Setting up the gaming cushion is a breeze, it’s as simple as plugging it in, powering it on, and connecting to it, it couldn’t be easier.

The left side of the gaming cushion offers a simple control panel, with a power button, haptic intensity controls, and another button to change the source. I found these buttons a little stiff and difficult to press at times, but once you’ve got the gaming cushion set up there’s little need to use them, as everything can be controlled via Razer’s Synapse 4 software.

The Razer Freyja is an odd piece of kit that has somehow managed to surprise me. I’m seeing a lot of accessories these days to improve gaming immersion, including a device that literally lets you smell your games and movies. While most of these accessories have left me unimpressed, the Razer Freyja has given me hope that we haven’t reached the pinnacle of gaming immersion yet.

The Razer Freyja may have its flaws, albeit, many of them coming from the fact that the technology is still in its infancy. But, with time and a few tweaks, it could easily become one of my favorite ways to further immerse myself in-game.

]]>
https://www.pcgamer.com/hardware/gaming-chairs/razer-freyja-review/ yFpJupt3V5SCTq7dHG4CLJ Tue, 19 Nov 2024 14:52:49 +0000
<![CDATA[ Lenovo Legion Pro 7i (Gen 9) review ]]> The Lenovo Legion Pro 7i makes a hard play for the best gaming laptop guide, and when you set it to full power it does so with flying colours. I'm not just talking about RGB-lit edges here. This chunky 16-incher has a heck of a lot going for it, with a sterling core config and great supporting spec, too.

Of course, there has to be a caveat, right? Something must be holding this beast back. It took a bit of testing, but aside from a base price tag to rival most RTX 4090-powered gaming laptops today, I think I've spotted it: gaming battery life. Seriously, what's a laptop without portability?

If you can lift its hulking aluminium mass out of the box, you'll find that the Legion Pro 7i chassis is not just thicc but beautifully understated. Matte black with a silver Lenovo tag on the back, and Legion indented in larger letters on the opposite corner. The lid is soft close, and unlike the RedMagic Titan 16 Pro there's no over-the-top design elements giving away its gamer-ness. The back and sides have a distinct speckled look—like classy glitter—though you won't see it unless you get your nose right up in the vents.

None of it is offensive to the eye and, while some might be averse to the RGB that lights up the front edge and keyboard, you can always turn it off when your co-workers are looking.

Legion Pro 7i specs

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)

CPU: Intel Core i9 14900HX
GPU: RTX 4080 (175 W)
RAM: 2x 16 GB DDR5-5600
SSD: 2x 1 TB NVMe PCIe Gen4x4 (SKHynix_HFS001TEJ9X115N)
Screen: 16-inch | 2560 x 1600 pixels (16:10)
Refresh rate: 240 Hz
OS: Windows 11
Weight: 2.66 kg | 5.86 lb
Connectivity: Bluetooth 5.1, Intel Killer WiFi, 1x USB Type-C 3.2 Gen 2 (DisplayPort 1.4, 140W PD), 1x USB Type-C (Thunderbolt 4, DisplayPort 1.4) , 3x USB Type-A 3.2 Gen 1, 1x HDMI 2.1 (8K @ 60Hz), 1x 3.5mm audio jack, 1x Ethernet, 1x USB Type-A 3.2 Gen 1 (Always On)
Dimensions: 363.5 x 262.1 x 21.95-25.9 mm | 14.31 x 10.32 x 1.02 in
Price: $2,729 | £2,584

There's a USB Type-A port placed on either side of the Pro 7i, as well as one of the USB Type-C ports, but all the other ports are located on the back—something that helps keep it looking clean on the desk, and it means wires don't get in the way of your mouse. Aside from the side ports, there's another USB Type- A on the back alongside a USB-C 3.2 Gen 2 Thunderbolt 4 port, which doubles as DisplayPort 1.4. The other USB Type-C is another DisplayPort 1.4 which also comes with an impressive 140 W Power Delivery. A good selection of ports, then.

After four years of testing gaming laptops, you start to get a feel for them. If the sheer weight of the thing hadn't given it away, turning the Lenovo Legion Pro 7i on made it clear I was in the thick of premium gaming laptop territory. And boy, does this thing go.

One thing I did note after the initial setup are the popups. There's some bloatware installed, including Tobii and McAfee. Tobii runs in the background from startup and uses up over 150 MB of memory, which isn't ideal, so you might want to do some uninstalling when you first get hold of the Legion Pro 7i. That said, Lenovo's own included Arena software does a great job at keeping all my game libraries in one place, and Vantage is great for system monitoring and one-click overclocking.

Aside from that, just clicking around in everyday use I've been impressed how speedy the Legion Pro 7i is. It's totally unphased by the masses of tabs I have open in Chrome. That's down to the 32 GB of dual-channel DDR5-5600 RAM Lenovo has packed under the hood. Two sticks of 16 GB memory means greater bandwidth, something that a lot of gaming laptops overlook. Lenovo has done no such thing with the Legion Pro 7i, and the attention to supporting components extends to storage, too. It's just frustrating that in the cheaper 16 GB config that means just a single stick of memory, rather than a pair of 8 GB sticks, which halves the potential bandwidth on offer.

Image 1 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 2 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 3 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 4 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 5 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 6 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 7 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)
Image 8 of 8

Lenovo Legion Pro 7i Gen9 gaming laptop

(Image credit: Future)

But the configuration I'm looking at includes two 1 TB NVMe SSDs. Not just your average proprietary storage either—these are SK-Hynix PCIE Gen 4x4 drives, and they're acing every storage benchmark I throw at them. We're talking excellent bandwidth and some of the lowest storage access times I've seen in similar laptops.

Blaring out of the bottom are two 2 Watt Harman speakers which I tested with a bit of Sleepnet. They're a bit tinny, but they handle deep, bassy synth really well and don't crackle when I whack them up to full volume. There's some lows missing, but they happily make the table shake.

When we get down to the core components, the Lenovo Legion Pro 7i is rocking a fantastic combo of Nvidia's RTX 4080 and an Intel Core i9 14900HX. That makes it a star in the CPU heavy benchmarks, meaning the Pro 7i absolutely owns when it comes to rendering.

Out of the box I was getting about average frame per second scores for something of its weight class, but there's no discernible stuttering to note and the fans don't whirr up and blow my ears off throughout, so it's a pleasant gaming experience all round.

Turn on Thermal Mode and that 175 W GPU realises its full potential with an extra 10–20 frames per second, even stretching to another 30 fps in some instances. Though it does push the components to their limit. When that 175 W RTX 4080 GPU goes full blast the cooling array is able to still maintain a decent GPU temperature, but the CPU tops out at about 100°C—hot enough to boil water—and you can bet it gobbles up the battery charge.

Which brings me to my main concern. Unplugged under a heavy gaming load the Legion Pro 7i only manages to eek out 40 minutes. That's one of the worst gaming battery life scores of this generation, and it even falls flat against most of the RTX 4090 powered gaming laptops we've tested this year.

Buy if...

✅ You're in need of hardcore rendering in a smallish package: The Legion is one of those productivity machines best suited for people who do a lot of rendering. Yes you can get some gaming done on it too, but what you're really paying for is that smashing 14th Gen Intel CPU.

✅ You're willing to forgo the masses of RAM and extra storage: In its 16 GB RAM / 1 TB SSD config, the Lenovo Legion Pro 7i starts to make far more sense price wise. It'll see you right for gaming, and save you the big bucks.

Don't buy if...

You're looking to save money: There are far cheaper RTX 4080 machines out there that match it in terms of gaming benchmarks.

You plan to use your laptop unplugged: The Pro 7i is what we call a 'desktop replacement' laptop. You're not going to get much gaming out of it unplugged, plus the weight alone is enough to put you off tucking into a backpack.

The thing to remember is that this is a gaming laptop with an MSRP of $3,220/£3,060 in a sea of sub-$3,000/£3,000 machines of almost the same spec. For that price, you shouldn't have to compromise. Yes, you're getting a current generation Intel CPU backing this one up, but unless you're working as a game developer or in the film industry, this particular config is a little overkill for most gamers.

Of course there are configuration options, and it's already heavily discounted and, if previous Lenovo Legion machines are anything to go by, I would expect to see that becoming effectively its standard price. With white LEDs only it's £30 cheaper (though that's only an option in the UK). You also stand to save $200/£170 if you decide against the second SSD and opt for 16 GB of single-channel RAM. What I'm getting at here is that this is one configurable machine, and definitely more so in the UK.

The Legion Pro 7i is a lovely machine, but in this configuration you could spend far less cash to nab the similarly specced and super svelte, OLED-topped ROG Zephyrus G16—currently our favourite gaming laptop. And while it doesn't quite hit the same $3,600 premium of the newest Razer Blade 16 with the same CPU in it, our favorite 17-inch, RTX 4090-powered gaming laptop, the Gigabyte Aorus 17X AZG, comes in at the same price.

That said, the Pro 7i is often found with a huge discount on the Lenovo site, and is likely to be for some time. If you can forgo the extra memory and storage, it's well worth a look.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/lenovo-legion-7i-gen-9-review/ 3fVjjh2cVRKQUZ5tePiwtJ Mon, 18 Nov 2024 12:00:04 +0000
<![CDATA[ LEGO Horizon Adventures Review ]]>

What is it? A child-friendly adaptation of Horizon: Zero Dawn using the Lego formula
Expect to pay: £59.99/$59.99
Developer: Guerrilla, Studio Gobo
Publisher: PlayStation Publishing LLC
Reviewed on: Ryzen 7 7700X, RTX 4080, 64GB RAM
Steam Deck: Verified
Multiplayer? Offline and online co-op
Link: Official site

I’m not really sure what Sony or Guerilla saw in the crossover potential of Lego Horizon Adventures. Lego and PlayStation's post-apocalyptic dino-hunting series don’t seem opposing, exactly, but before sitting down to play it I didn’t see any multiplicative Big Brand power in combining them either—especially when compared to Lego Harry Potter or Star Wars games.

Now that I’ve finished playing it, I see even less reason to justify Lego Horizon Adventures' conception. I honestly don’t know why this game exists, and frankly I’m not even sure the people behind it do either.

The Build Up

Aloy from Horizon Zero Dawn in Lego Form

(Image credit: PlayStation Publishing LLC)

Lego Horizon Adventures is a very loose adaptation of the plot of 2017's Horizon Zero Dawn (emphasis on loose). The story tackles most of hunter Aloy’s journey, including her battle with cult leader villain Helis and her attempt to understand cryptic tales of the "old world". A lot of the nuance of Zero Dawn’s story gets lost in the sauce here. Aloy’s outcast status within her tribe, a relatable and meaningful struggle for acceptance that I feel would have fit well in this family-friendly retelling, is a throwaway joke. Perhaps this should have been my first warning sign, as Lego Horizon Adventures would continue to flee from depth at every opportunity past this point.

A lot of the jokes didn’t land for me, but I'll cut it some slack for failing to make a 30-year-old man laugh. The writing, though, never ventures beyond the tired construction you've seen countless times in Dreamworks animated films. Haters of the "they’re right behind me, aren’t they?" style of Marvel-esque humour are in for a rough time. I think even its target audience will grow tired of its lazy writing by the time of the campaign’s conclusion—which is almost impressive, given how incredibly short this game is.

Mental Block

(Image credit: PlayStation Publishing LLC)

Unlike other Lego games, where characters tend to function more like skins rather than characters, Horizon Adventures' playable characters all have their own unique attacks and gadgets. For example, Aloy is ranged-focused with her bow whereas her companion Erend gets in close with his big hammer.

The levels feel exactly the same mere hours after you’ve started.

The issue is that the levels themselves don’t vary at all. Almost every single five-minute-long mission just tasks you with killing a bunch of enemies and traversing extremely similar maps, which makes all of the levels feel exactly the same mere hours after you’ve started.

It started to get so cut-and-paste that there were times I was convinced I’d somehow accidentally opted to replay a mission. There are some clever ways the game tries to disguise this blatantly short loop, as the tasks you’re given and the settings you complete missions in do change—but these tasks only result in different cutscenes, and the locales are merely visually distinct.

Combat is the videogame equivalent of snapping together 10,000 of the same basic 2x2 Lego brick: you’ll simply be spending most of your time spamming the attack button and dodging the odd projectile. The consumable gadgets you can find add a tiny and brief thrill of variety, letting you cast big AoE attacks or freeze enemies in place, but combat will still begin to feel repetitive early on regardless.

Enemy variety does steadily improve throughout, introducing new types of robot dinos every handful of missions. Tackling them does force you to mix up your offence, as you’ll have to attack some from behind or learn new patterns to dodge. Ultimately they can't keep the overly simplistic combat interesting for the entirety of Lego Horizon Adventures’ campaign, which is laughable considering its length.

Expand Your Horizons

Aloy from Horizon Zero Dawn in Lego Form

(Image credit: PlayStation Publishing LLC)

In the age of 100+ hour RPGs and never-ending live service games, a game that’s short but sweet can certainly have its charm. I’d take a focused, bite-sized adventure over a middling odyssey any day. Unfortunately I think Lego Horizon Adventures is the worst of both worlds in this respect. It's repetitive, not even trying to do anything creative with the Lego game formula, and you can comfortably complete it in a single day. Usually I don't care how long a game is relative to its price, but this one's $60 and won't even keep a kid's attention for more than an afternoon.

Even with a casual fondness for the Horizon series, I found myself getting bored after two hours. So, who exactly can I even recommend this game to—12-year-olds who love both Lego and Horizon? Adult Horizon fans who want a co-op game to play with their kids?

The character animations are all extremely smooth and the flashy particle effects easily make this the best-looking Lego game I’ve ever seen.

Maybe just people who really love looking at virtual Lego bricks instead of real ones. To be fair the game looks incredible on PC: I’m sure this is a benefit of the stop-motion effect that Lego Horizon Adventures is mimicking, but the character animations are all extremely smooth and the flashy particle effects easily make this the best-looking Lego game I’ve ever seen. A low bar perhaps, but one that the game comfortably clears.

Performance on PC was mostly fine too. I had all settings on max at 4K with the framerate uncapped, and it was flawless outside of a handful of stutters. These stutters seem to be directly tied to enemy deaths and only occurred when I defeated multiple machines in a single attack, most likely due to the resulting Lego brick debris that flies about when an enemy dies.

But outside of a blind Horizon obsession or an infatuation with digital bricks, there's little in Lego Horizon to appreciate - and even then, I don’t think fans of either will come away particularly satisfied. Mildly entertained for an evening perhaps, at best.

]]>
https://www.pcgamer.com/games/action/lego-horizon-adventures-review/ TQzHYQR9yYs6MCWntHbDqV Fri, 15 Nov 2024 22:38:36 +0000
<![CDATA[ MSI Titan 18 HX A14V review ]]> What is the purpose of a gaming laptop? A simple question, and one that'll have different answers depending on who you ask. Most, myself included, will argue that a gaming laptop is for those who desire the power of a gaming PC in a chassis that you can take on the move.

That's not all gaming laptops, however. Some fall into the "desktop replacement" category, and these are built to serve a different purpose. For these mighty machines, the idea is that you could move your PC around with you, but most of the time it'll be chained to a desk, and as such the designers prioritise high-powered components, big screens, and little else.

They're a bit of a dying breed, as these days it's perfectly possible to have a supremely powerful laptop that's still capable of being shoved in your bag and taken on the train, perhaps with a little bit of back strain as a result.

No-one seems to have told MSI. Because, in the year of our lord 2024, it's still pumping out new versions of its Titan line. We reviewed one of the previous models, the MSI Titan GT77 HX, and gave it a score of 53%—thanks to its incredibly loud fans, ugly chassis, and ludicrous price, among other things. Now, the MSI Titan 18 HX A14V is here. So, have things improved?

MSI Titan 18 HX A14V Specs

The light-up MSI logo on the lid of the Titan 18 HX.

(Image credit: Future)

CPU: Intel Core i9 14900HX
GPU: Nvidia GeForce RTX 4090
Graphics power: 175 W
Memory: 128 GB DDR5-5600
Storage: 6 TB across 3 x 2 TB Gen4 SSDs
Resolution: 3840 x 2400
Refresh Rate: 120 Hz
Network: Intel Killer BE Wi-Fi 7
OS: Windows 11
Dimensions: 404 x 307.5 x 24-32.05 mm
Price: $4,800 | £4,800

Price wise? Not really. The GT77 HX could be found for $5,300, and this new model retails for the same, although at the time of writing it is reduced down to a much more reasonable $4,800/£4,800—which is surely a bargain if you ask me.

Just kidding, that's still a ridiculous amount to pay for a gaming laptop. Still, you're getting quite literally the most highly-specced laptop money can buy right now when it comes to internal components, and that seems to be the whole reason for this machine's existence.

So, for the CPU, it's the 24-core Intel Core i9 14900HX, paired with a truly monstrous 128 GB of DDR5-5600. Yep, I did a double-take, too. Who needs 128 GB of RAM in a gaming laptop? Almost nobody, but again, need is not the point. It's here, and you're paying for it.

It'll therefore come as no surprise that the GPU is the 175 W version of the mobile RTX 4090, a model we've continually criticised for being virtually pointless in laptop configurations thanks to the exceptional amounts of heat it produces. That means you're usually paying for performance you can't actually use, thanks to its tendency to overwhelm even the largest of mobile cooling solutions.

Image 1 of 5

The MSI Titan 18 HX on a desk.

(Image credit: Future)
Image 2 of 5

The MSI Titan 18 HX shot from the side.

(Image credit: Future)
Image 3 of 5

The baby blue venting inserts on the back of the MSI Titan 18 HX.

(Image credit: Future)
Image 4 of 5

A hand holding the MSI Titan 18 HX, showing the massive side vents and the huge amount of effort it takes to hold one aloft.

(Image credit: Future)
Image 5 of 5

The MSI Titan 18 HX, on a desk, with its massive power brick alongside.

(Image credit: Future)

On first glances at the sizable chassis here, however, you'd be forgiven for thinking that the Titan 18 HX had at least half a chance. It's covered in truly gigantic vents big enough to wedge a finger in, and the deck feels like a slab of granite thanks to the thickness created as a result. I say slab of granite, but I only wish it felt as resistant to damage.

The weight is here, but the build quality is decidedly not. Flipping the laptop over (with some considerable effort, given that it weighs nearly eight pounds) reveals some incredibly hollow-sounding plastic protrusions on the underside, presumably to keep all those scalding hot components away from your legs.

They work, as it happens (and we'll get on to heat later, believe me), but the plastic feels tinny and worryingly breakable.

First impressions disappoint elsewhere, too. The base upon which you rest your wrists is decent enough (although incredibly fingerprint prone), but it contains a totally flush RGB trackpad, no edge whatsoever, and the lighting effect is horrible. It looks washed out, and feels worse.

The RGB mousepad of the MSI Titan 18 HX.

(Image credit: Future)

The click underneath has the same sort of tactile feedback and noise as pushing down on plastic food packaging, with a tacky ping that makes you feel like you're constantly breaking it, rather than clicking down on something substantial.

I could go on, and in fact I will. The whole theme here appears to be red, black and grey—apart from some venting inserts on the rear, which are baby blue. They look like they were grabbed from an entirely different machine, and had me staring at them with a look of genuine confusion for some time.

Most egregious of all when it comes to fit and finish, however, is the lid. Contained within is a 2400p 120 Hz Mini-LED screen, a fragile and rather good-looking panel that you'll want to keep safe from harm's way. No such worries when it comes to the outside of the lid—it feels sufficiently protective. But in an open position, it… tell you what, there's no point explaining this, let me just show you instead.

I'm sorry MSI, I really am, but that's completely unacceptable even in a budget laptop, never mind one at this price. I took this machine into the office with me once (just to prove that it could be done) and I spent the entire time fretting that the worryingly creaky chassis was going to result in a dead panel once I'd arrived.

It survived, for the record, although my back barely made it. As a desktop replacement model, you could argue that this was an unfair test, but I wanted to see what it would be like to buy one of these laptops and use it as a laptop—and the answer is, you simply wouldn't.

I know what you're thinking at this point—sure the chassis might be rubbish, but this is a machine designed to dominate the benchmarks. So here's the part where the Titan really comes into its own.

Sort of. Sometimes. In certain benchmarks. Perhaps. Looking at the 1080p real-world gaming figures, the Titan 18 HX manages to pull ahead of its elder sibling quite significantly in most cases, although there are outlier results. In Cyberpunk 2077 it's a frame slower, and I had some issues with the Metro Exodus Enhanced Edition benchmark tool.

Lows were, well, very low, and the frame timing was all over the place. The Titan performed smoothly everywhere else, so I'll write that off as an issue I couldn't track down despite multiple driver and game reinstalls.

At 1440p, the Titan 18 HX lags behind the Titan GT77 HX in Cyberpunk by a single frame once more. Other than that, however, it's much the same story. Speedier than the competition on average, sometimes significantly—and in F1 22's case, by a considerable margin.

In both the 1080p and 1440p tests, the Titan boosted itself to hypercar-levels of performance in this particular game, throwing all our previous RTX 4090-wielding laptops into the barriers.

At 4K, however, the wheels well and truly come off. While the Titan 18 HX keeps its F1 22 lead by a solitary average frame, in every other game it finds itself outmatched by the competition.

So what's going on here then? Well, at 4K you really are leaning on that mobile RTX 4090 quite heavily, and those of you that have read our RTX 4090 laptop reviews before will know where I'm going next.

At "extreme" performance settings (what else would you keep a laptop like this set to while plugged in?), the fans ramp up to hilariously loud levels—but they still can't stop that GPU from throttling back to save itself from melting a hole through the middle of the Earth.

It's the same story in the rendering benchmarks. At best, it slightly edges ahead of its rivals in the odd test, at worst, it lags behind the older model. These components simply do not wish to run flat out in a laptop chassis, even one this large—and that Core i9 CPU kicks the cooling system into high-gear panic in no time at all.

Speaking of fan noise, I know this is something that's complained about with every ultra-high spec laptop under heavy load, but the Titan 18 HX really is something to behold.

On my office adventure, I had to stop myself from benching this laptop at my desk lest it annoyed the entire floor with its jet-engine-like roar. I know this sounds like overstatement, but it really is loud to the point of embarrassing. I've included another video clip here, because you really need to hear it for yourself to get the point across.

I'd suggest turning the sound up for the full effect.

Battery life is, well, the sort of battery life you'd expect from a desktop replacement. I have to say that under real world usage, you can definitely get a few hours out of it, but give it the chance to spin up those fans (and it'll take any excuse it can get) and battery life plummets considerably.

There are yet more day-to-day teething issues that I can't let go, and seeing as I'm on a roll here I'd better list them. The keyboard, for one. This is a SteelSeries unit, complete with the SteelSeries text printed proudly on the top deck. I quite like SteelSeries keyboards, but the only thing reminiscent of a good SteelSeries keeb here is the font.

It crunches, pings, and pops, with the sort of resistance that makes me physically screw my face up as I type. It's mostly mechanical at least, by which I mean there's proper mechanical travel on most of the keys—apart from the odd few, like the Escape, the left Ctrl, and the arrow keys, which feel like scissor units.

To keep costs down, I would imagine. Got to save those pennies somewhere, ey?

Hear that springy pinging? Yep, once you've noticed it, there's no un-noticing it. Another personal record broken here, as I've never heard a laptop keeb sound quite as loud as this one.

The speakers are dreadful, too. I put the Titan side by side with my partner's default HP work machine, and somehow the cheap and cheerful model managed to exceed the volume of the big bruiser by a country mile, and sounded better balanced, to boot.

They're hollow units, and that lack of volume is going to be a real problem when those fans kick into high gear. In short, if you're buying one of these, you're going to need a headset. A properly loud one, ideally with noise cancelling.

The Mini LED panel found in the MSI Titan 18 HX.

(Image credit: Future)

This is all sounding very negative, isn't it? I'd better mention that screen once more for balance, which really is a lovely panel. It's just a shame it's only 120 Hz, which seems like an odd omission given everything else here is specced up to the guns. The previous Titan featured a 17-inch 144 Hz Mini-LED display, so I can only imagine that 18-inch high-refresh panels are difficult to get hold of.

Still, it's a good-looking 4K display. It's just a shame that, err, 4K performance is where the Titan falls down compared to similarly-specced machines.

So then, what to say about this behemoth of a gaming laptop? In certain benchmarks it's the fastest we've ever tested, and for that it deserves a certain degree of acclaim. However, almost every other desirable facet of a modern laptop has been sacrificed as a result.

It's not slim nor sleek. It's not a premium feeling object. The battery life is rubbish, the keyboard is crunchy, and the trackpad looks awful—with a click that feels worse.

MSI Titan 18 HX, with banana, in repose.

(Image credit: Future)

It's loud, cantankerous, unwieldy, and at 4K it still gets beaten out by the Lenovo Legion 9i. That's a laptop with phenomenal build quality, a sleek form factor, a superb keyboard, and so many other desirable aspects it's simply no contest as to which you'd rather have. Lenovo has released a refreshed version since our review that should do you rather nicely, and you can often pick one up for well under $4,000.

Buy if...

You absolutely must have the top components: If you really, really need stonking amounts of RAM, a super-powerful GPU and all the trimmings, there aren't many options. The Titan 18 HX is certainly one of those.

Don't buy if...

You have any concerns about money at all: For much less, you can get better. Much better.

You value build quality and nice materials: While the Titan has a weight that suggests quality, the rest of the plastics suggest something far cheaper.

You like being able to hear yourself think: It's the loudest gaming laptop I've ever heard under load, and that's quite the achievement.

That's still a silly amount of money for any laptop, but it's still massively cheaper than the MSI. The price of a nice mini-break for two—saved.

Should an 18-inch lappy be what you desire for desktop replacement duties, why not take a look at the Asus ROG Strix Scar 18? It's also horrendously loud, subtle as a brick, and angular to a fault. But having personally tested and reviewed both, I'd take the Asus all day long. It's an old-fashioned feeling machine, but at least it has build quality to match—and can often be found cheaper, too.

I just can't get my head around why MSI keeps making the Titan. Gaming and creator laptops have come such a long way in the past few years, and so much of that is down to taking powerful components and balancing them with the day-to-day niceties that make a premium object such a wonderful thing to behold. A thing of desirability, worthy of serious levels of cash for serious levels of luxury.

Fast, the Titan 18 HX A14V may be. But luxurious? Not a bit of it. If I was being generous, and at this point I feel I have to be for balance's sake, I like that it no longer has a large rear lip like the previous model. That feels like an improvement, and I've reflected it in the score.

But otherwise? It might be a performance monster compared to many, but charging this amount of money for a laptop with this little refinement is asking for trouble. I should be desperate to hang on to a machine with this kind of firepower inside it, but honestly? MSI, you can have it back.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/msi-titan-18-hx-a14v-review/ cd9qZNrEcFNuRbpufo6haL Fri, 15 Nov 2024 17:40:41 +0000
<![CDATA[ Xenotilt: Hostile Pinball Action review ]]>
Need to Know

What is it? Eyeball-searing cyberpunk pinball bliss.

Expect to pay £12.79/$14.99

Developer Wiznwar, Flarb LLC

Publisher Flarb LLC

Reviewed on Intel i9-13900HX, RTX 4090 (laptop), 32GB RAM

Steam Deck Verified

Multiplayer? No

Link Official site

I thought I could handle this. The developer's previous "occult pinball action" hit, Demon's Tilt, has been my go-to Steam Deck game for as long as I've owned Valve's portable. I knew I was in for another round of traditional pinball skill mixed with shmup-like sprays of bullets—been there, done that, got the pentagram-decorated t-shirt.

But no, I was initially as overwhelmed by Xenotilt's riot of colour, fireworks, and pixelled retro cyberpunk style as anyone. There are energy scythes, cyber-scorpion women, and dragon-adjudicated billiards here. I can pick up ammo drops and fire bright pink snaking lasers at swarms of points-awarding monsters. It's basically the pinball game I imagine NPCs from Cyberpunk 2077 playing.

Xenotilt has just one table, but it's so big it's essentially three different pinball machines stacked on top of each other, each zone feeling like a section of a derelict spaceship with multiple states, moving features, and animated flourishes. I never thought I'd want auto-firing turrets in my pinball games, but now I've been given a taste of heaven I'm not sure I could ever go back.

So I spent most of my very short first few attempts admiring all the lights, sometimes with my monitor tilted on its side so I could bask in the game's vertical glory, acquainting myself with the Game Over screen, and feeling… pretty good about it all, in spite of my poor performance. Even the smallest score multiplier awarded or most basic multiball mode activated sets off a wave of screen-sized celebrations. Xenotilt is always happy to throw a party or 10 in my honour, no matter where I end up on the high score table at the end of it all.

Time, practise, and determination eventually cut through the awe I had for the game's gorgeous bloom lighting and special effects (both thankfully extremely customisable, leaving me to decide just how distracting I want them to be). I finally spotted the skill shot ramp, and felt pleased with myself when I managed to coax the ball into landing somewhere near it. I started looking for specific lanes and labels, eager to claim the most recent jackpot I'd earned—or strategically add another ball to the perk-granting multiball matrix on the side of the screen.

Curving purple lasers arc across the pinball table.

(Image credit: Flarb LLC)

I never thought I'd want auto-firing turrets in my pinball games, but now I've been given a taste of heaven I'm not sure I could ever go back.

This matrix is a three-by-three grid, each space granting a unique ability (from straightforward things like more ammo to a vacuum-like effect on enemies, as if they're being sucked out of an airlock). They're all helpful, and there is no one right way to fill it up. Spread the balls evenly across all three columns and you'll get a great selection of basic boosts. Stick to one for a narrower but eventually more powerful array of abilities—and the chance to hit the multiball activator afterwards, unleashing these stored balls onto the table and causing points-scoring mayhem. But—of course there's a but—do you wait until you've filled all three, or play it safe and release the multiballs as soon as possible? And do you want to at all, considering you only get the matrix's perks when the balls are resting in their slots?

In spite of the outlandish nature of some of Xenotilt's features—it's not every day you play a pinball game with a combo meter, or one that actively fights back when you whack it—the game's reactions are always learnably consistent, and the table isn't as harsh as its menacing retro-cyber style and taunting vocal commentary (such as the somewhat disgusted "I expect more!" after the final ball drops into the gutter) might imply.

Every major potential point of run-ending failure has some subtle sort of limited-time or limited-use safety net—ready and waiting to stop a game from abruptly ending to a single moment of bad luck. And relative to conventional pinball games, the threshold for using the directional tilt is extremely lenient. It's practically expected in some areas of the board, as when I need to nudge balls into the desired left lane on the second tier—either sending the ball shooting to the top of the table or back around for some more points-milking in the same area.

A pinball table wearing the face of a skeleton glares down at the feeble efforts of the player.

(Image credit: Flarb LLC)

Crisis mode is so different it feels like a whole new game, even though it's still using the same table layout as every other variant.

By the seventh or eighth session, the deeper layers of this evil metal onion were revealing themselves. Score and surviving are important, but I was starting to figure out that Xenotilt is all about cycles of objective completion. And getting a bit lost in the jargon too, if I'm honest. At least I could take a peek at the quick help section if I wanted to go over the basics (like "What the heck is a 'charge pop' anyway?"). Every loading screen comes with a useful tip to try and keep in mind, the pause menus describe the multiball matrix bonuses in full, and whenever I try a new mode I'm greeted with a short explanation of its core features.

Xenotilt's EX variant includes score-enhancing survivors to permanently unlock and recruit, as well as "tri-gifts"—powerful effects only awarded after performing a "core overkill" on one of the table's bosses. Hardcore mode's most obvious change is the use of slightly smaller flippers, which demand tighter ball control and expert use of the tilt function to make sure the single life I'm given lasts for as long as possible. Crisis mode is so different it feels like a whole new game, even though it's still using the same table layout as every other variant. This is a high-pressure time attack mode with a completely different set of priorities: the only thing that matters here is surviving as long as possible, and to do that I have to risk it all by going after the time-adding targets on the table.

A shower of points hits the table all at once.

(Image credit: Flarb LLC)

There have been times I've started a new game so fast after finishing the last one that I didn't even take in my score.

Whatever mode I'm playing, it's always tense and exciting and I feel like my knowledge and abilities are being tested in an interesting way. For every one thing I master there are another dozen new challenges waiting in the wings, ready to send my scores higher than ever before—if I can get the hang of them. On very good runs I end up with random bonuses that dwarf scores I used to need three balls to earn. Not that I always notice my final tally: there have been times I've started a new game so fast after finishing the last one that I didn't even take in my score. Seeing where I've landed on the leaderboard could wait for later, because in those moments I just needed to play more Xenotilt.

Pinball paradise

A bright yellow explosion hits the centre of the pinball table.

(Image credit: Flarb LLC)

Xenotilt isn't just pinball for people who already know they like pinball games. It's a pinball for people who love the thrill of seeing arcade-style skill converted into raw points. It's pinball for people who like the thought of suddenly having to attack the exposed heart of a cyber dinosaur skeleton that's busy pummelling a metal ball with neon pink breath lasers and ice attacks. It's pinball for people who dream of playing a table that lights up like the Terminator's Christmas tree. It's even pinball for people who think they don't like pinball, the game different enough—and encouraging enough, thanks to the help on offer and the presence of a challenge tracker that always seems to be on the cusp of recognising another achievement—to offer an attractive, understandable, and unusual approach to the genre.

The fact that a great game lasts mere minutes and a full restart takes just a few seconds makes this the perfect game for all occasions, whether I've got 15 minutes and a Steam Deck in hand, or an entire free evening in front of my PC.

I should warn you though, if you're anything like me you can kiss your life goodbye once you start playing.

]]>
https://www.pcgamer.com/games/xenotilt-hostile-pinball-action-review/ DabjNeuR6nahhiuF8WAQsf Fri, 15 Nov 2024 15:48:27 +0000
<![CDATA[ Void Sols review ]]>
Need to Know

What is it? A minimalist soulslike starring a triangle.
Release date November 12, 2024
Expect to pay $20/£17
Developer Finite Reflection Studios
Publisher Modern Wolf
Reviewed on Gigabyte G5 (Nvidia RTX 4060, Intel Core i5 12500H, 16GB DDR4-3200)
Steam Deck TBA
Link Official site

Oh God dammit. I was so looking forward to opening this review with ‘Void Sols? More like Avoid Sols!’ and then going down in history as the funniest game critic who ever lived. But then the selfish developers of this smart little soulslike had to spoil it all by making yet another great game in everyone’s favourite/most inescapable genre. Boo!

You play a little white triangle, which I like to imagine is the ship from Asteroids after a very unfortunate crash landing. Because a sinister collection of bitter shapes occupy this land, seemingly having deserted the Geometry Wars to spend around a dozen hours trying to slay you instead. Thanks to this game, I now know what a hexagon hating me looks like. So that’s fun.

Geometric animosity aside, the first thing that truly wowed me was the game’s phenomenal lighting. Void Sols is a dark game that may plunge you into inescapable depression on an OLED monitor. It’s up to you to brighten things up—well, a little—by lighting torches you find as you cautiously move through its mazes. The ancient soulslike trick of hiding enemies around corners gets a much needed refresh here, as it's now paths of light that are your true foe. Sometimes your stupid triangular body is blocking the light, hiding an enemy right in front of you. Move too quickly around a pillar and you may discover that the darkness on its other side was concealing several angry dagger-wielding squares. Eep.

You enter each new area completely on the back foot with no map or hint of what’s coming. Progress is made inch by inch as you light up torches, which also give you a teeny-tiny amount of text telling you where you are (e.g. Prison Cell, Torture Chamber, Somewhere Else Horrible, etc) which is about all the surface-level story you’re getting. Stealthing past foes is essential at first, but combat’s sharp and satisfying when you do have to break out your sword. Hopefully you’ll eventually find a map for the area, giving you a fighting chance to see exactly where you are and plan out an escape route.

Except whoever drew these maps clearly drank their way through cartography school. They show you only the very basics of an area, and item locations that don’t seem to be obtainable by following the paths. This is far from a complaint. I love a game that gives you enough info to get by but still knows how to keep a secret. Essentially every area is a maze, full of dead ends, destructible walls with treasures hidden on the other side, and clever navigation puzzles that reward smarter strategies than just smacking your blade against the infrastructure. I’d have appreciated the ability to label stuff on the map rather than relying on my horrible memory to recall certain locked doors. But who ever heard of a triangle with the ability to doodle on a map? That'd be just plain unrealistic.

(Image credit: Finite Reflection Studios, Modern Wolf)

Combat is all about dodging, though you can find shields later if you want to mould it into a more traditional block/parry experience. In fact, the more time you spend scouring Void Sols’ areas for secrets, the more tools you’ll find to fit your favoured soulslike. I found a relic that restored lost health Bloodborne-style when I hit an enemy back in time, and then could never bring myself to take it off. Your starting sword is a decent shape-slayer, but katanas, hammers, maces, oh my, all have vari… varied playstyles that are fun to… to… zzz…

…Huh? Oh, sorry! It’s hard to stay awake when writing a paragraph that could describe, well, practically any soulslike. This is yet another game where you lose all your levelling up currency when you die and have one chance to get it back. Where combat is all about carefully watching enemy behaviour and waiting for an opening. Where god damn multiphase boss fights are somehow still seen as acceptable game design. Void Sols is more a lick of paint over old staples than something truly revolutionary.

But it's a very pretty lick of paint. Prisons, forests, mountains—I’ve explored these videogame locations countless times, so it’s to Void Souls’ immense credit that they look so striking and feel even somewhat novel again here. The cold, dark mountains, where occasionally you’ll see the distant orange glow of a bonfire, but more often be only able to make out your immediate surroundings and a plummeting red bar as the cold digs into your health, are wonderfully atmospheric. Nice to be reminded that a mountain should be a hostile challenge again, and not just a pointy bit on a world map.

(Image credit: Finite Reflection Studios, Modern Wolf)

Its prisons are evil labyrinths of locked doors and nonsensical routes to nowhere-but-pain that are immensely satisfying to overcome. I could see a pitch document in the developers’ hard drive somewhere saying ‘Top-down 2D Dark Souls demake’, but towards the end it gets more surreal and ambitious, playing on your perceptions of how a game that looks like this is meant to behave.

I just wish there was more of that. It’s not Void Sols’ fault that there are now more soulslikes than there are people on Earth, but it does mean that the more conservative ideas are very very overfamiliar. Even if there’s some clever quality-of-life improvements I’d like the entire genre to have to incorporate by law. I despise finite items in these games because knowing they’re gone forever discourages using them and means you never get to master them either. Here, any item you find and use is replenished whenever you find a resting point. You can level up speed, strength, dexterity and health, and all four of those stats can be reset and reassigned too. Likewise, all your weapons are levelled up simultaneously, and you can swap the stat boosts and buffs out as you see fit. It's the kind of stuff I wish was industry standard.

(Image credit: Finite Reflection Studios, Modern Wolf)

Void Sols threatens to go off the boil towards the end as the excellent pacing and difficulty curve starts spiking nastily. Bizarrely, the third-from-last boss fight is the hardest. It’s an absolute pig of a fight with multiple phases (ugh) and that grew really fond of constantly crashing my PC during its second phase (ugh!). Right when I was about to win too! OK, no I wasn’t, but the fact this fight is immediately followed by another boss, and then an area that chucks enemies at you remorselessly, feels suspiciously like a game padding out its runtime.

It recovers in its final stages, presenting a still meaty but fairer challenge that remembers what the game does best—sticking you in a pitch black maze and watching you fumble around, slowly solving its secrets. I can’t wait to scour it properly for everything I no doubt missed as I embark on my next run. Void Sols? More like Void So-Worth-Checking-Out-ls! HA!

]]>
https://www.pcgamer.com/games/action/void-sols-review/ WJ6dzT5etUhyCRMJXaTDH6 Thu, 14 Nov 2024 18:03:51 +0000
<![CDATA[ Logitech Pro X Superlight 2 Dex review ]]> I felt initially quite confused by the Logitech Pro X Superlight 2 Dex. It's basically the older Superlight 2 but with a new shape, complete with the same sensor, figures, and specs. Given the original mouse launched over a year ago now, it feels strange to put out an update that's basically the same thing as before, but only a little different. However, this is not only a good excuse to look back on the original Superlight 2 but also reflect on its Hero 2 sensor, which has since seen an update to become even better.

The Logitech Pro X Superlight 2 Dex has a name that is more flashy than the mouse itself. Mine came in black and only has a few light white accents in the mouse wheel, as well as a clear plastic for the Logitech G logo at the base of it. This is all to say that the mouse is understated like much of Logitech's other gear, relatively rare for mice aimed at pro gamers.

The most distinctive part of this mouse, aesthetically, is where it gets the name 'Dex' from. The mouse's shape curves to the left, to better accommodate right-handed gamers from an ergonomic point of view. The previous Superlight 2 is functionally ambidextrous (though the Logitech G Pro 2 Lightspeed does a much better job of that thanks to its customizable side buttons) whereas this one is a little uncomfortable to hold in the left hand.

Luckily, it makes up for the lack of an ambidextrous shape by fitting super comfortably into my, rather large hands. That extra bit of support can allow for a more comfortable hand position, leaning some of the weight of the palm against the base. I really started to notice this positive change in longer gaming sessions especially.

Logitech Pro X Superlight 2 Dex specs

Logitech Pro X Supleright 2 Dex from the side with the 2.4 GHz receiver

(Image credit: Future / Logitech)

Buttons: 5
Connectivity: 2.4 GHz wireless, USB-A to USB-C wired
Sensor: Hero 2
Max DPI: Up to 44K
Weight: 60 g
Max acceleration: 88 G
Max speed: 888 IPS
Polling rate: Up to 8K
Battery life: Up to 95 hours
RGB lighting: No
Price: $159 / £150

Using the Superlight 2 Dex as my primary mouse in a litany of games, I found it handled the more granular and precise clicking of puzzle sniper game Children of the Sun with ease and thrived in twitch shooters, such as Counter-Strike 2 and Call of Duty: Black Ops 6.

The Hero 2 sensor is now capable of a max DPI of 44K and a max speed of 888 IPS, which means more erratic movements are picked up accurately and with ease.

Though you can boost the polling rate all the way up to 8,000 Hz, which can shave off tens of nanoseconds in most instances, I rarely found the need to. This mouse is snappy and responsive at the default 1,000 Hz. The sweet spot for many will likely be somewhere between here and 4,000 Hz.

Despite it being a bit overkill, it's a nice addition to the mouse, especially when you consider the old Superlight 2 is also getting the same polling rate upgrade. The 8000 Hz polling rate is not achievable wired so this mouse is at its best without wires.

Importantly, the reason I opted for something a little less than the 8,000 Hz DPI rate regularly is that it negatively affects the battery life. At 1,000 Hz, you get just over 95 hours of charge, which is frankly great for a mouse of this caliber. Though the specs sheet on Logitech's site quotes 95 hours, Logitech's software regularly gave me a bit more than this.

The Razer DeathAdder V3 HyperSpeed, which is the best gaming mouse right now, gets just a little more than that, meaning it fits right into where it should be in the market.

Image 1 of 3

Logitech Pro X Supleright 2 Dex from the front

(Image credit: Future / Logitech)
Image 2 of 3

Logitech Pro X Supleright 2 Dex from the back

(Image credit: Future / Logitech)
Image 3 of 3

Logitech Pro X Supleright 2 Dex next to the standard Superlight 2

The original Superlight 2 on the left and the Superlight 2 Dex on the right (Image credit: Future / Logitech)

Unfortunately, once you have got this mouse all plugged in and ready to go, you will want to update it to get access to all those stats. To update the Hero 2 Sensor, you should go through Logitech's own G Hub software, where you can also set custom controls and profiles. This mouse doesn't have an easily accessible DPI switch and makes up for it by setting a custom DPI rate for individual games.

Frustratingly, the G Hub is the worst part of setting this mouse up, consistently freezing or running into loading issues. I found Logitech's software crashed or otherwise became difficult to work with on three separate rigs, one of which required a clean reinstall to get things running again.

Once working, this software does what it should but the hurdles to get it working can be occasionally rather frustrating, especially when you really just want to get an update and never touch the G Hub again.

Once this is all sorted, the Dex performs very well, partially down to a very nice design and weight. At just 60 grams, this thing can glide across a desk super easily but isn't so light that I found myself accidentally moving it when I merely placed my hand down. If it did happen to get caught in movement and knocked off a desk, I never worried that it would break, as it feels surprisingly sturdy given its size.

Image 1 of 3

xCount vs time graph in Mouse Tester

(Image credit: Future / Logitech)
Image 2 of 3

xVelocity vs time in Mouse Tester

(Image credit: Future / Logitech)
Image 3 of 3

Interval vs time graph in mouse tester

(Image credit: Future / Logitech)

Above: Tested at 1,000 Hz — The more erratic the dots are, the worse the tracking on a mouse is.

The only place this doesn't feel quite as sturdy is in the mouse wheel. Scrolling is super smooth but clicking the button itself feels a little mushy and inconsistent. The left and right click, thanks to the Lightforce switches, feel responsive, clean, and clear, yet the scroll doesn't feel quite as satisfying.

Picking up the original Superlight 2, this scroll wheel feels practically identical and hints at something worth noting when looking at the Dex. Both current Superlight 2 models are functionally the same, with an equally impeccable sensor, but an equally offputting scroll wheel and price tag.

Buy if…

You really want that high DPI and Polling rate: 44K DPI and 8K polling rate are undeniably impressive stats, but perhaps unnecessary for many. 


✅ You want something very light: Superlight is a very fitting name for this mouse as not only is it light in weight but its design makes it feel even lighter.  

Don't buy if…

❌ You like a minimalist mouse: Though the lack of a DPI switch and only a few buttons might suggest otherwise, this mouse's reliance on software and highly customisable stats make it a little finicky initially. 


❌ You want a flashy mouse: With only a little light to show when it's charging or turned on, this has very little RGB.  

The main hurdle the Superlight 2 Dex faces on the market right now is its price point. This is a very capable mouse that's easy to use and comes with great stats across the board. However, with so many of our favorite mice being cheaper to buy at their MSRP and a little older (meaning they're more prone to be seen in sales), it's hard to see the Dex really converting any gamers who aren't already into Logitech's design language and shape.

The max stats are impressive and are somewhat noticeably in use but you need to really take gaming very seriously, and be at a certain skill level to get the use out of it (if at all).

This mouse has better specs on paper than the likes of the Turtle Beach Burst Air II but is more expensive and has a worse battery life. That trade-off is true of many of the best mice and will be what you have to keep in mind if you plan on picking one up.

The Logitech Pro X Superlight 2 Dex is an excellent mouse in a market filled to the brim with fantastic alternatives. If you've had your eyes on the Superlight 2 but don't like the shape, this will do everything you want out of it, and it will do it quickly and smoothly, but you are paying extra for a speed that you may not even notice in the heat of battle.

]]>
https://www.pcgamer.com/hardware/gaming-mice/logitech-pro-x-superlight-2-dex-review/ 7e4zxP3TXktxe2JKznLQR5 Thu, 14 Nov 2024 17:14:19 +0000