‘KRANE’ CONFIRMED AS A PURE CHROME OS TABLET COMING SOON BY LENOVO

‘Krane’ is a Chrome OS device in development that we’ve not talked about too often. As a matter of fact, we mentioned it upon its arrival in our consciousness in April of this year and once again as Gabriel found it would be only the second Chrome OS device to make the questionable decision of ditching the headphone jack. Other than those two posts, we’ve not really tracked anything unique about this device, but we have a few more clues to add to the pile that tell us a bit more about this upcoming tablet-only device.

First up, let’s get the clues from the Chromium Repositories out of the way. We have two things of interest, here. On one hand, we have proof that in lieu of a headphone jack, Lenovo looks to be assuring support for a USB Type C to 3.5mm headphone converter. That bit of proof comes with a clear reference to Lenovo, so I suppose that’s a 2-for-1 clue. Second, we have proof that they will be removing the POGO pin support we expect most of the ‘Kukui’-based devices to ship with.

Let’s start with the latter. POGO support is being removed from ‘Krane’ and if you’ve been following along, ‘Krane’ is one of a slew of new devices based on the ‘Kukui’ baseboard. This ‘Kukui’ board is toting support for things like a MediaTek 8183 chip, ambient light sensing, Bluetooth 5, detachable/tablet form factors, and more. We know there are at least 10 of these devices in development and we expect to start seeing the first ones arrive at CES in just a couple weeks.

POGO pins are, in general, used for keyboard accessories in detaching devices like the Pixel Slate and HP X2. This port gives tablet devices a quick way to pair up with a keyboard for use as a laptop when necessary and we expect to see quite a bit of this with ‘Kukui’-based devices. It would have made perfect sense for ‘Krane’ to leverage this connector, but it has clearly been removed as seen in the language from this commit in the repositories:

krane: do not support POGO dock

krane doesn’t have a POGO dock, the config should be dropped.

So, why drop the POGO pin support? We’re looking to find more clues about this, but it seems that ‘Krane’ will likely be a tablet-first and tablet-only device. There have only been a few devices like this in the Chrome OS ecosystem thus far, and they were oddities. The most popular and notable was the Acer Chromebook Tab 10. With a very nice screen and pen support, it was a decent tablet held back mainly by the poor tablet mode of Chrome OS at the time and the sluggish Rockchip processor.

I suppose it is time to try again with a Chrome OS tablet and Lenovo looks to be one of the companies giving it a shot. With pen support likely, this device could end up being very similar to the Acer Chromebook Tab 10, but with a much-improved software interface and what we’re hoping will be a much faster ARM processor than the Rockchip RK3399.

We keep referring to ‘Krane’ as a Lenovo device, and that is for good reason. The other commit we’ve uncovered is a very clear nod not only to the maker of this device, but to the fact that they’ll be giving users a work-around for the fact that ‘Krane’ will ship without a headphone jack. The language we found in this particular commit gives all this away:

krane: Add Lenovo USB-C TO 3.5mm Adapter

Add Lenovo USB-C TO 3.5mm Adapter

First up, it is very clear now that this device will be made by Lenovo. Commits don’t get written to include manufacturer-specific hardware if the device itself isn’t made by that manufacturer. Just like we saw with Dell’s Chromebook Enterprise offerings earlier this year, commits with manufacturer-specific language around manufacturer-specific peripherals make a clear-cut case for the maker of the device in question.

The bigger concern, here, is the decision by Lenovo to drop the headphone jack in favor of requiring users to get a dongle out in order to plug in headphones. Look, I know it is the 2019 thing to do, but why? With a tablet or Chromebook, it doesn’t seem necessary in the least. If the big argument for ditching the headphone jack is space, that argument just doesn’t hold water with larger devices. It seems, however, that Lenovo is doing it anyway.

With the removal of POGO pins and what looks to be the addition of the USB-C to 3.5mm headphone adapter in the box, Lenovo seems to be going all-tablet with ‘Krane’. Don’t expect this device to ship with a keyboard attachment in the box or show up with one available right away. My hope is Lenovo is at least considering a solution for those that would need one. I hope that, unlike other Chrome OS tablets, ‘Krane’ will get an official folio case with a Bluetooth keyboard and trackpad (think Brydge G-Type for the Pixel Slate) that gives you some sort of stand function. Chrome OS on a tablet has come a long way, but that doesn’t take away from the fact that under that tablet UI is a great desktop device that many users will want to use in that way.

No Motorola Razr comeback orders in 2019: Costly foldy nostalgia mobe pulled back

We’ve got some bad news for the deeper-pocketed nostalgia-tinged tech fans out there: Motorola is pushing back the launch of its foldable reboot of the Motorola Razr.

Preorders of the revived flip-phone were expected to open on December 26, with the first units trickling out on January 9, but those dates have been ditched in favour of an unspecified time next year.

In a statement issued by parent company Lenovo, Motorola said it was delaying the launch to “better meet consumer demand.”

“Demand has been high, and as a result, has quickly outgrown supply predictions,” it said.

For what it’s worth, every company that has attempted to launch a mass-market foldable phone has missed its original expected release dates. Huawei delayed launch of the Mate X no fewer than two times, while Samsung had to shelve its original release date over an embarrassing durability issue.

Early units of the Samsung Galaxy Fold had an awkward problem where removing the screen protector would essentially wreck the display. When the device finally reached shelves, screwdriver botherers at iFixit warned punters that the device was “alarmingly fragile”, particularly with respect to the screen. Furthermore, other publications, most notably CNET, shed doubts on the phone’s ability to withstand its promised 200,000 folds.

The Motorola Razr will be the first commercially available phone to use a candybar-shaped portrait display. It also mirrors the design of the original Razr down to the protruding “chin” at the bottom of the device.

And crucially, at $1,500, it would drastically undercut the existing competition, making foldable mobes (unless Escobar’s brother makes good on his promise) more accessible to a wider swath of the population.

That said, it’s probably sensible to wait for the reviews before dropping your cash on this. The Razr uses a relatively underpowered Snapdragon 710 platform, which is typically found on £300-range mid-rangers. Motorola says this is due to thermal management, and while that might be true, it’ll leave consumers wanting for power.

And with just 6GB of RAM and 128GB of ROM, a puny 2,510mAh battery, and no wireless charging, this phone underwhelms in other areas. On paper, it doesn’t strike us as a device worthy of its steep price-tag. ®

Windows 7 dies Jan. 14. Here’s what you need to do

If you’re still using Windows 7, it’s time to move on. The venerable and much-loved 2009 version of Windows gets its last security update on Jan. 14, 2020 (unless your company pays for extended support) and the free Microsoft Security Essentials antivirus program will (probably) no longer be supported either.

As we get further into 2020, Windows 7 machines will be more likely to become infected by malware, and any deep-rooted flaws in the operating system that arise almost certainly won’t be fixed.

That’s not good for the roughly 28% of Windows users who still use Windows 7. If you’re among them, you should upgrade to Windows 10. Your Windows 7 PC will almost certainly support Windows 10, though you may have to ditch some programs.

How to upgrade to Windows 10 for free

Here’s a little secret: As long as your Windows 7 license key is valid, you can probably upgrade your Windows 7 machine to Windows 10 for free by downloading and running Microsoft’s own Windows 10 installation tool.

Microsoft’s user forums have instructions on how to do this, and Bleeping Computer has more detailed instructions. Bleeping Computer says you need to opt to keep all files and applications to get the free upgrade, but the Microsoft user forums say you just need the existing Windows license key. Here’s how to find the Windows license key.

How to survive with Windows 7

If you do decide to stick with Windows 7 after Jan. 14, you’ll need to take some precautions.

Update everything in the Jan. 14 Microsoft Patch Tuesday release. It’ll be your last chance to make sure Windows and other Microsoft software are as secure as possible.

Install one of the best antivirus programs. Microsoft is killing its own antivirus software for Windows 7, so you’ll need something else.

Uninstall the Microsoft Internet Explorer browser. It’s a huge security hole. Use Google Chrome or Mozilla Firefox instead.

Ditch Microsoft Office and any of its components, such as Word, Excel or Outlook. Switch to Google’s office suite or to LibreOffice. For an email client to replace Outlook, try Gmail or Thunderbird.

Turn off Java in your browsers.

Turn off Adobe Flash Player in your browsers.

Create a limited user account for yourself and use that for day-to-day computer tasks instead of the administrator account you originally created. Limited accounts limit malware infections.

This Casio Smartwatch Perfectly Illustrates the Struggles of Wear OS

The history of Wear OS watches is long, and mostly disappointing. Over the years, Qualcomm’s less-than-snappy Snapdragon Wear 2100 chip bore a decent chunk of the blame. The arrival of the 3100 was supposed to offset that, and it’s better sure—but not by a degree that offsets years of stunted development. Knowing all this, I wasn’t expecting much when mulling around a showcase of Casio watches a few weeks back. Until a proud spokesperson showed me the Pro Trek WSD-F21HRRD and said Casio designed its own processor for Wear OS.

It’s not unheard of for a company to design its own silicon. Apple does for the Watch, and so too does Samsung for its smartwatches. Huawei also designed its own processor for its Watch GT, opting to forgo the wait for the 3100 chip and therefore Wear OS completely. Now, Casio told me its decision to make its own processors wasn’t a dig at other suppliers; it was merely a strategic choice to keep all parts in house. Still, given that the rollout of 3100-powered watches was not only slow but also underwhelming, I was curious to see whether a proprietary chip would make a difference—or if Wear OS would always be mediocre regardless of hardware constraints.

The reality is not that simple. When I booted up the Casio Pro Trek, it felt faster and more nimble as I swiped through screens. Just to be sure, I also booted up a Misfit Vapor X, which runs Qualcomm’s Snapdragon Wear 3100 chip. The difference in speed was minimal, though anecdotally I never once experienced lag with the Casio while navigating around Wear OS. Apps loaded quickly and the screen never stuttered as I swiped through various widgets. That’s something that still happens, though less frequently, with the 3100 watches I’ve played around with.

What was more noticeable was battery life. I wouldn’t call it long-lasting, but with regular usage, I got about 1.5 days off a single charge. Sometimes a little more. Logging exercises also didn’t seem to tax the battery too much—a roughly hourlong run only ate up about 8 percent. On 3100-powered watches, I’m lucky if I get more than 20 hours. In both situations, I would need to charge nightly but with the Pro Trek, I could also still comfortably log a morning run if I forgot. Small difference, but one that I appreciate since a stupid portion of my life is dedicated to figuring out what watches needed to be charged when.

That’s important for the Pro Trek. If its name alone doesn’t tell you its meant to be an outdoor fitness watch, then the rugged design is a dead giveaway. It measures 2.4 by 2.3 by 0.7 inches, with a thickness of 0.8 inches. Wearing one sort of feels like having a tiny dinner plate on your wrist. This is not a watch I’d wear to a nice dinner, but then again it’s not meant for that type of everyday wear. The Pro Trek looks and feels like a GPS Garmin watch—right down to the part where you can see GPS maps of your current location from the wrist. That part is genuinely cool, as it’s not something you really see from Wear OS watches. For good reason.

Wear OS itself is not an operating system that lends itself to intrepid adventurers—though Casio does an admirable job of trying to shoehorn a barrage of sensors into the ugly stepsister of smartwatch operating systems. For starters, out of the box, there are competing apps for tracking your data. You could use Google Fit, which is not great and will inexplicably rely on your phone’s GPS. Or you could use Casio’s suite of native apps, which actually makes use of its built-in GPS sensor, maps, altimeter, barometer, compass, and heart rate monitoring. These apps are better than Google Fit, but it also feels like there’s a few too many. There’s one for tracking activities, one for compass, one for “moment setting” or giving you reminders every 200m of altitude you climb, another for “point navigation” or directions, and so on. Surely some of these could have been condensed, but at the same time, this cluttered approach feels oddly appropriate for Wear OS.

I wish I could say all those nifty features worked perfectly for me, but I had GPS-related issues during testing. In Casio’s native activity app, I was prompted to wait and point my device at the open sky. I ended up shivering for 10 minutes in 30-degree weather as my watch repeatedly told me it couldn’t find a connection. I ended up running without the built-in GPS for a 2.14-mile run, which the watch logged as 2.2 miles. Not too shabby for a GPS-less run. Or so I thought. After a troubleshooting session with Casio, it turns out the watch did track my run via GPS but said I didn’t because New York’s skyscrapers meant I was tracking on a significantly weaker signal. I did a second 4.6 mile run tracking via Google Fit, my phone, and the Series 5 to see if it was any better. It delivered roughly similar results, clocking me at 4.68 miles. That’s overreporting compared to the Apple Watch and my phone, but not egregiously so. Heart rate readings were also accurate compared to the Apple Watch Series 5, which I wore simultaneously while running.

My troubleshooting session with Casio, however, highlighted how annoying updates on Wear OS can be. Although I had auto-updates enabled, some hadn’t actually downloaded, or been interrupted halfway through. The menus to access everything are unnecessarily buried in a nesting egg of menus, and searching for anything on-wrist in the Google Play Store can be a pain.

When you factor in the Pro Trek’s $500 price tag, the watch is a curious anomaly. It’s too expensive for casual users, but given the look and the feature set, this watch isn’t meant for them. That pricing is right on par with some of Polar and Garmin’s higher-end running watches, but because Casio’s chosen Wear OS, the watch also lacks their finesse. Polar and Garmin both have in-depth platforms and their own companion apps that truly make sense of your metrics. They make more sense for results-oriented athletes. Meanwhile, accessing your results off Wear OS has never been the smoothest experience. (Again, Google Fit is just…not good). The Pro Trek has so many sensors and advanced capabilities, but Wear OS is the reason why you wouldn’t get it.

I initially set out to review the Casio Pro Trek WSD-F21HRRD in an experiment to see if a custom piece of silicon would somehow breathe new life into Wear OS. The answer is sort of. Undoubtedly, the Pro Trek has better battery life. That’s a must for fitness watches, particularly those meant for the outdoors. But the battery savings aren’t so amazing that I’d burn the Qualcomm Snapdragon Wear 3100 chip at the stake. The Casio is a smidge faster with app loading and screen swiping. That’s just not enough.

Really, all this reconfirmed Wear OS’s mediocrity isn’t a pinpointable problem that can easily be fixed. It’s not as simple as swapping out the chip, or making a good app, or wrapping everything in a pretty package. The problem is deeply embedded in the entire platform. Maybe there’s some hope given that Google just bought Fitbit, which has a stable of solid products under its belt. But this Casio watch—through no fault of Casio’s really—felt like Frankenstein’s monster. Something cobbled together from disparate parts that work fine, but ultimately is not the thing you initially envisioned.

README

Big outdoor fitness watch with compass, altimeter, barometer, built-in GPS, and heart rate monitoring.

Battery life is longer than average for Wear OS. That’s not saying much, but ey.

The GPS maps and tracking are pretty accurate.

However, Wear OS is still a pain and other fitness watches have better platforms for viewing your metrics.

Wear OS still bad.

RAZER BASILISK X HYPERSPEED REVIEW

The Basilisk X Hyperspeed is Razer’s latest stab at the affordable wireless mouse market. Its big selling point is that it boasts the same wireless tech found in the Viper Ultimate and the Basilisk Ultimate—but while those mice cost $150-plus, the Basilisk X Hyperspeed is just $60.

Razer claims the technology is 25 percent faster than any other wireless tech. In practice, as I said in my Viper Ultimate Review, that translates to ultra-speedy performance and near-zero latency. No matter what I threw at the Basilisk X Hyperspeed, from shooters to strategy sims, it felt lightning quick, and I never detected delays between my clicks and what happened on screen. But does its reduced price come with important compromises?

The sensor on the mouse’s base shows no signs of corner-cutting: its specs match those in the more expensive Razer Deathadder Elite, arguably the best gaming mouse on the market. You can ramp it up to 16,000 dots per inch (DPI)—the same as the pricier Logitech G502 Lightspeed, our favourite wireless gaming mouse—and Razer claims 99.4 percent sensor accuracy. That’s not quite as good as the Viper Ultimate (20,000 DPI, 99.6% accuracy), but it’s still impressive, and it can track up to 450 inches per second: for comparison, the G502 Lightspeed is only rated up to 400. In-game, I never noticed any hitches.

As well as connecting to your PC via a dongle, you can also hook it up via Bluetooth. That means you can connect to almost any desktop or laptop easily, and it worked quickly across every PC I tried. I’m not convinced the Bluetooth support is vital, because you can remove a section of the mouse’s body to slide in the dongle, which makes it easy to transport. Plus, Bluetooth performance won’t match the standard wireless connection. But I appreciate the flexibility.

I can’t complain about the battery life, either. On just a single AA battery, Razer claims it will last 285 hours. Even if you use it eight hours a day, that means it will last well over a month. I didn’t have long enough to verify that, but during a week I barely made a dent in it, according to the battery indicator in Razer’s Synapse software. To save extra juice, you can change how quickly it falls asleep when left idle, and it always woke up as soon as I moved it.

Top marks for tech, then, and the Basilisk X Hyperspeed doesn’t look too shabby either. The deep, swooping thumb rest and the pointy ends of the left and right click are a tad extravagant, but I like its mean, sharp angles, and the plain black finish—with the occasional streak of gloss—stops it becoming obnoxious.

It feels comfortable in the hand. The grippy thumb rest never rubbed during long sessions, and the coating on the rest of the mouse stops it slipping in your palm. My outer two fingers rested naturally on the outside bulge of the mouse, itself covered with the same surface as the thumb rest, which meant I never had to think about where my hand was sitting (a problem I had with the symmetrical, ambidextrous Viper Ultimate).

The scroll wheel is tuned perfectly: it’s never stiff, but you can still feel the individual clicks as you turn it, so you always know if you’ve applied enough pressure. The mouse buttons—two primary, plus two programmable on the side—feel reassuringly sturdy when you press them. One of my few gripes with the Viper Ultimate was that the buttons felt thin and flimsy: these buttons are thicker, and they even sound less tinny.

However, they do tend to move laterally at the slightest sideways pressure. By prodding them gently you can really make them tilt, which makes me question how long they’ll last. Also, whenever I picked the mouse up and moved it around, the scroll wheel area rattled. It wasn’t loose at all (unlike the mouse buttons, I couldn’t wiggle it side-to-side), so I couldn’t tell what was making the noise, but it made the mouse feel a bit cheap.

The lack of customisation might put some people off, too. It only has two programmable side buttons, which is less than, say, the Logitech G604 Lightspeed, another wireless mouse in the same price bracket. Personally, I don’t ever use more than two extra mouse buttons, but it’ll be a dealbreaker for some.

The mouse also lacks RGB lighting: With a lot of Razer’s mice you can choose the colour and effect of the light-up Razer logo, but there’s no light to speak of here. You can switch the DPI between five custom levels by hitting a button near the scroll wheel, but there’s no colour indicator to let you know your current setting. Instead, you have to rely on Razer’s Synapse tech, which pops up every time you switch sensitivity. If you’re not running Synapse on your PC, you’ll just have to guess.

And despite how pleased I am with the wireless tech, it once lost connectivity altogether. It only took me 10 seconds or so to reconnect—I turned the mouse off and on, then removed and inserted the dongle, and all was fine—but at the wrong moment it would’ve been really frustrating. Still, it only happened once in a week, so I wouldn’t think it’s a systemic problem.

On balance, none of these shortcomings stop me recommending the Razer Basilisk X Hyperspeed. It’s impressive where it matters most: performance, comfort, and battery life. The build quality in some areas feels less than premium, and it’s a shame you can’t customize it as much as other mice, but you won’t find wireless tech and a sensor this snappy for less than $60 anywhere else.

OWC launches Accelsior PCIe SSD for Mac Pro, up to 8TB with speeds up to 6000MB/s

Just as Apple announced the 8TB SSD option for the Mac Pro, OWC has launched a PCIe SSD with storage up to 8TB capacities and speeds hitting up to a blazing 6000MB/s. The OWC Accelsior 4M2 SSD is slot powered and starts at $480 for the 1TB model.

OWC announced the launch of its new PCIe SSD in a press release today:

the all-new Accelsior 4M2 ultra high-performance PCIe M.2 NVMe internal SSD that delivers over 6,000MB/s real-world speeds in capacities up to 8TB. The Accelsior 4M2 is the fastest SSD ever built by OWC and is the perfect storage solution for large format video editing, VR/AR/MR environments, extreme gaming, compute-intensive applications and other high bandwidth needs.

Apple just opened up the 8TB SSD option when configuring a Mac Pro and the upgrade from the base 256GB SSD runs $2600 and that storage offers up to 3400MB/s speeds.

In comparison, the Accelsior 4M2 SSD with 8TB of storage is priced at $1600 by OWC and offers almost twice the max speeds at 6000MB/s.

Meanwhile, the 1TB version runs just $480, $630 for 2TB, and $950 for the 4TB version. Orders are open now and expected to ship out starting on December 30th.

We previously reviewed OWC’s Thunderbolt 3 version of this SSD and found it to be a good option for those working with lots of heavy data.

In addition to the new Mac Pro, the Accelsior 4M2 SSD is compatible with PCs and 2010 and 2012 Mac Pro machines.

OWC Accelsior 4M2 SSD highlights

Supercharges Macs and PCs – ideal for Mac Pro 2019, Mac Pro 2012 or 2010, and PC towers

Work faster – over 6,000MB/s real-world speed in RAID 0

OWC Aura P12 powered solutions – advanced PCIe 3.0 M.2 NVMe Technology

Store more – store up to 8TB of critical footage, images, files and games

Bootable – start working in seconds

Slot-powered – no extra power cables needed

Includes SoftRAID: robust software for TRIM, healthy monitoring and custom creation and management of advanced RAID sets

Deployment ready – pre-configured solutions undergo performance verification

Silent cooling – finned heat sink cover for fan-less operation

Highly versatile – installs into a full-height, half-length x8 or x16 PCIe 3.0 or 2.0

Plug and Play – no drivers needed

Worry-free – up to 5-year OWC Limited Warranty

Stay tuned as we should give the Accelsior SSD a review in the near future.

This Apple Watch charger plugs directly into a USB-C port so you can carry fewer cords

If you’ve been looking for an Apple Watch charger that isn’t yet another long cable that can get tangled up in your bag, Satechi’s new USB-C Apple Watch charging dock might be the charger you’ve been looking for.

The dock looks to be pretty straightforward — it’s just a magnetic Apple Watch dock that can plug into a USB-C port. You can also plug the dock into an included USB-C male to USB-C female cord and plug that cord into a USB-C port, if you need a little extra space between the dock and a USB-C port.

I actually think this could be kind of handy if you want to top up your Apple Watch’s battery while you’re out and about — but honestly, if you charge your Apple Watch every night, that will probably give you enough battery to get through a day already. This charging dock could be useful if want to pack fewer things with cords when you’re traveling away from home with your Apple Watch, though.

Satechi says the charging dock is out now on its website for $44.99.

This pint-sized Radeon RX 5700 would be a perfect fit for a small form factor PC

PowerColor is apparently selling a mini-ITX version of AMD’s Radeon RX 5700 in Japan, and I really hope the company decides to make it more widely available. If so, this could end up being the best graphics card for small form factor (SFF) builds.

I say this because the full-size 5700 is already one of the top GPU options in terms of frames per dollar—the bang-for-buck factor is high, and compares favorably even to Nvidia’s Super counter-punch to Navi. Given that the mini-ITX landscape is somewhat of an afterthought for higher end GPUs, I think PowerColor’s pint-sized card could sell well.

Japanese publication Hermitage Akihabara spotted the card at Aiuto-jp, which as far as I can tell is a distributor in Japan. The site claims the mini-ITX model is exclusive to that region. I’ve reached out to PowerColor to see if that’s true or if there are plans to expand the release to other parts of the world, and will update this article when I hear back.

In the meantime, let’s go over the specs. PowerColor has opted to stick with AMD’s reference blueprint, meaning the mini-ITX model features a 1,465MHz base clock and 1,725MHz base clock. It also wields 8GB of GDDR6 memory at 14Gbps, on a 256-bit bus.

It makes sense not to factory overclock this model, considering it’s designed to fit inside SFF systems where airflow (and heat management) can be a bit of a challenge. The card itself measures just 175mm long. That’s enough room to fit a single cooling fan, which works in conjunction with a standard heatsink and four 6mm copper heatpipes.

Mini-ITX PCs are smaller than even micro-ATX builds (usually, anyway). They’re great for places where space is in short supply, such as a college dorm or anywhere else a person might not want a hulking tower (or even a regular sized tower).

Despite their diminutive nature, it’s possible to build a potent gaming PC around the form factor. The GPU is the primary challenge, though, and to that end, PowerColor’s Radeon RX 5700 ITX would join a spattering of mini-ITX GeForce GTX cards. A quick check on Newegg, for example, yields a few options:

Gigabyte GeForce GTX 1650 Mini ITX OC—$149.99, Newegg

Gigabyte GeForce GTX 1660 Super Mini ITX OC—$229.99, Newegg

Gigabyte GeForce GTX 1660 Ti Mini ITX OC—$435.99 (marketplace seller)

The 5700 is faster than any of those models, making it an enticing option in the mini-ITX space. We’ll have to wait and see what PowerColor decides to do, though. Zotac is also known for making compact graphics cards, but I’m not seeing anything in the modern space that is shorter than around 210mm.

AMD Ryzen Threadripper 3970X vs Intel Core i9-10980XE: High End Flagships Fight

If you just so happen to be in the market for some serious firepower, both AMD and Intel have new high-powered HEDT (high-end desktop) processors designed for the heaviest workloads. Intel recently launched its Cascade Lake-X family, spanning from ten to 18 cores, while AMD unleashed the third-generation Ryzen Threadripper chips with models that come with either 24 or 32 cores.

The 32-core 64-thread Ryzen Threadripper 3970X is the current flagship for AMD’s Ryzen Threadripper lineup, although that will change soon when the chipmaker rolls out the 64-core Ryzen Threadripper 3990X next year. On the other side of the ring, we have the 10th-generation Core i9-10980XE, which serves as the flagship for Team Intel with 18 cores and 36 threads.

Unfortunately, the comparison between the two flagships is rather lopsided: Intel has ceded the upper tiers of the HEDT market to AMD and doesn’t have a comparable chip based on core counts or pricing. However, this is a battle of the most powerful consumer silicon from both companies.

To help you pick one, we put the flagship models from both chipmakers through a seven-round face-off, based on their features, overclocking, coolers, motherboards, performance, and value.

Features

The AMD Ryzen Threadripper 3970X comes packing AMD’s potent Zen 2 microarchitecture and is produced with TSMC’s 7nm FinFET manufacturing process. The core-heavy Ryzen Threadripper 3970X comes equipped with 32 cores and 64 threads that operate with a 3.7 GHz base clock and a 4.5 GHz boost clock. AMD feeds the cores with up to 144MB of total cache (128MB L3 cache).

The Ryzen Threadripper 3970X supports up to four DDR4-3200 memory channels, ECC (error-correcting code) memory, and can accommodate up to 256GB of DDR4 memory. The chip also exploits the latest PCIe 4.0 interface, providing up to 64 PCIe 4.0 lanes for high-speed storage and compatible graphics cards.

The Intel Core i9-10980XE is based on Intel’s new Cascade Lake microarchitecture and ultra-mature 14nm process node. It comes with 18 cores, 36 threads, and 24.75MB of L3 cache. The processor clocks in with a 3.0 GHz base and 4.8 GHz boost.

Like the Ryzen Threadripper 3970X, the Core i9-10980XE has a quad-channel memory controller and supports up to 256GB of DDR4 capacity. However, Intel sets official memory support at DDR4-2933, and the chip doesn’t support the ECC standard. Not to mention that the Core i9-10980XE is still on the PCIe 3.0 interface and only provides the user with up to 48 PCIe 3.0 lanes.

Winner: AMD. The Ryzen Threadripper 3970X possesses a superior feature set. For starters, the core-heavy processor sports useful features, such as PCIe 4.0 and ECC memory support. Additionally, the Ryzen Threadripper 3970X also offers more cores, cache, and speedy PCIe 4.0 lanes, all of which can make a significant difference in the HEDT market.

Motherboard Options

Unlike consumer-class motherboards, there is no chipset segmentation in the HEDT market: One chipset from each vendor delivers all the features the processors have to offer. That would be the X299 chipset for the Core i9-10980XE, and the TRX40 chipset for the Ryzen Threadripper 3970X.

One caveat with the Ryzen Threadripper 3970X is that it resides on a new sTRX4 socket, locking out existing X399 motherboard owners from upgrading to the new chips. Like the processor itself, a TRX40 motherboard consumes a hefty chunk of your budget. The entry-level models typically start around $400, while high-end models stretch up to as much as $850.

The Core i9-10980XE, on the other hand, continues to slot into existing X299 motherboards with the LGA2066 socket. This means that as long as your motherboard is running the proper BIOS, you can just drop the 18-core part into your motherboard without spending a penny on another motherboard.

Nevertheless, motherboard manufacturers have released a fresh wave of revised X299 motherboards to accommodate the Core i9-10980XE. That’s because the new Cascade Lake-X chips come with four additional PCIe 3.0 lanes, which you can’t access on an older motherboard. In regards to pricing, X299 and revised X299 motherboards start at $160 and $260, respectively. The premium offerings top out at around $750.

Winner: Tie. Upgrading to the Core i9-10980XE is cheaper because you’re not forced to invest in a new motherboard. If you’re starting from scratch, there are more X299 options out there to choose from. A quick search dug up as many as 62 different X299 motherboards out in the wild as opposed to the 12 TRX40 models that are currently available.

On the flipside, we suspect that X299 is probably a dead end platform, meaning it won’t be compatible with next-gen chips, while TRX40 is barely in diapers. From a future-proof perspective, TRX40 motherboards represent a better long-term investment even though they are more expensive right now.

Overclocking Potential

The Ryzen Threadripper 3970X and Core i9-10980XE come with unlocked multipliers, opening the doors to manual overclocking. Overclocking aficionados will be happy to know that both processors use solder thermal interface material (sTIM), which improves thermal dissipation, and thus overclocking potential.

Since both processors are relatively new, there aren’t many statistics on their overclocking potential. For now, we’ll evaluate each chip in accordance with our experience.

Although AMD advertises the Ryzen Threadripper 3970X with a 4.5 GHz boost clock, our chip only hit that speed on a single core, and it depends on several factors. We prefer to let AMD’s auto-overclocking Precision Boost Overdrive (PBO) feature do its thing so we continue to benefit from the single-core boost while also enjoying higher clocks in multi-threaded workloads.

AMD’s Ryzen 3000-series consumer processors are a bit stingy on manual overclocking headroom. We expect the new Ryzen Threadrippers to inherit the trait as well. The ceiling for the Ryzen Threadripper 3970X is likely around 4.1 GHz to 4.3 GHz — that is assuming you’re blessed with a good sample and own the necessary custom liquid cooling system to tame the processor.

The Core i9-10980XE features a single-core boost of 4.8 GHz. However, we got our sample to a whopping 4.8 GHz on all cores with a 1.2V Vcore and 2.1V VCCIN. That’s pretty amazing considering we only got the previous-gen Core i9-9980XE to 4.4 GHz, but as always, the silicon lottery applies. You’ll also need a beefy custom watercooling loop to extract the utmost performance.

Winner: Intel. The Core i9-10980XE might be at a core disadvantage, but the 18-core part holds tremendous overclocking potential.

Cooling Solutions

Advertisement

The Ryzen Threadripper 3970X and Core i9-10980XE are multi-core monsters that can’t be tamed with your typical CPU air cooler, therefore, neither AMD or Intel includes a stock cooler with the processor.

The Ryzen Threadripper 3970X is rated with a 280W TDP (thermal design power), meaning you’ll need a very beefy cooler to keep the temperatures in check. AMD recommends you pair the Ryzen Threadripper 3970X with a capable liquid cooling solution. Thankfully, the cooler mounting mechanism for the sTRX4 socket is identical to the previous-gen TR4 socket. That means you can repurpose your existing Threadripper cooler for the 3970X, assuming it’s up to the task.

The Core i9-10980XE rocks a 165W TDP, which is 115W lower than the Ryzen Threadripper 3970X. It’s important to remember that TDP ratings don’t correlate directly to power consumption, but as expected, the 10980XE’s lower overall power consumption equates to lower thermal output. Intel’s chips also have a better efficiency rating in AVX workloads, but Threadripper does hold the absolute performance crown.

Winner: Intel. The Core i9-10980XE has a substantially lower peak power consumption rating. As a result, there are more cooling options available and, if you aren’t overclocking, you don’t have to spend as much on cooling as you would with AMD’s 32-core beast. However, it is noteworthy that AMD’s chips come with far more cores and take the absolute performance lead in threaded workloads, so the tradeoff of increased thermal output is acceptable.

Gaming Performance

The chances of anyone picking up a Ryzen Threadripper 3970X or Core i9-10980XE solely for gaming purposes are very slim. However, streamers will surely appreciate the abundance of cores. Besides, even professional users need to relax every once in a while.

In general, the Ryzen Threadripper 3970X is a better gaming processor than the Core i9-10980XE. AMD’s 32-core part delivers up to three more average frames per second in five out of nine titles. The Core i9-10980XE only beat the Ryzen Threadripper 3970X in three titles, with differences up to eight frames per second.

Winner: AMD. The Ryzen Threadripper 3970X might not be a gaming monster, but it’s still marginally faster than the Core i9-10980XE. If you plan to do a fair bit of gaming, the Ryzen Threadripper 3970X has your back.

Productivity Performance

The Ryzen Threadripper 3970X and Core i9-10980XE really excel with productivity workloads. Judging by the specifications alone, it’s easy to see why the AMD chip should come out as the winner: The Ryzen Threadripper 3970X has 14 more cores than the Core i9-10980XE.

The Ryzen Threadripper 3970X outperforms the Core i9-10980XE in both workstation and Adobe benchmarks. We continue to see the Ryzen Threadripper 3970X’s dominance in rendering, encoding, compression and general office workloads. The Core i9-10980XE beats the Ryzen Threadripper 3970X in a couple of benchmarks.

Winner: AMD, without a doubt. If you’re looking for a processor specifically for work, the Ryzen Threadripper 3970X is the way to go.

Value Proposition

The Ryzen Threadripper 3970X has a $1,999 price tag while the Core i9-10980XE should cost $1,000. Essentially, you’re paying $62.47 per core on the AMD chip and $55.56 on the Intel. Neither HEDT processor is in stock at the moment, though.

Factoring in the cost of a TRX40 motherboard and a liquid cooling solution if you don’t already own one, you might need to spend another $600 at the very least if you’re rolling with the Ryzen Threadripper 3970X.

In the Core i9-10980XE’s case, you could walk away with $350 for a X299 motherboard and a cooler that can handle 165W if you’re going the budget route. But then again, you’ll be buying a platform that already has one foot in the retirement home.

Winner: Tie. The Ryzen Threadripper 3970X clearly requires a more hefty investment, but you also get way more performance out of the chip. Strong competition has forced Intel’s hand to reconsider the pricing on its products. Presently, the X299 platform is as cheap as an HEDT platform can get.

The Bottom Line

Some might argue that it’s not fair to compare the Ryzen Threadripper 3970X to the Core i9-10980XE given the huge core difference, but this is a battle of the flagships, after all. Sadly, Intel has been stagnant in the HEDT space, and its flagship part stops at 18 cores. It just so happens that AMD doesn’t currently have a 18-core chip to go head-to-head with the Core i9-10980XE. So, for the time being, it is what it is.

The Ryzen Threadripper 3970X outshines the Core i9-10980XE in both productivity and gaming workloads. It’s hard to not give AMD the win in a market where performance is the main concern. Performance is expensive, but so is cutting-edge technology like PCIe 4.0, which we think is worth the additional cost.

One of the main advantages of picking up a HEDT platform is that you know you won’t have to replace it every year. Fortunately, AMD has the upper hand since the chipmaker just introduced the sTRX40 socket. There is no promise that AMD will commit to the sTRX40 socket, but if it’s anything like the original TR4 socket, it should last for a couple of good years.

Adata announces stylish XPG Hunter DDR4 memory for desktops and laptops

The new XPG Hunter DDR4 modules come in both conventional DIMM and SO-DIMM form factors.

Adata is expanding its memory line with a new XPG Hunter DDR4 series, available for both desktop PCs (conventional DIMMs) and laptops (SO-DIMMs). The latter form factor also applies to some mini-PCs. The best DDR4 memory kits combine higher speeds with tighter timings, and maybe some RGB bling as a bonus.

“The XPG Hunter modules are made with high-quality chips selected through a strict filtering process. They are equipped with the finest PCBs and pass rigid reliability and compatibility tests to ensure longevity and rugged durability, which are vital for overclocking, gaming, and extreme benchmarking,” Adata says.

Otherwise known as binning, the “strict filtering process” Adata references is a common practice on high performance RAM. Companies test individual chips for their ability to hit and maintain certain speeds, and this is one of the things that can separate an enthusiast kit of RAM from a run-of-the-mill kit.

The modules come wrapped in a stylish heatsink, though if you were hoping for RGB lighting, you won’t find it here. As for the specs that matter (and yes, RAM speed and capacity matters for gaming, to an extent), there are 8GB, 16GB, and 32GB options. Here’s the desktop lineup:

8GB DDR4-2666—CL16-18-18, 1.2V

8GB DDR4-3000—CL16-20-20, 1.35V

16GB DDR4-2666—CL16-18-18, 1.2V

16GB DDR4-3000—CL-16-20-20, 1.35V

32GB DDR4-3000—CL-16-20-20, 1.35V

The SO-DIMM variants for laptops and mini PCs come in the same speeds and capacities, but the timings and voltage are a little different. Here’s a look:

8GB DDR4-2666—CL18-18-18, 1.2V

8GB DDR4-3000—CL17-19-19, 1.2V

16GB DDR4-2666—CL18-18-18, 1.2V

16GB DDR4-3000—CL17-19-19, 1.2V

32GB DDR4-3000—CL17-19-19, 1.2V

None of the kits are showing up in retail channels yet. However, Adata provided some MSRP info, saying the 8GB module costs $39.99 in DIMM form and $49.99 in SO-DIMM form, while the 16GB module costs $79.99 in DIMM form and $99.99 in SO-DIMM form.

I assume those numbers apply to the DDR4-2666 kits, but we’ll have to wait and see how things shake out once these modules and kits show up at places like Amazon and Newegg.