Huawei introduces servers with Intel Cascade Lake Refresh compatibility

Intel refreshed its Cascade Lake silicon in an effort to compete with AMD’s EPYC Rome server-grade processors. It consists of high-end models such as the Xeon Gold 6258R (which corresponds to the 6258), mid-range Gold and Silver chipsets (such as the Gold 6246R as opposed to the 6246) and some Bronze-class products (the 3206R, and so on).

The upper tier started at competitive prices relative to comparable Rome-series CPUs, although the AMD options often still delivered a better cost-per-core, have much higher caches and a higher TDP. The mid-range Gold and Silver refreshes also offer 4 to 6 more cores than original Cascade Lake silicon. Huawei has announced that its FusionServer Pro servers have been adapted to use these newer Intel processors.

This line comprises high-density, blade and rack servers. The 1288H V5 and 2288H V5 variants are examples of the latter. Huawei claims that they offer intelligent data and power-consumption management features. In terms of memory and storage, the 2288H V5 exhibits 24 DDR4 DIMM slots as well as 10 PCIe expansion slots, and support 28 NVMe SSDs, 20 3.5-inch drives, or 31 2.5-inch drives.

These servers also boast in-house technologies such as Huawei Fault Diagnosis & Management (FDM) and Dynamic Energy Management Technology (DEMT). The OEM claims that they function to maximize return on investment while reducing costs.

OnePlus 8 Pro: full camera specs & new features leaked, along with more color options

Some existing reports have pointed to upgraded quadruple rear cameras in the upcoming OnePlus 8 Pro premium smartphone. They are thought to be made up of 48+48+5+8MP sensors; now, a fresh leak purports to outline what each of them does and what they may offer the user.

This new tip, allegedly obtained by Ishan Agarwal and published through PriceBaba, states that the 8 Pro’s new main shooter will be the IMX689. This is the novel f/1.78 Sony sensor that debuted “exclusively” in the OPPO Find X2 flagship series. It is paired to the also-48MP IMX586, which had the role of primary lens in this upcoming OnePlus’ predecessor.

In the newer phone, it will apparently take on an ultra-wide role instead, with a new 120-degree field of view and a 2.2 f-stop. The remaining sensors are allegedly for “Color Filter” (5MP) and telephoto (f/2.44 8MP, with 3X optical & 30X digital zoom capabilities) purposes.

This new set-up reportedly comes with new features, each with a novel marketing name. They include a “Night Portrait Mode”, along with “Cinematic Effects”. OnePlus may have also developed something called “3-HDR Video” for the 8 Pro, not to mention improved OIS.

These new abilities have already been rumored to come packaged in a chassis that has not diverged greatly from the previous generation in terms of design – on the other hand, the 8 series is said to offer new colors: Green, Black and a more mysterious-sounding “Glow”. The blog WinFuture now claims to have discovered what this last term means: an innovative gradient in which an almost Thunder-like purple transitions to a gold-orange sunset effect.

Otherwise, the OnePlus 8 Pro is expected to offer the Snapdragon 865 SoC, a possibly next-gen form of Warp Charge and to introduce wireless charging by its OEM. However, all these potential details are still waiting on its launch event for their confirmation.

THIS CHROME BOOKMARK TRICK TAKES 5 MINUTES TO SET UP BUT SAVES TONS OF TIME

The bookmark bar for Chrome is prime real estate. Once it’s full, your bookmarks fall off the edge and disappear! 😩 But wait! There is a simple bookmark trick that will double the number of bookmarks you can save to your bookmarks bar! 😀

The elements of a bookmark

A bookmark has three basic elements: 

The title of the page

The location of the bookmark

The favicon (icon) of the website

To bookmark a website, click on the star at the end of the Omnibox (search box). You will see a window that looks like this.

The name field is filled in with the page description created by the author, but you can change it (that’s important…we’ll come back to that later). You can also select a folder for your bookmark. Most people want to see their bookmarks on the thin shelf under the Omnibox – the bookmarks bar. Once you click “done” you get a bookmark that looks like this:

HOW TO FIX YOUR BOOKMARKS AND MAKE THEM SMALLER

Look at how much space that bookmark consumes! It’s the same size as the 5 bookmarks to the left of it! Follow these simple steps to shrink your bookmarks and double the number of bookmarks you can fit on the screen. 

Right-click on an existing bookmark and select “edit”

Delete the description for the bookmark

Save

Your bookmark is now represented by the unique favicon for the website! By using this bookmark trick you can fit approximately 30 bookmarks on the bookmark shelf – double (or more) the number of pages you can typically save. The next time you add a new bookmark, make sure you delete (or shorten) the description so that it takes up as little space as possible. 

If you need to bookmark multiple pages from your work or school website, you will probably need to place them all in a folder with a description. To create a folder, right-click on your bookmarks bar and look for “add folder.” 

HANDY PAGES YOU SHOULD BOOKMARK (BUT DIDN’T THINK ABOUT IT)

Now that you have all of this extra space on your bookmarks bar, why not fill it with something useful! Here are some productivity-boosting bookmarks: 

Chrome Utility pages: You can bookmark the setting, flag, or extension page for Google Chrome (Example – Chrome://settings)

Creator Accounts: if you are a content creator, bookmark your YouTube creator studio or Facebook page manager, for easy, one-click access. 

Developer Tools: I save a direct link to my Google Cloudshell instance for quick access. 

Google Drive Files: I bookmark links to important Google Drive files for one-click access to my budget, content calendar, sales forms, etc. 

Windows 10’s next update will make fall in love with your PC all over again

Microsoft has released a new Insider preview build for Windows 10, known as 19592, that introduces a “new tablet experience” for the operating system that should make using 2-in-1 PCs more intuitive than ever before. If you’ve ever used a convertible PC, you’ll know that using the full-fat version of Windows 10 with only touch input or stylus isn’t exactly intuitive.

In fact, that’s probably why Apple has refrained from putting macOS on its iPad series of tablets – the hardware’s form factor lends itself to customised software.

It seems Microsoft knows this and is making changes to Windows 10 accordingly. The new tablet experience makes a number of alterations to the hugely popular operating system, four of the most substantial being:

Taskbar icons are spaced out

• Search box on taskbar is collapsed into icon-only mode

• Touch keyboard auto invokes when you tap a text field

The new tablet experience was initially rolled out to Insiders in an earlier Windows 10 preview build – 18970. However, Microsoft said it removed the feature in build 19013 and is reintroducing it after making some tweaks.

Announcing the feature, the Redmond tech giant said: “We are beginning to roll out (again) the new tablet experience for 2-in-1 convertible PCs as a preview to some Windows Insiders in the Fast ring.

“This experience is separate from the Tablet Mode experience that you will still be able to use. This new experience allows users entering tablet posture to stay in the familiar desktop experience without interruption with a few key touch improvements.”

Windows 10 already has a dedicated Tablet Mode and has done for some time. However, this dramatically changes the look and feel of the software, placing app tiles front and centre.

The new tablet experience is designed for users who still want to use Windows 10 in its original form, but introduces a number of nifty tweaks to make using it more natural.

Microsoft has said the latest addition will arrive in a “future Windows 10 update”, although it’s unclear if this will be the forthcoming 20H1 or something coming later.

LineageOS ROM releases first builds based on Android 10 (Updated)

LineageOS is the most popular custom ROM in existence, and the project prides itself on bringing newer versions of Android to unsupported devices. However, Lineage has been a bit slow to roll out a version based on Android 10 ⁠— the Pie-based ROM was already available by this time last year. Thankfully, the next major version of LineageOS seems to be just around the corner.

Even though unofficial Lineage ROMs based on Android 10 have been available for months, the project has yet to release a completed update for any officially-supported devices. That could change soon, as the project’s build server has been updated to list all the phones that will receive nightly builds of LineageOS 17.1 (based on Android 10).

The list currently contains the Fairphone 2, the original Moto Z, the much-loved HTC One M8, and a shit-ton of LG G3/V20 variants. The bolded models below are completely new to LineageOS, but the rest are already on LineageOS 16 Pie.

Fairphone FP2 (FP2)

Motorola Moto Z1 (griffin)

HTC One (M8) (m8)

HTC One (M8) Dual SIM (m8d)

Samsung Galaxy S4 Active (jactivelte)

Samsung Galaxy S4 (SGH-I337) (jflteatt)

Samsung Galaxy S4 (SCH-R970, SPH-L720) (jfltespr)

Samsung Galaxy S4 (SCH-I545) (jfltevzw)

Samsung Galaxy S4 (GT-I9505/G, SGH-I337M, SGH-M919) (jfltexx)

Samsung Galaxy S4 Value Edition (GT-I9515/L) (jfvelte)

LG V20 (AT&T) (h910)

LG V20 (T-Mobile) (h918)

LG V20 (International) (h990)

LG V20 (Sprint) (ls997)

LG G5 (Unlocked US) (rs988)

LG V20 (US Unlocked) (us996)

LG V20 (Verizon) (vs995)

LG G2 (AT&T) (d800)

LG G2 (T-Mobile) (d801)

LG G2 (International) (d802)

LG G2 (Canadian) (d803)

LG G3 (AT&T) (d850)

LG G3 (T-Mobile) (d851)

LG G3 (Canada) (d852)

LG G3 (International) (d855)

LG G3 (Korea) (f400)

LG G3 (Verizon) (vs985)

Interestingly, builds of LineageOS 16 (Pie) have already been disabled for the above-mentioned phones, so they won’t receive any more updates until the Android 10 ROM is available. Lineage typically maintains both branches for at least some devices after a new major OS update is released (e.g. the Nexus 6 had Oreo and Pie ROMs available until earlier this year), but that’s seemingly not the case here.

A Real-World Review of the Canon 1D X Mark III

As many of you know, I have been lucky enough to have a Canon EOS 1D X Mark III in my possession for more than a month now. People have been asking me to review this new top-of-the-line camera, but I really wanted to put it through its paces in order to do a fair review.

There are lots of photographers or tech reviewers who write reviews of a new product, basically looking at the spec sheets, or holding it in their hands for a couple of minutes. But in my mind, there is no better way to review a product than to use it as my primary camera for a while and really get to know it in detail.

Now that I have become pretty familiar with the ins and outs of this camera, it is time to share my findings with all of you.

So… on to the testing.

I took the camera out of the box and was happy to see that the body is very similar to the previous models, with buttons and joysticks right where I expect them. I was also happy to see a familiar battery and charger that is basically the same as the previous model.

The one big difference is that the new camera has two CFExpress card slots, which as many of you know, I was really hoping for. I like this for two reasons:

1. I like having the two extremely fast cards instead of one fast card and one legacy card format which slows everything down. This is really important because I always shoot RAW images to both cards for redundancy.

2. I like having 2 card slots using the same card format. I always found it frustrating to have a CFast slot and a CompactFlash slot in the same camera.

The first photos taken with the Canon 1D X Mark III were taken in my backyard. I like to use a new camera for non-client shoots for a while to build trust and familiarity with the camera and memory cards. The last thing I would do is use this camera on a paying job before I knew how to control it. I need to know that the images will be captured correctly in the camera and stored correctly on the memory cards before using it in a real-world situation.

This was also a time for me to try out the new CFExpress cards from ProGrade Digital. I had inserted a 512GB card in slot 1 and a 1TB card in slot 2, so capacity was not a problem!

The first couple of photos were of my dog, Cooper, who was nice enough to pose for me. It was my first time holding the camera and trying to the new smart controller for moving the focus point (more on that in a little bit). No fast action here, but it gave me a chance to inspect the image quality of the camera, which looked really great.

We were dog-sitting for a friend and our dog Cooper decided to play with Milo and give me some action shots. This was the first time trying the fast burst shooting of 16fps. The first thing I noticed with the 1D X Mark III was that it felt totally familiar in my hands.

Having used a 1D X and a 1D X Mark II in the past, I felt right at home shooting with the new body. The one big difference is that the new model has a touch screen LCD. I have gotten used to this on my Canon 5D Mark IV and find it very useful when shooting in the field.

Shooting at the fast burst rate enabled me to catch this shot of Cooper with all four paws off the ground. (Cooper forgets that he is 8 years old and still thinks he is a puppy).

This was my first chance to play around with the new smart controller. What is the smart controller? Canon took the back button focus button and added a new twist. This button now acts as a virtual joystick, so that if I move my thumb along the back of the button, the focus point will move accordingly. This can be incredibly handy, but also takes some getting used to.

There were a couple of times when I pushed the back button to focus and inadvertently moved the focus point to a location I did not want. But, with time, I have gotten used to this and really appreciate the feature a lot. What I have found is that the smart controller is optimum when shooting portraits, but I still prefer a locked single point of focus for sports.

My last trip, before all this COVID-19 craziness, was to Las Vegas for the WPPI show. I was not planning on bringing my 1D X Mark III to Las Vegas, but right before leaving, I had the offer to meet up with my buddy Drew, Canon USA’s top tech guy, who offered to help me customize the settings to get the most out of the new features of the camera. That turned out to be awesome, and I will tell you more about that in a minute.

While at the show, there was a rain booth set up for people to photograph models dancing in water. I saw this as a perfect time to try out these new settings.

I used the new 1D X Mark III at it’s full speed at 16 frames per second, with a Canon 24-105mm lens to capture the dancers. The super-fast frame rate of the camera allowed me to capture them at the peak of action.

The newer focus system also did a very good job of locking in on the dancers as they moved around at a fairly quick pace.

As I mentioned, Drew sat down with me to give me pointers on the new camera. And there is a lot to learn on this new piece of hardware. The Canon 1D X Mark III looks a lot like the Canon 1D X Mark II, but looks can be deceiving. What is under the magnesium alloy body is very different from the previous model.

One of the biggest differences of the 1D X Mark III is the new face and head detection. I was shown how to tweak the camera to take advantage of the face and head detection covering most of the frame. This means that once I locked in on a person, it would follow them even if they moved off-center from the lens.

I got credentials to shoot the San Jose Earthquakes game and put the camera to a test. I mounted the Canon 200-400mm lens to the 1D X Mark III and found the focusing system to be noticeably faster and more accurate than the 1D X Mark II.

I would lock focus on a particular athlete and then let the camera follow them from that point. As long as I kept the athlete in the frame, the tracking stayed on them, even if someone briefly ran in between them and me. This allowed me to capture images like this, where the Earthquake player is in perfect focus even though he is not in the center of the image.

The camera is capable of shooting 16 frames per second (fps) when using the shutter and 20 fps when in live view mode. This is great except that I can not imagine shooting a sporting event in live view and trying to follow fast action using the screen on the back of the camera. But, needless to say, 16 fps is plenty fast and allowed me to easily capture the peak of action during the game.

Even though I was shooting in RAW mode using the ProGrade Digital CFExpress memory cards, I never once filled the buffer of the camera. These cards can transfer 1600MB/s per second, which is nothing short of amazing.

I kept the camera in Auto White Balance for the entire afternoon and found it to be very accurate in the representation of the colors.

Note: Some of you may be wondering about the video capabilities of the new camera. But since I am primarily a still photographer, I will leave the video review to the experts who know that side of the business way more than I do.

My next test of the camera was in a completely different environment. My niece and her husband asked if we could take portraits of their one-year-old son. This time I was using the camera at higher ISOs indoors and going outside with different lenses.

Patrick did not move at the speed of a soccer player, but he definitely moved faster than a year ago, when I took his baby photos. Once again, the 1D X Mark III (this time combined with the Canon 70-200mm f/2.8 lens) was tack sharp on his eyes.

After taking a bunch of portraits of the little guy on the grass and standing, they asked if I could get some photos of him in the swing. As soon as I started photographing him, I realized that this was a perfect test of the new focus system.

The following images really help tell the story of this new face and head tracking.

Using back button focus, I locked focus on Patrick and then hammered the shutter at the full speed of 16 fps. Even though his head was moving off-center of the frame, the focus stayed perfectly on him. You can scroll through the following images to see how accurate this was!

I figured that the black swing would interfere with the focusing of his face, but that was not the case.

This sequence is a perfect example of how I set up the shot. In this image (above) I locked focus on Patrick when he was dead center and the focus point was right on this face.

Then, as he was going back and forth, I just held down the back button and the focus points moved with him.

You can see here that his face is well off the center of the image, but the focus is still perfect. If I were to try this with the previous Canon models, I would have had to move the camera and lens to keep the focus point on his face. This would have been very difficult to do and would have yielded a lot less useable images.

The Canon 1D X Mark III has a newly designed 20.1-megapixel CMOS sensor which is ample for most of my photography. Do I wish for a little more resolution? Maybe. I do like the file sizes of the Canon 5D Mark IV which captures at 30.4MP, but having clean images at higher ISOs is still the most important thing to me. And I know that cramming more megapixels onto a sensor can degrade the high ISO sensitivity.

A couple of weeks ago, I was doing a portrait shoot for a young lady who was about to have her bat mitzvah. Well…until it was postponed due to the Covid-19 virus outbreak. For this shoot, I used the Canon 1D X Mark III with a Canon 600EX-RT flash mounted on the hot shoe of the camera.

Canon has designed a new low pass filter for better lens sharpness, and the image quality of the camera is exceptional, with the colors, skin tones and clarity being everything I was expecting from a pro camera. I don’t fully understand how the new DIGIC X image processor works, but I can tell you that everything in this camera is fast. From focusing speed, the processing of the image, to data transfer to the card.

There was one anomaly though. When I take portraits, I almost always do so in a slow burst mode. There is no need to shoot at 16 fps, and yet I never have my cameras set to a single-shot mode. I don’t like the single-shot mode since I always want to be prepared to shoot multiple images when if a perfect moment arises.

With every other Canon DSLR I have used, the slow burst mode is a predictable sequence of shots. I hit the button and I get “click….click….click”. Weirdly enough, when I had my flash on the camera and I was shooting outdoors, the frame rate was a bit erratic. I expected “click…click…click” at a predictable pace and instead I got “click..click…click.click.click” or “click…click.click…..click”. I am hoping that this is something that Canon will fix in a future firmware update.

After using the new camera for numerous shoots, I felt comfortable using it to create images at a client’s bar mitzvah. For their portraits, I loved using the smart controller to easily move the focus point out of the center and taking full advantage of the 191 focus points.

While spending time with Canon in Las Vegas, I was also shown how to use the 1D X Mark III in mirrorless mode. Since the mirror is locked out of place, this allows me to shoot with absolutely no shutter noise at all. Combining this silent mode with the face tracking autofocus is a real game-changer.

For this bar mitzvah, I was using the Canon 200-400mm lens on the 1D X Mark III, mounted on a Gitzo gimbal fluid head and tripod. It was awesome to lock focus on the boy’s face and let the camera track his movements while I silently took photos.

While shooting this way, I came across another weird anomaly. As I mentioned earlier, I like to shoot in a slow burst most of the time. When taking these photos, I had the camera in Live View mode (essentially shooting mirrorless) and also had the camera set to slow burst. But when I hit the shutter release I saw that the camera was capturing at the fastest burst rate of 20 fps.

This is complete overkill for an event like this. I sent a text to the Canon expert from the back of the Temple and he replied back and told me that when in Live View, the camera will capture either a single shot or full speed. There is currently no in-between. This is something else that I hope is changed in a future firmware release.

When I photograph events, it is quite common for me to shoot full RAW for the service and then switch to a smaller file size for the party. In the past, that meant that I would switch my files from RAW to MRAW. On the Canon 5D Mark IV, that meant that I was switching from a file size of 30MP to 17MP and a resolution of 6720×4480 down to 5040×3360.

So you can imagine my surprise when I got to the party and went to change the 1D X Mark III to MRAW and it wasn’t there. All I saw was RAW and something called CRAW, but both were listed at the same resolution of 5472×3648. It was time for another text message to my Canon contact asking for urgent help.

He explained to me that MRAW has been replaced with CRAW (in the new CR3 files) and that even though they are the same resolution, the CRAW file is more compressed. I recently tested this and found that an image taken in RAW was 25.8MB and the same exact image at CRAW was 14.3MB in size. When zooming in at 400%, I could see how the increased compression decreased the quality a bit, but it was only a slight difference.

I love the idea of having the same resolution with higher compression than a smaller resolution.

There are certain key moments during a bar mitzvah celebration, and the family members being lifted in the chair is one of them. For the last 6 years, I have relied on the Canon 1D cameras to capture this moment. Why? Because the focus system is more accurate than the Canon 5D and the camera can write to two cards faster than the less expensive cameras.

The Canon 1D X Mark III definitely proved that it could lock focus even in low light, and wrote to the two CFExpress cards faster than my flash units could keep up.

With all of this said, there are still features of the Canon 1D X Mark III that I have yet to explore, and I look forward to doing so in the near future. As many of you now know, the Summer Olympics in Tokyo has been postponed. This postponement is a major disappointment for the organizers, the athletes, the public, and me. I was so excited to use this new camera at the Games. But I guess that will have to wait for a while longer before I get that chance.

Looking on the positive side, it gives me that much more time to get familiar with the new camera before the big event.

8 Chrome extensions that help you stay productive on your Chromebook

A lot of people are starting to get accustomed to remote working, considering that many countries have mandated shutdowns and quarantines to battle the novel coronavirus. While working from home may be comfortable and fun at first, you need to have the right tools at your disposal to be efficient and stay productive. If you have a Chromebook, the best way to tailor your working experience to your needs is via browser extensions, and we’ve got a collection of eight great tools for you.

Text Blaze

Text Blaze is a tool that can save you a ton of time when you need to type the same words over and over. It does so by replacing user-defined snippets with any text you could imagine, completely customizable. For example, you can create a “/sig” snippet to add your email signature, and “/letter” could be expanded to a template for a form letter. It even helps you quickly write ASCII emoji, such as “/shrug” for ¯\_(ツ)_/¯. It’s also possible to automatically add values from your clipboard or the current date, time, and more. An expanded text can even contain custom input fields, so you know which placeholders to remove before publishing some text or sending an email.

The extension is free to use up to a certain amount of snippets and folders — to unlock more, you need to subscribe.

Get the extension on the Chrome web store: Text Blaze

We’ve yet to find the perfect clipboard manager for Chrome OS, but Clipboard History Pro comes closest at the moment. It automatically saves every string you copy and makes it available for later pasting. To quickly access saved snippets without using your trackpad or mouse, type in chrome://extensions/shortcuts in your address bar and create a keyboard shortcut for the extension once you’ve installed it. That makes it pretty easy to use, but it’s still a little harder to find and paste older snippets compared to clipboard managers such as Alfred on macOS.

Basic functionality is free of charge, but if you want to automatically sync your clipboard across different devices, you need to subscribe.

Get the extension on the Chrome web store: Clipboard History Pro

Peek

The open-source extension lets you preview all kinds of files on the web before you download them. This is particularly useful if you need to research lots of PDFs, as you can quickly view these by hovering over the download link right in Google Search thanks to the add-on. It’s not limited to PDF, either. Peek also supports TXT, RTF, Word and Excel documents, PowerPoints, WebM, GIFV, MP4, and OGG/OGV videos, MP3s and WAVs, and most common web image files. It’s developed by our own Corbin Davenport and could’ve saved me a lot of time and headaches back when I was a student.

Get the extension on the Chrome web store: Peek

Save to Pocket

Save to Pocket isn’t a productivity tool per se, but if you find interesting articles while surfing mindlessly around the web researching, the extension allows you to save them for later consumption. Posts will be added to Pocket, a Mozilla-owned reading service available as a web app or on Android and iOS. That way, you can keep long and potentially distracting content out of your working hours and read it when you have time. Pocket is free to use, but some more advanced features like unlimited highlights and automatic tags are tucked away behind a subscription.

Get the extension on the Chrome web store: Save to Pocket

The Great Suspender

Since many entry-level Chromebooks come with only 4GB of RAM, you might fill up all that space quickly when working with many open tabs. If closing all of those websites to alleviate memory pressure isn’t an option, an extension like The Great Suspender could do the job. It automatically kicks tabs out of memory when you don’t touch them for an adjustable amount of time. The extension won’t shut down sites playing audio or video, and you can also whitelist specific domains like Slack. Similarly, websites that contain filled out text fields won’t be suspended, either.

Get the extension on the Chrome web store: The Great Suspender

WasteNoTime

WasteNoTime helps you manage how much time you spend on distracting websites and can be used to completely block social media, YouTube, and any other address you could think of. As a less draconic measure, you can also set up a maximum amount of time per day that you want to spend on specific websites and have them blocked once the limit is reached. A detailed report lets you see if you missed any other time sinks. It’s also possible to set up different limits during working hours and leisure time, which is perfect if you don’t manually want to turn off the blocker every time you want to scroll through Twitter at night.

Get the extension on the Chrome web store: WasteNoTime

News Feed Eradicator

While some people have no issues with blocking social media altogether while working from home, others rely on Facebook and similar platforms for their job. If you need some Facebook features but tend to get sucked in by the network’s newsfeed, the News Feed Eradicator extension might be for you. It allows you to access Facebook, but it replaces the newsfeed with random inspirational quotes — you can even add your own.

Get the extension on the Chrome web store: News Feed Eradicator

Toggl Button

Toggl is a tracking tool you can use to keep tabs on how much time you spend on which projects. The Toggl Button extension adds a simple one-click solution to start tracking the time you work on a project, but you can also use it as a Pomodoro timer, which you can activate in the extension’s settings. Pomodoro is a technique that has you fully focused on a task for 25 minutes, followed by a 5-minute break. Thus, the extension lets you see how much time you spend on a project and provides you with predefined breaks, which could help you get some routine.

If you don’t want to dig into Toggl Button’s settings to make it work that way, you can also check out Focus To-Do as a similar alternative that functions as a Pomodoro timer right out of the box.

Get the extension on the Chrome web store: Toggl Button

Alternative: Focus To-Do

All of these extensions also work for Google Chrome and other Chromium-based browsers, like Microsoft Edge. Some are even available as standalone apps for Windows and macOS if you’d rather go that route on a device other than a Chromebook. If you think we missed any other great Chromebook productivity extensions, go ahead and share them in the comments.

The groundwork is set for Apple ‘Pro’ ARM Mac chips

ARM chips power most of the world’s smartphones and tablets, but there are high-performing ARM chips in data centers now. Apple may not need to wait long —if at all —for speed in a high-end ARM Mac.

For most desktop, laptop or datacenter applications, Intel’s x86 chips have long been the industry standard. But amid rumors of Apple’s switch to first-party ARM silicon for its Macs, there are a handful of manufacturers pushing high-performance ARM chips to the market.

Here’s why that’s significant, and how Apple’s abandonment of Intel could kickstart a broader switch.

Differences between ARM and x86 chips

ARM chips are much more power-efficient than Intel x86 chips, and generally offer better performance-per-watt. That’s due to a variety of reasons, including a simpler instruction set, the use of fewer transistors and overall slower clock speeds.

The power efficiency of ARM, along with other factors like a low cost of production and development, has largely lead ARM chips to become the industry standard for most mobile devices like smartphones, tablets, and lightweight PCs like select Google Chromebooks.

But when it comes actual high-performance chipsets, particularly for use in desktops computers or laptops, the assumption has long been that Intel’s x86 is the natural choice. That’s been the case for a long time, and Intel’s steadily increasing dominance could be even be seen in Apple’s switch from PowerPC to x86 in 2005.

The same is true for data centers and servers. While minor exceptions do exist, the vast majority of the web’s architecture is still based on Intel’s chip design instructions. Intel is still the leader in terms of market share (nearly to a monopolistic degree), with AMD x86-based chips making up the lion’s share of the scraps.

There have been talks of a broader shift to ARM-based servers since the early 2010s. Nearly a decade later, ARM chips are used in some server applications, but their overall market share pales in comparison to its x86 competitors.

But as the lines blur between mobile device and laptop, and datacenter operators are increasingly looking at more efficient and cheaper server options, 2020 is the year when things could start to change.

The current state of ARM processors

One interesting thing about the current ARM chip industry is that the majority of ARM processors in use are based on custom chip designs.

Apple’s A-series system-on-chips (SoCs) are a prime example, especially because they show off the potential power of ARM processors. Apple currently makes the fastest smartphone chips on the market, which allows its iPhones to keep up with or beat rival Android devices that appear to have better specifications on paper. To get this done, Apple designs its chips in-house and only relies on the larger ARM ecosystem for processor instructions.

The latest A13 Bionic iPhone chip, for example, approached the performance of some desktop CPUs. And beyond Apple, companies like Qualcomm, MediaTek and other manufacturers also take the ARM chip design instructions and apply them to their own custom silicon.

While ARM chips aren’t widely used in servers or datacenters, many third-party manufacturers have been designing their own ARM-based chips aimed specifically at that market. These server processors aren’t a straight drop-in to existing desktop hardware, but given Apple’s use of Intel Xeon processors, it isn’t that far off.

In March 2020, Ampere debuted an ARM-based 80-core server processor called the Altra, which the firm projects will offer 2.11 times better power efficiency and up to 2.23 times better in raw performance than an Intel Xeon Platinum 8280.

For comparison’s sake, the Xeon Platinum 8280 is a Cascade Lake chip released in the second quarter of 2019 that sports 28 cores, 2.7GHz base frequency and a 205W thermal rating. Used in a rack setting, Ampere even goes so far as to say a rack of Altra processors can offer up to 120% better raw performance than the 8280, ARS Technica reported

Back in 2018, Amazon announced its first ARM-based server chip, the Graviton. Though that server chip didn’t appear to make a lasting impact on the market, Amazon announced a new chip in March 2020 called the Graviton 2, which the company says offer better price-performance than AMD and Intel in many server workloads.

Amazon says users of its Elastic Compute Cloud, or EC2, web service can expect 40% better price-performance. For cloud-based companies or software-as-a-service firms, that could offer serious competitive advantages, since they would be spending less money on comparable service speeds.

Marvell, the volume leader in ARM server chips, also unveiled a new ThunderX3 “Triton” chipset, a piece of 240W silicon with 96 cores. According to Marvell, the ThunderX3 consistently offered better performance than Intel’s 2019 Cascade Lake-SP chips across several cloud-based workflows, such as MySQL or CDN.

While third-party ARM chips are out there, ARM Holdings, the company that designs the ARM chip instructions, is also getting into the market itself. Perhaps most interesting is the company’s laptop chips, which have long lagged behind Intel but appear to be catching up.

The Cortex-A76, first implemented in hardware in 2019, is a laptop chip that offers roughly the same performance as an Intel Core i5-7300, ARM chief architect Mike Filippo told CNET. That isn’t quite stunning performance, but it suggests that ARM is looking to catch up to Intel and beat them at their own market.

ARM Holdings’s first-party server architecture has been making a splash in the arena, too. In 2018, the company announced a revamp of its server architecture dubbed Neoverse N1. (Which, it’s worth noting, is the architecture that most of the aforementioned chips, like the Altra and Graviton 2, is based on.)

Since then, ARM Holdings has released some very ambitious server chips, such as 2019’s Ares design.

The significance of high-performance ARM chips

It’s no secret that Intel’s pace of performance upgrades has been slowing down, and the company is suffering from other problems, too. The issue is that the company essentially argues that the performance ceiling has largely been hit and it’s already taking full advantage of what’s possible with modern-day computers.

But ARM is offering a different take with its chips, promising in 2018 that each generation of its future chips would be at least 30% faster than past generations. That’s far beyond what Intel is promising with its current chip design.

That’s just the first-party silicon designed by ARM itself. That doesn’t address the third-party custom chips made by Apple and other companies, which have largely proven to be massive market successes in tablets, smartphones and IoT devices.

To put all of this plainly, it’s looking like ARM and other chipmakers are investing serious resources into creating chips that can compete with x86 processor. Not just at the low-end consumer use case, but also at both the desktop and server level, leaning more towards high-performance computing.

As far as the impact to users, there are some big ones. As mentioned before, ARM chips are generally more power-efficient. With their speeds catching up, it could lead to devices with much longer battery lives that still pack a performance punch. They also produce less heat than Intel chips, and when combined with computer or server cooling mechanisms, could suffer less from performance throttling.

The cost of manufacturing and ease of deployment may also play major roles, perhaps bringing down the cost of computers or other electronics over the long-term at the consumer level.

And high-performance ARM chips are coming at a time when Intel is floundering. The company has made various broken promises on performance upgrades and missed deadlines, not to mention the fact that chip-level Intel vulnerabilities have made a lot of devices less secure.

ARM hasn’t had those problems, historically. While that’s largely attributable to market share, it could mean that many computer manufacturers will be given a fresh start.

The future of ARM chips

ARM has largely taken a backseat to Intel’s x86 when it comes to the devices, like servers and workstations, that power our world’s work. But there are some undercurrents in the industry that could suggest the tides are shifting.

Apple’s expected ARM MacBooks will likely be a tipping point, depending on what kind of performance Apple’s custom Mac chips will offer. But Project Catalyst could also be a major factor for a broader shift to ARM, since it’s encouraging popular app developers to seriously consider supporting the ARM architecture.

Microsoft, Apple’s chief competitor in the laptop space, also debuted a new line of Surface devices in 2019 that sport what the company calls the first 3GHz ARM-based chip —the SQ1.

The ongoing COVID-19 outbreak could also spur a shift to ARM in the server and datacenter spheres since major companies are going to seriously consider their performance-per-dollar metrics on critical hardware going forward. ARM server chips are cheaper than Intel ones, and as we’ve covered, their performance-per-watt tends to be better.

There are other factors, too. The needs of data centers everywhere are rapidly evolving. Because Intel is on the verge of becoming a monopoly in the server space, switching to ARM servers is a good way for data center operators to source processors from a wider range of suppliers.

Of course, it will undoubtedly take some time for ARM to become as firmly entrenched in our day-to-day Macs as Intel. But there are now signs on the horizon that it’s becoming a serious possibility, rather than a far-fetched prediction.

Apple and an ARM Mac

Apple probably isn’t going to take an A-series processor and drop it straight into a Mac. A custom chip, tailored for Mac use like other ARM chips are tailored for the server market, is the most likely scenario. It doesn’t have to wait for high-end chips, but it probably will in the interest of a smooth transition for that market.

The conventional wisdom is that Apple will start at the low-end, like the MacBook and Mac mini. Specifically, with laptops, ARM-based chips are particularly well-suited to deliver excellent battery life with no compromises to performance. After some period of time that isn’t clear yet, it will move the chips up to the “Pro” level hardware —probably when the “Pro” user base demands it.

Right now, today, Apple could deliver high-end performance with workstation chips similar to the Altra and ThunderX3 in the same thermal and power envelopes that the iMac Pro and Mac Pro use now with the Xeon processor. The software would have to follow —which is why that transition probably won’t be day and date with the lower-end models.

Looking at the broader picture, Apple would obviously benefit from having tighter control over its entire stack, a goal that it has already attained on the iPhone and iPad. And consumers will probably enjoy Mac performance gains akin to the iPhone’s outpacing of most rival devices —not to mention significant battery life improvements and a potential reduction in cost for Apple.

The transition may not be the smoothest, particularly for developers or users reliant on non-updated plugins and software. But taking developments like Project Catalyst into account, it does seem inevitable.

Google resumes Chrome updates with Chrome 81 coming the week of April 7

Last week, Google announced that it’s temporarily pausing updates for its Chrome browser and Chrome OS. Due to the COVID-19 coronavirus, many people are working from home and relying on their browser more for day-to-day work, so the company wanted to focus on stability and security.

Today, the firm announced that it’s resuming updates, but on a new schedule. Here’s how it’s going to work. In the stable channel, Chrome 81 will arrive the week of April 7 (two weeks from now), and Chrome 83 will arrive in mid-May, which is actually earlier than originally planned. Chrome 82 is canceled completely.

Canary, Dev, and Beta channel updates are all arriving this week. The Beta channel is going to be bumped up to Chrome 81, while Canary and Dev will both get Chrome 83. Google says that it will provide timing for Chrome 84 in a future update.

Presumably, these changes will be reflected across the board with other Chromium-based browsers. That includes Microsoft’s Edge, Brave, Vivaldi, and more. You can expect to see announcements from those companies soon.

FreeSync vs. G-Sync 2020: Which Variable Refresh Tech Is Best Today?

For the past seven years or so, the best gaming monitors have enjoyed something of a renaissance. Before Adaptive-Sync technology appeared in the form of Nvidia G-Sync and AMD FreeSync, the only thing performance-seeking gamers could hope for was a refresh rate above 60 Hz. Today, not only do we have monitors routinely operating at 144 Hz and some going even further, Nvidia and AMD have both been updating their respective technologies. In this new age of gaming displays, which Adaptive-Sync technology reigns supreme, G-Sync or FreeSync? 

For the uninitiated, Adaptive-Sync means that the monitor’s refresh cycle is synced with the rate at which the connected PC’s graphics card renders each frame of video, even if that rate changes. Games render each frame individually, and the rate can vary widely depending on the processing power of your PC’s graphics card. When a monitor’s refresh rate is fixed, it’s possible for the monitor to begin drawing a new frame before the current one has completed rendering. G-Sync, which works with Nvidia-based GPUs, and FreeSync, which works with AMD cards, solves that problem. The monitor draws every frame completely before the video card sends the next one, thereby eliminating any tearing artifacts. 

Today, you’ll find dozens of monitors, even non-gaming ones, boasting some flavor of G-Sync, FreeSync or even both. If you haven’t committed to a graphics card technology yet or have the option to use either, you might be wondering which one is best. And if you have the option of using either, will one offer a greater gaming advantage over the other?

Fundamentally, G-Sync and FreeSync are the same. They both sync the monitor to the graphics card and let that component control the refresh rate on a continuously variable basis. 

Can the user see a difference between the two? In our experience, there is no visual difference when frame rates are the same.

We did a blind test in 2015 and found that when all other parameters are equal between two monitors, G-Sync had a slight edge over the still-new-at-the-time FreeSync. But a lot has happened since then. Our monitor reviews have highlighted a few things that can add or subtract from the gaming experience that have little to nothing to do with refresh rates and Adaptive-Sync technologies.

The HDR quality is also subjective at this time, although G-Sync Ultimate claims better HDR due to its dynamic tone mapping. 

It then comes down to the feature set of the rival technologies. What does all this mean? Let’s take a look.

G-Sync Features

G-Sync monitors typically carry a price premium because they contain the extra hardware needed to support Nvidia’s version of adaptive refresh. When G-Sync was new (Nvidia introduced it in 2013), it would cost you about $200 extra to purchase the G-Sync version of a display, all other features and specs being the same. Today, the gap is closer to $100.

There are a few guarantees you get with G-Sync monitors that aren’t always available in their FreeSync counterparts. One is blur-reduction (ULMB) in the form of a backlight strobe. ULMB is Nvidia’s name for this feature; some FreeSync monitors also have it under a different name, but all G-Sync and G-Sync Ultimate (not G-Sync Compatible) monitors have it. While this works in place of Adaptive-Sync, some users prefer it, perceiving it to have lower input lag. We haven’t been able to substantiate this in testing. Of course, when you run at 100 frames per second (fps) or higher, blur is a non-issue, and input lag is super-low, so you might as well keep things tight with G-Sync engaged. 

G-Sync also guarantees that you will never see a frame tear even at the lowest refresh rates. Below 30 Hz, G-Sync monitors double the frame renders (and thereby doubling the refresh rate) to keep them running in the adaptive refresh range.

FreeSync Features 

FreeSync has a price advantage over G-Sync because it uses an open-source standard created by VESA, Adaptive-Sync, which is also part of VESA’s DisplayPort spec. 

Any DisplayPort interface version 1.2a or higher can support adaptive refresh rates. While a manufacturer may choose not to implement it, the hardware is there already, hence, there’s no additional production cost for the maker to implement FreeSync. FreeSync can also work with HDMI 1.4. 

Because of its open nature, FreeSync implementation varies widely between monitors. Budget displays will get FreeSync, at a 60 Hz or greater refresh rate and little else. You won’t get blur-reduction, and the lower limit of the Adaptive-Sync range might be just 48 Hz, compared to G-Sync’s 30 Hz.

But FreeSync Adaptive-Sync works just as well as any G-Sync monitor. Pricier FreeSync monitors add blur reduction and Low Framerate Compensation (LFC) to compete better against their G-Sync counterparts.

G-Sync vs. FreeSync: Which Is Better for HDR? 

To add even more choices to a potentially confusing market, AMD and Nvidia have upped the game with new versions of their Adaptive-Sync technologies. This is justified, rightly so, by some important additions to display tech, namely HDR and extended color. 

On the Nvidia side, a monitor can support G-Sync with HDR and extended color without earning the “Ultimate” certification. Nvidia assigns that moniker to specific monitors that include 1,000 nits peak brightness, (which all the currently certified monitors achieve via the desirable full-array local dimming (FALD) backlight technology). There are many displays that are plain G-Sync (sans Ultimate) with HDR and extended color. 

A monitor must support HDR and extended color for it to list FreeSync Premium on its specs sheet. If you’re wondering about FreeSync 2, AMD has supplanted that with FreeSync Premium. Functionally, they are the same. 

Here’s another fact: If you have an HDR monitor that supports FreeSync with HDR, there’s a good chance it will also support G-Sync with HDR (and without HDR too). We’ve reviewed a number of these. We’ll provide a list at the end of this article with links to the reviews. 

And what of FreeSync Premium Pro? It’s the same situation as G-Sync Ultimate in that it doesn’t offer anything new to core Adaptive-Sync tech. It simply means AMD has certified that monitor to provide a premium experience with at least a 120 Hz refresh rate, LFC and HDR. There is no brightness requirement nor is a FALD backlight part of the spec. Many FreeSync panels will qualify for Premium Pro status simply by supporting HDR and extended color. 

Running G-Sync on a FreeSync Monitor 

We’ve covered this subject in multiple monitor reviews and in this article on how to run G-Sync on a FreeSync monitor. It’s pretty simple, actually. 

First, you make sure you have the latest Nvidia drivers installed (anything dated after January of 2019 will work) and hook up a FreeSync monitor. Chances are that the FreeSync monitor will run G-Sync with your Nvidia graphics card — even if Nvidia hasn’t officially certified the monitor as G-Sync Compatible. And if the monitor supports HDR, that will likely work with G-Sync too. 

A visit to Nvidia’s website reveals a list of monitors that have been certified to run G-Sync. Purchasing an official G-Sync Compatible monitor guarantees you’ll be able to use our instructions linked above. 

FreeSync Monitors We’ve Tested That Support G-Sync 

The monitors in the list below were all tested by Tom’s Hardware on systems with both Nvidia and AMD graphics cards, and all of them supported both G-Sync and FreeSync without issue. This includes monitors that Nvidia hasn’t officially certified as being G-Sync Compatible and, therefore, aren’t on Nvidia’s list. Additionally, the ones with HDR and extended color delivered Adaptive-Sync in games that support those features. 

Acer Nitro XV273K

Acer XFA240

Alienware AW5520QF

AOC Agon AG493UCX

Dell S3220DGF

Gigabyte Aorus CV27F

Gigabyte Aorus CV27Q

Gigabyte Aorus FI27Q

Gigabyte Aorus KD25

HP Omen X 25f

Monoprice Zero-G 35

MSI Optix G27C4

MSI Optix MAG271CQR

Pixio PXC273

Razor Raptor 27

Samsung 27-Inch CRG5

ViewSonic Elite XG240R

ViewSonic Elite XG350R-C

Conclusion

So which is better: G-Sync or FreeSync? Well, with the features being so similar there is no reason to select a particular monitor just because it runs one over the other. Since both technologies produce the same result, that contest is a wash at this point.

Instead, those shopping for a PC monitor have to decide which additional features are important to them. How high should the refresh rate be? How much resolution can your graphics card handle? Is high brightness important? Do you want HDR and extended color? If you’re an HDR user, can you afford a FALD backlight?

It’s the combination of these elements that impacts the gaming experience, not simply which adaptive sync technology is in use. Ultimately, the more you spend, the better gaming monitor you’ll get. These days, when it comes to displays, you do get what you pay for. But you don’t have pay thousands to get a good, smooth gaming experience.