Rumor: ‘Apple TV 6’ increasing storage to 64/128GB, tvOS adding Kids Mode, Screen Time, more

Alongside the continued expansion of the Apple TV+ streaming service, Apple is also believed to be working on a new Apple TV 6 set-top box. 9to5Mac has reported select details about the upcoming refresh, and a new report today offers more information.

The YouTube channel iUpdate and The Verifier are jointly reporting a few details about the so-called “Apple TV 6” set-top box today and upcoming tvOS updates.

According to the report, Apple will update the storage configuration options for the Apple TV. Currently, the Apple TV is available in 32GB and 64GB options, but today’s report says that will change to 64GB and 128GB. The idea is to make sure users have enough available space to play Apple Arcade titles.

Apple is also reportedly planning to add a new “Kids Mode” to Apple TV. This wouldn’t be unique to the Apple TV 6 set-top box, but rather a new feature for tvOS itself. It would allow Apple TV owners to set up a separate account for their kids, with control over which applications can be used.

Next up, the report says that Screen Time is also coming to tvOS, for Kids Mode and normal users alike. The report also adds that Apple is working on “redesigned Apple TV+ with a greater focus on content.” What exactly this means is unclear as of now.

In terms of availability, today’s report says to expect the new Apple TV hardware sometime before the end of the year. As with all hardware timelines right now, that could change due to the ongoing COVID-19 pandemic. The physical appearance of the Apple TV is said to be virtually unchanged.

The Verifier has accurately reported certain details about Apple software and hardware in the past — but its timing has sometimes proven to be off. That being said, 9to5Mac found evidence in iOS 13.4 that indicating Apple is developing a new Apple TV box with the A12 or A13 processor. We’ve also found evidence of a new Apple TV remote.

BRINGING 330 PETAFLOPS OF SUPERCOMPUTING TO BEAR ON THE OUTBREAK

IBM, Amazon, Microsoft, and Google are teaming with the White House, the US Department of Energy, and other federal agencies to bring a massive amount of supercomputing power and public cloud resources to scientists, engineers and researchers who are working to address the novel coronavirus global pandemic that is expected to bear down hard on the United States in the coming weeks.

Through the Covid-19 High Performance Computing Consortium announced over the weekend, the companies and organizations are making available more than 330 petaflops of performance over 16 systems that hold an aggregate of more than 775,000 CPU cores and 34,000 GPUs to researchers to help them better understand the virus, treatments that can be used and potential vaccines and cures. And because the current economic crisis is tied to the pandemic, anything that can be done to solve the coronavirus outbreak will certainly slow the cratering of the economy and soften the recession that’s coming if it’s not already here.

The move to pool all this supercomputing power comes as the coronavirus continues to spread around the globe. Estimates have put the number of confirmed cases around the world at almost 337,000 resulting in more than 14,700 deaths. In the United States, the numbers are just over 39,000 cases and 455 deaths, with the brunt of the pandemic expected to hit over the next several weeks.

“How can supercomputers help us fight this virus? These high-performance computing systems allow researchers to run very large numbers of calculations in epidemiology, bioinformatics, and molecular modeling,” Dario Gil, director of IBM Research, wrote in a blog post. “These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms. By pooling the supercomputing capacity under a consortium of partners … we can offer extraordinary supercomputing power to scientists, medical researchers and government agencies as they respond to and mitigate this global emergency.”

Included in the consortium are not only the tech companies but the Argonne, Lawrence Livermore, Los Alamos, Sandia and Oak Ridge national laboratories, the Massachusetts Institute of Technology, Rensselaer Polytechnic Institute, the National Science Foundation, and NASA.

LINING UP THE COMPUTE POWER

Supercomputers already have been enlisted in the fight against the virus. Using the massive Summit system at Oak Ridge, scientists this month via simulations ran through how 8,000 molecules would react to the coronavirus and were able to isolate 77 compounds that may be able to be used to stop it from infecting host cells, a crucial step toward finding a vaccine. Summit, first on the Top500 list, is a huge system with more than 2.4 million Power9 cores and Nvidia Volta V100 GPUs that delivers more than 200 petabytes of performance. Researchers also have used the Tianhe-1 supercomputer in China and supercomputers in Germany for everything from diagnoses to research. Summit is included in the systems available to the consortium.

The new Covid-19 consortium will bring to bear compute power from more than a dozen systems. Lawrence Livermore is opening up its 23 petaflops Lassen supercomputer (788 compute nodes, Power9 chips and V100 GPUs), Quartz (3.2 petaflops, 3,004 nodes and Intel Xeon E-5 “Broadwell” chips), Pascal (900 teraflops, 163 nodes, Xeon-E5 Broadwell CPUs and Nvidia Pascal P100 GPUs), Ray (1 petaflops, 54 nodes, Power8 CPUs and Pascal P100 GPUs), Surface (158 nodes, 506 teraflops, Xeon E5 “Sandy Bridge” chips and Nvidia Kepler K40m GPUs) and Syrah (108 teraflops, 316 nodes and Xeon E5 Sandy Bridge chips).

Los Alamos systems are Grizzly (1.8 petaflops, 1,490 node and Xeon E5 Broadwell CPUs), Snow (445 teraflops, 368 nodes and Xeon E5 Broadwell CPUs) and Badger (790 teraflops, 660 nodes and Xeon E5 Broadwell chips), while Sandia will make its Solo supercomputer (460 teraflops, 374 nodes and Xeon E5 Broadwell chips) available.

The consortium also will have access to five supercomputers supported by the NSF: Frontera and Stampede 2, both operated by the Texas Advanced Computing Center (TACC). Stampede 2 provides almost 20 petaflops of performance designed for scientific, engineering, research and educational workloads. It uses 4,200 Intel Knights Landing nodes and Xeon “Skylake” chips. Frontera is aimed at simulation workloads, data analytics and emerging applications such as artificial intelligence (AI) and deep learning. It offers a peak performance of 4.8 petaflops and is powered by “Cascade Lake” Xeon SP Platinum chips.

Comet is a 2.76 petaflops supercomputer at the San Diego Supercomputer Center powered by Xeon E5 chips and Nvidia K80 and P100 GPUs, Bridges is a mix of Xeon E5 and E7 chips and Tesla K80, Tesla P100 and Volta V100 GPUs operated by the Pittsburgh Supercomputing Center, and Jetstream, run at Indiana University’s Pervasive Technology Institute powered by Xeon E5 Haswell chips, which uses elements of a commercial cloud computing.

NASA is making its high-performance computing (HPC) resources available to researchers, MIT is offering its Supercloud, a 7 petaflops cluster powered by Intel chips and Volta GPUs, and Satori, a 2 petaflops system using Power9 CPUs and Volta GPUs. The system is oriented toward AI workloads. RPI’s Artificial Intelligence Multiprocessing Optimized System (AiMOS), an 8 petaflops Power9/Volta supercomputer, is being made available to the consortium to explore new AI applications.

Google Cloud, Microsoft Azure and Amazon Web Services (AW) are making their infrastructure and cloud services available to researchers. Microsoft will provide grants to researchers via its AI for Health program and the program’s data scientists will be available to collaborate on consortium projects. IBM’s Research WSC 56-node cluster, powered by Power9 chips and V100 GPUs, also will be available. In addition, IBM will help evaluate proposals that come in from researchers.

CARVING UP THE WORK

Consortium members expect a range of projects to be run on the supercomputers, from studies of the molecular structure of Severe Acute Respiratory Syndrome (SARS), another coronavirus that started in China in 2002 and quickly spread to other parts of the globe, to the makeup of Covid-19, how it’s spreading and how to stop it. Such work around bioinformatics, epidemiology, and molecular modeling requires a huge amount of computational capacity, which is what the consortium is offering.

Scientists and medical researchers who are looking to access the consortium’s compute capabilities can submit a two-page description of the proposal on the NSF’s Extreme Science and Engineering Discovery Environment (Xsede) website. The proposal shouldn’t include proprietary information – the consortium expects teams that get access to resources will not only publish their results but produce an ongoing blog during the research process.

The proposal should include scientific and technical goals, an estimate of how much compute resources will be needed, whether collaboration or additional support from consortium members will be needed, and a summary of the team’s qualifications and readiness for running the project.

Once a proposal is submitted, it will be reviewed by the consortium’s steering committee on such metrics as potential impact, computational feasibility, resource requirements and timeline. A panel of scientists and computing researchers, which will work with the proposing teams to evaluate the public health benefits of the project. Speed is of the essence; an emphasis will be placed on projects that can ensure rapid results, the organization said.

Hamilton is bringing back the original digital wristwatch with an OLED twist

Hamilton is bringing back the original digital watch with the PSR, a 50th anniversary tribute to the company’s legendary Pulsar Time Computer — the first commercially sold digital watch, which was released to massive hype in 1972. (The watch was first announced in 1970, hence the anniversary release this year.)

Displaying the time not through a mechanical mechanism but rather an LED display that lit up when a button on the side was pressed, the original Pulsar (and it’s space-age, stainless steel design) was once viewed as the future of the world of technology. James Bond (as played by Roger Moore) even famously wore one in Live and Let Die.

As Hodinkee’s in-depth history of the rise and fall of digital LED watches explains, however, the boom for Pulsar’s watches (and the inevitable copycats) was relatively short-lived. Cheaper, less power-hungry LCD watches would soon follow, with the added advantage of being able to display the time all the time, instead of just when a button was pressed.

In fact, the reason the new watch is being sold under the Hamilton brand, instead of the original Pulsar one, is that the company no longer has the rights — it sold the name off in 1977 (rival watchmaker Seiko now owns the branding).

The new PSR looks to improve on the original Pulsar is a few ways. In an effort to help make the watch a little more useful, the display is now a hybrid LCD and OLED panel that shows the time constantly using the dimmer LCD portion and only lights up the brighter OLED component when the button is pushed. There’s also an antireflective-coated sapphire crystal and a 100-meter water-resistance rating that were both absent on the original model.

The Hamilton PSR isn’t cheap, though, especially compared to a standard digital watch — it’ll run for $750 for a stainless steel model or $995 for limited-edition 1970 pieces in PVD gold. Comparatively, the original Pulsar was sold for $2,100 in a solid gold case, making the $750 price tag a (relative) bargain. Although, you’d have to take into account that Pulsar also sold cheaper $275 steel-case models later on that make the price here feel a little hefty.

All in all, the Hamilton PSR is a neat tribute to an iconic wristwatch and a great example of how far display technology has come in such a short time. It’ll be available later in May.

BenQ Unveils SW321C: A 32-Inch Pro Monitor with Wide Color Gamuts & USB-C

BenQ has introduced a new 32-inch professional-grade display designed for photographers and post-production specialists. Dubbed the SW321C, the monitor is for professionals who need wide color spaces like the Adobe RGB and the DCI-P3, as well as HDR transport support. And, like many other contemporary displays, BenQ’s new LCD is equipped with a USB Type-C input.

Under the hood, the BenQ AQColor SW321C uses a 10-bit 32-inch IPS panel featuring a 3840×2160 resolution, a 250 nits typical brightness, a 1000:1 contrast ratio, a 5 ms GtG response time, a 60 Hz refresh rate, and 178° viewing angles. The monitor uses a LED backlighting that is tailored to ensure brightness uniformity across the whole surface of the screen.

The LCD can display 1.07 billion colors and can reproduce 99% of the Adobe RGB, 95% of the DCI-P3, as well as 100% of the sRGB color gamuts, all of which are widely used by professional photographers as well as video editors and animation designers who do post-production work. Meanwhile, the monitor has a 16-bit 3D LUT (look-up table) and is calibrated to DeltaE ≤ 2 to ensure fine quality of colors and color gradients. The LCD can even display content in different color spaces at the same time side-by-side in PIP/PBP modes.

As for HDR support, things aren’t quite as stellar there. The monitor supports HDR10 as well as the relatively uncommon HLG transport format. However the monitor doesn’t have the kind of strong backlighting required for HDR, let alone a FALD setup necessary to deliver anything approaching pro-grade HDR. So the inclusion of HDR support seems to be largely for compatibility and checking HDR content, rather than doing actual content editing in HDR.

As far as connectivity is concerned, the display is comes with one DisplayPort 1.4 input, two HDMI 2.0 ports, and a USB Type-C input. The latter can deliver up to 60 W of power to the host, which is enough most laptops. All the connectors support HDCP 2.2 technology that is required for protected content. In addition, the BenQ SW321C monitor has a dual-port USB hub and an SD card reader that is certainly useful for photographers.

Since we are dealing with a professional display, it is naturally equipped with a stand that can adjust height, tilt and swivel as well as work in album mode. In addition, the SW321C comes with BenQ’s hockey puck controller to quickly adjust settings.

The BenQ AQColor SW321C monitor is currently listed by BenQ Japan, so expect it to hit the market shortly. Exact pricing is unknown, but this is a professional-grade display, so expect it to be priced accordingly.

Doom Eternal speedruns are already clocking in under 90 minutes

I shouldn’t be surprised that people are finishing the fastest shooter on the block even faster than I imagined. Doom Eternal speedruns are already clocking in under two hours, with these runs from BloodThunder and CreeperHntr shaving an extra 30 minutes off. 

Granted, these are primarily exploratory runs rather for testing out early glitches and strategies. These runners are Doom experts, no doubt, jut don’t expect to see an endless reel of supreme Nightmare difficulty FPS skill on display here. Do expect to see Doomguy floating through and above nearly every environment. The thing only released last Friday, after all. This is the mining period, where speedrunners lay out some shaky tracks and do their best to break the intricate, layered work dozens of people spent years working on. 

Even so, there’s some rad movement on display. The ballista makes for a good Doomguy propellant, sending him gliding over vast expanses and skipping over the ground like a stone on the surface of a placid pond. And since you don’t die from falling into pits, playing around above and behind level geometry makes for some elaborate checkpoint skips. Give ’em a watch below. 

This Gigabyte gaming laptop with a 4K screen is on sale for $1,300

You probably know Gigabyte best from the its PC motherboards, but the company also produces tons of other hardware, including gaming laptops. The Gigabyte Aero 15 OLED is a great machine, and now one variant of it is down to $1,299.00 at Newegg.

The Aero 15’s main selling point is the 15.6-inch 4K AMOLED screen, which should give you deeper blacks than the screens on most other gaming laptops. Other specifications include a 6-core/12-thread Intel Core i7-9750H processor, an Nvidia GeForce GTX 1660 Ti graphics card, 8GB RAM, and a 256GB NVMe SSD for storing all your data. There’s also per-key RGB lighting, HDMI 2.0 output, three USB 3.0 ports, one USB 3.1 Gen 2 Type-C connector, and Thunderbolt 3 support.

This is definitely a content-creation laptop first and a gaming laptop second, as indicated by the 4K screen (which maxes out at 60Hz) paired with a GTX 1660 Ti graphics card. You won’t be able to play most games at the native 4K resolution, but since 1080p scales up perfectly to 4K (and the 1660 Ti can definitely handle 1080p gaming), the AMOLED screen will still give you a great experience. And the 1660 Ti should be capable of playing less-demanding laptop games at 4K, too. The 8GB RAM is less than we like these days for intensive tasks⁠—you might want to upgrade it to 16GB down the road for improved performance.

Our friends at TechRadar reviewed the higher-end version with an RTX 2070 Max-Q graphics card, and found that the laptop’s “new 4K HDR screen is better than anything we’ve seen as a color-calibrated portable work display or high-resolution on-the-go gaming monitor.”

Latest Nvidia GPU driver is optimized for Half-Life: Alyx and Resident Evil 3

This is also the first driver to introduce initial support for DLSS 2.0 on GeForce RTX cards in a couple of games.

The newest ‘Game Ready’ graphics driver (v445.75) from Nvidia adds day-one support for Half-Life: Alyx, a VR exclusive built from the ground up for VR headsets. It’s also optimized for Resident Evil 3, due out in April, and Ghost Recon Breakpoint’s new Ghost Experience.

A driver labeled as Game Ready basically means Nvidia’s software engineers focused on specific games to eek out the most performance possible, while stomping out bugs along the way. The trade off, of course, is the unknown when it comes to installing graphics drivers right when they come out—some people prefer to wait a day or two (or longer) to see if any unforeseen issues rear their ugly heads.

Interestingly, I’m not seeing any fixed issues in the release notes (PDF) this time around, only a rundown of known issues (pages 8 and 9). Several of them are carryovers from the previous driver release and affect Doom Eternal in specific situations. For example, Nvidia highlights an issue in which the framerate drops in Doom Eternal when using Steam’s in-game fps counter.

As Alan reported on earlier today, Nvidia rolled out its second generation Deep Learning Super Sampling technology, otherwise known as DLSS 2.0. It is supposed to be a much improved iteration. As part of that, this driver releases also brings Game Ready optimizations to Control and MechWarrior 5: Mercenaries with DLSS 2.0 enabled.

You can download the latest driver through GeForce Experience or grab it manually from Nvidia’s website. And if you’re looking for installation tips, check out our guide on how to update drivers.

Google Play GPU driver updates in development for Pixel 4, S10, and Note 10

In addition to unveiling the Snapdragon 865, Qualcomm last December announced that it would update GPU drivers through the Play Store. At the Google for Games Developer Summit today, both companies provided an update on the effort, including how the Pixel 4 will be one of the first devices to see GPU updates.

Google today announced the Android GPU Inspector to make building and optimizing games on the platform easier. The tool provides detailed information about a game’s render stages and GPU counter that was previously not available. These insights can be used to improve frame rates and lower power usage for longer battery life:

While working with a game partner using the Android GPU Inspector and a Pixel 4 XL powered by Snapdragon, Google was able to discover an optimization opportunity that saved the game 40% in GPU utilization.

Qualcomm worked with Google on the Inspector given the Adreno GPU family’s wide adoption. In using the tool to make optimizations, developers can suggest driver enhancements directly to the chipmaker. An Adreno Graphics Development Driver will be made available to select devs to allow for fast testing of optimizations.

These improvements to the Adreno GPU driver will be compiled and released to consumers as an update through the Play Store. Similar to on PCs, this will bring new features and performance enhancements. The end-user experience will be akin to updating an app, just like with Project Mainline and its modularization of security and other key Android components.

Today’s update reveals how Qualcomm and Google are “currently collaborating” on upgradable GPU drivers for the Snapdragon 855-powered Pixel 4, Samsung Galaxy S10, and Note 10. More devices are “coming later.”

HANDY 2-IN-1 CHUWI HI10 X OFFERS GREAT BANG FOR THE BUCK

CHUWI brand is definitely one of the big dogs in chinese tablet/laptop industry and they always focus on the price/performance ratio. Recently they have sent to the market a new CHUWI Hi10 X convertible 2-in-1 tablet and it more or less embodies the company’s philosphy. It packs quite a punch with rock solid specs, but on the other hand is not exceeding the acceptable pricing. And the design of full metal unibody chassis with sleek 8.8mm thickness and 600g of weight certainly helps too. How about rest of the specs ?

For starters the 2-in-1 convertible tablet is getting a solid processor with the Gemini Lake Intel Celeron N4100 paired with 6 GB LPDDR4 RAM and 128 GB of internal storage. The 10.1-inch fully laminated G+FF IPS display with FHD resolution is kept and with the added support of the 1024-pressure level stylus pen HiPen 3 it got even better. The tablet is also boasting full metal unibody build with a docking keyboard as an extra and new set of ports. Those will even have two USB Type-C ports (one slower 2.0 for charging and one 3.0 for data). Battery life is also not bad with estimations ranging from 6 to 8 hours. Everything of course runs on the pre-installed Windows 10 system as usual.

And now we are getting to the best part of the whole thing. Because CHUWI Hi10 X is really kept pretty affordable with the base prices on Aliexpress, Amazon and Banggood official channels reaching roughly about $210. But ideally you want to throw in the mix the keyboard dock and the stylus pen for couple dollars more to have all the perks. Still the pricing is really good all things considered. You can find more information about the model including the buying links on the official website.

Intel Rocket Lake leak promises a new CPU core architecture, finally

A leaked slide suggests that Intel’s Rocket Lake is going to be the desktop equivalent of Tiger Lake, just a bit more portly.

Intel Rocket Lake is rumoured to be the next next-gen desktop processor from the chip giant, and a freshly leaked slide looks to be serving up our first concrete info on the new Intel CPUs. Of course we need to get the Intel Comet Lake processor update out of the way first, but the Rocket Lake silicon should be our first taste of a new Core architecture on the desktop since Skylake when it finally arrives. When that might be… well, all hardware launches are subject to change right now.

The slide, leaked by Videocardz and reportedly its sources at Intel, promises “Increased performance with new processor core architecture.” The 10nm Tiger Lake mobile chips launching in laptops (near you) later this year look to be the forerunners of the Rocket Lake desktop parts, and those sport the new Willow Cove Core architecture. 

Willow Cove wraps in the extra single threaded performance boost of the Sunny Cove architecture, but includes a cache redesign, new transistor optimisations, and some much-needed security features. And that should give Rocket Lake some of the edge it needs to compete with AMD’s Ryzen chips. Which will be in maybe their fifth generation by the time it launches some time in 2021. Tough times.

Alongside the new architecture will be the Xe graphics silicon, but as Rocket Lake S is a desktop part, don’t expect Intel to ship too much GPU grunt inside it. After all, you’re going to be slamming a next-gen AMD RDNA 2 or Nvidia Hopper graphics card into your hypothetical Rocket Lake rig.

The slide also promises PCIe 4.0 support for the touted Intel 500-series motherboard platform, something which Comet Lake’s 400-series boards won’t be offering. Again, a long while after AMD dropped the inaugural PCIe 4.0 platform. The PCIe 4.0 support is coming directly from the CPU itself, offering a x16 interface for the GPU and another x4 lanes for an NVMe SSD. That there’s bandwidth, people.

So, will this be the first 10nm desktop processor from Intel, a node akin to AMD’s 7nm chips? There’s no confirmation in the latest slide, but the expectation all along has been that Rocket Lake will be another roll of the 14nm dice, with a huge number of +++++ characters behind it, back-porting the advanced core architecture into an older process node.

Given the performance Intel has squeezed out of the 14nm node up to now, that’s not necessarily a bad thing. The best CPUs for gaming are still Intel… for now. But we’ve got to wade through the Comet Lake CPU sludge first, and who knows when that’s going to happen.