We had questioned how long it would be until the iTunes and AirPlay 2 features would proliferate to other television platforms during yesterday’s update. We didn’t have to wait long for an answer. Today, LG announced comparable support to what Samsung announced for new models as did Vizio. While Samsung may maintain an edge in not leaving owners of fairly new 2018 sets out in the cold, the proliferation of options at this point only leaves a Roku and Sony announcement up in the air. Whether or not the expected uptick in the use of Apple’s services by individuals that are firmly entrenched outside of the macOS/iOS ecosystem manifests, this approach is certainly unprecedented. We’re expecting that these developments will make more sense after Apple’s WDDC later this year.
Can you feel the excitement in the air?!?! Over the next week, product announcements and presentations from a myriad of companies will shape the technological landscape for this year. While we’re expecting some surprises over the next few days, the initial wave of marketing publications or early releases of information have left us… slightly disappointed. Why is this?
For starters, the next wave of Chromebooks may come equipped with an AMD processor. This is good news as it provides additional options for consumers. Mass-market Chromebooks have been versatile and easy-to-use products that fulfill many use cases where a traditional laptop isn’t necessary. This also provides AMD with another revenue stream. While trade-offs have to be made at some level to reach the sub-$300 price point where most of these units fall, the use of AMD’s legacy cores is a cause for concern. When paired with the abysmal 720p screens that HP still foists upon unsuspecting consumers, the end result will most likely be something that pushes toward blue/purple from a display perspective while not offering the battery life that Chromebooks have been known for. Acer’s inclusion of a 1080p screen will warrant serious consideration at the $279 MSRP if the CPU can pass the muster. If we’re lucky, CES 2020 will offer Chromebooks with an Athlon 200GE-level APU at a comparable price point.
nVidia will be going all-out very shortly. The early reveal pertained to their BFGD finally coming to market. 65″ screen, 4K resolution, HDR support, and G-SYNC with a 144Hz refresh rate. All for the low price of $4,999. No, that’s not a typo. While I’m sure it will be impressive, the less than spectacular lifespan of the Dell S2716DG (replaced once under warranty, refurb unit failed about eight months later) may have tainted our view of the longevity and benefit of G-SYNC. When it worked, frame rates were noticeably better, even if the panel was less than satisfactory from a color accuracy perspective. Hopefully, these displays find a market and last long enough to be paired with a future-generation nVidia GPU that can drive this beast to the point where the high refresh rate benefit can be realized in conjunction with all graphical options being set to their maximum limit.
Finally, in the “strange bedfellows” department, Apple will be bringing its services and AirPlay 2 to Samsung TVs. Potential admission that Apple TV (4K or otherwise) isn’t doing well? Realization that there may be a non-trivial number of customers outside of the walled garden that would like to interface with a large screen sans additional purchases? Whatever it is, it feels questionable. The salient points raised by Nilay Patel over at The Verge do highlight the potential for lost sales of Apple’s streaming device. This development also makes it less likely that a lower cost “streaming stick” will be coming out of Cupertino. If Apple’s bringing their services to the Tizen platform, what’s holding them back from doing the same for webOS (LG), Roku (Sharp, TCL, HiSense, and Roku streaming devices with sufficient horsepower), or the FireTV platform (Toshiba, Insignia, and the millions of Amazon Prime subscribers that have a Fire TV device)?
While we still hate the new Apple TV remote with a passion, the image quality and performance of Apple TV is silky smooth and dependable. Support for Dolby Vision and HDR10 formats aid in providing an optimized picture for compatible sets. Additionally, Apple TV plays nicely with a myriad of network-based ad-blocking solutions. This is something we can confirm from first-hand experience that the Roku does not. Apple still hasn’t fixed the network flapping issue on the 4K box. While enabling these technologies in mass market products is one of the more consumer-friendly actions taken by Apple, we’d rationalize that the majority of consumers have already standardized on something that is NOT an Apple product for media consumption. A properly calibrated at the factory set with impeccable fit/finish and a “storage compartment” for an Apple TV device would have been an epic win. We can only jump to the conclusion that Apple was unable to achieve their desired margins on a Cupertino-developed television. Time will tell how this partnership works out.
The fine folks at Netflix advertised their first foray into an interactive streaming experience with Black Mirror: Bandersnatch on December 27th. The release of the movie the following day came with an undisclosed caveat; the movie can’t be watched on Apple TV and Chromecast devices. We discovered this deficiency within an hour of the MacRumors article that highlighted this issue.
While this limitation could theoretically be accepted for the third generation units that did not possess a dedicated App Store, voice interface, touchpad remote or support for Bluetooth controllers, to experience this fail on a fourth generation Apple TV (or an Apple TV 4K) is certainly unacceptable. Based on the 2012 release date of the third generation hardware, limitations of a single-core A5 CPU and absence of an available app ecosystems, Netflix would get a pass. However, it’s pure laziness or a lack of proper preparation on Netflix’s behalf when the newer fourth generation platform shares commonality with iOS and contains ample performance capability.
Before the Netflix account is cancelled, the movie was tested on a 2018 iPad. The concept has some novel branches but ends up being disruptive to the experience. Some choices have minor continuity implications to later scenes in the movie, while other branches don’t appear to be fully fleshed out. It’s not a bad experiment except for the fact that Netflix failed to recognize that they automatically excluded a portion of their viewing audience based on sales data. Engagement with customer service was lackluster to say the least. Any acknowledgement that their verbiage and disclaimer as to why this movie doesn’t “work” with Apple TV or Chromecast was missing entirely in the chat session we established.
While there may be a possibility that Netflix will modify the tvOS app to support this type of content (and future implementations if this is a direction that they’re going), there have been too many poor decisions without a viable explanation in 2018. The overabundance of “everyone gets a comedy special”, when paired with the abrupt cancellation of the various Marvel series (Iron Fist, Luke Cage, Daredevil, and – theoretically – Punisher and Jessica Jones once they air their next season), demonstrates their inability to maintain franchises consistently. The half-completed efforts for Arrested Development, shorter episode seasons of marquee shows, and the impending loss of a non-trivial portion of Netflix’s content library creates an opening for alternative services to take a slice of market share for good.
To quote any of the “sharks” from Shark Tank when they’re excluding their potential to invest in a product, “We’re out”.
Those of us that experienced TechTV will have fond memories of The Screen Savers show. TWiT brought the series back as The New Screen Savers (TNSS) with many familiar faces from the original series. At the end of the latest episode on Saturday, Leo Laporte dropped the bomb related to this program being eliminated. While there is never a lack of content for viewing on the Internet in general, we feel that this series shouldn’t have ended up on the chopping block. The format and variety of topics covered made technology accessible for a sizable audience. While there are certainly costs associated with developing and presenting content of this nature, the program was well differentiated and didn’t appear to address an oversaturated market. Hopefully, the “new” program coming in 2019 will integrate portions of what was unique to TNSS.
Mere days ago, Handbrake 1.2.0 was released. Significant enhancements were made for this fantastic, open source transcoding solution. Mac users are now able to leverage the performance improvements offered by support for videotoolbox within macOS due to the core decoding library change. The use of ffmpeg on the backend provides much of the boost for supported hardware. Windows users that leverage a non-Intel CPU also benefit from this newest version thanks to support for NVENC and AMD VCE acceleration for Nvidia and AMD GPUs. Our initial cross-platform runs have been a mixed bag. The macOS iteration was indeed faster in converting a one hour video from an established ts container to an mp4 container. However, the third item in the queue faulted and crashed Handbrake. While there may be some minor bugs such as this, the bulk of the enhancements provide significant benefits that won’t break workflows. Be sure to head over to Handbrake’s site to download the latest versions!
Joel Hruska’s article over at Extremetech doesn’t account for a number of considerations in AMD’s operating model and contractual obligations with its suppliers. The publicly available data related to the Zen 2 architecture with its chiplet and controller paradigm certainly yields the benefits noted by Hruska in a multi-chiplet design such as what is common for current-generation Threadripper and EPYC processors. While he does properly identify the candidate for the controller chip, he neglects to recognize that AMD still has a contractual obligation with GlobalFoundries due to the wafer supply agreement.
Furthermore, Economics 101 teaches us that the per-unit cost of manufacturing a “widget” is modified by scale; manufacturing more of the “glue” for AMD chiplet-based solutions will benefit AMD’s bill of materials. The AdoredTV video, which is certainly worthwhile to view regardless of however many grains of salt the presented information requires, does provide a realistic concept that isn’t far fetched. Not every central controller chip is going to meet the specs for providing the maximum number of memory channels for use. Repurposing lesser silicon for the mainstream line will make better use of the outputs from Globalfoundries while strengthening the potential performance capabilities for the Ryzen 3000-series product line.
The proposition related to AMD’s willingness to pay penalties for wafers produced at competing fabs may indeed be part of the long game. This would make more sense if and when a sub-12nm design is required for power, performance, or logical specifications for the controller. Incurring this financial penalty prior to the actual need for an advanced manufacturing process would be irresponsible of AMD from a cost optimization perspective.
Moving back to the transition from Ryzen 2000-series processors and associated clock speeds, Hruska’s skepticism on potential base and boost clocks for the proposed product lineup appears to neglect the benefits of TSMC’s 7nm manufacturing process along with further optimizations of the individual chiplet design that may have been achieved by AMD. Moving from 14nm in Zen to 12nm in Zen+ provided a 200-400Mhz improvement on average. In the chiplet model, each individual “processing” chiplet is theoretically less encumbered due to core functionality being offloaded to the dedicated central controller. 7nm process + central controller = more headroom for base and boost clocks.
The only part of Hruska’s article that’s at least grounded in reality is the recognition of bandwidth starvation for an integrated GPU. We’ve put the Ryzen 5 2400G through its paces and found it to be a very competent performer at 720p, yet started requiring turning down settings at 900p and 1080p to achieve a playable frame rate. The power draw and cost of HBM2 doesn’t make this approach viable at the proposed price points. The move to Navi will certainly have benefits from the performance and power consumption space but will most likely remain a solid choice for 2D and eSports titles. While the speeds and number of Navi compute units for applicable Ryzen 3000-series processors may be slightly lower than what has been shared via the rumor mill, I’d suggest buckling up and preparing to be razzle dazzled when Dr. Lisa Su speaks at CES!
Our review of the Drobo 5C has been published. If you’re in the market for direct-attached storage and would benefit from a solution that can be expanded as needed, you’ll definitely want to check out our analysis! Hope you enjoy!
In case you haven’t heard by now, Marriott announced that Starwood’s guest database has been breached. The magnitude of records obtained by the perpetrators exceeds the quantities noted in our improperly configured ElasticSearch article yesterday. At five hundred million records, this lapse in security is going to incur a significant financial impact for Marriott. While the company is offering identity monitoring services in some regions, the larger expense will come from fines associated with non-compliance with GDPR. The fact that adversaries were able to retain access to this trove of data for four years raises serious questions related to the following considerations:
- Information Security Policy/Program at Starwood and Marriott: The completion of the merger between these two hotel giants occurred in September of 2016. For the two years prior to the merger, the adversary was undetected while initiating their reconnaissance and data exfiltration scheme. Tools, sensors, or policies that govern how Starwood monitors their systems for suspicious or malicious behavior were either not being properly used or did not exist. As part of the integration of these two organizations, it would also appear that Marriott’s tools and sensors were unable to detect this pre-existing exploitation. While the reactionary response from Marriott does provide a level of acknowledgement and accountability, the damage to the brand may persist for months or years to come.
- Proper vetting of the acquisition target: Prior to the merger, there is an expectation that Marriott would ideally have had vulnerability scans performed against all Starwood assets. This effort may have provided some proverbial breadcrumbs that would potentially close some of the holes used as attack vectors to gain access to the trove of data. The extended delay in detection raises additional questions related to Marriott’s vulnerability management and risk management programs. Do they exist? Are they being followed using an established and repeated cadence? I’m optimistic that the post-mortem on this event will address gaps and deficiencies without blaming one person as the fine folks at Equifax did.
As we come closer to the end of 2018, the consumer base can only hope that organizations allocate larger budgets to gain resources and solutions which will provide strong and holistic cyber security capability in 2019 and beyond.
There have been an overabundance of articles within the past day that highlight organizations which placed an ElasticSearch database in the wild without a password.
- The fine folks at Urban in the UK offered up 309,000 customer profiles.
- Sky Brasil told Urban to “hold my beer” and upped the ante with data related to 32 million subscribers.
- A to-be-officially-confirmed-as-Data & Leads told Sky Brasil and Urban to “hold my beer” and went all out with a multi-week exposure of data related to 57 million US citizens.
While security researchers noted in the respective articles have aided in preventing the further exposure of this data, the lack of any due diligence by the individuals or organizations that provided a smorgasbord of freely available intel for adversaries is incredibly frustrating. By publishing volumes of data without basic “Day 1” security principles or controls, events that occur due to staff not taking the time to RTFM or to engage with peers that may have experience with a given platform will continue to result in unfettered access to potentially sensitive data.
In events such as those that have transpired with Dunkin’ Donuts and Dell, considerable efforts were required to facilitate access to the noted systems of these respective companies. These organizations didn’t leave the door wide open as Urban, Sky Brasil, and the to-be-confirmed-as-Data & Leads organizations chose to do with their ElasticSearch solutions.
As we had previously noted, the transformation from an excessively loud hyper converged solution to a commercial off the shelf solution took place a few months back. About a week ago, we had also noted the procurement of a 2018 Mac mini. Efforts today related to troubleshooting Safari’s inability to access the management interface when secure HTTP is being enforced has resulted in a game of finger pointing between vendors.
Apple refuses to acknowledge that there is a bug in the way that Safari handles untrusted sites. This may be a more common occurrence than one would realize. Default certificates from untrusted sources may be commonly found on consumer gear. In the event that such a site is accessed by a modern browser, the person behind the keyboard is normally provided with a warning of some variety. Using Chrome as an example, visiting a site with a certificate mismatch will result in the warning dialog pictured below.
Selecting Advanced and proceeding to the site of a device with a mismatched or invalidated certificate will normally enable you to proceed. Using Chrome, Internet Explorer, Edge, Firefox and Opera in this scenario provides the ability to log into the QNAP management interface. When it comes to Safari performing the same exact process, the results were far different.
Selecting Show Details, “visit this website” in the details section, and entering account credentials to confirm that you want to legitimately make a change to your Certificate Trust settings results in an infinite loop of the same process repeating over and over. “Charles” from Apple Support decided that the solution is to “use another browser” instead of fixing an identified bug. One has to chuckle at the circa-2018 support from Apple. While the certificate appeared to have the proper trust settings in Keychain Access, the results of trying to access the management page proved this to be false. Deleting the offending certificate and allowing it to reimport fixed the glitch.