Those of us that experienced TechTV will have fond memories of The Screen Savers show. TWiT brought the series back as The New Screen Savers (TNSS) with many familiar faces from the original series. At the end of the latest episode on Saturday, Leo Laporte dropped the bomb related to this program being eliminated. While there is never a lack of content for viewing on the Internet in general, we feel that this series shouldn’t have ended up on the chopping block. The format and variety of topics covered made technology accessible for a sizable audience. While there are certainly costs associated with developing and presenting content of this nature, the program was well differentiated and didn’t appear to address an oversaturated market. Hopefully, the “new” program coming in 2019 will integrate portions of what was unique to TNSS.
Mere days ago, Handbrake 1.2.0 was released. Significant enhancements were made for this fantastic, open source transcoding solution. Mac users are now able to leverage the performance improvements offered by support for videotoolbox within macOS due to the core decoding library change. The use of ffmpeg on the backend provides much of the boost for supported hardware. Windows users that leverage a non-Intel CPU also benefit from this newest version thanks to support for NVENC and AMD VCE acceleration for Nvidia and AMD GPUs. Our initial cross-platform runs have been a mixed bag. The macOS iteration was indeed faster in converting a one hour video from an established ts container to an mp4 container. However, the third item in the queue faulted and crashed Handbrake. While there may be some minor bugs such as this, the bulk of the enhancements provide significant benefits that won’t break workflows. Be sure to head over to Handbrake’s site to download the latest versions!
Joel Hruska’s article over at Extremetech doesn’t account for a number of considerations in AMD’s operating model and contractual obligations with its suppliers. The publicly available data related to the Zen 2 architecture with its chiplet and controller paradigm certainly yields the benefits noted by Hruska in a multi-chiplet design such as what is common for current-generation Threadripper and EPYC processors. While he does properly identify the candidate for the controller chip, he neglects to recognize that AMD still has a contractual obligation with GlobalFoundries due to the wafer supply agreement.
Furthermore, Economics 101 teaches us that the per-unit cost of manufacturing a “widget” is modified by scale; manufacturing more of the “glue” for AMD chiplet-based solutions will benefit AMD’s bill of materials. The AdoredTV video, which is certainly worthwhile to view regardless of however many grains of salt the presented information requires, does provide a realistic concept that isn’t far fetched. Not every central controller chip is going to meet the specs for providing the maximum number of memory channels for use. Repurposing lesser silicon for the mainstream line will make better use of the outputs from Globalfoundries while strengthening the potential performance capabilities for the Ryzen 3000-series product line.
The proposition related to AMD’s willingness to pay penalties for wafers produced at competing fabs may indeed be part of the long game. This would make more sense if and when a sub-12nm design is required for power, performance, or logical specifications for the controller. Incurring this financial penalty prior to the actual need for an advanced manufacturing process would be irresponsible of AMD from a cost optimization perspective.
Moving back to the transition from Ryzen 2000-series processors and associated clock speeds, Hruska’s skepticism on potential base and boost clocks for the proposed product lineup appears to neglect the benefits of TSMC’s 7nm manufacturing process along with further optimizations of the individual chiplet design that may have been achieved by AMD. Moving from 14nm in Zen to 12nm in Zen+ provided a 200-400Mhz improvement on average. In the chiplet model, each individual “processing” chiplet is theoretically less encumbered due to core functionality being offloaded to the dedicated central controller. 7nm process + central controller = more headroom for base and boost clocks.
The only part of Hruska’s article that’s at least grounded in reality is the recognition of bandwidth starvation for an integrated GPU. We’ve put the Ryzen 5 2400G through its paces and found it to be a very competent performer at 720p, yet started requiring turning down settings at 900p and 1080p to achieve a playable frame rate. The power draw and cost of HBM2 doesn’t make this approach viable at the proposed price points. The move to Navi will certainly have benefits from the performance and power consumption space but will most likely remain a solid choice for 2D and eSports titles. While the speeds and number of Navi compute units for applicable Ryzen 3000-series processors may be slightly lower than what has been shared via the rumor mill, I’d suggest buckling up and preparing to be razzle dazzled when Dr. Lisa Su speaks at CES!
Our review of the Drobo 5C has been published. If you’re in the market for direct-attached storage and would benefit from a solution that can be expanded as needed, you’ll definitely want to check out our analysis! Hope you enjoy!
In case you haven’t heard by now, Marriott announced that Starwood’s guest database has been breached. The magnitude of records obtained by the perpetrators exceeds the quantities noted in our improperly configured ElasticSearch article yesterday. At five hundred million records, this lapse in security is going to incur a significant financial impact for Marriott. While the company is offering identity monitoring services in some regions, the larger expense will come from fines associated with non-compliance with GDPR. The fact that adversaries were able to retain access to this trove of data for four years raises serious questions related to the following considerations:
- Information Security Policy/Program at Starwood and Marriott: The completion of the merger between these two hotel giants occurred in September of 2016. For the two years prior to the merger, the adversary was undetected while initiating their reconnaissance and data exfiltration scheme. Tools, sensors, or policies that govern how Starwood monitors their systems for suspicious or malicious behavior were either not being properly used or did not exist. As part of the integration of these two organizations, it would also appear that Marriott’s tools and sensors were unable to detect this pre-existing exploitation. While the reactionary response from Marriott does provide a level of acknowledgement and accountability, the damage to the brand may persist for months or years to come.
- Proper vetting of the acquisition target: Prior to the merger, there is an expectation that Marriott would ideally have had vulnerability scans performed against all Starwood assets. This effort may have provided some proverbial breadcrumbs that would potentially close some of the holes used as attack vectors to gain access to the trove of data. The extended delay in detection raises additional questions related to Marriott’s vulnerability management and risk management programs. Do they exist? Are they being followed using an established and repeated cadence? I’m optimistic that the post-mortem on this event will address gaps and deficiencies without blaming one person as the fine folks at Equifax did.
As we come closer to the end of 2018, the consumer base can only hope that organizations allocate larger budgets to gain resources and solutions which will provide strong and holistic cyber security capability in 2019 and beyond.
There have been an overabundance of articles within the past day that highlight organizations which placed an ElasticSearch database in the wild without a password.
- The fine folks at Urban in the UK offered up 309,000 customer profiles.
- Sky Brasil told Urban to “hold my beer” and upped the ante with data related to 32 million subscribers.
- A to-be-officially-confirmed-as-Data & Leads told Sky Brasil and Urban to “hold my beer” and went all out with a multi-week exposure of data related to 57 million US citizens.
While security researchers noted in the respective articles have aided in preventing the further exposure of this data, the lack of any due diligence by the individuals or organizations that provided a smorgasbord of freely available intel for adversaries is incredibly frustrating. By publishing volumes of data without basic “Day 1” security principles or controls, events that occur due to staff not taking the time to RTFM or to engage with peers that may have experience with a given platform will continue to result in unfettered access to potentially sensitive data.
In events such as those that have transpired with Dunkin’ Donuts and Dell, considerable efforts were required to facilitate access to the noted systems of these respective companies. These organizations didn’t leave the door wide open as Urban, Sky Brasil, and the to-be-confirmed-as-Data & Leads organizations chose to do with their ElasticSearch solutions.
As we had previously noted, the transformation from an excessively loud hyper converged solution to a commercial off the shelf solution took place a few months back. About a week ago, we had also noted the procurement of a 2018 Mac mini. Efforts today related to troubleshooting Safari’s inability to access the management interface when secure HTTP is being enforced has resulted in a game of finger pointing between vendors.
Apple refuses to acknowledge that there is a bug in the way that Safari handles untrusted sites. This may be a more common occurrence than one would realize. Default certificates from untrusted sources may be commonly found on consumer gear. In the event that such a site is accessed by a modern browser, the person behind the keyboard is normally provided with a warning of some variety. Using Chrome as an example, visiting a site with a certificate mismatch will result in the warning dialog pictured below.
Selecting Advanced and proceeding to the site of a device with a mismatched or invalidated certificate will normally enable you to proceed. Using Chrome, Internet Explorer, Edge, Firefox and Opera in this scenario provides the ability to log into the QNAP management interface. When it comes to Safari performing the same exact process, the results were far different.
Selecting Show Details, “visit this website” in the details section, and entering account credentials to confirm that you want to legitimately make a change to your Certificate Trust settings results in an infinite loop of the same process repeating over and over. “Charles” from Apple Support decided that the solution is to “use another browser” instead of fixing an identified bug. One has to chuckle at the circa-2018 support from Apple. While the certificate appeared to have the proper trust settings in Keychain Access, the results of trying to access the management page proved this to be false. Deleting the offending certificate and allowing it to reimport fixed the glitch.
Steve Burke over at Gamers Nexus pulled the trigger on a “top of the line” Overpowered DTW3 gaming rig from Walmart. No explanation can do justice to the YouTube summary. There are considerable deficiencies beyond price and build quality when it comes to this solution. The race to the bottom on component selections highlighted by Steve in his teardown of the system demonstrates either a complete lack of technological understanding by the systems integrator that has been subcontracted by Walmart to construct a “gaming PC” or a corporate giant that doesn’t grasp the concept of a value proposition. For the outrageous markup associated with the identified components, the fact that the hours of operation for the company making these systems for Walmart don’t align with the 24×7 model of HP, Dell, etc will only impact Walmart’s earnings. The already antiquated component selections will be passed over by the alleged target audience. Overpowered will never be the Supreme of PC manufacturing. There are a wide variety of options available to have someone construct a high performance computer if one does not have the time, effort, or skills involved. Every alternative option is better than what Walmart is attempting to foist upon consumers.
We received our BTO Core i7 Mac Mini today after a brief delay due to customs. Upon receipt, we executed on the memory upgrade procedure provided by iFixit. A lesson learned in this process involves variation in the amount of pressure that is applied to the duct by thumbs in an effort to remove the logic board from the enclosure. The BTO model required more force than one of the default configurations. Unfortunately, some blood was shed to facilitate the upgrade due to skin being weaker than the duct plastic.
The adventure was certainly worthwhile in light of the non-trivial cost savings versus ordering a memory upgrade directly from Apple. The speed with which we were able to upgrade the memory in the Core i5 model was drastically improved. Full teardown and reassembly occurred in under 10 minutes. The most frustrating part (for those with big hands) involves the reattachment of the wire for the antenna plate. Patience and a sprinkle of profanity may be required to complete the effort.
We’ll be finishing the entire setup process as time permits this week. External cable management will be required for the variety of peripherals that will be connected to provide the utmost flexibility for our solution. While the screen shot is crude, a preview of a multi-pass run of BlackMagic DiskSpeed is provided below and mirrors what other reviews have stated. The internal flash is fast!
The October 30th Apple event resulted in significant product updates for neglected platforms. The MacBook Air and Mac Mini have been dramatically renewed and are first-class citizens of the ecosystem once again. We have been using the new entry-level MacBook Air as our daily driver for portable compute and have been astonished by the overall experience. The third generation keyboard alleviates concerns over the potential for an untimely keyboard failure. While a 7W “Core i5” may sound somewhat under-powered in the Windows ecosystem, macOS coaxes rather respectable performance while driving a vastly improved (when compared to the prior MacBook Air) 2560×1600 display using integrated Intel graphics. Apple’s demonstrated commitment to USB-C (and Thunderbolt for macOS platforms) provides an acceptable path toward future expansion and upgrades. We’ll be taking delivery of a BTO Core i7 Mac Mini tomorrow and will be putting it through its paces this week.
The sheer horsepower and capabilities of the revised iPad Pros appear to be held back by their reliance on iOS. While Apple has done well with enhancing multi-window and multitasking capabilities of iOS for large-screen platforms, the discussion and demonstrations provided by Leo Laporte and Jason Snell on Episode 182 of The New Screen Savers allows astute observations to be made. In an 11 or 12.5″ form factor, the concept of the dock more closely matches the desktop experience. Concerns that were raised around getting to the desired app as quickly as possible without a more traditional interface device are valid. The speed with which app switching occurred highlights how powerful the Apple-designed solutions have become. When Apple finally pulls the trigger on converting from Intel x86 offerings to their A-series solutions, we’d expect it to be introduced in the following waves:
Wave 1 – The MacBook, MacBook Air, and iPad Pro conundrum becomes a thing of a past. A singular category device replaces all three of the aforementioned products. The most significant problems with supporting touch and traditional inputs simultaneously may be something that results in an innovative solution. Like the new MacBook Air and iPad Pro, Wave 1 will consist of portable devices that offer fewer customized configurations. You’ll get the current-generation A-series processor with all of its enhancements, multiple flash storage configurations, and the potential for cellular connectivity. The default Apple apps will just work, but some products that are bound to an x86 architecture will require time to be ported, re-written, or replaced by something superior.
Wave 2 – The Wave 1 approach extends to the Mac Mini. Additional maturity in the conversion/translation space will exist. The “and one more thing” would involve a new Apple display which contains “storage” for the new Mac Mini in its base. If all expansion can be suitably addressed externally via Thunderbolt or USB-C, the new display could be made paper-thin since cooling for internals is no longer required on the rear of the iMac. While the iMac may have its sales cannibalized by the tandem of Mac Mini and custom monitor, the impact to Apple’s margins may not be significant. A fully loaded 2018 Mac Mini is $4,199. A fully loaded 2017 iMac is $5,299. Neither price includes input peripherals. Paired with a matching monitor that conceals the Mac Mini in the base, the potential for less waste in the long run while enabling consumer to cosmetically match all elements may be realized.
Wave 3 – This is when the Mac Pro comes into the fold. Virtualization and software would need to be ready for this type of transition. The largest issue at hand stems from the fake gesture from the fine folks at Intel. In 2017, the licensing fees for Thunderbolt 3 were supposed to disappear in an effort to drive adoption of this standard. Fast forward to today and we’re in a situation where Thunderbolt 3 on a PC is the functional equivalent of finding Bigfoot or the Loch Ness Monster. The liars over at Gigabyte sold a Threadripper motherboard with the hopes and dreams of a future upgrade for Thunderbolt 3 support. Excluding Intel’s newest NUCs that contain integrated AMD graphics with high bandwidth memory, the majority of the cookie cutter systems on store shelves and the myriad of motherboards available for system builders contain no trace of increased adoption of Thunderbolt. Without the PC moving in lockstep with this interface, it will continue to remain primarily within the domain of the Mac. How Apple will continue to leverage Thunderbolt on a non-x86 platform may be a valid concern which will require significant engineering efforts and a sizable dose of courage.