Security experts level criticism at Apple after Big Sur launch issues | #macos | #macsecurity | #education | #technology | #infosec

[ad_1]

Users took to social media to complain about slow systems with one report pointing to an OCSP responder as the culprit.

Apple announced at its November 2020 event that macOS 11 Big Sur would arrive Nov. 12. 

Image: Apple

Apple was forced to issue a statement Monday on its data collection policies after the release last week of Big Sur led to complaints of slow systems, which morphed into a larger debate about privacy on Macs and iPhones. The release stated the process is part of its efforts to protect users from malware.

Apple released macOS Big Sur on Nov. 12 and hours later, hundreds of people took to social media to complain about problems they were having with certain applications on their Macs. Security expert Phil Vachon explained what happened on his blog Security Embedded, writing that an Online Certificate Status Protocol (OCSP) responder checking certificates of each and every application was to blame after an Apple server went down. 

Vachon said that in an effort to protect users and customers from malware, Apple uses an OCSP responder so that “at every launch of an app, macOS would dutifully check if the certificate used by the signer is still valid, per the OCSP responder. Of course, if macOS couldn’t reach the OCSP responder, it would go about its merry way launching an app. After all, a computer needs to work offline, too.”

“If Apple finds that an app they issued a certificate to is actually malware, they can rapidly revoke this certificate and prevent the malware from running, even on machines it has already installed itself on. This does put a lot of policy control in Apple’s hands. This is where you have to make a business decision as to whether or not you trust Apple to be benevolent or not,” Vachon wrote. 

“In the aftermath of the OCSP responder outage, and the dust settling on the macOS Big Sur release, there are a lot of folks reasonably asking if they can trust Apple to be in the loop of deciding what apps should or should not run on their Macs. My argument is—who better than Apple?”

SEE: Identity theft protection policy (TechRepublic Premium)

As more security experts began examining the problem, a number of other thorny issues cropped up. 

Some took issue with the idea that Apple felt the need to verify each and every application while others, like Berlin hacker and security researcher Jeffrey Paul, highlighted that for each instance of verification, the macOS sends a hash back to Apple “of each and every program you run, when you run it.” 

It also sends your IP address to Apple, which Paul said was concerning considering Apple’s work with government agencies like the National Security Agency’s notorious PRISM surveillance program. Paul noted that while most people trust Apple, some may have concerns about the fact that the company has allowed police forces and militaries to gain access to user data.

Apple openly admits to this and allows you to see how many times they have approved government requests for user data. It did this thousands of times in 2019. Paul said that the data from OCSP requests gives the company a significant amount of information on how you use your device, when you use it and where.  

“This means that Apple knows when you’re at home. When you’re at work. What apps you open there, and how often. They know when you open Premiere over at a friend’s house on their Wi-Fi, and they know when you open Tor Browser in a hotel on a trip to another city,” Paul wrote. 

“This data amounts to a tremendous trove of data about your life and habits, and allows someone possessing all of it to identify your movement and activity patterns. For some people, this can even pose a physical danger to them.”

Apple has since sought to explain the problem and defend its practices, writing in a release that the process is part of efforts to protect users from malware embedded in applications downloaded outside of the App Store. 

The company explained that the macOS checks the Developer ID signature for any Mac apps, plug-ins, and installer packages from outside the App Store and requires software to be notarized.

If you have a Mac, you’ve probably seen the message that comes up whenever you download an application that asks you if you’re sure you want to open it. 

“macOS has been designed to keep users and their data safe while respecting their privacy. Gatekeeper performs online checks to verify if an app contains known malware and whether the developer’s signing certificate is revoked,” the Apple statement said. 

“We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices. Notarization checks if the app contains known malware using an encrypted connection that is resilient to server failures. These security checks have never included the user’s Apple ID or the identity of their device.” 

The company went on to add that it has stopped logging IP addresses associated with Developer ID certificate checks and said it will ensure that any collected IP addresses are removed from logs. 

In the next year, the company plans to make other changes to security checks that include a new encrypted protocol for Developer ID certificate revocation checks, stronger protections against server failure, and a new preference for users to opt out of these security protections.

The statement did little to stop the raging debate over Apple’s data collections policies. Some like Paul, were not persuaded by the statement while others, like Vachon, said Apple was not doing anything necessarily nefarious.

Vachon noted in his blog post that malware is “hiding in more places than I have creative brain cells to think of, and every executable package I run makes me wonder who I might be handing the keys to the kingdom over to, especially if there’s no way to tie the package back to the original developer who built it.”

“App signing is actually a very good thing, in my opinion. If anything, it gives me the confidence that the package wasn’t tampered with between when the developer built it and when I installed it,” he wrote. He added that it would be nearly impossible for one person to keep an updated list of trustworthy or untrustworthy parties like Apple does. 

“No matter who you are, you will end up outsourcing this to someone—most users capable of running a security program that monitors for malicious apps (the same applies to most corporations, for that matter). While I’m going to sound like an Apple apologist, I think the privacy arguments are far-fetched. Even if we took them to their extreme conclusion and Apple allowed users to disable all the controls they provide, we would cause more harm than good.”

Famed Danish programmer and creator of Ruby on Rails David Heinemeier Hansson wrote a lengthy thread on Twitter explaining that while Apple was not doing anything nefarious, the company had to understand how this new information looked in light of its size and global power.

“I don’t think Apple is gathering this data because they want to sell it to advertisers (like a Google or Facebook would).” Hansson wrote. “Completely believe that the creators of this system thought they were doing right by users. But that’s the conceit. Apple is late to rendering its actions and intentions through the lens of a two-trillion-dollar conglomerate with a proven record of using its systems and dominance for anti-competitive behavior.” 

Also see



[ad_2]

Source link