Jekyll2023-10-05T15:43:40+00:00https://sneak.berlin/feed.xmlJeffrey PaulThe personal website of Jeffrey Paul. Apple OSes Are Insecure By Design To Aid Surveillance2023-10-05T00:00:00+00:002023-10-05T15:43:29+00:00https://sneak.berlin/20231005/apple-operating-system-surveillance<p>I have a theory that I believe is supported by enough evidence for you to believe it, as well.</p> <p>Mind you, this is <em>definitionally</em> a conspiracy theory; please don’t let the connotations of that phrase bias you, but please feel free to read this (and everything else on the internet) as critically as you wish.</p> <p>I believe that Apple is preserving unencrypted server connections in their operating systems in an effort to enable global location tracking of their userbase by passive monitoring of major internet backbones.</p> <p>This is supported by timelines and context, which will be provided.</p> <p>Several important connections (TSS, OCSP) are made from Apple devices in plaintext (that is, completely unencrypted). This began for historical reasons, but has been repeatedly reported to Apple. They have not fixed it.</p> <p>TSS checks happen on update. OCSP checks happen, among other times, on app launch.</p> <p>Apple committed in writing a few major versions (i.e. ~3 years ago) to providing a preference setting for disabling online OCSP checks in macOS <a href="/20201112/your-computer-isnt-yours/">when I made a stink about it</a>, within one year. Not only did this not happen within a year (a rare instance of Apple actually outright lying), but someone was kind enough to write me and tell me that Apple has <a href="https://support.apple.com/en-us/HT202491">edited the webpage to remove this promise</a>. Presumably there are no plans to offer users ability to disable OCSP checking, which leaks which apps are being launched on your system, when you launch them.</p> <p>Not only can you not disable it, they’re still not happening over encrypted (<code class="language-plaintext highlighter-rouge">https:</code>) connections. This is straightforward to fix, but it hasn’t happened.</p> <p>Apple’s webpage says:</p> <blockquote> <p>We have never combined data from these checks with information about Apple users or their devices.</p> </blockquote> <blockquote> <p>We do not use data from these checks to learn what individual users are using on their devices.</p> </blockquote> <blockquote> <p>These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we don’t log IP addresses associated with Developer ID certificate checks, and we make sure that any collected IP addresses are removed from logs.</p> </blockquote> <p>The problem is that the connections are still unencrypted! Anyone in the world who can watch the internet traffic to or from your computer, or to and from Apple, or between your computer and Apple (this is a <em>lot</em> of people), can make their own logs, with all of the IP addresses and all of the launched apps. If you use enough apps, the specific constellation of <em>your</em> apps is probably pretty close to uniquely identifying you.</p> <p>The OCSP checks (“Gatekeeper”, in Apple’s terminology) are not the big deal, however.</p> <p>When you update a modern mac, it needs something called a boot “ticket” for the new OS. This ticket is cryptographically signed by Apple, and is unique to your specific <code class="language-plaintext highlighter-rouge">Mx</code> Apple Silicon CPU/SoC (or your specific T1/T2 security chip, if you are using an Intel mac).</p> <p>The request to Apple for this boot ticket is via an API (called TSS), and includes specific unique identifying serial nubmers of your computer (such as your chip’s ECID) that never, ever change. It’s done on every major OS update, and, you guessed it, <em>it’s done totally unencrypted</em>. Anyone watching the backbone traffic on the internet will be able to pair ECIDs with client IP addresses on every major macOS update. (Client IP addresses identify city-level location. With the information available to DHS/FBI from the carriers and cable companies, they identify a specific building and subscriber name.)</p> <p><img src="/s/img/202310/2023-10-05-plaintext-tss.jpg" class="img-rounded img-responsive" /></p> <p><small>This is a pcap taken today of the OS updater included in 13.4, released in May of this year, transmitting my ECID in plaintext.</small></p> <p>I <a href="/20220409/apple-is-still-tracking-you-without-consent/">screamed loudly about this in April 2022</a> and ended my post with “Continued transmission of plaintext identifiers will be assumed as malicious intent. Fix it.”</p> <p>It was not fixed. Not in the next release, not in the next major version.</p> <p>It’s <em>still</em> not fixed in Ventura 13.x, at least as of 13.4 (May 2023). Apple has been leaking this customer PII across the internet unencrypted for <em>seven years</em>, ever since the introduction of the first T1 chip in the original Touch Bar MBP.</p> <p><img src="/s/img/202310/2023-10-05-apple-analytics-nonconsensual.jpg" class="img-rounded img-responsive" /></p> <p><small>First, during the update, Apple’s updater has to send your activity data back to the mothership even when analytics transmission is explicitly disabled. (This is why <code class="language-plaintext highlighter-rouge">xp.apple.com</code> is in so many hosts file privacy blocklists.)</small></p> <p><img src="/s/img/202310/2023-10-05-ls-tss-plaintext.jpg" class="img-rounded img-responsive" /></p> <p><small>Then the updater can get to the important work of leaking your plaintext ECID across the whole internet.</small></p> <p>I would be likely to give Apple the benefit of the doubt here, if not for two very important mitigating factors:</p> <ol> <li> <p>Apple does not allow plaintext server communications in apps released by developers in the App Store. This is explicitly against the rules, and they have tools available for app developers to use that make 100% encrypted connections the unavoidable status quo (<a href="https://developer.apple.com/documentation/bundleresources/information_property_list/nsapptransportsecurity/">App Transport Security</a>). But for some reason for seven years and counting they didn’t mandate this for their own OS updates.</p> </li> <li> <p><a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">Apple has a documented history of preserving cryptographic backdoors to aid US government surveillance.</a></p> </li> </ol> <p>You might argue that the latter is now invalid, given that there is now an option to enable end to end encryption (on this page, hereinafter referred to as E2EE, but called “Advanced Data Protection” by Apple) for iCloud and thus iCloud Backup. That’s also not valid, and I’ll explain why.</p> <p>First, iCloud E2EE is opt-in. The setting is buried, and there are no prompts to enable it, so approximately 0% of iCloud users have turned it on. It might as well not exist.</p> <p>Second, iCloud E2EE is woefully incomplete. When you iMessage with someone, they have iCloud Backup on by default, and <a href="https://support.apple.com/en-us/HT202303">non-E2EE by default</a>, which means that approximately <em>all</em> of your iMessages (including all image and document attachments) will <em>still</em> be readable by Apple and the FBI because they are backed up <em>twice</em>: once from each end of the conversation. Unless you <em>and ALSO everyone you iMessage with</em> has enabled E2EE (which, today, is never, ever true) then your iMessages are subject to surveillance by Apple and the whole of the US government that can force them to turn them over.</p> <p>Furthermore, the E2EE for iCloud Photos is not designed to preserve privacy. Even though iCloud Photos now supports E2EE for the content of the photos and videos stored, <a href="https://support.apple.com/en-us/HT202303">the file metadata is not E2EE, and the metadata includes the FILENAME and also a unique hash of the <em>unencrypted</em> file content</a>. This means that if you make a first-of-its-kind Winnie the Pooh meme and save it to your camera roll (hooked up to an E2EE-enabled iCloud Photos account), then send it via secure means (Signal, or in-person AirDrop, or whatever) to <em>another person</em> who has iCloud Photos enabled (also with E2EE) and they save it, Apple can see that you both have the same file, the only two people in the world with it.</p> <p>They can see who had it first. They can see who had it next, and when. They can see where it went after that. This is with full “E2EE” turned on, and without even using Apple messaging apps. This leaks your social graph, too, and it provides a nice list of dissidents who all have the same file.</p> <p>This applies to all files in iCloud Drive, too, not just your photos. Share a file securely within a group of people who all have E2EE enabled for their iCloud Drive? Guess what, Apple now knows they’re a group, and by extension the US and Chinese governments can, too.</p> <p>If you enabled E2EE for iCloud Drive, would you expect that Apple sysadmins can read all of your filenames? How about the FBI without a warrant?</p> <h2 id="privacy-thats-apple">“Privacy, that’s Apple.”</h2> <p>Any repressive government can now go to Apple with an image file or document (or its hash) and demand a list of every single phone number, payment card number, full name, and last known device location of everyone in iCloud who has a copy of the file, even if all of those people have opted in to Apple’s false-sense-of-security E2EE system.</p> <p>Note that this was always possible before, too. But it’s still possible, even under the current E2EE system. So far, it’s farce.</p> <h2 id="why-bother">Why Bother?</h2> <p>What exactly is the point of rolling out this E2EE feature if:</p> <ul> <li>you’re not going to prompt to migrate people to it</li> <li>you still enable authoritarian repressive surveillance of your users even when it’s turned on, because its design sucks</li> <li>you still leak the social graph of the users based on the path and time of spread of unique files</li> </ul> <p>Apple <em>has</em> designed and operated truly private systems before. iCloud Keychain and Health are two bits that have been E2EE and are, as far as I can tell, immune to surveillance orders or corrupt governments (such as those in Apple’s two biggest markets).</p> <p>(Please don’t write me about how the iCloud plaintext content hashes are to support deduplication. Apple already did in <a href="https://support.apple.com/en-us/HT202303">HT202303</a>. That’s not the point, and it’s irrelevant. Also please don’t write me about how E2EE is bad for most users because users will lose all their keys and then lose all their photos and be sad. It’s called <a href="/20181022/sneaks-law/">sneak’s law</a> and I wrote it. It’s irrelevant to the topic today.)</p> <p>Apple says in HT202303 that they are “committed to ensuring more data, including this kind of metadata, is end-to-end encrypted when Advanced Data Protection is enabled”. Maybe if they hadn’t dragged their feet for the FBI years ago then privacy would actually be baked into this product by now.</p> <p>Note well: You can’t use Homepods, Apple Pay, Handoff, or Passkeys without opting in to iCloud; each year choosing not to use iCloud comes with a larger list of disabled functionality on your device. I imagine the Apple Vision will be similarly hobbled without sending Apple and the FBI a constant realtime stream of your life’s activity in the form of iCloud metadata. A ton of the functionality of Apple devices is missing if you don’t submit to the ability to be surveilled.</p> <h2 id="wrapping-up">Wrapping Up</h2> <p>iCloud is a privacy nightmare. iMessage is <em>not</em> end to end encrypted due to <em>both</em> endpoints escrowing their secret keys to Apple by default in the non-E2EE iCloud Backup. Even if you turn on E2EE on one end, it’s ineffective because the other end still has it off.</p> <p>On every macOS update, you transmit plaintext TSS API requests which include your unchanging ECID, alerting everyone on the internet backbones of your ECID-to-IP mapping, allowing your movement to be tracked over time.</p> <p>On many app first launches (not every launch), you transmit plaintext identifiers leaking what developer’s app you have launched and when (and client IP, again). (Note that most developers only publish a single app, so this aliases to which app you have launched.)</p> <p>Furthermore, Apple knows all of these things, and has opted to do nothing meaningful to mitigate the threats.</p> <p>I think that macOS has too many <em>plaintext</em> network privacy leaks, for far too long (in the context of everything else I’ve enumerated here) for this to be carelessness, coincidence, or deprioritization.</p> <h2 id="finally">Finally</h2> <p>Please take the time to go to Berlin, and <a href="https://www.stasimuseum.de/en/enindex.htm">visit the Stasi museum</a>, if you think my <a href="https://en.wikipedia.org/wiki/COINTELPRO">warnings about the potential for abuse by the FBI</a> when they are not constrained by the need for probable cause or search warrants are overblown.</p> <p>This is the same FBI that <a href="https://en.wikipedia.org/wiki/FBI%E2%80%93King_suicide_letter">wrote Martin Luther King Jr. anonymous letters telling him to kill himself</a>.</p> <p>Presently, the US government is accessing the full content of around seventy thousand(<em>!</em>) Apple accounts per year (as of 2022) <em>without a search warrant</em>, per <a href="https://www.apple.com/legal/transparency/">Apple’s own transparency report</a>. The numbers are much higher when you include warrants issued with probable cause.</p> <p>My concern is not so much that Apple is the threat, as Apple is not in the business of oppression or <a href="https://en.wikipedia.org/wiki/Julian_Assange">imprisoning journalists</a>, but the governments that can meaningfully compel Apple: the United States, and the People’s Republic of China. (Google could pull out of China and lose their customers; however approximately 100% of everything Apple sells is made in China, by Chinese people, in factories subject exclusively to Chinese law. China has more control over Apple presently than the USA does.) Anything Apple can know, the CCP or USG can know.</p> <p>A society that is under constant and total suspicionless police surveillance is a society that is, given enough time, actually doomed.</p> <p>Apple is complicit in building this society currently, and it poses an existential threat to freedom worldwide if it is not rectified.</p> <p><small> Finally, if you still don’t believe that Apple would be game to play ball in such a manner, <a href="https://tidbits.com/2020/08/17/the-case-of-the-top-secret-ipod/">read this</a>. There isn’t much wiggle room for large corporations to refuse the direct demands of the military in the USA, there’s simply too much asymmetry in terms of organizational goals. Corporations are ultimately extremely fragile and far too vulnerable to the state’s will.</small></p> <h2 id="footnote">Footnote</h2> <p>I’m experimenting with blogging more quickly and with less time spent editing. Please provide feedback on this or any other post <a href="https://bbs.sneak.berlin">on the BBS</a> or via email.</p>I have a theory that I believe is supported by enough evidence for you to believe it, as well.Apple Has Begun Scanning Your Local Image Files Without Consent2023-01-15T00:00:00+00:002023-01-15T18:27:55+00:00https://sneak.berlin/20230115/macos-scans-your-local-files-now<p>Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I use macOS software on Apple hardware.</p> <p>Today, I was browsing some local images in a subfolder of my <code class="language-plaintext highlighter-rouge">Documents</code> folder, some HEIC files taken with an iPhone and copied to the Mac using the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).</p> <p>I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a lot of Apple OS-level apps because I’m not interested in transmitting any of my data whatsoever to Apple over the network - mostly because Apple turns over customer data on over 30,000 customers per year to US federal police <em>without any search warrant</em> per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.</p> <p>Imagine my surprise when browsing these images in the Finder, Little Snitch told me that macOS is now connecting to Apple APIs via a program named <code class="language-plaintext highlighter-rouge">mediaanalysisd</code> (Media Analysis Daemon - a background process for analyzing media files).</p> <p><img src="/s/img/202301/2023-01-15.mediaanalysisd.jpg" /></p> <p>It’s very important to contextualize this. In 2021 Apple announced their plan to begin clientside scanning of media files, on device, to detect child pornography (“CSAM”, the term of art used to describe such images), so that devices that end users have paid for can be used to provide police surveillance in direct opposition to the wishes of the owner of the device. CP being, of course, one of the classic <a href="https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalypse">Four Horsemen of the Infocalypse</a> trotted out by those engaged in misguided attempts to justify the unjustifiable: violations of our human rights.</p> <p>Apple has repeatedly declared in their marketing materials that “privacy is a human right”, yet they offered no explanation whatsoever as to why those of us who do not traffic in child pornography might wish to have such privacy-violating software running on our devices. It was widely speculated at the time that they were doing this to get the FBI off their back so that they could roll out clientside end-to-end encryption for iCloud, something they were not at the time doing (which provided and preserved a <a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">backdoor in iMessage privacy, specifically for the FBI</a>).</p> <p>There was a large public backlash. Apple likely expected this.</p> <p>Some weeks later, in an <em>apparent</em> (but not really) capitulation, Apple published the following statement:</p> <blockquote> <p>Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.</p> </blockquote> <p>The media <a href="https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/">erroneously reported this as Apple reversing course</a>.</p> <p>Read the statement carefully again, and recognize that <em>at no point</em> did Apple say they reversed course or do not intend to proceed with privacy-violating scanning features. As a point of fact, Apple said they <em>still intend</em> to release the features and that they consider them “critically important”.</p> <p>Apple is very good at writing technically truthful things that say one thing that cause reporters to report a different thing (which is not factual). This becomes an “everybody knows” sort of thing where the narrative that is widely believed and accepted by the public is <em>not</em> what Apple actually said. Apple PR exploits poor reading comprehension ability, while maintaining some sort of imagined moral integrity because they never made any factually false statements during <a href="/20201112/your-computer-isnt-yours/">their attempts to explicitly confuse and induce misreporting</a>.</p> <p>To recap:</p> <ul> <li> <p>In 2021, Apple said they’d scan your local files using your own hardware, in service of the police.</p> </li> <li> <p>People got upset, because this is a clear privacy violation and is wholly unjustifiable on any basis whatsoever. (Some people speculated that such a move by Apple was to appease the US federal police in advance of their shipping better encryption features which would otherwise hinder police.)</p> </li> <li> <p>Apple said some additional things that did NOT include “we will not scan your local files”, but did include a confirmation that they intend to ship such features that they consider “critically important”.</p> </li> <li> <p>The media misreported this amended statement, and people calmed down.</p> </li> <li> <p>In late 2022, <a href="https://www.apple.com/newsroom/2022/12/apple-advances-user-security-with-powerful-new-data-protections/">Apple shipped end-to-end encrypted options for iCloud.</a></p> </li> <li> <p>Today, Apple scanned my local files and those scanning programs attempted to talk to Apple APIs, even though I don’t use iCloud, Apple Photos, or an Apple ID. This would have happened without my knoweldge or consent if I were not running third-party network monitoring software.</p> </li> </ul> <p>Who knows what types of media governments will legally require Apple to scan for in the future? Today it’s CP, tomorrow it’s cartoons of the prophet (PBUH <small>please don’t decapitate me</small>). One thing you can be sure of is that this database of images for which your hardware will now be used to scan will regularly be amended and updated by people who are not you and are not accountable to you.</p> <p>This is your first and only warning: Stock macOS now invades your privacy via the Internet when browing <em>local</em> files, taking actions that no reasonable person would expect to touch the network, with iCloud and all analytics turned off, no Apple apps launched (this happened in the Finder, via spacebar preview), and no Apple ID input. You have been notified of this new reality. You will receive no further warnings on the topic.</p> <p>Integrate this data and remember it: macOS now contains network-based spyware <em>even with all Apple services disabled</em>. It cannot be disabled via controls within the OS: you must used third party network filtering software (or external devices) to prevent it.</p> <p>This was observed on the current version of macOS, macOS Ventura 13.1.</p> <p><small>Aside: I can’t recommend <a href="https://www.obdev.at/products/littlesnitch/index.html">Little Snitch</a> enough. It’s literally the first software I install (via USB) on a fresh macOS, before I even enable Wi-Fi or plug in a network cable.</small></p> <p>A final reminder: if you’ve nothing to hide and you’ve done nothing wrong, those are the times when it is most important to limit information transfer to law enforcement. Law enforcement obtaining data on criminals is not a tragedy. Law enforcement investigating innocent people leads to extreme injustice. You should reject all law enforcement surveillance attempts, obviously if you are criminal, but <em>especially</em> if you are an innocent.</p> <p>As usual, you can <a href="https://bbs.sneak.berlin/t/apple-has-begun-scanning-your-local-image-files-without-consent/633">comment on and discuss this post on the BBS</a>.</p>Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I use macOS software on Apple hardware.Apple is Still Tracking You Without Consent2022-04-09T00:00:00+00:002022-04-10T00:41:36+00:00https://sneak.berlin/20220409/apple-is-still-tracking-you-without-consent<p>In the current version of macOS, Monterey, on every system update on a system containing an M1 chip, such as all the new shiny/fast ARM (“Apple Silicon”) macs, the update process phones home to Apple to obtain a special boot signature, known in Apple jargon as a “ticket”.</p> <p>It does this in a totally unencrypted fashion, over standard plaintext port 80 HTTP (the exact same protocol they banned for use by third party app developers in the App Store when transmitting private data like unique identifiers that serve as PII) to the host <code class="language-plaintext highlighter-rouge">gs.apple.com</code>. The HTTP request includes unchangable hardware unique identifiers (chip serial numbers known as ECIDs) that function as a supercookie, and it is visible to your local LAN, your ISP (or hotel or coffee shop), anyone monitoring the network backbones, and of course Apple.</p> <p>This permits anyone listening to see the approximate location of the device, even if they are not proximate to it, because they can observe the client IP (which is equivalent to approximately city-level geolocation) and the serial number of the device.</p> <p>Anyone watching the internet backbones and internet exchanges can see in which city each chip serial number (ECID) is located, and can see where they travel, as these updates are released several times per quarter. A new request is made on each system update, and users are prompted to enable automatic updates, enabling unattended tracking.</p> <p>The requests are transmitted to a server called <code class="language-plaintext highlighter-rouge">gs.apple.com</code> which is an API run by Apple that provides a service called <a href="https://www.theiphonewiki.com/wiki/Tatsu_Signing_Server">TSS</a> which has been in use since the old iOS days to provide “tickets” (boot signatures) to allow Apple-designed ARM processors (like the Ax and Mx series) to boot.</p> <p>Here’s what it looks like doing an update the last week of Jan 2022 on a <code class="language-plaintext highlighter-rouge">MacBookPro18,2</code> (M1 Max 16” MacBook Pro, 2021):</p> <p>Note that this is the <em>request</em> sent to the server, and that this is sent <em>totally unencrypted</em>.</p> <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>POST /TSS/controller?action=2 HTTP/1.1 Host: gs.apple.com:80 Content-Type: text/xml; charset="utf-8" Proxy-Connection: Keep-Alive Pragma: no-cache User-Agent: com.apple.MobileSoftwareUpdate.UpdateBrainService/1 CFNetwork/1327.0.4 Darwin/21.2.0 Content-Length: 15xxx Connection: close <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>@ApImg4Ticket</key> <true/> <key>@BBTicket</key> <true/> <key>@Baobab,Ticket</key> <true/> <key>@HostPlatformInfo</key> <string>mac</string> <key>@SE,Ticket</key> <true/> <key>@VersionInfo</key> <string>libauthinstall-850.0.2</string> <key>ANE</key> <dict> --- some lines redacted --- <key>ApBoardID</key> <integer>10</integer> <key>ApChipID</key> <integer>24577</integer> <key>ApECID</key> <integer>REDACTED UNIQUE APPLICATION PROCESSOR EC ID (a 16 digit integer)</integer> <key>ApNonce</key> <data> base64 data redacted </data> <key>ApProductionMode</key> <true/> <key>ApSecurityDomain</key> <integer>1</integer> <key>ApSecurityMode</key> <true/> --- additional lines redacted --- </code></pre></div></div> <p>If you block this connection, or you block the (also unencrypted) insecure OCSP connections made by the mac (<a href="/20201112/your-computer-isnt-yours/">previously reported on here</a>) during the update process, the update will fail and your computer will remain out of date and vulnerable to security vulnerabilities.</p> <p>You can be insecure to everyone, or you can be insecure to Apple. There’s no third choice.</p> <p>Apple does not seem to be interested in any way whatsoever in providing you security from malicious acts undertaken by Apple (either willfully, or acts which they are compelled to undertake). This is a huge risk, given that Apple is in the same jurisdiction that earnestly tried to assassinate Julian Assange for doing a journalism. You’d think they’d know better by now. (Or, maybe, given that the CCP has forced them to maintain backdoors for all iCloud users in China, they’ve just decided that you don’t need a weatherman to know which way the wind blows.)</p> <p>This insecurity exists in the latest/current version of macOS, macOS Monterey 12.3.1. I assume without verification that it exists on iOS as well.</p> <p>Apple, please stop transmitting tracking identifiers in plaintext across the network. You’ve already <a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">maintained the backdoors in your end-to-end encryption for the FBI</a>; you’re well past the line where someone may reasonably suspect you of preserving tracking ability for the US military intelligence community as well. There’s no excuse for using plaintext any longer, you even banned its use by developers in the App Store under the whole App Transport Security initiative. Shape the fuck up.</p> <p>You’ve even built a (premium, paid) service (called iCloud Private Relay) specifically for concealing client IPs from advertisers and email tracking services for precisely this reason. It’s not like you can claim ignorance of the privacy issue. For some reason you don’t use this same service for protecting some of your users’ most security sensitive (and potentially life safety critical) information (their location) during a critically important and frequent/routine operation (operating system updates) that nobody should be skipping.</p> <p>Continued transmission of plaintext identifiers will be assumed as malicious intent. Fix it.</p>In the current version of macOS, Monterey, on every system update on a system containing an M1 chip, such as all the new shiny/fast ARM (“Apple Silicon”) macs, the update process phones home to Apple to obtain a special boot signature, known in Apple jargon as a “ticket”.Unstoppable Payments Are Coming.2022-02-13T00:00:00+00:002022-02-14T08:39:07+00:00https://sneak.berlin/20220213/unstoppable-payments-are-coming<p>Unstoppable payments are coming. Society must integrate this inconvenient fact.</p> <p>There’s a protest going on in Canada at the moment. It may or may not have transformed into a riot. I don’t know enough about it to know if I agree with the protesters or not, and I don’t care if you have an opinion on the protest either.</p> <p>The protest needed money for supplies and practical logistic needs. They raised a bunch from sympathetic people all over the world (and especially the USA) on GoFundMe. GoFundMe, not wanting the controversy associated with its brand, censored the fundraising effort on their site. <a href="https://www.npr.org/2022/02/10/1080022827/a-canadian-judge-has-frozen-access-to-donations-for-the-trucker-convoy-protest">Then the funds were then frozen by a Canadian judge, post-donation, pre-disbursement-to-protesters.</a> This is, of course, only a possible situation on a centralized platform that is in custody of the money as a middleman.</p> <p>There are a lot of opinions about whether or not you should be free to give money to whomever you want. I have my own (somewhat extreme) opinions here which longtime friends and readers can probably guess at, but they’re not relevant to the point of this post. You no doubt have your own opinions as well, which I also ask you to set aside for a moment.</p> <p>The protesters switched to using a less well-known and less willing-to-censor-due-to-controversy fundraising website called <a href="https://www.givesendgo.com/">GiveSendGo, at this moment “down for maintenance” after being hacked today</a>.</p> <p>Hacktivism used to be a thing of which I was almost always personally supportive, but in the last decade we’ve seen an emergence of hacktivists of all ideological stripes, including pro-military, violent nationalistic types, so you can no longer remotely infer any semblance of righteousness based solely on “able to engage in hacktivism”. (I think that the <code class="language-plaintext highlighter-rouge">#OpISIS</code> effort by Anonymous to get CloudFlare to start arbitrary censorship of customers in 2015 was the first time I noticed this state of affairs.)</p> <p><a href="https://twitter.com/MikaelThalen/status/1493056860638773248">Today we have learned that some people with some strong opinions about the protest in Canada have breached GiveSendGo’s private databases and have defaced their website</a> and are sending out an archive file containing tens of thousands of names of donors to journalists, although it would appear that the data was probably already semipublic due to <a href="https://twitter.com/MikaelThalen/status/1493062098288447488">some pre-existing clownshoes incompetence by the administrators at GiveSendGo</a>.</p> <p>I told you that story to tell you about the future. I don’t have strong opinions one way or another about the protest, or about these specific payment/fundraising systems, or this hack. (I care a lot about the concept of deplatforming in general, but that’s, again, a topic for another time.)</p> <p>I think that rather than celebrating or condemning this, we should prepare, both emotionally on a personal level, as well as a whole gestalt of a society, for what will happen when the inevitable occurs: totally censorship-resistant payments that even governments cannot meaningfully stop (only hinder/make more inconvenient).</p> <p>This has already occurred, so we know unambiguously that it is coming. In the next ten years they will be available widely to everyone who wants them who has uncensored internet access.</p> <p>The integration of Mobilecoin with Signal is a great example. The wallets are pseudonymous and can be paid without using Signal, the Mobilecoin blockchain itself has transactional privacy built in (who is paying whom is not public), and it uses Signal’s existing end-to-end messaging encryption and privacy features to arrange for coordinating payments between parties, such that the service itself has no idea what anyone’s balances are or who is paying whom. You simply can’t stop payments to anyone on the system without stopping the whole system. Even if you ban a single user from Signal based on their phone number, every user can still pay the payment address directly, and those addresses can be easily disseminated from person to person because the end-to-end encrypted nature of Signal makes it impossible to selectively censor messages.</p> <p>Regardless of how you feel about Signal, or Mobilecoin, or Signal’s integration with Mobilecoin, or cryptocurrency in general: this is a demo of the fact that the concept of end-user-accessible, completely private payments is technically possible, and indeed more than technically possible: it exists today, albeit with a tiny userbase. It’s only a matter of time now until that technology (whether Signal or a competitor) is widely available and evenly distributed to everyone who wants to use it.</p> <p><small> Before you go there: the Mobilecoin blockchain does not use the extremely energy intensive Proof-Of-Work (PoW) system used by most popular blockchains that has become such a point of complaint against them; not that that is relevant to the point of this post, just that I think it would be remiss in introducing the chain without mentioning that fact.</small></p> <p>You may, of course, think that Signal should be banned from the App Store. Perhaps it will be over this. Schneier <a href="https://www.schneier.com/blog/archives/2021/04/wtf-signal-adds-cryptocurrency-support.html">thinks that adding this functionality to Signal was “an incredibly bad idea”</a>, presumably for this reason. It’s a legitimate claim that broadly enabling private payments may actually hinder their abilility to broadly enable private messaging. From a technical standpoint, however, there is no difference whatsoever, and Schneier should be smart enough to recognize this.</p> <p>Now that privacy-focused cryptocurrencies exist, there is no way to have private and uncensorable end-to-end encrypted messaging without also simultaneously enabling private payments. Payment coordination is a subset of messaging.</p> <p>In such a case where it’s yanked from the <a href="https://www.vox.com/recode/2019/10/23/20927577/apple-hong-kong-protest-app-democracy">totalitarian, censorship-happy Apple App Store</a> and unavailable on iPhones, people needing such payments functionality will switch to or augment with a platform that allows sideloading of apps. The most popular smartphone OS in the world is such a platform.</p> <p>You may, of course, think that use and operation of such completely unstoppable payment infrastructure should be prohibited by law as it eliminates existing government/legal control over who can pay whom and why. They could outlaw such payments, just as they successfully outlawed cocaine and speeding, but, critically, that doesn’t <em>actually stop them from happening</em>; it only introduces penalties for people when caught.</p> <p>The cat is now out of the bag.</p> <p>Fact is, though, it is now technically feasible, and the only effective counters from the authoritarians who would stop it in advance are aggressive censorship of apps that implement it (which is, in short order, any end-to-end encrypted messenger), or generalized internet network censorship that blocks certain protocols like these sorts of p2p payment systems. This is the sort of infrastructure that China and Russia have built into the internet in their countries.</p> <p>It’s only a matter of time before these fundraising efforts aren’t going to operate on trivially shutdown centralized websites, and anyone can pay anyone they want, in almost any country they want, without anyone being able to stop them.</p> <p>What does our world look like, then? Is that okay? Is that terribly bad? What are the potential upsides? What are the failure modes? Think beyond throwing out terrifying <a href="https://en.wikipedia.org/wiki/Thought-terminating_clich%C3%A9">thought-terminating cliche</a> set pieces like “financing terrorism” and consider down to the actual practical details what true global any-to-any payments enable.</p> <p>I think that a lot of people don’t like to confront certain difficult facts, such as the one that there are many things that the government and entire “justice system” cannot actually meaningfully control. Use of drugs, the drug trade, and the sex trade stand as shining examples of the fact that there are many things that governments simply cannot actually stop with the tools they usually use for such things.</p> <p>I don’t think the US, for example, wants to go full <a href="https://en.wikipedia.org/wiki/Great_Firewall">Great Firewall-style internet censorship</a> (they didn’t for Bitcoin, for example), so that leaves only practically ineffective methods of attempting to prevent or dissuade determined users from using this new technology.</p> <p>It didn’t stop people’s taste for liquor during prohibition, and it didn’t stop people’s taste for hookers and blow then or now.</p> <p>My own personal theory (possibly biased due to my general optimism regarding human societies, borne out by the progression of civilization the last 100 years or so) is that the threat of terrorism and violence is massively overblown (“if it bleeds, it leads”…) and that this will ultimately be a net benefit to all people who have access to such systems. Being able to beam cash to anyone you want, instantly, privately, and without any third party applying second-guessing to what you are allowed to do with your own cash is insanely powerful stuff, and I think like with most sharp tools unleashed upon society to wield as they see fit, the benefits to people will massively outweigh the harms. We’ve seen this again and again. I’m not sold on the idea that the reason terrorists don’t more often do mass murder is simply a lack of financing.</p> <p>It’s also possible that I’m missing something! My optimism is sometimes misplaced. You are encouraged to directly answer the questions posed above, via the link below.</p> <div class="well"> Community comments on this post can be made and read on <a href="https://bbs.sneak.berlin/t/unstoppable-payments-are-coming/578">the BBS</a>. All are welcome to participate. </div>Unstoppable payments are coming. Society must integrate this inconvenient fact.Signal Is Wrecking Your Images and Videos2021-04-25T00:00:00+00:002021-04-26T09:13:35+00:00https://sneak.berlin/20210425/signal-is-wrecking-your-images-and-videos<p><img src="https://dl.sneak.cloud/2021/2021-04-18.sleepybeans.a7r4.07733.jpg" class="img-fluid" alt="sleepy beans" style="max-width: 100%; height: auto;" /></p> <p>I’m a photographer. I care a lot about pixels and image quality and resolution. My main workstation has three 5120x2880 5k displays that run at 218ppi.</p> <p>I’m also someone who really likes music and audio engineering who has been known to DJ occasionally.</p> <p>I also come from a network and storage engineering background, so I’m not entirely ignorant of the considerations that are made in terms of trade-offs for size/cost versus quality. I remember once in the 90s being totally flabbergasted upon performing an ABX (computer-assisted double blind test) in ideal conditions to find 160kbps MP3 (lossy) compression entirely acoustically transparent across a range of different audio samples. I encourage you to try the same sometime, and I’ll bet you any amount of money that you wish that you cannot reliably distinguish 320kbps MP3 from FLAC/WAV, ever.</p> <p>Today I sent <a href="https://dl.sneak.cloud/2021/2021-04-18.untitled.a7r4.07723.jpg">a picture I’d taken of Tutu</a> with my Sony A7R4 camera via Signal. I’d loaded the raw in Lightroom, done some of the usual silly traditional photographery things one does to portraits in post-production, and exported it using one of my export presets, which limits the long edge resolution to a maximum of 4000 pixels and the overall total file size to a maximum of 25MB (along with slapping my email address on the bottom corner because we still haven’t figured out how to reliably attach metadata to bitmaps across multiple generations/edits). I think in 2021, 25MB or so is probably the upper “reasonable” limit of file size for passing around a single compressed image. In this case, the exported JPEG file was only 3.9MB, quite a bit below that.</p> <p>Signal took that 4000x2667 JPEG image comprising precisely 3,916,886 bytes, encrypted it, and transmitted it to my friend. She received a <em>different</em> 4000x2667 JPEG image comprising 784,524 bytes: 80% smaller.</p> <p>Signal threw away 80% of the data in my already-compressed image. (The original image, not the export, was 9504x6336 and 67,358,106 bytes.) I had already compressed it once by eliminating 94% of the data from when it came out of my camera.</p> <p>Camera raw: <strong>67MB</strong></p> <p>My “reasonable person” standard for maximum filesize for an image to be sent across the internet in 2021: <strong>25MB</strong></p> <p>My actual file export: <strong>3.9MB (-94%)</strong></p> <p>Signal’s unasked-for, silent edit: <strong>0.8MB (another -80%)</strong></p> <p>Did you know that the everyday, normal-sized (<4MB) images you send via Signal are being silently altered in transit to look like dogshit?</p> <p>I didn’t.</p> <p>Now we both do.</p> <p>I don’t think Signal should do this.</p> <p>If they feel it’s absolutely necessary for them to do this to continue to exist as a free service: they should be much, <em>much</em> clearer about the fact that this is going to happen, at the time of recompression, and permit you to opt out of sending an altered file. As well as, just, you know, not touching images that are already reasonably sized.</p> <p>What if I were sending evidence to my attorney? You just fucked that up, both from a file integrity standpoint, as well as an image quality one. I hope it wasn’t exculpatory!</p> <p>You’re a messenger. Don’t edit the text in the messages I send without my consent, and don’t edit the bytes in the attached files I send without my consent. Neither one is okay.</p> <p>(It happens with videos too, much worse. I’m just using photos as an example.)</p> <h2 id="security-footnote">Security Footnote</h2> <p>Before you get worried about message privacy, note that Signal messages are fully end-to-end encrypted, and all of this quiet fuckery happens on your own device, in the Signal client software, before it gets encrypted and transmitted. This isn’t happening inside the Signal <em>service</em>, this is a misfeature in the Signal <em>software</em>, designed to benefit the operators of Signal-the-service, at your expense (or at least at the expense of the quality/integrity of your files).</p> <p>This behavior was observed on 2021-04-25 using Signal Desktop for macOS version <code class="language-plaintext highlighter-rouge">1.37.3</code>.</p> <h1 id="a-note-about-the-open-source-signal-client">A Note About The Open Source Signal Client</h1> <p>It’s a common misconception that you can’t make forks of the <a href="https://github.com/signalapp/Signal-Desktop">Signal client software</a> that work with the main, Signal-organization-operated centralized Signal service. This is false. The Signal client software is AGPL (sort of a silly choice for client software, tbh) and can be forked at will by anyone, and you are free to leave the main upstream Signal service API URLs in the code. (You are probably not free to use the string “signal” or “Signal” in the name of the derived work though, but that’s an entirely separate trademark issue.) The Signal service Terms of Service (TOS) govern <em>API clients</em>, not the publication of software, and at the moment the Signal service TOS does not prohibit use of the Signal service by authorized users that are using non-official clients (although it’s fairly obvious that the Signal founder is personally opposed to that). In any case they apply to different entities: the AGPL license for the Signal client software applies to distribution of derived works; the Terms of Service for the Signal service API governs end-users who connect to those APIs.</p> <p>(I’m not even sure you can govern the use of client software in a Terms of Service for an API, that would be like Google claiming that you’re not authorized to use google.com if you’re loading it in Firefox. That’d be insane.)</p> <p>As you can probably tell, I’m really hoping that someone makes a much better client for the Signal service. The “official” ones from the Signal service operators are pretty lame.</p> <div class="well"> Community comments on this post can be made and read on <a href="https://bbs.sneak.berlin/t/signal-is-wrecking-your-images-and-videos/216">the BBS</a>. All are welcome to participate. </div>How Not To Run A Vulnerability Disclosure Program2021-04-24T00:00:00+00:002021-04-24T10:53:57+00:00https://sneak.berlin/20210424/how-not-to-run-a-vulnerability-disclosure-program<p>I found a small vulnerability in American Express today. Nothing major, but something they’ll definitely want to fix.</p> <p>I searched and found that they <a href="https://www.americanexpress.com/us/security-center/cybersecurity.html">have a special dedicated vulnerability reporting email address</a>. Cool.</p> <p>I wrote them an email outlining the issue, and mentioning that I intend to publish the finding in 30 days.</p> <p>They sent me back an autoresponse, saying that to “complete the submission of this report” I have to follow a link. That’s no issue, however, they also (inaccurately) claimed:</p> <blockquote> <p>Before you submit the report please read the <a href="https://www.hackerone.com/privacy">HackerOne Privacy Policy</a>. By clicking on the link below, you confirm that you have read and agree to the terms of the Policy.</p> </blockquote> <p>First off, fuck you. You don’t get to assert that I agreed to some legal contract (with a third party, no less!) because I clicked a link you sent me. That’s not how any of this works.</p> <p>Second, why would you make the <em>free donation</em> of important business security information by a complete volunteer to a <em>financial institution used by millions</em> contingent upon agreeing to some random third party service’s legal contract? Are you fucking stupid? You want <em>zero</em> friction in this, even if you have to hire three shifts of inbound security disclosure email ticket triage staff. It’s not like this is Joe Appalachia’s Bumblefuck Credit Union: you’re in the <em>top 10</em> of the Fortune 50 and have like a billion customers, and the economy of scale here is very nearly visible from the Kuiper belt.</p> <p>I clicked the link (without reading the HackerOne Privacy Policy, because I can’t possibly have agreed to a contract I have never even seen before). I use <a href="https://noscript.net/">NoScript</a>, which blocks Javascript from running in my browser. The page completely failed to render: a blank white screen, not even an “enable javascript, pretty please!” error. I can only assume it did not complete my submission.</p> <p>AmEx has now twice refused my free donation of security information: first when they handed me off to some third-party service bot which demanded I agree totally to their terms or fuck directly off, and second when the third-party service they picked turned out to be run by idiots that have decided that <a href="https://developer.mozilla.org/en-US/docs/Glossary/Graceful_degradation">graceful degradation in the face of feature incompatibility</a>, one of the core foundational tenets of the world wide web <em>since its invention</em>, despite a nice two-decade run simply isn’t important anymore in 2021, and that serving blank pages to… you know, security professionals with javascript disabled (pretty much browser security tip #1), is totally fine.</p> <p>(Anyone competent and serious about workstation security is either using a whole boatload of different virtual machines for browsing the web, which is an inconvenient pain in the dick, browses with javascript off by default, or both.)</p> <h2>"Disable the fucking scripts." —Ed Snowden</h2> <p><small>This is a real, actual Snowden quote, as reported verbatim by Bart Gellman in his book, <a href="https://www.penguinrandomhouse.com/books/316047/dark-mirror-by-barton-gellman/">Dark Mirror</a>. (Highly <a href="https://vk.com/wall-160012486_149">recommended</a>, btw.) He was talking about browser Javascript <em>specifically</em>.</small></p> <p>I fired up a VM just for them, not because I want to submit the ticket, but because now I’m curious how deep this shit-filled rabbit hole goes. Loading the submission page prompts me to create an account (value donation friction event #3, OF COURSE) and includes remote-loaded Google ad tracking spyware javascript, too. Why do I even bother? Ain’t one hacker over at HackerOne, it seems.</p> <p>Check back here on 24th May (my 30 day embargo period was specified in the email they seem to now have ignored) to read about AmEx doing some super duper bush-league amateur hour security mistake, which I’ll give you dollars to donuts is still happening on that date. Cheers!</p>I found a small vulnerability in American Express today. Nothing major, but something they’ll definitely want to fix.macOS 11.2 Network Privacy2021-02-02T00:00:00+00:002021-02-03T07:55:40+00:00https://sneak.berlin/20210202/macos-11.2-network-privacy<p>Yesterday on 1 Feb 2021, Apple released macOS 11.2 (Big Sur .2), which <a href="https://www.zdnet.com/article/apple-removes-feature-that-allowed-its-apps-to-bypass-macos-firewalls-and-vpns/">removes the <code class="language-plaintext highlighter-rouge">ContentFilterExclusionList</code> that prevented user content filtering applications from blocking Apple first-party apps from accessing the network</a>. This misfeature, originally introduced on 12 November 2020 in macOS 11.0 (the only OS available on the new M1/ARM “Apple Silicon” macs) and <a href="https://twitter.com/patrickwardle/status/1349487256180908032">initially publicized on Twitter by Patrick Wardle, creator of LuLu</a>, was a major privacy violation, permitting Apple apps to phone home even when you used a firewall like <a href="https://objective-see.com/products/lulu.html">LuLu</a> or <a href="https://www.obdev.at/products/littlesnitch/index.html">Little Snitch</a> to block them.</p> <p>Fortunately, it’s now removed, and users like me who don’t use iCloud, the App Store, Siri, iMessage, or FaceTime (and have Apple’s analytics disabled) can now upgrade to Big Sur, knowing that it is possible to prevent the device from constantly sending usage data across the network to the manufacturer.</p> <p>There are several privacy/usage leaks remaining in the OS, but now they can be effectively blocked without affecting the overall operation of the device.</p> <h1 id="test-prep">Test Prep</h1> <p>The following testing occurred today, using an M1 Macbook Air with Wi-Fi MAC <code class="language-plaintext highlighter-rouge">18:3e:ef:b9:2a:01</code>.</p> <p>The system was booted from 11.2 USB installer media, and <code class="language-plaintext highlighter-rouge">diskutil zeroDisk force /dev/disk0</code> was run from terminal. <code class="language-plaintext highlighter-rouge">nvram -c</code> was used to wipe all stored NVRAM settings, including network configuration and credentials. The system upon reboot was missing its OS as well as its macOS and system recovery partitions (as expected), and booted to the <a href="http://support.apple.com/mac/restore">“exclamation point in a circle”</a> awaiting OS and firmware restore.</p> <p>The machine was <a href="https://support.apple.com/guide/apple-configurator-2/revive-or-restore-a-mac-with-apple-silicon-apdd5f3c75ad/mac">placed in DFU mode</a>, connected with a cable to another machine running <code class="language-plaintext highlighter-rouge">Apple Configurator 2</code>, and <a href="http://updates-http.cdn-apple.com/2021WinterFCS/fullrestores/071-00846/DCCBFDF4-0B4E-4628-A843-F8755C863FB0/UniversalMac_11.2_20D64_Restore.ipsw"><code class="language-plaintext highlighter-rouge">UniversalMac_11.2_20D64_Restore.ipsw</code></a> (SHASUM <code class="language-plaintext highlighter-rouge">7315f657df2a14b838b9d51eb49cdfff5be090e7</code>) was dragged directly onto the “DFU” icon for the USB2-attached MBAir, restoring the firmware and OS.</p> <p><img class="img-fluid" src="/s/2021/02/sysinfo.jpg" /></p> <p>On first boot, the system did not connect to the previously-automatically-connected available Wi-Fi network, suggesting successful clearing of NVRAM (which normally persists your Wi-Fi password across reinstalls/OSes).</p> <p>The system completed the out-of-box-setup (initial country selection, user account creation, et c) without being connected to the network. Location services, analytics, “Screen Time”, Siri, and Touch ID were all declined.</p> <p>(Has anyone else noticed that Apple’s setup wizards are getting more and more aggressive about prompting you to enable their services, even after you click no?)</p> <h1 id="whats-it-doing">What’s It Doing?</h1> <p>Pcapping began.</p> <p>The desktop appeared, and the system was connected to a dedicated wi-fi network for the purpose, <code class="language-plaintext highlighter-rouge">DeviceUnderTest</code>. 60 seconds or so of “no activity” <a href="https://www.youtube.com/watch?v=ISpsK5MwnXY">simply staring at the desktop</a> elapsed, and the system was rebooted, automatically reconnecting to the test network.</p> <p>Wi-Fi was then disabled. The pcap on the gateway was filtered thus:</p> <p><code class="language-plaintext highlighter-rouge">tcpdump -lne -r 2021-02-02.firstboot-unfiltered-11.2.pcap \ -w 2021-02-02.firstboot.filtered-b92a01-11.2.pcap ether host 18:3e:ef:b9:2a:01</code></p> <p>The following is all packets to or from <code class="language-plaintext highlighter-rouge">18:3e:ef:b9:2a:01</code>: <a href="/s/2021/02/2021-02-02.firstboot.filtered-b92a01-11.2.pcap.gz"><code class="language-plaintext highlighter-rouge">2021-02-02.firstboot.filtered-b92a01-11.2.pcap.gz</code></a></p> <p>In this few minutes, the system generated <strong>38 megabytes of network traffic</strong>.</p> <p>The following is a list of all hostnames queried in DNS by Big Sur in this pcap:</p> <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>nostromo:~$ tshark -r 2021-02-02.firstboot.filtered-b92a01-11.2.pcap \ -T fields -e dns.qry.name | sort | uniq | grep -v "_tcp.local" | grep -v "_dns-sd" | grep -v "in-addr.arpa" 1-courier.push.apple.com 1-courier.sandbox.push.apple.com 46-courier.push.apple.com 49-courier.push.apple.com a1051.b.akamai.net a1864.gi3.akamai.net a2047.dscb.akamai.net a239.gi3.akamai.net albert.apple.com albert.gcsis-apple.com.akadns.net api.apple-cloudkit.com api.smoot.apple.com apple-finance.query.yahoo.com appleid.apple.com bag-smoot.v.aaplimg.com bag.itunes.apple.com c.apple.news captive.apple.com captive.g.aaplimg.com cdn.apple.com.c.footprint.net cf.iadsdk.apple.com configuration.apple.com configuration.ls.apple.com cs9.wac.phicdn.net e10499.dsce9.akamaiedge.net e11408.d.akamaiedge.net e12919.dscd.akamaiedge.net e1329.g.akamaiedge.net e16126.dscg.akamaiedge.net e17437.dscb.akamaiedge.net e5977.dsce9.akamaiedge.net e673.dsce9.akamaiedge.net e6858.dsce9.akamaiedge.net e6987.a.akamaiedge.net e6987.e9.akamaiedge.net gateway.fe.apple-dns.net gateway.icloud.com gdmf.apple.com gdmf.apple.com.akadns.net geo-applefinance-cache.internal.query.g03.yahoodns.net get-bx.g.aaplimg.com gsa.apple.com gsa.apple.com.akadns.net gsp-ssl.ls.apple.com gspe1-ssl.ls.apple.com gspe21-ssl.ls.apple.com gspe35-ssl.ls.apple.com help.apple.com iadsdk.apple.com init-p01md.apple.com init.ess.apple.com init.itunes.apple.com init.push-apple.com.akadns.net init.push.apple.com internalcheck.apple.com lcdn-locator-usnkq.apple.com.akadns.net lcdn-locator.apple.com mesu.apple.com ocsp.apple.com ocsp.digicert.com pancake.apple.com pancake.g.aaplimg.com pds-init.ess.apple.com stocks-sparkline.apple.com swcdn.apple.com swdist.apple.com swscan.apple.com time.apple.com weather-data.apple.com weather-edge.apple.com weather-edge.news.apple-dns.net www.apple.com xp.apple.com nostromo:~$ tshark -r 2021-02-02.firstboot.filtered-b92a01-11.2.pcap \ -T fields -e dns.qry.name | sort | uniq | grep -v "_tcp.local" | grep -v "_dns-sd" | grep -v "in-addr.arpa" | wc -l 74 nostromo:~$ </code></pre></div></div> <p>All of these 73 hostname lookups happened <em>without launching any apps</em>: no App Store, analytics off, no iTunes, nothing. No Apple ID has been used on the device. The device was set up, analytics and network services declined, connected to network, sat at desktop, rebooted, sat at desktop, and then shut down.</p> <p>Let’s see what happens when we launch some apps:</p> <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>nostromo:~$ ssh root@gateway.local "tcpdump -s 9999 -i ens3 -w - ether host 18:3e:ef:b9:2a:01" | pv > 2021-02-02.applaunch.11.2.pcap </code></pre></div></div> <p>The system is reconnected to Wi-Fi, and rebooted.</p> <p>One by one, the following apps are launched (via the command-space launcher, which also sends each and every keystroke typed into it over the network):</p> <ul> <li>App Store</li> <li>News</li> <li>TV</li> <li>Books</li> <li>Maps</li> </ul> <p><img class="img-fluid" src="/s/2021/02/appstore-spyware.jpg" /></p> <p>The “we’re going to upload your hardware serial number” non-consent “consent” screen on these was clicked through, and then the splash pages were permitted to load, and then the apps were closed. The system was shut down.</p> <p>An additional 47 megabytes of traffic was generated in this additional 2-3 minutes: <a href="/s/2021/02/2021-02-02.applaunch.11.2.pcap.gz">2021-02-02.applaunch.11.2.pcap.gz</a>.</p> <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>nostromo:~$ tshark -r 2021-02-02.applaunch.11.2.pcap -T fields -e dns.qry.name | sort | uniq | grep -v "_tcp.local" | grep -v "_dns-sd" | grep -v "in-addr.arpa" tshark: The file "2021-02-02.applaunch.11.2.pcap" appears to have been cut short in the middle of a packet. 1-courier.push.apple.com 1-courier.sandbox.push.apple.com 26-courier.push.apple.com 34-courier.push.apple.com a1806.dscb.akamai.net a1838.dscb.akamai.net a1864.gi3.akamai.net a1956.dscb.akamai.net a2047.dscb.akamai.net amp-api.apps.apple.com api-edge.apps.apple.com api-glb-den.smoot.apple.com api-glb-usw2c.smoot.apple.com api.apple-cloudkit.com apple-finance.query.yahoo.com apple.com apple.comscoreresearch.com appleid.apple.com apps.mzstatic.com ax.itunes.apple.com bag.itunes.apple.com books.apple.com buy.itunes.apple.com c.apple.news captive.apple.com captive.g.aaplimg.com cdn.smoot.apple.com cdn.smoot.g.aaplimg.com cdn2.smoot.apple.com cf.iadsdk.apple.com client-api.itunes.apple.com configuration.apple.com configuration.ls.apple.com cs9.wac.phicdn.net e10499.dsce9.akamaiedge.net e11408.d.akamaiedge.net e12919.dscd.akamaiedge.net e1329.g.akamaiedge.net e14313.g.akamaiedge.net e16126.dscg.akamaiedge.net e17437.dscb.akamaiedge.net e3925.dscx.akamaiedge.net e5949.dscg.akamaiedge.net e5977.dsce9.akamaiedge.net e673.dsce9.akamaiedge.net e673.dscx.akamaiedge.net e6858.dsce9.akamaiedge.net e6987.a.akamaiedge.net e6987.e9.akamaiedge.net e8143.dscb.akamaiedge.net gateway.fe.apple-dns.net gateway.icloud.com gdmf.apple.com gdmf.apple.com.akadns.net geo-applefinance-cache.internal.query.g03.yahoodns.net get-bx.g.aaplimg.com gsp-ssl.ls.apple.com gspe1-ssl.ls.apple.com gspe19-ssl.ls.apple.com gspe21-ssl.ls.apple.com gspe35-ssl.ls.apple.com help.apple.com humb.apple.com iadsdk.apple.com init.itunes.apple.com init.push-apple.com.akadns.net init.push.apple.com is1-ssl.mzstatic.com is2-ssl.mzstatic.com is3-ssl.mzstatic.com is4-ssl.mzstatic.com is5-ssl.mzstatic.com itunes.apple.com js-cdn.music.apple.com mesu-cdn.origin-apple.com.akadns.net mesu.apple.com news-assets.apple.com news-client.apple.com news-client.news.apple-dns.net news-edge.apple.com news-edge.origin-apple.com.akadns.net news-events.apple.com news-events.news.apple-dns.net ocsp.apple.com ocsp.digicert.com pancake.apple.com pancake.g.aaplimg.com pds-init.ess.apple.com play.itunes.apple.com s.mzstatic.com sb.tv.apple.com se-edge.itunes.apple.com sf-api-token-service.itunes.apple.com smoot-api-glb-den.v.aaplimg.com smoot-searchv2-usw2c.v.aaplimg.com stocks-sparkline.apple.com support-sp.apple.com swscan.apple.com time.apple.com uts-api.itunes.apple.com weather-data.apple.com weather-edge.apple.com weather-edge.news.apple-dns.net www.apple.com xp.apple.com nostromo:~$ tshark -r 2021-02-02.applaunch.11.2.pcap -T fields -e dns.qry.name | sort | uniq | grep -v "_tcp.local" | grep -v "_dns-sd" | grep -v "in-addr.arpa" | wc -l tshark: The file "2021-02-02.applaunch.11.2.pcap" appears to have been cut short in the middle of a packet. 106 nostromo:~$ </code></pre></div></div> <p>105 unique hosts looked up this time. Again, this is on a system that <em>does not opt in to any Apple services</em>: no iCloud, no FaceTime, no iMessage, no App Store apps, no Siri, no analytics.</p> <p>The system was reconnected to the network, and <a href="/s/2021/02/11.2-block-apple-hosts.sh">11.2-block-apple-hosts.sh</a> was downloaded and executed, placing the combined list of hosts into <code class="language-plaintext highlighter-rouge">/etc/hosts</code> in an attempt to perform some simple hostname-based blocking. The system was then shut down.</p> <p>Pcapping resumed.</p> <p>The system was booted to the desktop, the cursor moved around, and rebooted to the desktop once more.</p> <p>The same apps were again one by one launched in order and terminated:</p> <ul> <li>App Store</li> <li>News</li> <li>TV</li> <li>Books</li> <li>Maps</li> </ul> <p>The system was shut down and the resulting file: 104KB (~one tenth of a megabyte) comprising <a href="/s/2021/02/2021-02-02.hostblocking.11.2.pcap.gz">2021-02-02.hostblocking.11.2.pcap</a>.</p> <p>It’s still sending DNS traffic trying to look up the following hostnames:</p> <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>nostromo:~$ tshark -r 2021-02-02.hostblocking.11.2.pcap -T fields -e dns.qry.name | sort | uniq | grep -v "_tcp.local" | grep -v "_dns-sd" | grep -v "in-addr.arpa" tshark: The file "2021-02-02.hostblocking.11.2.pcap" appears to have been cut short in the middle of a packet. 24-courier.push.apple.com 36-courier.push.apple.com a1806.dscb.akamai.net a1864.gi3.akamai.net api-edge.apps.apple.com api-glb-usw2c.smoot.apple.com apple-finance.query.yahoo.com apple.comscoreresearch.com apps.mzstatic.com c.apple.news captive.apple.com captive.g.aaplimg.com cf.iadsdk.apple.com configuration.apple.com configuration.ls.apple.com e10499.dsce9.akamaiedge.net e11408.d.akamaiedge.net e12919.dscd.akamaiedge.net e1329.g.akamaiedge.net e17437.dscb.akamaiedge.net e5977.dsce9.akamaiedge.net e673.dsce9.akamaiedge.net e673.dscx.akamaiedge.net e6987.a.akamaiedge.net e6987.e9.akamaiedge.net gateway.fe.apple-dns.net gateway.icloud.com gdmf.apple.com gdmf.apple.com.akadns.net geo-applefinance-cache.internal.query.g03.yahoodns.net get-bx.g.aaplimg.com gsp-ssl.ls.apple.com gspe1-ssl.ls.apple.com gspe19-ssl.ls.apple.com gspe35-ssl.ls.apple.com help.apple.com iadsdk.apple.com init.itunes.apple.com init.push-apple.com.akadns.net init.push.apple.com mesu-cdn.origin-apple.com.akadns.net mesu.apple.com news-edge.apple.com news-edge.origin-apple.com.akadns.net ocsp.apple.com pds-init.ess.apple.com play.itunes.apple.com push.apple.com smoot-searchv2-usw2c.v.aaplimg.com swdist.apple.com.edgekey.net swscan.apple.com xp.apple.com nostromo:~$ </code></pre></div></div> <p>All in all I think that’s a pretty good improvement. I’m going to be blocking all/most of these hostnames/third-level domains at <a href="https://nextdns.io">NextDNS</a>, my DNS provider, and I imagine you could accomplish much the same in Pi-Hole or some similar project. This, coupled along with a (now working) user firewall (I intend to use <a href="https://objective-see.com/products/lulu.html">LuLu</a>) should be sufficient for my purposes.</p> <p>It will be interesting (read: time-consuming and toilsome) work trying to figure out which hostnames beyond the obvious ones (such as <code class="language-plaintext highlighter-rouge">swscan.apple.com</code>) will need to be unblocked to receive OS security updates. Apple, if you’re listening (like you <a href="/20201112/your-computer-isnt-yours/">apparently were last time</a>), putting critical OS patches/checks on a small number of consistently named hostnames, and publishing/documenting that list, would be hugely beneficial for those of us who want to avoid vulnerabilities while hard opting-out from Apple’s other nonconsensual network services.</p> <p>Blocking this list of hostnames, coupled along with the fact that macOS will now no longer prevent <a href="https://objective-see.com/products/lulu.html">LuLu</a> and <a href="https://www.obdev.at/products/littlesnitch/index.html">Little Snitch</a> from working against Apple’s own embedded OS spyware in 11.2 means that I’m saved from having to switch to KDE and the shitshow that is PC laptop hardware for at least a few more months. (Did you know there are precisely <em>zero</em> good GUI email clients for Linux?!)</p> <p>(I recently received an XPS 13” for testing, and while the 300+ ppi screen is amazing (also, it’s an option), you’re fooling yourself if you think this thing can compete with Apple laptops.)</p> <p>The quest for a computer that allows me to boot, open a local text editor, write some words, save them on a disk, quit the program, and shut down without snitching to the network continues. (This is mild hyperbole: <a href="https://pop.system76.com/">Pop! OS</a> on the XPS 13 is already there; I just can’t stand the plasticky keyboard, the shitty GPU, and the terrible desktop environments available for Ubuntu-alikes.) Winston Smith would be proud, I think.</p> <p>I’m starting a video series soon that’s going to highlight these sorts of widespread, invisible processes that happen with our tools and technology, generally without our knowledge. It’s a scary thought, but Apple devices and software, while chock-full of network surveillance, are actually some of the <em>least</em> intrusive in the industry. Make sure you sign up at <a href="https://sneak.berlin/list">sneak.berlin/list</a> to be notified when my video series launches if you’re interested in this kind of stuff.</p> <h1 id="final-notes">Final Notes</h1> <p>If you’re really concerned about privacy for these things, do what I do, and keep it behind a separate VPN router at all times (a VPN client on the device itself is insufficient). At my home and office I have a dedicated vlan/ssid for such things, with a <a href="https://www.pcengines.ch/apu2.htm">PC Engines apu2</a> as WireGuard client and gateway, and on the road I have a <a href="https://www.amazon.com/gp/product/B082X2DLMY">GL.iNet GL-E750</a> which takes a SIM card, speaks LTE/4G to the tower, runs OpenWRT and a WireGuard client (on which I have root and access to iptables/tcpdump!), has a battery, and provides Wi-Fi. I don’t put SIM cards in any of my Apple devices any longer, nor do I ever connect them to the unfiltered, un-VPN’d internet. My iPhone 12 Pro (sadly, my last) has never had a SIM card in it, and never will.</p> <p>This may seem a bit extreme, but all iPhones/iPads (and now Macs!) maintain a persistent connection to Apple at all times (to APNS, for the receipt of push notifications), with a unique identifier, which permits Apple to construct a detailed track log of your device via the geolocation of the client IP that connects to the service. This, coupled along with the unique identifier, lets them (and any other service your devices frequently connect to with a unique ID) see when you switch from your home ISP to your mobile carrier, from your mobile carrier to your work ISP, or when you go to a different city or town. I’ve nothing to hide, but I’d prefer that the people who constructed my laptop not forever receive a realtime log of when I’m at or away from home; that strikes me as unreasonable. Apple’s marketing around privacy seems to assume that people desire privacy from everyone but Apple themselves.</p> <p>Stay safe out there, and <a href="https://sneak.berlin/list">keep in touch</a>.</p>Yesterday on 1 Feb 2021, Apple released macOS 11.2 (Big Sur .2), which removes the ContentFilterExclusionList that prevented user content filtering applications from blocking Apple first-party apps from accessing the network. This misfeature, originally introduced on 12 November 2020 in macOS 11.0 (the only OS available on the new M1/ARM “Apple Silicon” macs) and initially publicized on Twitter by Patrick Wardle, creator of LuLu, was a major privacy violation, permitting Apple apps to phone home even when you used a firewall like LuLu or Little Snitch to block them.On Trusting Macintosh Hardware2020-12-04T00:00:00+00:002020-12-04T08:27:35+00:00https://sneak.berlin/20201204/on-trusting-macintosh-hardware<p>Modern Apple computers can no longer be fully used and maintained in 100% offline environments, or in ways that will reasonably ensure that the computer is free of state-ordered tampering.</p> <p>Buy a new mac, Intel (with T2 chip; all recent Intel machines) or M1, and it comes “activated” from the factory, with the OS installed and ready to use. Inside all current and recent Apple computers is a chip made by TSMC for Apple, to Apple’s design and specification. In the M1 this is the main (central) processing unit, in all of the Intel macs this is a chip called T2, which is responsible for allowing the Intel chip to boot.</p> <p>If you wish to fully and completely restore these systems to their factory state for whatever reason, be it a virus or malware, reverting a testing or research configuration, preparing for resale, disk data corruption, whatever: the operating system for the special secured section of the processor (in M1-land) or the separate security/encryption chip (the T2, in Intel-land) must be restored, first and foremost, for the computer to even think about booting an OS. The only way to do this is to obtain a cryptographic signature from Apple, specific to that hardware. Apple provides these freely (provided your hardware isn’t marked as locked to a particular Apple ID, in which case they provide it freely if you provide the authentication data (password, et c) for that Apple ID).</p> <p>Freely <em>over the internet</em>, that is.</p> <p>I’m not talking about the main OS for the computer. This is the special, security-specific OS that runs either on a dedicated part of the M1 or the external T2 on Intel systems (BridgeOS, it’s called on T2). If the internal disk is <em>totally</em> blanked and wiped to restore the computer’s software to exactly as it was from the factory (or at least exactly as it was the last time you freshly wiped and reinstalled it from known-quantity, checksummed media), you <em>must</em> connect it to the internet to “activate” (that is, provide the appropriate cryptographic proofs to the security chip that will convince it to function) to begin using the internal disk again.</p> <p>This means that macs, even the recent Intel ones, are now entirely unsuitable for certain critically important industrial applications:</p> <ul> <li>airgapped systems that never touch the internet, such as: <ul> <li>those that are used in cryptocurrencies,</li> <li>high value encryption ceremonies,</li> <li>SCIFs,</li> <li>or other secure/offline data processing facilities</li> </ul> </li> <li> <p>systems that must maintain cryptographic integrity (such as being fully wiped and reinstalled, offline, from known-checksummed boot media)</p> </li> <li>systems that are offline for extended periods of time, where no internet access is readily available, and where such systems <em>must</em> be able to be repaired/reinstalled/restored whilst underway, such as: <ul> <li>remote research stations, such as those in Antarctica</li> <li>ships at sea</li> <li>space</li> </ul> </li> </ul> <p>Also, any country that decides to fully censor or block access to Apple or the US (without a special deal with Apple, such as in China) will render inoperable any mac or iPhone/iPad that gets wiped, as it won’t be able to reactivate.</p> <p>This is a major bummer. I maintain a small number of high assurance systems that have never been connected to the internet, and have only come into contact with a limited set of devices that can possibly move data in or out of them. They were purchased in ways that are not linked to me or my companies, in person, off the shelf, for cash, to avoid targeted attacks. They’ve been fully wiped, zeroed out, and then booted from cryptographically verified (locally, on other high assurance systems) boot media. Some have had their internal firmwares dumped directly from the chips to verify that they are what they should be. Then a known-good OS image (again, verified on other machines) is used to install them, and they are used for certain extremely sensitive or high value cryptographic tasks, usually permanently offline (airgapped). Some are connected in very specific, limited ways to exclusively non-internet-connected “island” networks.</p> <p>To repurpose these machines, or to audit/verify them, the steps are repeated: dump their data, wipe them fully, verify their firmware, reinstall from cryptographically verified media. (Optionally, audit to ensure that the data dumps are what they are expected to be.)</p> <p>This is now impossible on any modern mac (and 100% of all recent macs with decent keyboards). Between the wipe and reinstallation step, the machine <em>must</em> connect to Apple to obtain a tiny bit of cryptographic signature data that allows it to be “activated”, enabling the security chip (which mediates all disk i/o) which then permits it to be reinstalled. The code that makes this request is closed source, and (I presume, and will verify soon) that the request itself is encrypted (and perhaps certificate pinned; more info on that soon). Based on the security system design, we know that the request necessarily includes a unique identifier of the system (likely the system serial number, or a different unique identifier inside of the processor that maps 1-to-1 in Apple internal records with the assembled system serial number). Because the connection back to Apple includes the IP address from which you’re connecting, and IP addresses are coarse (city-level) location, Apple knows that serial number X is at location Y at time Z when you reactivate. If you don’t reactivate, the computer simply won’t work at all, even if you have full, local reinstallation media.</p> <p><small>Note that iPhones and iPads have all been like this since their first day of release: you can wipe them offline, but they’re very close to bricked until you can get them back online to “activate” them, which, just like the computer, exposes your device’s unique identifier, IP (coarse location), and timestamp to Apple. It does so again whenever you install an app, as a similar online authorization is required to put any app on such a device.</small></p> <p>This means that the machine can potentially be tampered with by Apple (or anyone who can coerce Apple), on a system-by-system basis. There’s no way to take a system, wipe it, and reinstall it offline from a cryptographically verified exactly-the-same-OS-as-everyone-else-on-the-planet reinstallation USB. There’s now a step in between, where it identifies itself to Apple and persists some unique-to-that-system data to the long-term storage on the device, and this process is totally opaque to you. Will the security chip OS, or the main system OS, do something differently based on the contents of that system-unique data? There’s no simple or straightforward way to know, especially considering that the OS for the security chip is encrypted in such a way that only the security chip can decrypt it: you’re not allowed to inspect it at all. Reverse engineering these systems takes huge amounts of time; it was years after their release that the first software dumps of the iPhone’s security enclave code even became available (and, even then, the dumps are executable, not source code, and must be reverse engineered like any other compiled binary). This software is periodically updated with new versions, so any such previous in-depth public analysis is then totally worthless (from a security assurance standpoint) a short while later.</p> <p>This is a <em>huge</em> degradation in the <em>trustworthiness</em> of the mac platform as a whole. It’s now impossible to know that a mac is obeying the wishes of its owner (via wipe and reinstall from known-good media) and hasn’t had some subtle “backdoor the disk encryption hardware because the FBI made us do it” flag jammed down its throat from half a planet away. To achieve that, one must now buy a brand new shrinkwrapped system (anonymously, in person, for cash, to avoid interdiction and other targeted attacks) and never, <em>ever</em> fully wipe its disk, or it’s now useless without letting Apple reach into it over the internet and modify its lowest level code however they wish.</p> <p><small>(Before you Apple-bashers start shouting, I absolutely do not think this is some “but Apple is a hardware company!” conspiracy to sell a few thousand more single-use macs to the tiny segment of the population that requires high assurance systems.)</small></p> <p>Recall also this is <a href="https://www.nytimes.com/2019/10/09/technology/apple-hong-kong-app.html">the same Apple who, under pressure from the CCP, censored apps used by pro-democracy protesters in Hong Kong</a>. The ability to ensure that your device, once manufactured by Apple, is now no longer <em>remotely controlled</em> by Apple (and the governments with whom they cooperate), has been taken away from you in their latest models. (No one ever been able to have confidence in this fact on an iPhone or iPad.) Even if you trust Apple 100%, the fact that the state can effectively put a gun to their head and demand that they disable or surveil certain specific devices means that this platform as a whole is can no longer be made trustworthy, even with extreme and professional measures.</p> <p>Every single mac sold today or recently is running under this system; you cannot put them into a locally known/verified state, a function where you control all of the inputs: it simply won’t allow you to use the disk at all without an external opaque input to that function, one that you cannot inspect or modify.</p> <p>It’s no longer your machine; wipe its storage and that machine won’t function without explicit permission from Apple. Apple wants your machine to be secure: secure from everyone except Apple (and the governments to whom Apple must answer). Apple wants your data to be private: private from everyone except Apple (and the governments to whom Apple must answer).</p> <p>These systems are now insecure <em>by design</em>: there is <em>no way</em> for them to be made secure.</p> <p><br /> <br /></p> <hr /> <p><br /> <br /></p> <p><small>I’ve the M1 Air in hand and will have more specifics (and pcaps) about specifically what it sends and receives both during activation, as well as normal system operation, soon.</small></p>Modern Apple computers can no longer be fully used and maintained in 100% offline environments, or in ways that will reasonably ensure that the computer is free of state-ordered tampering.Your Computer Isn’t Yours2020-11-12T00:00:00+00:002020-11-24T18:13:58+00:00https://sneak.berlin/20201112/your-computer-isnt-yours<!-- note that you can't use a div here with markdown inside --> <p><em>There have been several updates appended to this page as of 2020-11-16, please <a href="#updates">see below</a>.</em></p> <p><small>Also available in:</small></p> <ul> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.tr/">Türkçe</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.fr/">Français</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.es/">Español</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.pt/">Português</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.pt-br/">Português brasileiro</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.ru/">русский</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.zh/">简体中文</a></small></li> <li><small><a href="/i18n/2020-11-12-your-computer-isnt-yours.ja/">日本語</a></small></li> <li><small>others: email translations in markdown format to <a href="mailto:sneak@sneak.berlin">sneak@sneak.berlin</a></small></li> </ul> <p>It’s here. It happened. Did you notice?</p> <p>I’m speaking, of course, of <a href="https://www.gnu.org/philosophy/right-to-read.en.html">the world that Richard Stallman predicted in 1997</a>. The one <a href="https://craphound.com/pc/download/">Cory Doctorow also warned us about</a>.</p> <p>On modern versions of macOS, you simply can’t power on your computer, launch a text editor or eBook reader, and write or read, without a log of your activity being transmitted and stored.</p> <p>It turns out that in the current version of the macOS, the OS sends to Apple a hash (unique identifier) of each and every program you run, when you run it. Lots of people didn’t realize this, because it’s silent and invisible and it fails instantly and gracefully when you’re offline, but today the <a href="https://news.ycombinator.com/item?id=25074959">server got really slow</a> and it didn’t hit the fail-fast code path, and everyone’s apps failed to open if they were connected to the internet.</p> <p>Because it does this using the internet, the server sees your IP, of course, and knows what time the request came in. An IP address allows for coarse, city-level and ISP-level geolocation, and allows for a table that has the following headings:</p> <p><code class="language-plaintext highlighter-rouge">Date, Time, Computer, ISP, City, State, Application Hash</code></p> <p>Apple (or anyone else) can, of course, calculate these hashes for common programs: everything in the App Store, the Creative Cloud, Tor Browser, cracking or reverse engineering tools, whatever.</p> <p>This means that Apple knows when you’re at home. When you’re at work. What apps you open there, and how often. They know when you open Premiere over at a friend’s house on their Wi-Fi, and they know when you open Tor Browser in a hotel on a trip to another city.</p> <p>“Who cares?” I hear you asking.</p> <p>Well, it’s not just Apple. This information doesn’t stay with them:</p> <ol> <li> <p>These OCSP requests are transmitted <em>unencrypted</em>. Everyone who can see the network can see these, including your ISP and <a href="https://en.wikipedia.org/wiki/Room_641A">anyone who has tapped their cables</a>.</p> </li> <li> <p>These requests go to a third-party CDN run by another company, Akamai.</p> </li> <li> <p>Since October of 2012, Apple is a partner in <a href="https://en.wikipedia.org/wiki/PRISM_(surveillance_program)">the US military intelligence community’s PRISM spying program</a>, which grants the US federal police and military unfettered access to this data without a warrant, any time they ask for it. <a href="https://www.apple.com/legal/transparency/">In the first half of 2019 they did this over 18,000 times, and another 17,500+ times in the second half of 2019.</a></p> </li> </ol> <p>This data amounts to a tremendous trove of data about your life and habits, and allows someone possessing all of it to identify your movement and activity patterns. For some people, this can even pose a physical danger to them.</p> <p>Now, it’s been possible up until today to block this sort of stuff on your Mac using a program called <a href="https://www.obdev.at/products/littlesnitch/index.html">Little Snitch</a> (really, the only thing keeping me using macOS at this point). In the default configuration, it blanket allows all of this computer-to-Apple communication, but you can disable those default rules and go on to approve or deny each of these connections, and your computer will continue to work fine without snitching on you to Apple.</p> <p>The version of macOS that was released today, 11.0, also known as Big Sur, has new APIs that prevent Little Snitch from working the same way. The new APIs don’t permit Little Snitch to inspect or block any OS level processes. Additionally, the <a href="https://appleterm.com/2020/10/20/macos-big-sur-firewalls-and-vpns/">new rules in macOS 11 even hobble VPNs so that Apple apps will simply bypass them</a>.</p> <p><a href="https://twitter.com/patrickwardle/status/1327034191523975168">@patrickwardle lets us know</a> that <code class="language-plaintext highlighter-rouge">trustd</code>, the daemon responsible for these requests, is in the new <code class="language-plaintext highlighter-rouge">ContentFilterExclusionList</code> in macOS 11, which means it can’t be blocked by any user-controlled firewall or VPN. In his screenshot, it also shows that CommCenter (used for making phone calls from your Mac) and Maps will also leak past your firewall/VPN, potentially compromising your voice traffic and future/planned location information.</p> <p>Those shiny new Apple Silicon macs that Apple just announced, three times faster and 50% more battery life? They won’t run any OS before Big Sur.</p> <p>These machines are the first general purpose computers ever where you have to make an exclusive choice: you can have a fast and efficient machine, or you can have a private one. (Apple mobile devices have already been this way for several years.) Short of using an external network filtering device like a travel/vpn router that <em>you</em> can totally control, there will be no way to boot any OS on the new Apple Silicon macs that <em>won’t</em> phone home, and you can’t modify the OS to prevent this (or they won’t boot at all, due to hardware-based cryptographic protections).</p> <p><small><strong>Update, 2020-11-13 07:20 UTC:</strong> It comes to my attention that it may be possible to disable the boot time protections and modify the Signed System Volume (SSV) on Apple Silicon macs, via the <a href="https://keith.github.io/xcode-man-pages/bputil.1.html">bputil</a> tool. I’ve one on order, and I will investigate and report on this blog. As I understand it, this would still only permit booting of Apple-signed macOS, albeit perhaps with certain objectionable system processes removed or disabled. More data forthcoming when I have the system in hand.</small></p> <p>Your computer now serves a remote master, who has decided that they are entitled to spy on you. If you’ve <a href="https://www.apple.com/macbook-air/">the most efficient high-res laptop in the world</a>, you <em>can’t turn this off</em>.<a href="#turnitoff">*</a></p> <p>Let’s not think very much right now about <a href="https://lapcatsoftware.com/articles/revocation.html">the additional fact that Apple can, via these online certificate checks, prevent you from launching any app they (or their government) demands be censored</a>.</p> <h1 id="dear-frog-this-water-is-now-boiling">Dear Frog, This Water Is Now Boiling</h1> <p>The day that Stallman and Doctorow have been warning us about has arrived this week. It’s been a slow and gradual process, but we are finally here. You will receive no further alerts.</p> <h1 id="see-also">See Also</h1> <ul> <li>21 Jan 2020: <a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">Apple dropped plan for encrypting backups after FBI complained</a></li> </ul> <h1 id="probably-unrelated">Probably Unrelated</h1> <p>In other news, Apple has quietly backdoored the end-to-end cryptography of iMessage. Presently, modern iOS will prompt you for your Apple ID during setup, and will automatically enable iCloud and iCloud Backup.</p> <p>iCloud Backup is not end to end encrypted: it encrypts your device backup to <em>Apple</em> keys. Every device with iCloud Backup enabled (it’s on by default) backs up the complete iMessage history to Apple, along with the device’s iMessage secret keys, each night when plugged in. Apple can decrypt and read this information without ever touching the device. Even if <em>you</em> have iCloud and/or iCloud Backup disabled: it’s likely that whoever you’re iMessaging with does not, and that your conversation is being uploaded to Apple (and, via PRISM, freely available to the US military intelligence community, FBI, et al—with no warrant or probable cause).</p> <p><small><a href="https://signal.org/">Use Signal.</a></small></p> <h1 id="updates">Updates</h1> <p><a name="updates"></a></p> <p><strong>Update, 2020-11-16 16:06 UTC:</strong></p> <blockquote> <p>“What are the facts? Again and again and again – what are the facts? Shun wishful thinking, ignore divine revelation, forget what “the stars foretell,” avoid opinion, care not what the neighbors think, never mind the unguessable “verdict of history” – what are the facts, and to how many decimal places? You pilot always into an unknown future; facts are your single clue. Get the facts!”</p> </blockquote> <p>— Robert Heinlein</p> <p>That guy jacopo who supposedly debunked my primary claim is lying. It’s <em>evidenced on his own page</em>, which you can <a href="https://blog.jacopo.io/en/post/apple-ocsp/">go see for yourself</a>:</p> <p><img class="img-fluid" src="/s/2020/20201112.yourcomputer/wrong.jpg" /></p> <p>Oops.</p> <p>He also claims that “macOS does actually send out some opaque information about the developer certificate of those apps”. It’s actually not opaque at all: it’s a <em>publicly known</em> unique identifier for the developer of an app (which for almost all apps is a public unique identifier for that app, as most developers only publish a single app).</p> <p><small>This nicely illustrates the danger of trusting any expert that jams some technical gibberish in your face under a clickbait <a href="https://en.wikipedia.org/wiki/Betteridge%27s_law_of_headlines">Betteridge headline</a>. Make sure you do your homework, and, always, always, our guiding light: <em>What are the facts?</em></small></p> <p>The thing that’s sent is <em>indeed a hash</em>, is <em>indeed a unique identifier for almost all apps</em>, and is <em>indeed sent to Apple unencrypted in realtime with your IP</em>. I simplified the explanation above to avoid having to explain OCSP and x509 and the PKI, and was deliberately careful not to claim that it was a hash of the file content of the application binary.</p> <p>TL;DR: This post is, was, and remains accurate. Clickbait gonna clickbait.</p> <p><a name="turnitoff"></a> The <strong>good news</strong> is that <a href="https://support.apple.com/en-us/HT202491">Apple has, just today, publicly committed</a>, presumably in response to this page, to:</p> <ol> <li> <p>deleting the IP logs</p> </li> <li> <p>encrypting the communication between macOS and Apple to prevent the privacy leak</p> </li> <li> <p>giving users an option of disabling these online checks that leak what apps you’re opening and when.</p> </li> </ol> <p>(Their update is at the very bottom of that page, under the oddly-capitalized headline “Privacy protections”.)</p> <p>A quote from Apple’s 16 November update:</p> <blockquote> <p>Gatekeeper performs online checks to verify if an app contains known malware and whether the developer’s signing certificate is revoked. We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.</p> </blockquote> <blockquote> <p>Notarization checks if the app contains known malware using an encrypted connection that is resilient to server failures.</p> </blockquote> <p>They use deliberately confusing language here to lead you into conflating Gatekeeper with Notarization, so that you will believe that the connections are currently encrypted, while not lying. The Gatekeeper OCSP checks described in this post (“Gatekeeper performs online checks”) <strong>are not encrypted</strong>. (The notarization ones, which aren’t relevant here, are.)</p> <p>Apple’s spin doctors are among the best in the world, and my hat’s off to them.</p> <p><small>This even totally fooled <a href="https://www.theverge.com/2020/11/16/21569316/apple-mac-ocsp-server-developer-id-authentication-privacy-concerns-encryption-promises-fix">Jon Porter at The Verge</a> into misreporting their insinuation as a statement of fact, inside of a hyperlink to the Apple post itself which says no such thing! Honestly, I’m amazed and impressed, this sort of say-one-thing-but-readers-read-another is to me like magic tricks to a 6 year old. The Verge, to their credit, amended their reporting after I emailed them to point this out, but still: wow. That’s world-class work.</small></p> <p>Further:</p> <blockquote> <p>In addition, over the the next year we will introduce several changes to our security checks:</p> </blockquote> <blockquote> <p>A new encrypted protocol for Developer ID certificate revocation checks</p> </blockquote> <p><small> (All of you that are <a href="https://www.epsilontheory.com/too-clever-by-half/">too-clever-by-half</a> incorrectly commenting about TLS trust circular dependencies and how OCSP <em>has</em> to be unencrypted to work <a href="https://news.ycombinator.com/item?id=25096990">can stop now</a>.) </small></p> <p>It sucks that they’ve let the NSA, CIA, your ISP, et al slurp up this unencrypted pattern-of-life data off the wire for the last 2+ years, and they’re still going to transmit the data (encrypted) to <em>Apple</em> in realtime, on by default every single mac, but at least the 0.01% of mac users who know about it now can turn it off, so Apple will only get a realtime log of what apps you open, when, and where for the other 99.99% of mac users. Cool.</p> <p><small>It’s possible they’ll use a bloom filter or some other privacy-preserving way of distributing the certificate revocation data that doesn’t actually transmit app launch activity, but given that <em>every single version</em> of iOS now begs me to re-enable analytics no matter how many times I repeatedly opt out, I’m not holding my breath here. We won’t know until they update this process, which they’ve only committed to doing sometime in the next <em>year</em>, which shows you how much of a priority your privacy is to them.</small></p> <p>This is, sadly, about as close as you can possibly get to a “we fucked up” from Apple PR: they’re deleting their IP logs, encrypting their shit, and letting you turn it off. This is great, but they have <a href="https://thenextweb.com/plugged/2020/11/16/apple-apps-on-big-sur-bypass-firewalls-vpns-analysis-macos/">remained totally silent on the fact that their OS apps will still bypass your firewall and leak your IP and location past your VPN on Big Sur</a> and how they’re still <a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">not fixing the key escrow backdoor in iMessage’s encryption so Apple sysadmins and the FBI can keep seeing your nudes and texts in iMessage</a>.</p> <p>We need to be happy with little victories, I guess.</p> <p><a href="https://twitter.com/dhh/status/1328337941769367552">dhh puts it best</a>:</p> <blockquote> <p>The whole process of having Apple mix these “protections against malware” into a system that’s also a “protection of our business model” remains deeply problematic.</p> </blockquote> <blockquote> <p>We need to remain vigilant, and resist these power grabs masquerading purely as benevolent security measures. Yes, there are security benefits. No, we don’t trust Apple to dictate whether our computers should be allowed to run a piece of software. We already lost that on iOS.</p> </blockquote> <blockquote> <p>Anyway, this is promise of progress. Right now, Apple is still linking your IP address to app openings in an unencrypted way over the open internet. And in Big Sur, have prevented tools like Little Snitch from blocking that. So until the fixes roll out, maybe don’t upgrade?</p> </blockquote> <blockquote> <p>What this change to logging and promise of future improvements also does, though, is hanging all the Apple apologists that were oh-so-quick to dismiss these revelations as nothing out to dry. Yikes jumping on that boat the day before Apple sinks it themselves with this admission.</p> </blockquote> <p>Three cheers for intelligent voices of reason. Thanks, dhh!</p> <p><strong>Update, 2020-11-14 05:10 UTC:</strong> There is now a FAQ.</p> <h2 id="faq">FAQ</h2> <p>Q: <em>Is this part of macOS analytics? Does this still happen if I have analytics off?</em></p> <p>A: This has nothing to do with analytics. It seems this is part of Apple’s anti-malware (and perhaps anti-piracy) efforts, and happens on all macs running the affected versions of the OS, independent of any analytics settings. There is no user setting in the OS to disable this behavior.</p> <p>Q: <em>When did this start?</em></p> <p>A: This has been happening since at least macOS Catalina (10.15.x, released 7 October 2019). This did not just start with yesterday’s release of Big Sur, it has been happening silently for <em>at least</em> a year. <a href="https://lapcatsoftware.com/articles/notarization-privacy.html">According to Jeff Johnson of Lap Cat Software</a>, this started with macOS Mojave, which was released on 24 September 2018.</p> <p>Each new version of macOS that comes out, I install on a blank fresh machine, turn analytics off and log into nothing (no iCloud, no App Store, no FaceTime, no iMessage) and use an external device to monitor all of the network traffic that comes out of the machine. The last few versions of macOS have been quite noisy, even when you don’t use any Apple services. There have been some privacy/tracking concerns in Mojave (10.14.x), but I don’t recall if this specific OCSP issue existed then or not. I have not yet tested Big Sur (<a href="https://sneak.berlin/list">keep in touch for updates</a>), and the concerns about user firewalls like Little Snitch and the Apple apps bypassing those and VPNs have come from reports from those who have. I imagine I’ll have a big list of issues I find with Big Sur when I install it on a test machine this week, as it just came out yesterday and I don’t use my limited time testing betas that are in flux, only released software.</p> <p>Q: <em>How do I protect my privacy?</em></p> <p>A: It varies. There’s a ton of traffic coming out of your mac talking to Apple, and if you’re concerned about your privacy you can start with turning off the things for which there <em>are</em> knobs: disable and log out of iCloud, disable and log out of iMessage, disable and log out of FaceTime. Ensure Location Services is off on your computer, iPhone, and iPad. These are the big tracking leaks that you’ve already opted in to, and there is a way out that could not be simpler: turn it off.</p> <p>As for the OCSP issue, I believe (but have not tested!) that</p> <p><code class="language-plaintext highlighter-rouge">echo 127.0.0.1 ocsp.apple.com | sudo tee -a /etc/hosts</code></p> <p>will work for now for this specific issue. I block such traffic using Little Snitch, which still works correctly on 10.15.x (Catalina) and earlier. (You have to disable all of the Little Snitch default allow rules for “macOS Services” and “iCloud Services” to get alerts when macOS tries to talk to Apple, because Little Snitch permits them by default.)</p> <p>If you have an Intel mac (which is pretty much all of you right now), don’t worry too much about OS changes. If you’re willing to get your hands dirty and change some settings, you’ll likely always be able to modify every OS that Apple ever ships for your machine. (This is especially true for slightly older intel macs that do not have the T2 security chip in them, and it’s likely that even T2 Intel macs will always be permitted to disable all boot security (and thus modify the OS) if the user desires, which is the case today.)</p> <p>The new ARM64 (“Apple Silicon”) macs that were released this week are the reason for my sounding the alarm: it remains to be seen whether it will be possible for users to modify the OS on these systems at all. On other Apple ARM systems (iPad, iPhone, Apple TV, Watch) it is cryptographically prohibited to disable parts of the OS. In the default configuration for these new ARM macs, it will likely be prohibited as well, although hopefully users that want the ability will be able to disable some of the security protections and modify the system. I’m hoping that the <a href="https://keith.github.io/xcode-man-pages/bputil.1.html"><code class="language-plaintext highlighter-rouge">bputil(1) utility</code></a> will permit disabling of the system volume integrity checks on the new macs, allowing us to disable certain system services at boot, without disabling all of the platform security features. More information will be forthcoming when I have the new M1 hardware in hand this month and have the facts.</p> <p>Q: <em>If you don’t like Apple or don’t trust their OS, why are you running it? Why did you say you’re buying one of the new ARM macs?</em></p> <p>A: The simple answer is that without the hardware and software in hand, I can’t speak authoritatively about what it does or does not do, or steps one might take to mitigate any privacy issues. The long answer is that I have 20+ computers that comprise ~6 different processor architectures and I variously run all of the OSes you’ve heard of and some of the ones you probably haven’t. For example, here in my lab, I have 68k macs (16 bit, almost-32 bit (shoutout to my IIcx), and 32 bit clean), PowerPC macs, Intel 32 bit macs, Intel 64 bit macs (with and without the T2 security chip), and I’d be a total slacker if I didn’t hack at least a little bit on an ARM64 mac.</p> <p>Q: <em>Why is Apple spying on us?</em></p> <p>A: I don’t believe that this was explicitly designed as telemetry, but it happens to serve insanely well for that purpose. The simple (assume no malice) explanation is that this is part of Apple’s efforts to prevent malware and ensure platform security on macOS. Additionally, the OCSP traffic that macOS generates is not encrypted, which makes it <em>perfect</em> for military surveillance operations (which passively monitor all major ISPs and network backbones) to use it for the <em>purpose</em> of telemetry, whether Apple <em>intended</em> that when designing the feature or not.</p> <p>However: Apple recently backdoored iMessage’s cryptography with an iOS update that introduced iCloud Backup, and then <a href="https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT">didn’t fix it so the FBI could continue to read all the data on your phone</a>.</p> <p>As Goldfinger’s famous saying goes: <em>“Once is happenstance. Twice is coincidence. The third time it’s enemy action.”</em> There is a finite and small number of times Apple (who employs many absolute stone-cold cryptography <em>wizards</em>) can say “oops sorry it was an accident” that their software transmitted plaintext or encryption keys <em>off of the device and to the network/Apple</em> and remain credible in their explanations.</p> <p>The last time I reported an issue to Apple involving the transmission of plaintext across the network back in 2005, <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-3774">they fixed it promptly</a>, and that was only for dictionary word lookups. Shortly thereafter they introduced <a href="https://developer.apple.com/documentation/bundleresources/information_property_list/nsapptransportsecurity">App Transport Security</a> to help third-party app developers stop fucking up their use of network crypto, and made it way more difficult for those same app developers to make unencrypted requests in App Store apps. It’s quite strange to me to see Apple making OCSP requests unencrypted, even if that is the industry default.</p> <p>If Apple truly cares about user privacy, they should be looking long and hard at every single packet that comes out of a mac on a fresh install before they release a new OS. We are. The longer that they don’t, the less credible their claims about respecting user privacy will become.</p> <p>Q: <em>Why are you crying wolf? Don’t you know that OCSP is just to prevent malware and keep the OS secure and isn’t meant as telemetry?</em></p> <p>A: The side effect is that it <em>functions as telemetry</em>, regardless of what the original intent of OCSP is or was. Additionally, even though the OCSP responses are signed, it’s borderline negligent that the OCSP requests themselves aren’t encrypted, allowing anyone on the network (which includes the US military intelligence community) to see what apps you’re launching and when.</p> <p>Many things function as telemetry, even when not originally intended as so. The intelligence services that spy on everyone they can take advantage of this when and where it occurs, regardless of designer intent.</p> <p>It’s not worth putting everyone in a society under constant surveillance to defeat, for example, violent terrorism, and it’s not worth putting everyone on a platform under the same surveillance to defeat malware. You throw out the baby with the bathwater when, in your effort to produce a secure platform, you produce a platform that is <em>inherently insecure</em> due to a lack of privacy.</p> <p>Q: <em>They backdoored iMessage’s end-to-end encryption?! WTF?!</em></p> <p>A: Yup. More technical details in my HN comments <a href="https://news.ycombinator.com/item?id=25078317">here</a> and <a href="https://news.ycombinator.com/item?id=25078388">here</a>.</p> <p>TL;DR: They even say as much on their website; from <a href="https://support.apple.com/en-us/HT202303">https://support.apple.com/en-us/HT202303</a>:</p> <blockquote> <p>Messages in iCloud also uses end-to-end encryption. <strong>If you have iCloud Backup turned on, your backup includes a copy of the key protecting your Messages.</strong> This ensures you can recover your Messages if you lose access to iCloud Keychain and your trusted devices. When you turn off iCloud Backup, a new key is generated on your device to protect future messages and isn’t stored by Apple.</p> </blockquote> <p>(emphasis mine)</p> <p>Note that iCloud Backup itself is <em>not</em> end-to-end encrypted, which is what results in the iMessage key escrow issue that backdoors the end-to-end encryption of iMessage. There’s a section on that webpage that lists the stuff that <em>is</em> end-to-end encrypted, and iCloud Backup ain’t in there.</p> <p><a href="https://sneak.berlin/20200604/if-zoom-is-wrong-so-is-apple/">Neither are your iCloud photos.</a> Apple sysadmins (and the US military and feds) can <a href="https://youtu.be/XEVlyP4_11M?t=1493">totally see all your nudes</a> in iCloud or iMessage.</p> <h2 id="further-reading">Further Reading</h2> <ul> <li><a href="https://developer.apple.com/videos/play/wwdc2020/10686/">https://developer.apple.com/videos/play/wwdc2020/10686/</a></li> <li><a href="https://lapcatsoftware.com/articles/ocsp.html">https://lapcatsoftware.com/articles/ocsp.html</a></li> </ul>Goodbye, Twitter.2020-10-31T00:00:00+00:002020-11-10T12:45:43+00:00https://sneak.berlin/20201031/goodbye-twitter<h2 id="preface">Preface</h2> <p>I encourage you to read this article, and, if you find it valuable or useful information, go to <a href="https://sneak.berlin/list">https://sneak.berlin/list</a> and enter your email address to receive updates from me a few times per year. In the next few moments you’ll read why I’m no longer using these centralized, censorship platforms, and why I’m falling back to direct communication with people like you via email and my own website.</p> <h2 id="twelve-years-is-a-long-time">Twelve Years Is A Long Time</h2> <p>A few years ago I celebrated a decade together with my partner; my Twitter account dates from <em>before I met her</em>.</p> <p>Today, I deleted it.</p> <p>The choices we make affect the world around us, and shape our society.</p> <p>Twitter has always mildly censored their site, deciding, up front, what is or is not allowed to be posted, even within the bounds of the law. Contravene these rules, and Twitter might delete your account.</p> <p>This is terrible, but legal: their site, their rules. It’s censorship, and abhorrent, but to make it illegal would be dictating that Twitter to host things they don’t want, and that would violate the human rights of the people who run and operate Twitter.</p> <p>For most of its existence, these rules were mostly irrelevant, as they weren’t consistently applied, and were quite permissive.</p> <p>Presently, Twitter has become massively popular, and has seen an explosion in signups and attention paid to their platform. This has caused them to abandon their original values, and make several changes, all of which made the site (and the world) worse off:</p> <ul> <li> <p>Twitter rewrites any URLs you post to go through their URL redirector so that they may spy on your traffic, and, worse yet, <em>truncates</em> the display of the original URL even inside the <code class="language-plaintext highlighter-rouge"><a></code> tag, so that your original URL <em>is no longer available to readers at all</em>.</p> </li> <li> <p>The categories of things you’re not allowed to post have been increased and widened, due to public and political pressure.</p> </li> <li> <p>World leaders (as defined by Twitter, natch) are <a href="https://help.twitter.com/en/rules-and-policies/public-interest">allowed to post things that they would ban your pleb ass for posting</a>.</p> </li> <li> <p>Twitter has decided that they will be the arbiter of truth on certain topics, and posting false or misleading information may mean that they editorialize over, or even clickwall, your posts.</p> </li> </ul> <p><small>This not an endorsement of false or misleading information. This is a rejection of Twitter appointing themselves the sole authority on what is or is not false, and applying that unchecked authority to the publications of millions of people. That’s a terrible thing, and way, way worse, even if you’re vehemently against false information.</small></p> <p><small>How they expect to retain 47 USC §230 (CDA) protections with this behavior is anyone’s guess; it seems like screaming “please make an example out of me” in the regulator’s face. It’s exceptionally difficult to credit Twitter with anything remotely resembling intelligent strategy these days.</small></p> <ul> <li> <p>What is now a long time ago, Twitter removed RSS feed URLs for Twitter profiles, making it much harder (and, for most, impossible) to consume someone’s Twitter feed not using a Twitter app. It used to be you could subscribe to a Twitter profile’s RSS feed just like subscribing to a blog’s RSS feed in your feed reader.</p> </li> <li> <p>Twitter hobbled, and then eventually entirely removed, their unauthenticated API, and put insanely draconian rate limits on their authenticated API, so that you can’t even consume your own feed, logged in, via their API without constantly getting rate limited. This was done to force you into first-party Twitter apps, which:</p> </li> <li> <p>Started displaying fucktons more ads.</p> </li> </ul> <p><small>Quit Twitter, seriously. I’m not joking. If you still don’t get how Twitter and other, similar censorship platforms such as Facebook/Instagram and Google are making the world worse, or simply don’t give a fuck and want to continue to mindlessly consume repetitive election outrage porn and thirst trap bullshit even to the detriment of wider society and the world, at least do this: every time you see an ad on Twitter, click on the account that posted it, and block the profile. You’ll need to do this for days or weeks, blocking hundreds, perhaps a thousand accounts. Eventually, you’ll see no (or very few) ads, because the Twitter apps won’t display ads from blocked accounts, and you’ll reach a point where you’ll have blocked almost every account that targets your segment with ads.</small></p> <ul> <li> <p>The imposition of these draconian API limits, as well as fail-closed antispam measures, now means that you can’t do simple things like write a small program to DM every one of your followers, people who <em>explicitly opted in</em> to receiving your content.</p> </li> <li> <p>The default feed display, instead of being a list comprised exclusively of tweets made by or retweeted by people you have explicitly followed, now includes posts made by people you’ve never followed, simply based on the “likes” and engagement actions (other than retweets) of people you do follow. This, in my experience, totally shits up one’s timeline, as following someone is absolutely not a vote of confidence in their taste or curation ability. Many people have things to say that I wish to read, but I absolutely do not want things that they “like” to appear in my own feed, because things that many people “like” on Twitter frequently amount to generic, boring, normie memes, or endless, endless, endless repetition of the standard depressingly predictable, <a href="/20200808/partisan-politics-are-boring/">utterly boring American culture war</a>. Twitter basically removed the ability to self-curate, forcing you into seeing what your follows liked.</p> </li> <li> <p>This new engagement-based timeline, being the default for everyone, means that, unless you’re already quite famous and have lots of engagement and likes on <em>all</em> of your tweets, <em>most of your followers</em> will not see <em>most of your tweets</em>, as they’ll be preempted by popular likes-by-friends clickbait injected into your followers’ feeds. It doesn’t just shit up your own timeline, it (de facto) hides your posts from your own followers. Remember, it’s impossible to promote some content to the top of the page without demoting other content to the bottom, until you invent a way to bend spacetime at will.</p> </li> <li> <p>In what may be the biggest dick move in the history of product decisions, Twitter would allow you to turn this new feature off (which they unhelpfully made default for everyone) and return to the previous, only-posts-from-those-I-follow behavior, which they retroactively named “Latest Tweets”. It would, however, <em>periodically revert this setting that you made</em> back to their “shit up your timeline with stuff your follows liked” mode, <em>contrary to your explicit wishes</em>. You’d then have to go back and set it back to “Latest Tweets” again.</p> </li> </ul> <p>I hope I never learn the name of the person at Twitter who made this product decision, for their own reputation’s sake. I must have changed this back to “Latest Tweets” hundreds of times, each time only noticing when suddenly my feed would start filling with culture war clickbait bullshit. I’d go to unfollow whomever was posting (or retweeting) such crap into my feed, only to find out that I wasn’t following them in the first place, and that Twitter had deleted my settings change to benefit Twitter at my expense. This is abusive behavior.</p> <ul> <li> <p>On the bot front, you’re now not allowed to @-mention people from API-posted tweets. No more fun hacks using Twitter as a notifications bus/backend.</p> </li> <li> <p>If you understand the importance of anonymous publishing, this last is perhaps most troubling of all: brand new Twitter accounts get locked a few minutes after creation, demanding a telephone number to unfreeze it, and you <em>can’t reuse a phone number that is already used on a different account</em>, which seems to indicate that Twitter doesn’t want you to have more than one account. Removing the phone number instantly re-locks the account. In 2020, now you have to dox yourself to Twitter to use the site. I’m from a time where people would commonly make joke or bot accounts that would post interesting or <a href="https://twitter.com/godtributes">funny</a> things or <a href="https://twitter.com/pavels_bot">other</a> <a href="https://twitter.com/itsdennian_bot">periodic</a> <a href="https://twitter.com/zurvollenstunde">information</a>; it appears those days are now gone.</p> </li> </ul> <p><img src="/s/2020/20201031.twitter/nophonereuse.png" class="img-rounded img-responsive" /></p> <blockquote> <p><a href="https://www.buzzfeednews.com/article/alexkantrowitz/how-saudi-arabia-infiltrated-twitter">“State agents pressuring Twitter employees to deliver private information wasn’t uncommon at the company, former employees tell BuzzFeed News.”</a></p> </blockquote> <ul> <li>This is totally safe and fine however, because Twitter would never <a href="https://www.eff.org/deeplinks/2019/10/twitter-uninentionally-uses-your-2fa-number-targeted-advertising">misappropriate your phone number to link your identity to advertisers</a> and certainly won’t ever <a href="https://www.buzzfeednews.com/article/alexkantrowitz/how-saudi-arabia-infiltrated-twitter">leak your personal information in exchange for bribes from the Saudi government that has a history of assassinating journalists they don’t like</a>. What are you worried about?</li> </ul> <h2 id="2020">2020</h2> <p>All of this I reluctantly tolerated, but recently Twitter went a step further, from deciding what you’re allowed to post, or how you’re allowed to access the service, to crossing a new line: they started deciding what you’re allowed to <em>read</em>.</p> <p><strong>Twitter’s search is now censored.</strong></p> <p>Go and search for the (quite popular) hashtag <code class="language-plaintext highlighter-rouge">#WWG1WGA</code>, an initialism for “Where We Go One, We Go All”, a slogan of the growing <a href="https://en.wikipedia.org/wiki/QAnon">Qanon conspiracy theory movement</a>.</p> <p>Seriously, go look:</p> <p><a href="https://twitter.com/search?q=%23wwg1wga&f=live"><code class="language-plaintext highlighter-rouge">https://twitter.com/search?q=%23wwg1wga&f=live</code></a></p> <p>For some reason, Twitter has decided that you shouldn’t be able to search for this, that whatever this search would turn up is something that you’re not allowed to read. It’s allowed to be posted, even under Twitter’s new heavy-handed censorship terms, but <em>you</em>, you’re not allowed to read it.</p> <p>Somewhere inside the machine, Twitter has summarily decided that censoring search results for journalists, political scientists, sociologists, and other researchers who would like to keep an eye on these whackjobs and track the spread of this movement is acceptable collateral damage in their (misguided) attempt to stop the spread of this movement.</p> <p>It’s fucking stupid, and not only because it <em>simply won’t work</em>. You can’t stop a social movement with censorship of a single website, even if that website is Twitter or Facebook. You’d think the operators of one of the biggest social media websites in the world would have heard about the <a href="https://en.wikipedia.org/wiki/Streisand_effect">Streisand Effect</a>—this is Social Media 101 stuff. What epic dumbfucks these product people are!</p> <h2 id="who-cares">Who Cares?</h2> <p>Publishing is one of the most important functions in our society. It’s how we learn accurate information about the current state and status of that society, even if the things we are observing directly are themselves inaccurate statements. The facts about who is saying what, whether or not that “what” is accurate, and how frequently they’re saying it, and to whom, are critical pieces of information. Without it, we’re flying blind.</p> <p>Banning the publication of false or misleading information is, paradoxically, banning access to the ground truth, as people paying attention will not be able to accurately assess the spread or prevalence or popularity of false or misleading information, or the popularity and influence of groups that deal in such deceit or propaganda. It also inappropriately presupposes that the self-appointed censor knows better than you what you should be permitted to read, for your own good.</p> <p><small>In the <a href="https://xkcd.com/137/">ever-relevant words of Randall Munroe</a>:</small></p> <p>This is very important, so I want to say it as clearly as I can:</p> <div class="well well-lg"> <h1><b>FUCK. THAT. SHIT.</b></h1> </div> <p>I’m out.</p> <p><img src="/s/2020/20201031.twitter/micdrop.gif" /></p> <p><small>I encourage you to leave, as well: on a personal level, never tolerate services that tell you what you’re allowed to see.</small></p> <h2 id="what-now">What Now?</h2> <p>Automatic, widespread, targeted, customized, invisible censorship is presently <a href="/20200421/normalcy-bias/">one of the largest existential threats to free societies on Earth</a>. It’s essential that each and every one of us take practical steps to deliberately avoid circumstances where we could be subjected to it, because, if we are, we’ll never have any clear indication that it has happened, and news organizations will be unable to clearly identify and reproduce it in most test cases, and thus be unable to accurately report on it. It is happening silently, without documentation or record.</p> <p>This is happening today, and is not a theoretical threat. Right now, on the major platforms, there are links you are simply not allowed to communicate to your friends, family, and audience—even in DMs. I’m not talking spam or malware, just links to articles that these platforms have decided that readers are simply too impressionable to be allowed to read.</p> <p><strong><em>This is your warning; note that there will be no further announcements to alert you to the danger as it silently affects the personal communications of you and yours.</em></strong></p> <p>We must be proactive in avoiding this cancerous trend of censorship products and platforms, and we must help those around us be proactive as well.</p> <p>To that end, take the following steps now:</p> <ul> <li> <p>Opt out of these platforms deciding what you’re allowed to read and <a href="https://sneak.berlin/list">enter your email address so that I can contact you directly to send you updates a few times per year about any important posts here or other time-sensitive developments.</a></p> </li> <li> <p>Switch to an email provider that doesn’t <a href="https://themarkup.org/google-the-giant/2020/02/26/show-your-work-wheres-my-email">censor your incoming mail</a> or <a href="https://en.wikipedia.org/wiki/PRISM_(surveillance_program)">make your complete email history available to the cops without suspicion of a crime</a>. This means you have to ditch gmail, hotmail, and the like, but it’s much easier than you think. <a href="/20201029/stop-emailing-like-a-rube/">I recently wrote a detailed guide on exactly how to do this, with clear, step-by-step instructions that anyone can follow.</a></p> </li> <li> <p><a href="/20200211/instagram/">Stop donating your time and attention to censorship platforms, because every user’s attention makes them more valuable and more attractive to others</a>. “I have to be on Instagram and Facebook for my business” is <a href="https://www.cnbc.com/2020/02/10/elon-musk-delete-facebook-its-lame.html">a lie that you tell yourself because it’s more convenient</a>. Delete your Facebook, Instagram, WhatsApp, and Twitter accounts, and make sure your friends have your direct email address (<a href="/20201029/stop-emailing-like-a-rube/">and not a gmail account</a>).</p> </li> <li> <p>Replace WhatsApp and Facebook Messenger with <a href="https://signal.org/">Signal</a>. When people start communications with you on these surveillance and censorship platforms, stop them and instruct them to install Signal instead, and chat there in private.</p> </li> <li> <p>If you run a business, rather than sharecropping by spending time and resources building an audience for a platform run by a giant company that rents access to that same audience back to you, make a website on a domain that you own, and establish <em>direct communication</em> with your current and future customers—disintermediate. If you don’t <a href="/20201029/stop-emailing-like-a-rube/">have your own <code class="language-plaintext highlighter-rouge">.com</code></a>, you’re a nobody.</p> </li> <li> <p>Spend time finding and deliberately interacting with <a href="https://instances.social/list#lang=en&allowed=nudity_nocw,nudity_all,pornography_nocw,pornography_all,illegalContentLinks,spam,advertising,spoilers_nocw&prohibited=&min-users=&max-users=">smaller, community-operated online communities</a>. Social networks designed for the general public such as Facebook, Instagram, Twitter, and the like are chokepoints for surveillance and censorship, which you should vehemently reject on moral grounds as damaging to all of society, even if you personally have nothing to hide or nothing to publish. Don’t delegate to others the power to invisibly decide what you’re allowed to read or see, or you’re viewing the world and society through a lens designed by people you’ve never met.</p> </li> </ul> <p>I’ll have some good news on this front for you in the future.</p> <ul> <li>Teach your friends and family and neighbors how to make these same choices.</li> </ul> <h2 id="offer-of-assistance">Offer Of Assistance</h2> <p>If you’ve any questions about how to implement these recommendations, or undertake these actions and find yourself getting stuck, please feel free to <a href="mailto:sneak@sneak.berlin">contact me</a> and I’ll do my best to help you however I can.</p>Preface