• Member Since 24th Jan, 2018
  • offline last seen April 17th

LostArchivist


What? Where am I? Well, I seem to have landed in some strange place that looks like the Sugarplum Fairy caught the flu. Time to make myself a new alias and try to blend in...

More Blog Posts13

Jun
18th
2018

Modern Computing sucks. · 4:25am Jun 18th, 2018

I just had an interesting and slightly depressing walk through the archives of Alyssa Rosenzweig (ironically, also a MLP fan!) about the attempt to reverse-engineer the Mali GPU system for ARM embedded processors. That archive binge had just followed a fun romp through the docs for ME_Cleaner.

Why don't I tell you what those are, and why you should care:

Firstly, every Intel Inside PC (AMD did this as well shortly after. I only mention Intel in this explanation, but that's just shorthand.) that has been produced since 2006 has a little security flaw (Never mind the speculative execution attack recently discovered that just buggered everyone who owns a computer; this is a different one).

This flaw is a dedicated coprocessor (the Management Engine, hence called the ME) in your device running proprietary software (you have no idea or control over what it does) and which has transparent (read: undetectable, and unpreventable) access to ALL system resources. If your computer has power (it does not even need to be powered on!), this chip is able to access the network (I do not know if wireless networks are safe, but I am not sure it is safe to assume so, either) and call home to Intel's servers. If your device is running, however, it can secretly modify or read any data that exists in any format on your computer (even encrypted data, since that must be decrypted to be used, and that key and the decrypted data would be available to the chip), as well as turn it on or off at whim.

This is disturbing, since it is very likely that, even if it is lacking in ANY zero-day bugs (unlikely, and which we can assume are not patched*), the government probably also has a backdoor to it**, and, further, Intel has to be trusted not to add any or use the existing interfaces (scary enough) for even their own reasons.

*The ME firmware is often provided with BIOS updates. Most people do not even know what these are, and their existence relies on ongoing support by a manufacturer who has no reason to keep making them for even new hardware.

**Post-Snowden, it was revealed that our kind and benevolent :facehoof:government had touted a cryptography standard that theoretically had a massive and ingenious backdoor to it, called Dual-EC DRBG. The actual hack was inconvenient and difficult, but it was possible in theory, especially with the NSA's known computing power. A recent and frankly hilarious bug report on old Canon-brand printers showed a damning proof that the backdoor was intentional, and had actually been used. For those who still want to say that our government has reason to do such things, see this video about all the many reasons that this is a bad idea in practice.

Worse, it is not possible to attempt to disable the ME systems fully by hacking the BIOS images. The CPU has a ROM inside it that requires the ME firmware to boot (and the ME firmware contains most of the code necessary for actually powering up the system in the first place). Attempts to disable the system despite that issue run into code signing issues. The firmware has been signed (think of cashing a check, for those of you old enough to know what those are!) such that any changes to the code are detectable and instantly known to be tampered with (the actual process is complicated, but understand that it is PHYSICALLY IMPOSSIBLE to create a "blank check", remove the signature, or forge one). Bad or tampered firmware results in the computer hard shutting down after 30 minutes, assuming that it ever booted in the first place. Your only recourse is a hack that can remove SOME of the ME code to crash the processor after boot but before it can enable that watchdog timer (or one of two secret flags that seem to do something similar without the code removal).

None of these can be proven to completely remove the risk that this system adds. They are quite risky as well, and require some advanced knowledge of your system to apply.
Sure, there's the whole spiel about how only the guilty need to hide (previous video), but there is an amendment against sudden searches of your home. However, tools like ME allow a sudden and very thorough searches, at any time, and without your knowledge, to your computer. What makes your private electronic data different from physical belongings? In this world, the former are often more valuable too! (Bank information and intellectual property, among others)

And, can you really say that there is NO reason for someone to want to either read or write your data? Analytics engines would love to have more profiling information*, the DMCA and its members would love to prevent you from even remembering copyrighted material (link is, while not serious, rather too likely for my taste), and criminals would love to be able to steal your data or use this system to PLANT incriminating material (or get it away from them, since free hosting that can't be tracked back to them is great!) and get you in trouble. Even without that, there are still people who would simply hack the systems for the heck of it or to cause damage, and not just to read your information (but who's kidding; cybercrime/data theft is THE most lucrative crime today).

*I cannot recall where I saw this, so the prudent reader would disregard unverifiable information***, but after a fun but disturbing read of several Optimalverse stories, I found a news article discussing an actual (allegedly successful!) attempt to use illegally-acquired data-mined preferences to influence large parts of the population, whether by directly influencing them or by making them stand off from important debates. You could also look into the Facebook psychology testing of a few years back where they changed the posts that you saw and/or their content to try to make users happier or depressed, without their knowledge or permission, or consider the fact that Facebook and Google, among others, make/made the majority of their money off of selling your data to third parties, and are thus incentivised to collect ever more of it with increasing granularity.

***The reason that you should ignore unverifiable data is because I could say anything I wanted, especially if I had knowledge of how to use several well-known psychological techniques to influence the way you would think, and so long as it seems plausible, you'd believe it and be more likely to discard later arguments that conflict with it. (this technique probably falls under Cognitive Override, as mentioned in the link)
Suppose that you know that I am going to try to tell you what to do. You'd do the opposite, but I knew that. So, you'd still do what I wanted. Thus, you should ignore the data and do neither of the things I suggest. And, as you can see, you believe me anyway, despite me also having forgotten where I read this one from, simply because you read it and it's reasonable. Now I have you trapped in a logic loop where I tell you to ignore what I tell you! Muahahaha!:pinkiecrazy:


Now, to disillusion those of you who have phones or tablets, and think you are safe...

See the link https://rosenzweig.io/blog/no-cellphones.html to see why stock systems or devices with any cellular connectivity whatsoever are a bad idea for security. In short, they run software, often also signed to prevent replacement or changes (and in the case of most Android devices, updated less frequently than your BIOS for the exact same reasons), that is not capable of security review (usually by straw DMCA protections that result in loss of a house should you try) to ensure that it is not leaking your data or even that it's actually secure (as in, cannot be hacked to cause unwanted actions). Beyond that, the radio systems that allow connection to cell networks have independent control over your system on a level comparable to or worse than the ME device.
This is done by a system called the Baseband, which cannot legally run open firmware, is rather less than secure, and which can often grab or view any information visible to the phone's mane processor. Such as location (ignoring the fact that it already allows the network to get your location via tower triangulation if the battery is present), microphone, and camera access, and CPU state (screenshots, in-memory data,etc). That's a bit scarier now, since phones have a lot more interface systems than most home computers do (laptops being an exception) and many people use them more often.

Worse, unless you are a software developer of a far greater caliber than I am (on par with the aforementioned person trying to reverse engineer the graphics, who actually does this), there is no good way to run actually useful software on these devices even if it's physically possible (as in, you were lucky and got one that's not using a signed bootchain).

Let's say that you have an Odroid, say, or a Chromebook. These devices are minimal, and while they do use a cellphone (commonly an ARM) processor, they don't have the snooping hardware to be concerned about. The Odroid (and most other CPU versions, for that matter) require(s) a signed (and thus secret) bootloader, meaning that you can't quite trust the system not to be doing funny things behind your back. They also commonly use a closed-source graphics stack. Since the only really useful operating systems (go look up some of the hilarity caused by Windows RT or ME) at this point are open-source GNU/Linux based (some variant of Android, or a stock desktop OS suite, my personal choice) because it's possible to convert the applications to run on the foreign CPU architecture (only partially, in most cases, and often by people who know far more about this than us, so it's not perfect:trollestia:), they are still crippled by the inability to actually show pretty pictures (or anything whatsoever, actually) on the monitor.

The graphics being closed source rightfully retards any useful widespread use or development of these systems (the devs won't touch it since it can cause unknown and unfixable bugs, and it might not be safe/secure). Further, the licensing on the rights to actually develop these systems also FOREEEVEEEEERRR :pinkiehappy: prevents you from any attempts to make your own (see this poor man's struggles; if he wanted to try to make the world better by making a competing driver (and competition does drive innovation and proper treatment of consumers -- in theory, at least), he can never work on anything using the official one or any futher code he contributes can get him sued).

As it was, Libv's driver actually died out anyway from this type of pressure. That's the other reason the devs won't touch it. Permanent curtailment of ability to write or change certain code, to people who do that as their job or hobby, is a massive issue.

Report LostArchivist · 935 views ·
Comments ( 0 )
Login or register to comment