Still Flogging, Still Dead

A day or so ago, I was reminded of one of Apple’s stupidest actions ever, one that has left everything running macOS open to a DOS attack, a thing that no only will they refuse to fix, but one they defend, because it only affects user data, so an app that would rampantly mess up your ability to use files without deleting or harming them at all, but by modifying one piece of file metadata, make it hard, if not impossible to correctly use that file.

The post:

And this is why you don’t use file name metadata as your sole determinant for what a file is, does, or belongs to.

<stares at macOS where a script/app that yanked all the file name extensions from every file in your home directory would be a (sadly) extremely effective way to DOS you on your own computer because without filename, UTIs fail to work and the OS (and every app that didn’t exist prior to OS X) will not only not know what to fucking do with the file…>

My post on mastodon

It was a sort of quote tweet of another post by someone else talking about how having TLDs that end in .mov or .zip could be a bad idea. I don’t completely agree, we’ve had .com for a long time, but, to be fair, the days of actually using .com executables as a common thing are somewhat in the past.

However, it did remind me of how unless the dev takes steps to prevent it, if I want to force you to do a lot of work, like file restores, all I have to do is embed a…between 1 and 3-line script in an application that has a good reason to get access to things like your documents directory, your iCloud directory, OneDrive directory and similar, (not hard do do for any document-based application) and all that script has to do is delete all filename extensions from every file it can find, and well, unless you know what’s up, you’re…what’s the phrase…oh, right:

You’re Fucked

Now, if you have a decent backup/archiving system, and you can just completely restore all the contents of those directories, well, it’s still a lot of tedium, especially if you use cloud backups and aren’t in a location with lots of bandwidth, but at least then it’s fixable.

However, if you are not, or your backup strategy has…holes…then thanks to Apple’s complete reliance on file metadata, the filename extension, you have little chance of getting your files back to normal anytime soon. Here, a tour:

So here we have a pages file, it’s a homework file for a class I’m taking. If I want to get details reasonably easy, a bit of fun with AppleScript’s “info for” command and here’s what Script Debugger has for us:

That looks normal. We see the file name, with the extension (.pages), the kind is a “Pages Document”, and the UTI is “com.apple.iwork.pages.sffpages.” If we double-click on the file, it opens in Pages. If we use File -> Open in Pages, we can open the file. If we right-click the file and select “Open”, it opens in Pages. If we right-click the file and select “Open With”, Pages is the default. All is as it should be, right down to the icon in the Get Info Window:

All Is Well

But

What happens if we change the file name, specifically, if we remove the Pages extension?

Fail

First, the file icon changes to a generic Unix Command line utility icon. No, really:

Same file, only thing that changed was part of the file name. Just happened to be the “wrong” part. But surely, that’s not correct for everything, surely the OS still knows what kind of file it is…

Um…nope:

Pages won’t open it at all:

Gotta say that circled part is my favorite thing ever. And no, “Open With” doesn’t work. Unless you happen to know what that file is in terms of the “proper” file name extension, or can restore an earlier version, congrats, that file is lost to you. Even if you try to look at the raw elements of the file in say, BBEdit, which will show you the file structure:

The files in the metadata folder? Nope. Here’s what they have:

BuildVersionHistory.plist

<?xml version=”1.0″ encoding=”UTF-8″?>
<!DOCTYPE plist PUBLIC “-//Apple//DTD PLIST 1.0//EN” “http://www.apple.com/DTDs/PropertyList-1.0.dtd”&gt;
<plist version=”1.0″>
<array>
<string>Template: Blank (12.1)</string>
<string>M12.2.1-7035.0.161-2</string>
</array>
</plist>

DocumentIdentifier.plist:

EE9FEC88-81E3-4BA8-AA2E-C51C5E429CD8

Properties.plist

The only chance you have is the preview.jpg file, which shows you a preview of the first page of the document. That’s it. all the .iwa files are snappy-compressed files, and of course, the Archive Utility won’t touch them. BRILLIANT.

Now, it’s not just Apple mind you. Most what old-timers would call Cocoa apps have the same problem. Here, Pixelmator Pro.

File with extension:





Same file, no extension:





(If you try to open it with TextEdit, it does not work)

What happens if you try “Open With” and force Pixelmator Pro to try? Well

So that’s a fail. Look, I’m not really bagging on the Pixelmator folks right? They’re doing what Apple recommends. The problem is, this isn’t i(Pad)OS where there’s other ways to manage this and I can’t just arbitrarily mung the filename like this. At least with Pixelmator, if you open the file in BBEdit, the metadata.info file is a SQLite database file with the application name in there, so at least theoretically, if you know how to troll through SQLite files and you know what you’re looking for, you have a chance to fix it. iWork files (or really any file from Apple), lol, you’re screwed.

Oh and if you use the default view in the Finder, that hides file extensions? you can’t even tell the name was changed!

The only visual indication is the icon. But that’s not the most egregious example. Wonder what happens if you yank the extension off a Shortcuts file? Of course you do, you’ve read this far, WHY STOP THE AGONY NOW?

Just by removing the extension, POOF! IT’S A FOLDER.

By the way, to all you nerds who love to laugh at people who think you can make a jpeg into Word doc just by changing the filename extension? CRAP LIKE THIS IS WHY! PUSHING AND ADVOCATING FOR THIS SHIT IS WHY THAT HAPPENS! IT’S YOUR FAULT, 100%

But surely there’s an application that isn’t stupid. Why yes, yes there is. My favorite example, MICROSOFT WORD.

With the filename extension:





All as expected. But now, let us remove the extension…


okay, that’s expected, a generic icon. At least it’s a document icon, and not a unix executable or a gods-damned folder.



See, just like…wait, it’s still a Word file, even without an extension. That can’t be right…



But it is right. And the reason, for the youngin’s out there isn’t obvious, so let me explain.

If you look at the file info for either word version, you’re going to see two values that don’t exist for pages/shortcuts/pixelmator pro/”new” application files: file creator and file type. Back in the before (OS X/macOS) times, on a Mac, the filename extension truly was metadata. It was mostly there for sharing files with other platforms, like Windows that heavily relied on the filename extension. But the older Mac OS, didn’t care, because of those two values. If you hear old-timers talking about resource forks, that’s a lot of what they did: they told the OS and the Finder what kind of file a thing was. Wasn’t just for document files either:

No need for .exe or .app or what have you. Just a bit of metadata that existed outside of what you saw, so changing the name and/or filesystem didn’t change a file into a folder or even appear to.

Now, the implementation of this in the Classic Mac OS was anything but perfect. It created its own brand of magical thinking, i.e. “Rebuild the Desktop” was the classic version of “Repair Permissions”, but it was a way to ensure a file was what you thought it was regardless of name, and clearly it still works. It’s not the best thing right, I mean, prior to the Office XML files, the file type was “WDBN” and I think at one point, there was a “WD4N” or something. Having only four characters meant things got weird, but the concept of file types that aren’t solely reliant on something as trivially changed as the file name makes a lot of sense, and prevents the kind of DOS attack that every copy of macOS is pathetically vulnerable to.

Oh, and please, Office people, CC people, really, anyone who’s been building apps since before OS X, please keep using File Types and Creator Codes. I know it’s a pain in the ass, I really do, but the few times you need them, you need them and they will literally save your ass. Thank you for that extra work, it’s a genuine help, and we appreciate it.

Before the pedants feel the need to bother me with it, if I render someone unable to actually use their computer, to work on their files until they do a massive document restore, probably multiples given the defaults not showing you the extensions so you can’t even see the problem until you try to open your folder, I mean file, that is a Denial of Service attack. It is denying you the ability to use your computer and your files, it is most definitely a DOS attack. Or DoS for the overly case-sensitive.

Just because the OS and the Applications can sit there and quietly navel-gaze doesn’t mean there’s no problem. At that point, the difference between hundreds/thousands of files you can’t use and an erased hard drive is not as great as you wish to think, especially for folks who are not coders, are not sysadmins and just want to use their computer to do work, to get through a class, to write stories or make paintings or all the things computers are used for that aren’t sysadmin or coding. I know it’s shocking, but non-tech workers do “real” work too. Some of you may need some time to internalize that, but it needs to be done.

The sad thing, the absolutely sad thing is there’s a fix that’s almost trivial: let applications set UTIs within file EAs instead of deriving them from the filename or making people try to use file types and creator codes. Most of the work is already done, applications can set the UTIs they use/conform to, you can still associate UTIs with filename extensions/MIME types (which you’d need to do for files coming from non-Apple OS’s), none of that has to go away, nor should it. But give the app creating the file the ability to write the pertinent type identifier into the file, and not just have it derived from something so trivially changed. Hell, make it read/write in Standard Additions so people can automate changing it for specific files en masse outside of the somewhat kludgy “Get Info” window.

Adding this capability would cause effectively zero harm and remove a vector to cause harm, or at least make it harder to do that. All the mechanisms are in place, it would be a trivial API change, and no, unlike one fine young dingus claimed the last time I talked about this, it would not keep you from having your web browser be the default for JPEGs you wee dork. It would just mean that if your file lost the .jpg/.jpeg extension, your web browser would still know it’s a goddamned JPEG. Nerds are the worst people to move anything forward for anyone but themselves, STG, SMDH.

But this won’t happen, because the powers that be at Apple and everyone else think doing an entire home directory restore from the cloud is trivial because it’s not like anyone ever doesn’t have 1G bandwidth at their fingertips.

S I G H

Like so many other things, Apple can improve this, they just neither care nor want to. Pity.

Advertisement

But Y Tho?

As those of you who follow me on mastodon know, a while back, I had given up on my Apple Watch. I had actually put it in a drawer, because I had gotten so frustrated with the changes to the run tracking on it.

I run, but not because I’m any kind of runner. It’s a way to exercise, to get my cardio in, and during the pandemic, well, a pretty easy way to not get completely out of shape. I run anywhere between 3 & 5 miles on a given run, so I’m not like Adam Engst or Heather Kilborn, friends of mine who are Runners. I’m not a serious runner. So what I wanted out of my watch was simple shit. Distance, Elapsed Time, Calories, Heart Rate, Pace. That’s it. I know there’s all kinds of things you can track, but those are the things I care about and for years, that was what I saw.

Then in a recent WatchOS update, it all went to hell. Split times, lap times, a half-dozen stats that if I were a Serious Runner, I’d care about. And no matter how I tried I couldn’t get it to show me what I wanted. So I was like, fuck this shit, I’m out. I carry my phone anyway, so the watch was a convenience. And if I couldn’t use it for the one really “smart” thing I had it for, why bother with it?

Fortunately, someone on Mastodon (and I am blanking on who, my apologies) helped me find the secret settings. Wherein if you go to outdoor workout, tap on the settings widget, then do the counterintuitive thing of creating a custom workout, you can enter the what feels like 3m tall list of settings that you can turn off and on and eventually get what you want.

So much scrolling.

SO

MUCH

SCROLLING

But why? Why is this only on the watch, and not on the watch app on the iPhone, where this would be trivial? This is the kind of thing where whomever was designing this update at Apple decided most of the users are Serious Runners, and so would only want these settings on the watch. That’s the only theory that makes sense. I don’t have a problem with the settings being on the watch, I have a problem with the settings ONLY being on the watch and being unable to change them on the phone, which has an infinitely better UI for this. I have a problem with the settings being badly labeled on the phone, and at the end of a long list of things that have nothing to do with the basic display settings. I have a problem with Apple deciding only Serious Athletes use the watch.

If you make the defaults useful only to the high end, and completely painful to change, then you’re telling the vast majority of your customer base to go fuck off. May not be your intent, but that is in fact the message you’re sending. Maybe don’t do that? I know Apple sucks at communication, they have always sucked at communication, but maybe think about fixing this?

Mastodon is in fact, better than Twitter. BUT…

Okay, so for those of you not used to it, here’s your warning: this post will have some *profanity.* Those of you who actually know me will not be surprised by this.

So first, in the ways that mostly count, Mastodon is better than Twitter. Now, my Twitter experience was better than most because I never used the official Twitter client that inundated you in shit. I used Twitteriffic, because, and I know this is just weird as hell, when I use something that sucks, I look for a replacement. That doesn’t suck.

Like were ya’ll getting handies or something from Twitter for using the “official” client? Because it was either orgasms or the most astounding case of Stockholm Syndrome ever, no one puts up with that much pain for nothing. Wait, it was like the IPA thing wasn’t it? You convinced yourself that if you just drank enough shit, it’d taste like ambrosia after a while.

Y’ALL

So even with my “don’t just stand in the effluvial stream” habit I have, I do like Mastodon better. But like Carlin once said…the people.

First, the overpopulation of effete honkies in Masto is…like being in a Hellman’s factory in Weston. The overweening sense of entitlement, and just happy-go-lucky willingness to tell Black people, Indigenous people, other People of Color that they need to, when talking about how on a daily basis, Western European society is literally dangerous to them on every level, that they need to hide that behind a Content Warning, CW, because it’s disturbing Bramberly’s daily positivity meditations. I got nothing. Like, I have no issue with CW’s because they are useful. They do have value. I didn’t always get them, but you grow, you learn, you empathize, so yeah, CW’s are a generally good idea.

But not when used by honkies to preserver order over justice. Yeah, no.

Then there’s the “JuSt MaKe YoUr OwN InStAnce/FiNd a BetTer InStaNcE” lot, which is the response too many flavors of Mastonerd has to every fucking complaint about how mastonerds act. Listen, it’s a social media platform, not the lost tribe of Trevor, we are not walking the earth until we find the perfect place. This is not a Hanna-Barbera cartoon series from the 1970s. There’s two aspects of this that bother me. First, that is literally the tech version of “If you don’t like it here, go back where you came from.” Secondly, it’s a pathetically transparent attempt to segregate “those people” into redlined instances so they can be defederated (guillotined from the Fediverse at large) and ignored. I literally saw, and was appalled by someone suggesting that anyone engaging in protesting should have to do so from a special instance/server so that this person could filter it out when “they just weren’t up to deal with it.”

Well Jimothy, Black people just aren’t up for being attacked for existing, yet here we fucking are. Reminds me of the people who were more angry that protests against police brutality were messing up their commute times than you know, Holman Square.

Honkies gonna honkie.

Then there’s the recreation of OH GOD, AOL JUST CONNECTED TO THE INTERNET vapors.

SIGHHHH

Like I know nerds are the most gatekeeping assholes who ever lived, they’re better at that than whatever it is they’re a nerd about, and I know they’re the worst kind of petty, delicate jerks, but dear god, you wanted people to see that Mastodon/Fediverse was better than Twitter, but somehow you expected them to…leave their POV and worldviews back on Twitter and silently comply with how you want things done? Do any of you, A N Y of you ever deal with other humans? That was never going to happen, you don’t even do that yourself. Don’t even front, I’ve seen you walk into someone else’s space and demand you be catered to by the locals. Don’t cry when you get what you send out.

Like yeah, I get it, it sucks, you had this quiet little space, and now thousands of people a day have realized you were right (why can’t nerds ever be happy with being right?) and are showing up, and they’re doing what all people do in a new sitch: clinging to the familiar while they adjust to the new. “They expect it to be like Twitter!” you all sniff. No kidding, of course they have that expectation, Twitter was all they knew. WTF, you expect signing up for an account on a mastodon server shoves the knowledge of the ages into your brain? Ye gods, y’all need to take some social science classes.

BADLY

Then there’s the “two legs good, four legs bad” shit with every suggestion that not every feature Twitter had was evil. Like Quote-Tweeting, QTs. Oh. MYGOD, the drama that is causing. All the objections to QTs boil down to one of two things:

  1. Twitter had it
  2. The only use for it is abuse

As for the first, Twitter had posting, you seem okay with that. Twitter had retweets and replies, you seem okay with that, Mastodon is literally based on the Twitter concept. Grow up.

As for the second, every feature of social media can be used for abuse. If you think it’s impossible to stampede, dunk on, or abuse people without QT’s, you clearly follow no Black people on Mastodon, for if you did, that fantasy would be blown out of the water. It would be yeeted from your mind. Well, maybe not, honkies, (and by and large, the source of all the crying I’m talking about are thoroughly mayolenated) gonna honkie. But no, there is value in the QT concept, and the whole OMG TWITTER thing is just stupid. First, a core reason behind the ways QTs could be used to abuse people was the thing Twitter has that Mastodon and the Fediverse at large are deliberately constructed to not have: a central controlling algorithm. I know Mastonerds know that, it’s practically the subject of every third sentence they utter.

Without the algorithm, a huge chunk of the abuse is not possible. Now, you can still use QTs to be an asshole, but that goes for every other feature of Mastodon. Given how badly the protocol manages link preview bandwidth, you can be a far bigger asshole with some REST magic and an image search. I don’t think we will demand the removal of link previews. As well, you can indeed simulate a lot of the QT featureset manually, which makes the arguments against properly implementing QT’s weaker, not stronger. The thing is, some of the ways you can simulate it, specifically screenshots are more suited for abuse than the core QT featureset. It also raises ethical implications if you’re wanting to QT artists. May raise legal implications too, but definitely ethical ones.

It is entirely possible to implement QTs in a way that provides what most people want, the ability to add one’s own context to a post without requiring abuse. (When properly pressed, the OMG QT ARE TEH DEBILLL!!! lot will admit that. But it takes some work.) Replies don’t do that, nor do retweets, nor do overly tedious combinations of the two. (I will not be using the word “toot”, that makes my skin crawl only slightly less than spiders. It just does.) As many Black “Mastonauts” have pointed out, the QT is a valuable feature in their communities, and I agree, personally, with all their reasons. Sometimes, I not only want to boost a post, but add my own thoughts. I don’t necessarily want to reply, because they’re not reply kinds of thoughts, they’re just “Hey, so this is a thing, and it reminds me of this other thing I think relates to it that y’all might want to read about” thoughts that one wants to share with one’s audience. That’s not unavoidably evil, it is not a form of abuse, and unlike screenshots, it does not effectively cut off the person who’s thought you’re adding to from things.

As well, given Mastodon and ActivityPub, the idea you can’t implement this in a manageable configurable way is nonsensical. What this reminds me of is back when HTML email was this radical thing and when people tried to say to the PTB behind various email protocols “No one cares if you like it, it is going to happen, people want to do actual bold and italic text and put a fucking picture of their dog in their emails” the PTB stamped and pouted and said “we can too stop it AND WE WILL!!!”

How’d THAT work out Skeezix?

(Before you get too smug about early Outlook, the HTML implementation in Apple Mail is nothing to be smug about.)

Implementing QTs with management features at the instance, account, and client level(s) is absolutely possible and a good idea, because here’s the thing…

You.

Cannot.

Stop.

It.

At some point, someone with a decent client is going to add it as a client-side feature, and then, my angry flailing nerds, you’re fucked. Just like the email versions of you were with their impotent tantrums which caused far more problems than a grown-up approach would have caused. As Angry Drunk pointed out, almost all the features of Twitter that people just see as part of things first showed up in clients. Including the concept of a Twitter client, and the word “tweet”! It is going to happen with, or without you, and if you set aside your reflexive I DON’T WANNA tantrums about it, you’ll realize that saying “since it is going to happen with or without our approval, let’s implement it in the protocol in a way that allows it to be managed well and for abusers to be more easily cut off at the knees.” One of the better ideas I’ve seen is to have it disabled by default and then either enabled by the individual or the server admin as they see fit. A decent client could help make this not as tedious as possible, so sure, ship in the “safer” mode. That’s a good, constructive approach. Better than throwing your toys out of the pram like y’all are currently doing.

Finally, the OMG CAPITALISM and STOP TRYING TO JUST BOOST YOUR FOLLOWER NUMBERS. Look, I agree Capitalism is bad, but as someone so deftly pointed out commerce is not capitalism. There are a lot of artists of various stripes, and other people who relied on their Twitter audiences to feed and clothe and house themselves, and I am not going to dismiss people who make money from social media as a thing either. Until we reform society, you have to engage in some form of commerce to survive in this world.

L I T E R A L L Y

Mastodon is no different. Artists et al need followers, it’s how they survive. So instead of shitting on them, figure out ways to help them establish themselves on Mastodon so they can keep making art and doing the things many people appreciate, and/or love them for. When a writer talks about a deal in their ebook shop, just let it go. Don’t fuckin’ hector them for contaminating your purity with their inconvenient need to eat.

Mastodon et al has a lot of potential (although the hagiographies about the paradise of the early web need to stop, y’all are literally lying at this point) and I think that because it is distributed, it will be a lot harder to stop as a vehicle for improving social justice and the human condition. It is at least possible for a dictator to shut down a single service like Twitter when they get up to their foolery. Shutting down mastodon is like trying to drink water with your fists.

But the gatekeeping shit has got to stop, christ, it’s like the early days of Twitter/everything else on the internet all over again.

Use PowerShell to make QR Codes

Since Elon Musk has decided to have a complete mantrum about posting links to Mastodon or any other social media, I thought I’d talk about a fun workaround: using QR Codes to post links.

As it turns out, doing so with PowerShell is really trivial:

  1. In a PowerShell window (on any platform you can run PowerShell on, this is not Windows-only at all), install QRCodeGenerator: Install-Module -Name QRCodeGenerator
  2. Once that’s installed, you can import the module for your session, although installing it in a root PowerShell session makes it available to everyone. To import: Import-Module -Name QRCodeGenerator
  3. There’s a few commands available, but really, the basic New-QRCodeText will work well for this: New-QRCodeText -Text “<URL you want to encode>” -OutPath <pathtofile.png> There’s an optional -Show parameter if you want to see the QR code before sending it.

That’s it. You can integrate that into any script you want that can call pwsh in the shell, so bash, AppleScript, whichever you prefer. So in five minutes, you can piss off an overly hemotional manbaby billionaire, and really, isn’t that what automation is for?

Adding Help to PowerShell Scripts

The Scripting Version of “Be Kind, Rewind”

There’s two hard parts to writing a script:

  1. Getting the silly thing to work correctly
  2. Showing people who aren’t you how to use it correctly

The first one is, I maintain, the easier. By far. Getting someone who isn’t you to see what you mean is significantly harder. This is one area where most scripting languages fall down, in that they don’t have a built-in help system available. So you have to add some home-built thing, which you then have to maintain. Man pages are okay, but they’re a separate set of files from the script, requiring additional work, and we all know how much coders love writing documentation.

No one can do much about the tediousness of writing docs, but PowerShell has an awesome built-in help system that not only applies to PowerShell binaries, but that you can build into literally any PowerShell script. The basic documentation is available as always from Microsoft:

There’s a lot of really good info, but you can build a simple help system for a script without having to try real hard. Basically, it’s all comments, but specific comments. Here’s one for my Get-MacInfo script:

<#
.SYNOPSIS
This is a powershell script for macOS that replicates, or tries to, the "Get-ComputerInfo" command for Windows Powershell

.DESCRIPTION
It's not a 1:1 replication, some of it wouldn't make any sense on a Mac. Also, it does check to make sure it's running
on a mac. This pulls information from a variet of sources, including uname, sysctl, AppleScript, sw_ver, system_profiler,
and some built-in powershell functions. It shoves it all into an ordered hashtable so there's some coherency in the output.
If you run the script without any parameters, you get all the items in the hashtable. If you provide one key as a parameter, 
you get the information for that key. You can provide a comma-separated list of keys and you'll get that as a result.

Note: the keys labled "Intel Only" don't exist for Apple Silicon.

Current keys are:
macOSBuildLabEx
macOSCurrentVersion
macOSCurrentBuildNumber
macOSProductName
macOSDarwinVersion
SystemFirmwareVersion
OSLoaderVersion
HardwareSerialNumber
HardwareUUID
ProvisioningUDID
HardwareModelName
HardwareModelID
ActivationLockStatus
CPUArchitecture
CPUName
CPUSpeed (Intel Only)
CPUCount (Intel Only)
CPUCoreCount
CPUL2CacheSize (Intel Only)
CPUBrandString
L3CacheSize (Intel Only)
HyperThreadingEnabled (Intel Only)
RAMAmount
AppMemoryUsedGB
VMPageFile
VMSwapInUseGB
BootDevice
FileVaultStatus
EFICurrentLanguage
DSTStatus
TimeZone
UTCOffset
DNSHostName
LocalHostName
NetworkServiceList
CurrentUserName
CurrentUserUID
CurrentDateTime
LastBootDateTime
Uptime

.EXAMPLE
Get-MacInfo by itself gives you all the parameters it can output

.EXAMPLE
Get-MacInfo TimeZone gives you the current timezone for the computer

.EXAMPLE
Get-MacInfo TimeZone,FileVault status gives you the current timezone and the filevault status for the computer

.NOTES
This can be used as a Powershell module or as a standalone script. 

.LINK
https://github.com/johncwelch/Get-MacInfo
#>

As you can see, it’s one really long block comment, with specific headers (.SYNOPSIS, .EXAMPLE, etc) that work when someone enters Get-Help <module or script name>. So if you have my Get-MacInfo script or module, and you enter Get-Help Get-MacInfo (tab completion works here, because PowerShell’s tab completion is TOTAL R0XX0RZ), you see:

Get-Help Get-MacInfo                                

NAME
    Get-MacInfo
    
SYNOPSIS
    This is a powershell script for macOS that replicates, or tries to, the "Get-ComputerInfo" command for Windows 
    Powershell
    
    
SYNTAX
    Get-MacInfo [[-keys] <Object>] [<CommonParameters>]
    
    
DESCRIPTION
    It's not a 1:1 replication, some of it wouldn't make any sense on a Mac. Also, it does check to make sure it's 
    running
    on a mac. This pulls information from a variet of sources, including uname, sysctl, AppleScript, sw_ver, 
    system_profiler,
    and some built-in powershell functions. It shoves it all into an ordered hashtable so there's some coherency in 
    the output.
    If you run the script without any parameters, you get all the items in the hashtable. If you provide one key as a 
    parameter, 
    you get the information for that key. You can provide a comma-separated list of keys and you'll get that as a 
    result.
    
    20221001 added code for Apple Silicon
    
    Note: the keys labled "Intel Only" don't exist for Apple Silicon.
    
    Current keys are:
    macOSBuildLabEx
    macOSCurrentVersion
    macOSCurrentBuildNumber
    macOSProductName
    macOSDarwinVersion
    SystemFirmwareVersion
    OSLoaderVersion
    HardwareSerialNumber
    HardwareUUID
    ProvisioningUDID
    HardwareModelName
    HardwareModelID
    ActivationLockStatus
    CPUArchitecture
    CPUName
    CPUSpeed (Intel Only)
    CPUCount (Intel Only)
    CPUCoreCount
    CPUL2CacheSize (Intel Only)
    CPUBrandString
    L3CacheSize (Intel Only)
    HyperThreadingEnabled (Intel Only)
    RAMAmount
    AppMemoryUsedGB
    VMPageFile
    VMSwapInUseGB
    BootDevice
    FileVaultStatus
    EFICurrentLanguage
    DSTStatus
    TimeZone
    UTCOffset
    DNSHostName
    LocalHostName
    NetworkServiceList
    CurrentUserName
    CurrentUserUID
    CurrentDateTime
    LastBootDateTime
    Uptime
    

RELATED LINKS
    https://github.com/johncwelch/Get-MacInfo

REMARKS
    To see the examples, type: "Get-Help Get-MacInfo -Examples"
    For more information, type: "Get-Help Get-MacInfo -Detailed"
    For technical information, type: "Get-Help Get-MacInfo -Full"
    For online help, type: "Get-Help Get-MacInfo -Online"

As you go through the docs, you’ll see where you can do a lot more, but that’s how amazingly simple it is to write useful, accurate, updateable documentation for a PowerShell script or module that lives where it should live: in the script or module, and best of all, that’s the standard PowerShell way to add help.

It doesn’t take much to write a decent help system for a PowerShell script or module, and if you do, you save yourself a lot of tech support time, which is way more annoying than writing documentation. So really, there’s not excuse not to.

A Modest Proposal

No, this doesn’t involve eating the Irish, nor is it satire. But it does involve the killing of certain ‘sacred’ cows, so not a completely bad title…

Having lived in both the Windows and macOS admin worlds for three decades, I’ve had some time to deal with the foibles of both, and while Windows is a capable, usable, feature-rich OS, it is also a gods-damned mess in ways that are, in 2022, almost 2023, inexcusable. The registry, the issues with library and runtime versions…no matter how hard deploying software on macOS can be, it is orders of magnitude easier than on Windows. So with that in mind, looking at it from a macOS perspective, how can we solve this problem in a sane manner? What ideas can we steal from macOS and Linux to help, while applying them in a way that works for Windows?

Get off the 32-bit pot

It is 2022, almost 2023, MS needs to just pull an Apple and say “As of <version>, we will no longer support 32-bit anything in Windows.” OEMs/ISVs will either go along or they won’t, but hanging on to 32-bit support is no longer justifiable given the amount of work it creates. Stop it. Stop it for every application too. The fact there’s even an option for a current 32-bit Office (and I would be thrilled to be now wrong about that) is even more inexcusable. Dumping all the 32-bit legacy code would be a massive improvement on every level, including OS layout, security, simplicity, etc. That users have to still care what the “bittedness” of their OS or applications is? Stupid. Like straight up stupid. Yes, I know, enterprise customer still have 32-bit apps they use.

If they’re big enough to try to stop this, they’re big enough to have or hire the staff to update their crap. It’s 2022 almost 2023, there is no justification for 32-bit Windows or Windows apps. Just. Stop. It. No, don’t even allow them to run sandboxed. Put them to rest like they should have been years ago. Spine up and do it.

Partition, Partition, Partition

Not in the filesystem sense, but in the OS structure sense. The way Windows as an OS is set up, the difference between user and system data is not as clear as it should be. But, thats something that can be fixed.

First, let’s create some core structures, some of which are already there:

  • Windows
  • AD
  • MyComputer
  • Users

Okay, admittedly the AD/MyComputer names are lame, but given MS branding over the years, they’re not as bad as some of what MS has come up with (anyone want to get a brown Zune squirt?) So what are the purposes of each of these partitions?

Windows

This is the local OS, as it is now. But, we’re going to steal an idea from Apple, and make C:\Windows read-only. Like hardcore. There’s no reason not to, it’s a perfectly usable method, as Apple has shown, and while it can cause minor bumps, the management tools on Windows are, or should be, mature enough to manage this. The only thing that goes in here is installed during the OS install. By actual OS installers, not imaging. To be blunt, Imaging needs to die, but that means that MS has to stop allowing OEMs to modify windows to add in their own nonsense as part of the OS. Install it somewhere else, but the Windows OS install has to be carved in stone as it were by MS.

This also means MS has to stop tarting up the OS install itself. The Xbox stuff, the other non-OS essential items don’t have to be eliminated as products, but they can’t be a part of the core OS. Windows as an OS needs to be slimmed down. This would a) be a huge boon to enterprises everywhere, especially in high-security areas. If they can rely on the OS they get on the machine being just the OS, and everything outside of Windows being removable without causing issues with the operation of the computer, a major need for imaging (the “de-tarting” of the OS) goes away. The OS is the OS, the OS directory structure is read-only, periodt. I’m not saying do the partition tricks Apple does, although those have much to speak for them in terms of locking down the OS, but at the very least, you should never be a single password from having your computer owned at the “burn it to the ground to fix it” levels.

Again, the only thing in C:\Windows is the OS, the things needed to be a copy of Windows. Nothing more. Also, stop with the stupid stub files for Office, does anyone not despise those?

AD

This is a nod to the ubiquity and integration with AD that is a part of the Windows world. This is where everything required for AD management goes, and it’s created only on binding with AD. Policies? here. Configs? Here. Device Management configs? Here. Needed Scripts? Here Once it’s in place, the only source that can modify it is AD. No local fuqery allowed. The only thing a local admin can do is wipe the drive to get rid of it. If you want to delete the directory/unbind from AD, the minimal privileges required for a local interactive user should be Enterprise Admins. Yes, you have to lock this down like that to make it work. If the machine is removed from AD remotely via AD tools/processes, then part of that is deleting the C:\AD directory.

Which also means that any AD-only users lose their abilities to log in. This would require the thought process of “do we want to allow this user to be a hybrid AD/Local user?”, which should require some thought. The cases where this would be a problem should be relatively small, but it has to be the starting point. Yes, I am quite serious about how hard I want to lock out local admins/users from being able to modify C:\AD. It’s needed, and since the only way to create C:\AD requires AD (or some other LDAP server that can play AD games correctly), the legitimate needs to be able to modify/delete C:\AD as a local machine admin/user are small.

MyComputer

This is analogous to /Library on macOS. This is for local settings/policies that affect every user on a specific computer. If it affects all users, or is a system setting, it goes here. Obviously, local admins can mess with this, it would exist on every Windows computer. But, you’d have to have local admin rights to do so. Note that nothing here should be required to boot the computer and log in. But if you just delete stuff, your apps or local settings may be very strange. No, it’s not hidden. Hidden directories for this kind of thing are silly, and I very much include hiding ~/Library on macOS in this.

But yes, application settings, login settings for all users, etc., that all goes here. It’s basically the current C:\ProgramData directory, with some updates. Like not being hidden.

Users

I hope this is obvious, since it already exists. This is for User data, including per-user installed applications. We should already understand this concept, so I’m not going to go into details on this. There will be one significant change that I’ll discuss in a bit, but it’s a good change that will make things like user migration easier. I will say get rid of the stupid Roaming/Local/LocalLow, there’s little need for that.

Whither Applications?

There’s no real need to change this, C:\Program Files works fine. What I will say again is, get rid of 32-bit support. That C:\Program Files(x86) still exists, and is still needed is an embarrassment. So we keep C:\Program Files, but we’re going to take an idea from macOS and modify it a bit to make certain things easier: All applications go in their own folder in C:\Program Files, and all application-specific data, files, and libraries goes in the application folder. No more dropping an executable in C:\Program Files and then vomiting library files everywhere, including runtimes that other apps use because “oh, that VC++ runtime is there, no need for me to install mine.”

If your application needs 34 runtimes, they go in your application’s folder. Not in C:\MyComputer, that’s only for things needed by every user on the computer. That your application needs a specific version of the .NET runtime that is different than the OS version? That’s fine, but it goes in your application’s folder, periodt, and only your app can access stuff in that folder. This has a number of effects that are hugely positive:

  1. Uninstalling becomes orders of magnitude simpler. Along any directories you put in C:\MyComputer, the entire uninstall process no longer requires complex executables. A handful of remove-item statements in a PowerShell script at most are all you need. (Yes, I know about the elephant, patience children, patience.)
  2. Installing is simpler. You copy a folder to C:\Program Files, put some shortcuts on the current user’s desktop, (other users, if this is a shared use machine can be handled via a First Run action), shortcuts in the Start Menu, create any necessary services, and you’re done. Everything else can, and should be handled by a First Run action on that computer when the human initiates it. So the only reason for .msi at all is integration into managed deployment systems, but those become so much more simple. Also, it helps put an end to setup.exe, the bane of admins everywhere. There is nothing about your installer so clever that you need to write your own. Wank somewhere else, not on my computer.
  3. It makes troubleshooting easier, because it creates known places for all your stuff. You can make assumptions.
  4. It makes reinstalls easier
  5. It makes updates easier
  6. It removes the need for the OS to manage application-specific library needs

This is such an obvious change that I’m really surprised it hasn’t happened. It simplifies so many things, it removes so much confusion. Seriously, this change alone makes things ridiculously better.

So about that elephant…

So what about the registry? How do we update that for this brave new world?

We don’t. We kill it. We remove it. We obliterate it. We treat it like the Death Star treated Alderaan.

There’s nothing about the registry that is objectively good. It’s awful on every level, and if you look at the actual data it contains, a huge part of that? File Paths. Which are better managed in literally any other way, via settings files. Text, XML, JSON, I don’t care. There is nothing good about the registry. It’s a trivially modified place for critical system settings, it is a hard to read/use database that is a glorious target for every bad actor out there…it was never really a good idea, just admit it and make it go away.

If you get rid of the registry, a lot of things get easier. Installing software. Uninstalling software. Updating software. Migrating users to a new machine. Removing a user from a machine. Adding a user to a machine. In fact, almost everything that uses the registry now gets easier if you remove the registry.

Just kill it. Kill it dead. It was never a good idea, and the only thing the Windows registry does better than any other method is be the Windows Registry. Make it stop, delete it from the computer/IT lexicon. If the path “HKLM:/…” is never seen again, it will be a good day. Kill the registry. Do it.

This is not complete

Obviously, this is a broad strokes post. The details on this are numerous and important, but I really don’t like to complain without at least attempting to offer a solution, and I think this is not the worst attempt.

The question is, does anyone on the Windows team have the spine to actually make the changes to make things better, or are they too stuck in “we can eventually fix it without changing anything?”-ville. Because if that’s the case, Windows will never get better. We’ve watched decades of failed incrementalism on the platform. Time to blow some things up and make it actually better.

Application Scripting is Weird

There’s a tendency in the Apple world to paint AppleScript as some uniquely weird inconsistent language. I’m usually amused by that, because then the person doing that will use shell as an example of a consistent language. Which is highly amusing.

But here’s the the thing: the core language, the core AppleScript syntax is really quite consistent. It’s when you get into scripting applications that things get weird, because app devs are not consistent in how they implement things.

So let’s take a look at it via Excel, which has the advantages of being scriptable in wildly different languages on a different platform. We’re going to do a fairly simple set of steps:

  1. Open an excel file
  2. Set a range of columns to be formatted as a table
  3. sort that table by the first column ascending

AppleScript

Here’s how we do this in AppleScript:

set theExcelFile to "pathtofile" as POSIX file
tell application "Microsoft Excel"
     activate
     open theExcelFile
     set theWorkbook to the active workbook
     set theWorksheet to active sheet of theWorkbook
     set theRange to get entire column of range ("A:H")
     set theList to make new list object at theWorksheet with properties {range object:theRange}
     set theSort to sort object of theList
     set theSortRange to sortrange of theSort
     sort theSortRange key1 (column "$A:$A" of theSortRange) order1 sort ascending
end tell

I mean, if you know Excel, and how “Format as Table” actually creates a list object, and sorts are weird within a table/list object, and you script Excel a LOT, this makes sense. You:

  1. Create the path to the file
  2. Tell Excel to start/activate
  3. Tell Excel open the file
  4. Create a reference to the active workbook
  5. Create a reference to the active (work) sheet of the active workbook reference
  6. Create a range of columns
  7. Make a new list object (format as table) in the active sheet for the range you just created and create a reference to that list object
  8. Create a reference to the built in sort object of the list object
  9. Create a reference to the sortrange of that sort object reference
  10. sort the sortrange reference by the first column of the sortrange in ascending order

Okay, so what about say, PowerShell on windows? That has to be way less application-specific right? Surely it’s not that weird…

Powershell

$fileToOpen = "fullpathtofile" 
$excelObject = New-Object -ComObject Excel.Application 
$excelFileObject = $excelObject.Workbooks.Open($fileToOpen)
$excelFileWorksheet = $excelFileObject.ActiveSheet 
$excelFileWorksheetRange = $excelFileWorksheet.Range("A1","H1").entireColumn 
$excelFileTableObject = $excelFileWorksheet.ListObjects.add([Microsoft.Office.Interop.Excel.XlListObjectSourceType]::xlSrcRange,$excelFileWorksheetRange,$null,[Microsoft.Office.Interop.Excel.XlYesNoGuess]::xlYes)
$excelFileTableObject.Sort.SortFields.add($excelFileTableObject.Range.Columns.item(1),0,1)
$excelFileTableObject.Sort.Apply() (actually perform the sort)
$excelObject.Visible = $true 

Obviously this is totally different, because here we:

  1. Create the path to the file
  2. Tell Excel to start/activate
  3. Tell Excel to open the file
  4. Create a reference to the active workbook
  5. Create a reference to the active (work) sheet of the active workbook reference
  6. Create a range of columns
  7. Make a new list object (format as table) in the active sheet for the range you just created and create a reference to that list object
  8. Create a sort object made up of the list object specifying the column to search on and how
  9. Apply the sort object to the list object
  10. Actually make the excel file visible

Oh yeah, that’s totally different and that syntax is just as bog-standard PowerShell as can be, unlike that Excel AppleScript which has nothing to do with core AppleScript. 🙄

If you were to do the same thing in Numbers, you’d see a similar syntax, because applications have specific needs that a core language for an OS does not. Any language that can be extended to fit the needs of a specific application is going to get weird based on the needs, features, and naming conventions of the application. For example, “Table” in Excel can cover a lot of very different things. “Table” in Numbers covers basically one thing, it’s not like Numbers has Pivot Tables. So doing table operations in Numbers is similar, but not identical to format as table in Excel.

Any language supporting application scripting is going to get weird as more applications use it. It’s the unavoidable nature of the beast.

Get-Macinfo Update

tl;dr, updated for Apple Silicon

During my talk at JNUC, a few folks pointed out that my Get-Macinfo script didn’t work well on Apple Silicon. I wasn’t surprised, but as I don’t have an Apple Silicon Mac, I can’t exactly test for that. However, some of y’all really came through with details on command results, and with the help of folks, in particular Kelly Dickson and Dr. Michael Richmond, I was able to get the info I needed.

For Apple Silicon, in the system profiler hardware report, the following values:

  • CPU Speed
  • CPU Count
  • L2 Cache
  • L3 Cache
  • Hyperthreading

don’t exist. Not a shock, but as that query is dumped into an array, missing 5 items meant my array references were all wrong.

I’ve got the first update for Apple Silicon up at ye olde github site, so anyone with an Apple Silicon Mac who wants to look at it and feels like installing/running PowerShell on their Mac (if they don’t already have it) can beat on it. It still seems to work correctly on Intel.

Again, thanks to everyone who helped out, it’s really appreciated, and if anyone has anything they’d like to see added to the list of things Get-Macinfo reports on, I’m happy to add where I can.

Thanks!

JNUC 2022

Sitting in the airport with an hour to kill, thought I’d jot down my thoughts. First, I really enjoyed the conference. The Jamf folks did a solid job, San Diego was great, the hotel location was perfect, and having an event on a Naval Aviation museum? PLANE NERDGASM. Even as a non-Jamf user/admin, the sessions were varied enough that I had no trouble basically double-booking myself. Really good choice, even if I am slightly biased.

Speaker Thoughts

As a speaker, I’ve a couple of minor nits: the speaker room, “Green Room” was a bit sparse. It could use some sprucing up, more coffee (to be fair, there’s never enough coffee as far as I’m concerned, so don’t take that too seriously, I’m an E D G E C A S E when it comes to coffee consumption.) But a bit larger and a few more amenities would be appreciated, it’s good to have a place where one can quietly go over a presentation one last time.

On the flip side of that, the ability to see, almost in real time how many people attended vs. registered is really useful. I was blown away at the numbers my PowerShell session pulled, I honestly didn’t expect but about ten people to show up. That feature is really useful for speakers, so kudos to Jamf on that one. The pre-show prep covered literally months, but that allowed it all to be spaced out and unhurried, which is something I really liked. Whoever set that up did a fantastic job of making it really clear what I needed to do and when, and that prep was a massive help for me, so thank you all.

However, while I understand that 30 minute sessions need to be tight, the inability to have any live demos during a presentation was a killjoy and a half. I’d timed my session out with demos in mind, and without those, I went from a solid 20+ to an okay 15+ minute session. I’m good at vamping, but that’s a lot of time to tap-dance through. Being able to demo things, especially when talking about something like PowerShell or similar that a lot of the audience is unfamiliar with is critical. I hope that for 2023, Jamf adjusts their setup to allow for better demos.

Other than the demos issue, the overall presentation was fantastic. The speaker monitors were perfectly placed, so there was never a need to do the “turn head to see what you’re talking about” dance. Having the presentation display there so I could also see my notes was really useful, and greatly appreciated.

Overall as a speaker, I think other than the very minor nit of the speaker room, and the demos issue, Jamf did a great job here.

Attendee Thoughts

Knocked it out of the park. The location, as I said was amazing. Being able to run along the bay between the end of sessions and any after events was a really great way to unwind a bit, and the solid spacing between the end of sessions and after events was hugely appreciated. That’s something a lot of conferences overlook, and after two days of non-stop go from 7/8am to 11pm or later, one runs out of energy for the last day or so. Having that break kept that from happening, please don’t lose that.

The room setups were great, easy to see and hear from everywhere in the room, and the chairs were quite comfortable. Having a good bit of elbow room at the tables in the rooms was so nice, like just so nice. Having breakfast and lunch provided was nice, the quality of the food was aces. The overall conference had a hard WWDC vibe, but from the older days when you got actual decent food, not box lunches that make USAF flightline meals look luxurious. If you’re going to provide food, spend more than a buck a meal, and Jamf did this about perfectly. The plethora of coffee stations, again, I loved. Could have used more. And bigger cups. But that’s (literally) just me.

Finally, the attendees. First, to everyone (and there were a lot of you) who told me how much they enjoyed my session, thank you so much. I was kind of unsure how such a non-Jamf-specific session would be received at all, and given it was on PowerShell, I had braced myself to be mostly ignored. Instead, I had almost a packed room for the ‘live’ session, and twice that for the virtual and oh my goodness, than you all SO MUCH, it means a lot. That sense of community I’d been missing for the last few years was there in abundance, and it really, really felt good. Y’all are amazing and wonderful.

Fin…

my first-ever JNUC, I was genuinely impressed on every level. The few things that stood out as less-than amazing did so more because everything else was so good, and I think all of them are fixable without a huge amount of work or expense. I really enjoyed the experience as both a speaker and attendee, and am absolutely figuring out how I can work 2023.

Finally, dear Jamf folks, my loves, my sweets…if you’re looking to hold future JNUCs in places that aren’t Minneapolis, I understand Kansas City is a conveniently-located venue. Middle of the country, good-sized airport, lots of hotels, and for the big event, an absolutely amazing art museum that regularly hosts such things, or a just-as-amazing train station, and literally, the best chocolatier in the country if not the planet. Oh and a few solid hotels and convention centers. Just sayin’…;-P