Some Reasons why Installers are a mess

The other day, I was kvetching about installers (if you’re new here, I do that. A lot. You get used to it) and got an interesting reply on twitter:

Installers should be a 100% solved problem these days. Use a freakin’ msi on Windows, and use a pkg on macOS and all will be right with the world. Do not use bloated Java-based installers. Yes, I’m looking at you, InstallAnywhere and whatever they use (used to use?) for Archicad.

So the thing is, yes and no. Installers are, as this person said, something that should be solved, but it is not, and the problem is far, far worse on Windows, even if one uses MSI’s, than it is on macOS. Like far worse, for a variety of reasons.

First, and this exists regardless of platforms, outside of a very small number of companies, the amount of care put into installers is basically none. It’s scutwork, it generates zero direct income, it’s the first circle of hell for interns, etc. One of the best reasons for drag and drop installs on macOS is that it literally avoids installer executables.

The problem with this is that an application that has unit tests galore, UI/UX research by the pound, endless resources devoted to making sure it runs in a way that makes angel sing gets installed by executables that take all that and shove it into some of the most awful UI with almost zero testing on anything but the programmer’s machine. (I have literally dealt with installers that had hardcoded paths to the dev’s machine in the config files. More than once.)

It is also a place where the worst sorts of unnecessary cleverness lives, especially if we’re talking about high-end applications that have roaming license servers. In some cases, I have seen installers for Linux servers that:

  1. Require manual undocumented hacks to install on RedHat
  2. Don’t have a command-line option, so the only documented way to install is to install a GUI. On a Linux server. To install software. Shoot. Me.

There is no realm of software that can begin to touch installers for sheer, mind-numbing WAT. The things I have seen…

But even if you do things “the right way”, there’s a lot of problems. I’m going to talk about Windows here, because it’s…well, honestly, it’s easy.

First, MSIs are not a magical end-all. For example, I have seen MSI over MSI that has to be run as an admin, but if you’re not physically logged in as an admin and double-click on the MSI, you’re never asked to elevate to run the installer, it just fails with a mostly-useless error message. This is a trivial thing to avoid, and yet it happens a lot. I’m pretty sure I know why, the one dev building the installer logs in as an admin, so this problem doesn’t exist for them.

Sometimes you get a “you need to be an admin, start over” dialog. Thanks a pantload Chet, you could have handled this as part of the install. But you didn’t, and now you suck.

You end up getting really good with running msiexec from administrator PowerShell windows after a while.

MSI has a fairly standard set of switches for quiet installs, etc., but a lot of MSI installers don’t always use them all for some reason. So the install either fails, (sans logging, and logging on Windows is awful) or it runs in default mode with nary an error message to be found.

Did I mention installers are regularly awful? Because they are. Autodesk for example, has arbitrary filename length limits for its installers. That it regularly violates. Let me say that again, the company that sets the rules for how long a filename can be to work with its installer names files that break that rule. Not filename and path, just the filename. Make it make sense.

But even if you don’t hit that kind of nonsense, even if the MSI is perfect, then there’s the library issue. Also known as the endless strings of VC++ and .NET runtimes that have to be installed, often in a specific order, and when you uninstall, those stay behind, because if some other app is also using it, (and you have no way of knowing this), then removing it breaks other things, often, again, with no useful error message.

This is one place where the package format macOS uses is literal brilliance, a brilliance I did not truly appreciate until I had to deal with Windows deployments again. In some cases, we’re talking about nine or more separate MSIs that have to run in a specific order before the actual installer for the app you want to install can run. None of these will be visible to the user in Settings or the Control Panel. So when you “uninstall” the application, you’re only uninstalling the actually application, not the n things that were actually installed. Because there’s no safe way to do that in Windows.

On macOS, you just shove all the libraries in the application bundle and you’re good to go. For example, Adobe Photoshop 2022 has 121 frameworks, 8 “required” plugins, 52+ other “required” pieces of code and image files, and hundreds of various localization files. All of them are in the application bundle. There’s some other things outside in the Application Folder, some config files in ~/Library/Application Support/Adobe, some in /Library/Application Support/Adobe, and really, that’s mostly it. Compared to how things work in Windows, that’s almost nothing.

There’s also, on Windows, no good way around it. The architecture of the OS forces you into doing stuff like the VC++ redistributable/.NET Runtime dance. You don’t have a choice, because you absolutely can’t make assumptions about anything. Linux is literally more predictable than Windows.

However, that being said, there’s ways to make things easier.

  1. Fully support deployment tools, and no, I do not mean your weird .hta file. I mean SCCM/Intune/MEM, or other managed deployment tools. Out of the box, with support documentation. Autodesk is particularly good here. Solidworks is particularly not good here. If you create the software, it’s your job to make it work with managed deployment tools. Not the IT department’s, not the VAR’s, yours. If you cannot actually test with any deployment tools, then at the very least, do the work so that your installer can be slotted into said tools without weird dances, and undocumented tricks.
  2. When you uninstall, clean up after yourself as much as possible. You can at least delete the application folders. That’s not too much to ask. Mathsoft is a rather annoying offender with this one.
  3. Make it easy to slipstream/update your deployment point. I should not have to build a new deployment point from scratch just to add a .x update file to it. Yes, it’s not zero work. It’s more work for me when I have to push that out to a few thousand machines, and I should start charging consulting fees for ISVs that make this harder than it should be
  4. If on Windows, use the registry as sparingly as possible, and for the love of christ, do not put application-level variables needed to run the app in the user-specific hive. Deployment tools don’t run as a logged in user, doing stuff like that adds a lot of work to deployments. There’s no good reason to do this, stop it.
  5. Avoid environment variables. Those suck to manage outside of your installer.
  6. Document your installers and the process in excruciating detail. Seriously, we don’t mind.
  7. Virtual machines are literally free. It costs only time to test your installer basics. Do that.
  8. If your installer doesn’t work perfectly smoothly in fully manual mode (manual double-click on the installer file), it is not ready to go. Periodt.
  9. If there are multiple post-install steps that have to be done before the installer quits, those aren’t post-install steps. Those are installer steps. Automate them as much as possible. Don’t pause things just so I can click “next”. If I have to click next to finish the install, then just assume “next” will always be clicked, and bake those steps into the installer. Don’t make people click when there’s no real need.

FInally, the installer is not only code, it is the first code your customers will run. Why do you want them to hate you that fast?

OS X Server is Gone

“I read the news today, oh boy…”

Trite, but kind of how I feel. I get it’s weird to feel anything for a product that in all honesty had ceased to be much of anything over the last few years, but for those not in the “greybeard” section of macOS née Mac OS X Server née Rhapsody, explaining what that product, which mind you, used to not be free, or even cheap, meant to a lot of people is kind of hard. Especially those of us coming from the “dark ages” of AppleShareIP et al. There’s not a lot these days that creates the kind of community OS X Server did. It was a confluence of a lot of things that I don’t think could exist today.

OSXS as it was more commonly known started the careers of so many people and jumpstarted the careers of many others, like, well, me. It did something that hadn’t happened in the server space in years: it created something new. No one was an expert at it when it came out. So everyone was reset to equal, and that created so many ways for products and people to flourish. It was a boost to the people already familiar with Unix-based server OS’s, who understood a bit of LDAP, (especially after 10.5, when NetInfo was finally put to pasture. Oh my the parties about that.)

It wasn’t magical, right? The product itself was always kind of this afterthought, and you could tell what part of it Apple used to sell Macs to people depending on the year. Netboot was huge for a long, long time, then Open Directory, then other things. For orgs that didn’t want to move to Active Directory from NT Domains, or couldn’t, it was a way to delay that move. And it gave Apple at least the ability, along with the Xserve and the Xserve RAID, to say “We have a place in the server room.” Which in the halcyon days before we handed our entire infrastructure to Amazon and/or Microsoft Azure, was important.

There were a lot of people who learned how to be sysadmins because of that product. Which I think created the biggest thing about OSXS: the community. It was a weird community, but it was definitely there, and at times, it was just the most amazing, warm, welcoming thing. Ten seconds later, you wanted to burn it to the ground, but it was awesome just a little more than it was awful. There’s nothing like that any more. No, the MacAdmins Slack channel ain’t it. Not even close.

I met a lot of people who I’m still close to, or at least in touch with because of that product. I’m glad existed the way it did, and yes, it was time for it to go. But for a while, it was a really cool thing to be a part of, and I guess that’s the best thing you can say about any software. So to Eric, Doug, DaveO, Tom, Richard, all the people who helped create and make OSXS and other folks like Chuck, Bartosh, Schoun, Kevin, Greg, Josh, Nigel, Joel, Andrina, Pam, Sara, Mark, and so many others, I will always feel privileged to have been able to share in such a special, weird, and regularly wonderful community with y’all.

Thanks.

(Not) Fried Ice Cream

It’s not really fried, it just looks that way when you’re done. There are variations that do actually fry the ice cream, however this isn’t one of them. This is a bit tedious, so you probably aren’t going to make them all the time, but for a special occaision, they rock mightily.
Ingredients:

• 1 half gallon Breyer’s Vanilla Ice Cream
• Whipped Cream, preferably hand – whipped, but good Cool – Whip works too
• 1 box Special K cereal, crushed well, but not to a powder.
• Chocolate Fudge topping
• Molasses
• Wide and Shallow Sundae dishes

Keep all the ingredients but the Ice Cream in the refridgerator until you use them, and put them back as soon as you’re done. Obviously, keep the Ice Cream in the freezer. The harder you can freeze the ice cream the better. You really want to make these one at a time so that the ice cream doesn’t have a chance to melt any more than necessary.

Scoop enough ice cream so that you can make a ball of ice cream about the size of a softball. Once you’re done, put the ice cream ball in the sundae dish, and put both in the freezer until it’s hard frozen again.

Once the ice cream is re-frozen, then you coat the ice cream ball in molasses, and roll it in the crushed Special K until it’s thoroughly covered, i.e. you can’t see ice cream or molasses. Put the ball back in the dish, put the dishes back in the freezer, and let it reset.

The last two steps are done right before serving.

Pull the dishes with the Special K – covered ice cream out of the freezer, cover the exposed part of the ice cream with the chocolate fudge topping, add the whipped cream and serve.

Margaritas from Scratch

It’s actually pretty simple to make Margs from scratch, but it’s tedious, since you have to juice the lemons and limes yourself. But I find that making them fresh this way gives them a better, more full taste than you get from a mix.
Ingredients:

• Whole lemons
• Two whole limes for every lemon
• White sugar
• Salt
• Tequila

Juice the lemons and limes until you have enough for however maragaritas you want to make. Add sugar until it tastes good. Salt is totally optional for the rim of the glass. The amount of tequila is up to you.

Tips:
If you want frozen margaritas, don’t use an underpowered blender. That will just beat the ice until it’s more melted than not. You really want a proper bar blender, one of the metal Hamilton Beach blenders, like a 700 watt model or a Ninja. Fill the blender with ice cubes, preferably the skinnier ice maker ones. Add the margartia and tequila untill the non – ice space in the blender is filled. HOLD THE LID ON, and hit the switch until the insides are moving smoothly. You may have to hit the switch a few more times. Once the consistency is mostly smooth, pour and drink. You’ll probably always have some chunks of ice, unless you use crushed to begin with. Just use a bar strainer to catch the chunks of ice.

Pina Coladas from Scratch

This recipe is one you may have to play with a bit to get it to taste right. It’s also VERY rich, so if you’re drinking it straight or on the rocks, you’re going to fill up quick.

You can also hide a VERY large amount of rum with this recipe, so be careful. This is particularly good if you’re going with frozen piña’s.

Ingredients:

  • 1 can unsweetened pineapple juice
  • 1 can cream of coconut
  • 1 pint half & half

Add the full can of the cream of coconut into the blender.

Pour in the half & half until the blender is about 3/4 full.

Shake the pinapple juice WELL, and fill the blender the rest of the way.

Run the blender until it’s mixed well. You may have to adjust the proportions until they work for your blender.

Add rum and enjoy.

If you want to make ahead of time, this refridgerates well for a couple days. Separation of the ingredients is normal, and is easily remedied in the blender. Tips: If you want frozen piñas, don’t use an underpowered blender. That will just beat the ice until it’s more melted than not. You really want a proper bar blender, one of the metal Hamilton Beach blenders, like a 700 watt model, or a Ninja. Fill the blender with ice cubes, preferably the skinnier ice maker ones. Add the piña colada and rum untill the non – ice space in the blender is filled. HOLD THE LID ON, and hit the switch until the insides are moving smoothly. You may have to hit the switch a few more times. Once the consistency is mostly smooth, pour and drink. You’ll probably always have some chunks of ice, unless you use crushed to begin with. Just use a bar strainer to catch the chunks of ice.

Fun with Get-Command

So when doing shell things in macOS, (or Linux for that matter), you use things like “which” and “locate” a lot. They’re handy. Well, PowerShell, unsurprisingly has its own version, Get-Command, (Full syntax and examples at the Get-Command documentation page.)

There’s a lot there, but I wanted to talk about just two cases, getting a single command, and getting a list of similar commands. So if you just run Get-Command on a single command, i.e. Get-Command pwsh, you get:

single command results

So this looks nice, but it’s functional. Let’s redo that command, but assign it to a variable: $thePwshCommand = Get-Command pwsh

If we just run the variable name, $thePwshCommand, you see the same thing as you would for Get-Command pwsh. So like any generic string, but with some nicer display formatting right? Well, no. As it turns out, the results of Get-Command are not just a string as shown by $thePwshCommand.GetType():

results of GetType()

So as we see, it’s not a string, but rather something called CommandInfo. What does this mean? Well, let’s say you don’t care about the type of executable a command is, or the version, you just want the path. No string parsing needed, just use $thePwshCommand.Source:

Et voila, no parsing of any kind needed

Since CommandInfo is a kind of object, you have some fun options for using what it returns without having to dink around with sed, grep, or what have you. The language and runtime take care of it for you.

But wait, there’s more. Now, as we can see, I’m running a preview version of PowerShell. Which means there’s a chance that there’s other versions of pwsh on my system. (By default, Get-Command returns the first hit it finds.) So how do we get all the versions of pwsh? Wildcards Get-Command pws*:

all the pws* things

so as we can see, there’s a few things. But what happens if we use a var for that? Well, it’s kind of neat…first the command, $allThingsPwsh = Get-Command pws*. If we just display the var, we get about what we expect for $allThingsPwsh:

No surprises here

So that’s another CommandInfo object right? Nope, as $allThingsPwsh.GetType() shows us:

Well that’s different

So when you get multiple items returned, it’s now an array, of CommandInfo objects. So if we want to see the second item, we get just that entry for $allThingsPwsh[1]:

Second item in the array

But as it’s an array of objects, we gets more flexibility, like if we just want to see the path for that entry, we use $allThingsPwsh[2].Source:
/usr/local/bin/pwsh-preview

If we just want the paths for all the items, we use $allThingsPwsh.Source, and we get:
/usr/local/microsoft/powershell/7-preview/pwsh
/usr/local/bin/pwsh
/usr/local/bin/pwsh-preview

If we just want the name and the path separated by a tab, easily done with
$allThingsPwsh[2].Name + "`t" +  $allThingsPwsh[2].Source:
pwsh-preview /usr/local/bin/pwsh-preview

So yeah, because PowerShell understands objects and arrays better than things like shell, you get a lot of flexibility in the command, saving you from a lot of the endless string/output parsing legerdemain you have to do with shell.

For more info on arrays:
https://docs.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-arrays?view=powershell-7.2 and https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_arrays?view=powershell-7.2

Enjoy!

Porting an Application from AppleScriptObjectiveC to Xamarin/C#

This ended up taking less time than I thought it would, as it turns out, about 90% of the work I did on the Swift version was usable with little modifications in the Xamarin version. Like mostly differences between variable declarations in C# vs Swift, object inits, etc.

Links:

ASOC Original
Swift Version
Xamarin/C# Version

So in terms of just coding, MS and the Xamarin team have done a really good job. Like, there’s very few things that actually tripped me up going from Cocoa to Xamarin. Usually, it was odd things, like having to use Environment.Username to get the current user name because NSUsername() isn’t an option, or similar. Like little things that are annoying more than OH GOD NO. The syntax MS uses for the .NET Core Mac bits is really similar to the Swift implementation in Cocoa, which is a huge benefit to anyone looking to use VS.

I think a lot of that is just C# and .NET being things that have existed for a while outside of the Mac, and so you’re going to have odd little differences where they’re not going to implement NSUsername() when they have something that already works.

As an aside, Git is just awful, can someone please invent something that doesn’t have a UI/UX that only Linus Torvalds could love? Also, VS’s Git interface is…it is a thing, it exists but it is not very good even for simple things. Also, if you have any idea you may ever want to use Github et al for your project ALWAYS SAY YOU WANT TO USE GIT WHEN YOU CREATE THE PROJECT. Ye gods did dealing with that suck.

The major thing is dealing with the UI. It’s very reminiscent of when Interface Builder was not a part of Xcode. To build a UI in Visual Studio, you basically shell out to Xcode, build your storyboards there, then dump back in to VS. No SwiftUI yet, but there’s no reason you couldn’t do it. (It is still amusing to me that VS will have native ARM support on the Mac before it does on Windows.) You also need Xcode to publish to the MAS if that is your wish. I’m honestly not sure there’s a way to not need Xcode for the UI bits that wouldn’t be more trouble than it’s worth, but I could be wrong, and I’d be happy to be wrong.

Could you use Visual Studio: Mac for “real” development or cross-platform development? I think you could and if they fix a few relatively minor issues, it would be not awful at all.

I mean, for the purposes of my experiment, I was rather happy with VS/Xamarin/C#/.NET, and I am glad I’ve done the work I have in PowerShell, that was quite a help. Now, if you could use PowerShell in VS:Mac, that would be awesome, since you know, PowerShell is available for the Mac and all.

Just sayin’…

Porting an Application from AppleScriptObjectiveC to Swift

Okay, first, the links:

AppleScriptObjectiveC (ASOC) version: https://github.com/johncwelch/ScutilUtil
Swift version: https://github.com/johncwelch/scutil

So this took about…two days to do. Obviously, I am not an experienced Swift anything right? So there’s a lot of things i did someone more experienced would have done differently. No, it’s not done in SwiftUI, life is too short for that UI-as-CSS torture. I’m glad y’all like it, I find it tedious as hell. Way too much work to just put a button somewhere.

It is really hard to state just how a higher level language like ASOC spoils you. A LOT. It spoils you SO MUCH. So many thing you just don’t ever have to think about. I comment the hell out of the code, so i’m not going to go into detail about it. I will say that managing running a shell command with sudo privileges was way more difficult than I thought it would be. NSAppleScript() works, but it’s really fragile and without Swift playgrounds, I might have given up. Being able to just test a thing rapidly is really handy.

Also, for the love of god, think hard before you put a project you’re actively working on in a OneDrive folder. OneDrive handles code really badly. Like VERY BADLY.

I am also working on a Xamarin version just to see the differences. I just realized that doing the Swift version first would end up being way easier for the Xamarin version.

Sometimes, High-Level Languages don’t suck

So recently, as an exercise, I wanted to see about moving one of my simpler ASOC (AppleScript ObjectiveC) applications to a different language. Then I thought, why not two? So the first one I’ve been working with is C# via Visual Studio on the Mac (the 2022 preview). It’s um…interesting. First, you still need Xcode to do all the UI setup. I’m not sure why, and it’s a bit janky, but it’s not bad per se.

I think the Xamarin docs need a lot more sample code, it’s really intimidating if you’re new to the platform. But anyway, part of this app, ScutilUtil (https://github.com/johncwelch/ScutilUtil) uses the scutil command to pull info about the computer’s host name. (There may be a “proper” way to do it, but that’s not worth the search when scutil is right there.) To do so, you run scutil with various switches. i.e. /usr/sbin/scutil --get ComputerName to get the computer name.

In ASOC, this is trivial, literally one line thanks to AppleScript’s ‘do shell script’ command:

set my theCurrentComputerName to do shell script "/usr/sbin/scutil --get ComputerName"

Easy-peasy. In C#/Xamarin…sigh. The best way I found was:

Process scutilTest = new Process();
scutilTest.StartInfo.UseShellExecute = false;
scutilTest.StartInfo.RedirectStandardOutput = true;
scutilTest.StartInfo.FileName = "/usr/sbin/scutil";
scutilTest.StartInfo.Arguments = " --get LocalHostName";
scutilTest.Start();
string localHostName = scutilTest.StandardOutput.ReadToEnd();
scutilTest.WaitForExit();      

Like, it works right? But really? I mean, yes, this is a very cross-platform way to do it, but all that to run a single one-line command…it seems unelegant, and it seems unelegant on any of the platforms C#/Xamarin run on. Like, macOS/Linux/Windows all have a command line environment. They all have command line utilities that ship with the OS you might want to use because they’re there, and pretty fast.

Why make calling them so hard? Why not have something that gets you closer to do shell script. I mean, out of all of them, the only command/script environments you will never see on all three is cmd.exe, that’s windows-only and AppleScript is macOS – only, (but not something you’d run often in cases like this.) But shell? All three can have it. PowerShell? All three.

So wouldn’t it make more sense to have a way to test for the presence of a command environment, and a way to just use a command environment? Like say:

string localHostName = RunCommand -Environment zsh -Command "/usr/sbin/scutil" -Arguments "--get localHostName" -NoNewWindow -Wait

I’m using PowerShell-ish syntax here, because VS and .NET et al, but you get the idea. You could even have a way to check for a specific command environment if you wanted to be fancy, etc. Again, all the non-phone/non-tablet .NET platforms have a command-line environment of some form. Why not just allow for that to be used easily?

It’s times like these I wish Visual Studio Mac supported PowerShell as a language. Sigh.

A Good Example of a Bad Example

As most of y’all know, I’ve been kind of up in the OneDrive debacle for a while, going back to when the “new” OneDrive was only available on the Insider builds and breaking everything. But like everything that goes this epically wrong, there’s a lot others (and the OneDrive team) can learn from this epic situation

  1. One Major Change At A Time
    One of the biggest things that’s hitting OneDrive Mac users is that there’s two fundamental changes hitting them at once. This isn’t new chrome or a slight reordering of things, OneDrive on the Mac has made two fundamental changes that affect everyone using the app. One of those, the moving of the OneDrive root was imposed on them via API changes from Apple. The other is the Files-On-Demand being no longer optional. My guess is the latter was in the works when the former hit, and rather than delay the FoD changes, they went ahead with them anyways, “may as well have all the pain at once.”

    Developers, if anyone ever suggests this to you, fight back as hard as possible, because this will cause you pain for years. Years. The real problems from the OneDrive root changes such as:

    Using OneDrive on an external drive is now a real problem, one that may not be fixable

    A lot of workflows that depended on those files being in a specific place are broken

    Backing up OneDrive files to other systems may be a problem depending on the backup

    The OneDrive root change alone will take months to sort out, along with any bugs caused or discovered. Throwing the FoD change on top of it was just foolishness

  2. Beta Programs Are Not Enough
    If you read the forum posts on this, it becomes obvious that there’s a host of users that are being whacked by the OneDrive root changes:

    People with workflows that rely on files being here, not there.
    People using document/media management applications like Adobe Bridge
    People using database-driven apps like Devon Think

    There are a lot of apps and workflows that rely on files not being moved by third parties in the background. Heck, just scripting stuff relies on files not arbitrarily being moved or deleted. Yes, there’s the Insider Program, but that’s going to be a highly self-selected audience, which is going to lean IT a lot, because we’re the ones who need to know what’s coming down the pipe, and even then, not a lot. A lot of IT folks, for good or bad, don’t test until the final release. I think that’s foolish, but that’s me.

    In any event, even though I know for a fact that the problems people are seeing now were reported during the beta cycle, I think the team either blew off the data as “well, not that many people will be harmed by this” or pulled a New Coke, and payed a great deal of attention to the data they liked (“We think this tastes better than “old” Coke) and blew off the data they did not like (“But don’t replace Coke with this”) For example, for me, in and of itself, the OneDrive root changes are not a real problem. They’re mildly annoying, but they don’t break anything.

    For folks with different more complicated workflows, the OneDrive root changes break a lot of things. Like a lot. But if I’m more representative of the people in the betas, you’re getting skewed data that ends up being bad.

  3. Major changes need a lot of warning
    The level of information people had about this at the start was slim and none, and almost all of it buried in MS Dev/Technical sites. That was stupid. Even if the only change was the OneDrive root change, that should have been way more publicly announced and talked about, with examples of how to manage those changes for things like OneDrive on external drives, Adobe Bridge and similar users, etc. It’s not like Microsoft didn’t have the money or resources to do this, they chose not to.

    Yes, I’m aware that notification doesn’t always work. Apple notified people about the elimination of 32-bit apps in macOS for over a year before it happened, with warnings happening on application launch. I know of at least one entire university whose IT team ignored those warnings completely, so for six months, Mac users who had updated or got a new Mac couldn’t use the school’s VPN client because they hadn’t updated their stuff and made the 64-bit client available.

    But people ignoring information does not justify not having it available. If nothing else, having the documentation available ahead of time makes it easier to direct people with problems covered by that documentation.

  4. Make sure your support teams are well-informed of the change
    There’s really good data showing that the OneDrive team did not make sure the MS support staff were fully up to date on the changes and likely issues resulting from them. That borders on inexcusable.

  5. Don’t victim-blame
    Don’t pull a Steve Jobs and tell them they’re holding it wrong. If someone has had their stuff working for years with your product, you make a change and they break it, and now they have to do a lot of work to get back to where they were, telling them they’re doing something wrong is a bad idea. It’s not their fault, especially if you didn’t warn them well.

  6. Don’t Gaslight
    Telling people the new changes are really better for them and their objections/problems aren’t real or “there aren’t that many people who didn’t like the change”? Bad idea, very bad, almost guaranteed to motivate people to think hard about not being your customer anymore. Your opinion on the changes is not more important than the pain and work you’re causing folks.

  7. Make sure your “solutions” aren’t more trouble than they’re worth
    This applies more to the FoD “Just keep clicking and downloading until it’s all done” coda. The level of anger created by making a customer do a lot of work just to get back to where they were three seconds prior to your change is not small. Ideally, the OneDrive team would have had scripts and other tools available to help make managing the OneDrive root changes and the FoD changes easier before the changes were implemented. That would have been a big help. Also, having the command line tools that would be a big help in remedying this actually work correctly and not be more work than clicking a hundred folders would have been a big help.

  8. ONLY ONE MAJOR CHANGE AT A TIME
    Yes, I know, repeat. Enough people do this that a reminder is called for.

The OneDrive team got caught between the rock of API changes and a hard place of their own making. One I have sympathy for, the other…especially given the really bad response of the OneDrive team to the problems their bad decisions caused? Not so much.