Margaritas from Scratch

It’s actually pretty simple to make Margs from scratch, but it’s tedious, since you have to juice the lemons and limes yourself. But I find that making them fresh this way gives them a better, more full taste than you get from a mix.
Ingredients:

• Whole lemons
• Two whole limes for every lemon
• White sugar
• Salt
• Tequila

Juice the lemons and limes until you have enough for however maragaritas you want to make. Add sugar until it tastes good. Salt is totally optional for the rim of the glass. The amount of tequila is up to you.

Tips:
If you want frozen margaritas, don’t use an underpowered blender. That will just beat the ice until it’s more melted than not. You really want a proper bar blender, one of the metal Hamilton Beach blenders, like a 700 watt model or a Ninja. Fill the blender with ice cubes, preferably the skinnier ice maker ones. Add the margartia and tequila untill the non – ice space in the blender is filled. HOLD THE LID ON, and hit the switch until the insides are moving smoothly. You may have to hit the switch a few more times. Once the consistency is mostly smooth, pour and drink. You’ll probably always have some chunks of ice, unless you use crushed to begin with. Just use a bar strainer to catch the chunks of ice.

Advertisement

Pina Coladas from Scratch

This recipe is one you may have to play with a bit to get it to taste right. It’s also VERY rich, so if you’re drinking it straight or on the rocks, you’re going to fill up quick.

You can also hide a VERY large amount of rum with this recipe, so be careful. This is particularly good if you’re going with frozen piña’s.

Ingredients:

  • 1 can unsweetened pineapple juice
  • 1 can cream of coconut
  • 1 pint half & half

Add the full can of the cream of coconut into the blender.

Pour in the half & half until the blender is about 3/4 full.

Shake the pinapple juice WELL, and fill the blender the rest of the way.

Run the blender until it’s mixed well. You may have to adjust the proportions until they work for your blender.

Add rum and enjoy.

If you want to make ahead of time, this refridgerates well for a couple days. Separation of the ingredients is normal, and is easily remedied in the blender. Tips: If you want frozen piñas, don’t use an underpowered blender. That will just beat the ice until it’s more melted than not. You really want a proper bar blender, one of the metal Hamilton Beach blenders, like a 700 watt model, or a Ninja. Fill the blender with ice cubes, preferably the skinnier ice maker ones. Add the piña colada and rum untill the non – ice space in the blender is filled. HOLD THE LID ON, and hit the switch until the insides are moving smoothly. You may have to hit the switch a few more times. Once the consistency is mostly smooth, pour and drink. You’ll probably always have some chunks of ice, unless you use crushed to begin with. Just use a bar strainer to catch the chunks of ice.

Fun with Get-Command

So when doing shell things in macOS, (or Linux for that matter), you use things like “which” and “locate” a lot. They’re handy. Well, PowerShell, unsurprisingly has its own version, Get-Command, (Full syntax and examples at the Get-Command documentation page.)

There’s a lot there, but I wanted to talk about just two cases, getting a single command, and getting a list of similar commands. So if you just run Get-Command on a single command, i.e. Get-Command pwsh, you get:

single command results

So this looks nice, but it’s functional. Let’s redo that command, but assign it to a variable: $thePwshCommand = Get-Command pwsh

If we just run the variable name, $thePwshCommand, you see the same thing as you would for Get-Command pwsh. So like any generic string, but with some nicer display formatting right? Well, no. As it turns out, the results of Get-Command are not just a string as shown by $thePwshCommand.GetType():

results of GetType()

So as we see, it’s not a string, but rather something called CommandInfo. What does this mean? Well, let’s say you don’t care about the type of executable a command is, or the version, you just want the path. No string parsing needed, just use $thePwshCommand.Source:

Et voila, no parsing of any kind needed

Since CommandInfo is a kind of object, you have some fun options for using what it returns without having to dink around with sed, grep, or what have you. The language and runtime take care of it for you.

But wait, there’s more. Now, as we can see, I’m running a preview version of PowerShell. Which means there’s a chance that there’s other versions of pwsh on my system. (By default, Get-Command returns the first hit it finds.) So how do we get all the versions of pwsh? Wildcards Get-Command pws*:

all the pws* things

so as we can see, there’s a few things. But what happens if we use a var for that? Well, it’s kind of neat…first the command, $allThingsPwsh = Get-Command pws*. If we just display the var, we get about what we expect for $allThingsPwsh:

No surprises here

So that’s another CommandInfo object right? Nope, as $allThingsPwsh.GetType() shows us:

Well that’s different

So when you get multiple items returned, it’s now an array, of CommandInfo objects. So if we want to see the second item, we get just that entry for $allThingsPwsh[1]:

Second item in the array

But as it’s an array of objects, we gets more flexibility, like if we just want to see the path for that entry, we use $allThingsPwsh[2].Source:
/usr/local/bin/pwsh-preview

If we just want the paths for all the items, we use $allThingsPwsh.Source, and we get:
/usr/local/microsoft/powershell/7-preview/pwsh
/usr/local/bin/pwsh
/usr/local/bin/pwsh-preview

If we just want the name and the path separated by a tab, easily done with
$allThingsPwsh[2].Name + "`t" +  $allThingsPwsh[2].Source:
pwsh-preview /usr/local/bin/pwsh-preview

So yeah, because PowerShell understands objects and arrays better than things like shell, you get a lot of flexibility in the command, saving you from a lot of the endless string/output parsing legerdemain you have to do with shell.

For more info on arrays:
https://docs.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-arrays?view=powershell-7.2 and https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_arrays?view=powershell-7.2

Enjoy!

Porting an Application from AppleScriptObjectiveC to Xamarin/C#

This ended up taking less time than I thought it would, as it turns out, about 90% of the work I did on the Swift version was usable with little modifications in the Xamarin version. Like mostly differences between variable declarations in C# vs Swift, object inits, etc.

Links:

ASOC Original
Swift Version
Xamarin/C# Version

So in terms of just coding, MS and the Xamarin team have done a really good job. Like, there’s very few things that actually tripped me up going from Cocoa to Xamarin. Usually, it was odd things, like having to use Environment.Username to get the current user name because NSUsername() isn’t an option, or similar. Like little things that are annoying more than OH GOD NO. The syntax MS uses for the .NET Core Mac bits is really similar to the Swift implementation in Cocoa, which is a huge benefit to anyone looking to use VS.

I think a lot of that is just C# and .NET being things that have existed for a while outside of the Mac, and so you’re going to have odd little differences where they’re not going to implement NSUsername() when they have something that already works.

As an aside, Git is just awful, can someone please invent something that doesn’t have a UI/UX that only Linus Torvalds could love? Also, VS’s Git interface is…it is a thing, it exists but it is not very good even for simple things. Also, if you have any idea you may ever want to use Github et al for your project ALWAYS SAY YOU WANT TO USE GIT WHEN YOU CREATE THE PROJECT. Ye gods did dealing with that suck.

The major thing is dealing with the UI. It’s very reminiscent of when Interface Builder was not a part of Xcode. To build a UI in Visual Studio, you basically shell out to Xcode, build your storyboards there, then dump back in to VS. No SwiftUI yet, but there’s no reason you couldn’t do it. (It is still amusing to me that VS will have native ARM support on the Mac before it does on Windows.) You also need Xcode to publish to the MAS if that is your wish. I’m honestly not sure there’s a way to not need Xcode for the UI bits that wouldn’t be more trouble than it’s worth, but I could be wrong, and I’d be happy to be wrong.

Could you use Visual Studio: Mac for “real” development or cross-platform development? I think you could and if they fix a few relatively minor issues, it would be not awful at all.

I mean, for the purposes of my experiment, I was rather happy with VS/Xamarin/C#/.NET, and I am glad I’ve done the work I have in PowerShell, that was quite a help. Now, if you could use PowerShell in VS:Mac, that would be awesome, since you know, PowerShell is available for the Mac and all.

Just sayin’…

Porting an Application from AppleScriptObjectiveC to Swift

Okay, first, the links:

AppleScriptObjectiveC (ASOC) version: https://github.com/johncwelch/ScutilUtil
Swift version: https://github.com/johncwelch/scutil

So this took about…two days to do. Obviously, I am not an experienced Swift anything right? So there’s a lot of things i did someone more experienced would have done differently. No, it’s not done in SwiftUI, life is too short for that UI-as-CSS torture. I’m glad y’all like it, I find it tedious as hell. Way too much work to just put a button somewhere.

It is really hard to state just how a higher level language like ASOC spoils you. A LOT. It spoils you SO MUCH. So many thing you just don’t ever have to think about. I comment the hell out of the code, so i’m not going to go into detail about it. I will say that managing running a shell command with sudo privileges was way more difficult than I thought it would be. NSAppleScript() works, but it’s really fragile and without Swift playgrounds, I might have given up. Being able to just test a thing rapidly is really handy.

Also, for the love of god, think hard before you put a project you’re actively working on in a OneDrive folder. OneDrive handles code really badly. Like VERY BADLY.

I am also working on a Xamarin version just to see the differences. I just realized that doing the Swift version first would end up being way easier for the Xamarin version.

Sometimes, High-Level Languages don’t suck

So recently, as an exercise, I wanted to see about moving one of my simpler ASOC (AppleScript ObjectiveC) applications to a different language. Then I thought, why not two? So the first one I’ve been working with is C# via Visual Studio on the Mac (the 2022 preview). It’s um…interesting. First, you still need Xcode to do all the UI setup. I’m not sure why, and it’s a bit janky, but it’s not bad per se.

I think the Xamarin docs need a lot more sample code, it’s really intimidating if you’re new to the platform. But anyway, part of this app, ScutilUtil (https://github.com/johncwelch/ScutilUtil) uses the scutil command to pull info about the computer’s host name. (There may be a “proper” way to do it, but that’s not worth the search when scutil is right there.) To do so, you run scutil with various switches. i.e. /usr/sbin/scutil --get ComputerName to get the computer name.

In ASOC, this is trivial, literally one line thanks to AppleScript’s ‘do shell script’ command:

set my theCurrentComputerName to do shell script "/usr/sbin/scutil --get ComputerName"

Easy-peasy. In C#/Xamarin…sigh. The best way I found was:

Process scutilTest = new Process();
scutilTest.StartInfo.UseShellExecute = false;
scutilTest.StartInfo.RedirectStandardOutput = true;
scutilTest.StartInfo.FileName = "/usr/sbin/scutil";
scutilTest.StartInfo.Arguments = " --get LocalHostName";
scutilTest.Start();
string localHostName = scutilTest.StandardOutput.ReadToEnd();
scutilTest.WaitForExit();      

Like, it works right? But really? I mean, yes, this is a very cross-platform way to do it, but all that to run a single one-line command…it seems unelegant, and it seems unelegant on any of the platforms C#/Xamarin run on. Like, macOS/Linux/Windows all have a command line environment. They all have command line utilities that ship with the OS you might want to use because they’re there, and pretty fast.

Why make calling them so hard? Why not have something that gets you closer to do shell script. I mean, out of all of them, the only command/script environments you will never see on all three is cmd.exe, that’s windows-only and AppleScript is macOS – only, (but not something you’d run often in cases like this.) But shell? All three can have it. PowerShell? All three.

So wouldn’t it make more sense to have a way to test for the presence of a command environment, and a way to just use a command environment? Like say:

string localHostName = RunCommand -Environment zsh -Command "/usr/sbin/scutil" -Arguments "--get localHostName" -NoNewWindow -Wait

I’m using PowerShell-ish syntax here, because VS and .NET et al, but you get the idea. You could even have a way to check for a specific command environment if you wanted to be fancy, etc. Again, all the non-phone/non-tablet .NET platforms have a command-line environment of some form. Why not just allow for that to be used easily?

It’s times like these I wish Visual Studio Mac supported PowerShell as a language. Sigh.

A Good Example of a Bad Example

As most of y’all know, I’ve been kind of up in the OneDrive debacle for a while, going back to when the “new” OneDrive was only available on the Insider builds and breaking everything. But like everything that goes this epically wrong, there’s a lot others (and the OneDrive team) can learn from this epic situation

  1. One Major Change At A Time
    One of the biggest things that’s hitting OneDrive Mac users is that there’s two fundamental changes hitting them at once. This isn’t new chrome or a slight reordering of things, OneDrive on the Mac has made two fundamental changes that affect everyone using the app. One of those, the moving of the OneDrive root was imposed on them via API changes from Apple. The other is the Files-On-Demand being no longer optional. My guess is the latter was in the works when the former hit, and rather than delay the FoD changes, they went ahead with them anyways, “may as well have all the pain at once.”

    Developers, if anyone ever suggests this to you, fight back as hard as possible, because this will cause you pain for years. Years. The real problems from the OneDrive root changes such as:

    Using OneDrive on an external drive is now a real problem, one that may not be fixable

    A lot of workflows that depended on those files being in a specific place are broken

    Backing up OneDrive files to other systems may be a problem depending on the backup

    The OneDrive root change alone will take months to sort out, along with any bugs caused or discovered. Throwing the FoD change on top of it was just foolishness

  2. Beta Programs Are Not Enough
    If you read the forum posts on this, it becomes obvious that there’s a host of users that are being whacked by the OneDrive root changes:

    People with workflows that rely on files being here, not there.
    People using document/media management applications like Adobe Bridge
    People using database-driven apps like Devon Think

    There are a lot of apps and workflows that rely on files not being moved by third parties in the background. Heck, just scripting stuff relies on files not arbitrarily being moved or deleted. Yes, there’s the Insider Program, but that’s going to be a highly self-selected audience, which is going to lean IT a lot, because we’re the ones who need to know what’s coming down the pipe, and even then, not a lot. A lot of IT folks, for good or bad, don’t test until the final release. I think that’s foolish, but that’s me.

    In any event, even though I know for a fact that the problems people are seeing now were reported during the beta cycle, I think the team either blew off the data as “well, not that many people will be harmed by this” or pulled a New Coke, and payed a great deal of attention to the data they liked (“We think this tastes better than “old” Coke) and blew off the data they did not like (“But don’t replace Coke with this”) For example, for me, in and of itself, the OneDrive root changes are not a real problem. They’re mildly annoying, but they don’t break anything.

    For folks with different more complicated workflows, the OneDrive root changes break a lot of things. Like a lot. But if I’m more representative of the people in the betas, you’re getting skewed data that ends up being bad.

  3. Major changes need a lot of warning
    The level of information people had about this at the start was slim and none, and almost all of it buried in MS Dev/Technical sites. That was stupid. Even if the only change was the OneDrive root change, that should have been way more publicly announced and talked about, with examples of how to manage those changes for things like OneDrive on external drives, Adobe Bridge and similar users, etc. It’s not like Microsoft didn’t have the money or resources to do this, they chose not to.

    Yes, I’m aware that notification doesn’t always work. Apple notified people about the elimination of 32-bit apps in macOS for over a year before it happened, with warnings happening on application launch. I know of at least one entire university whose IT team ignored those warnings completely, so for six months, Mac users who had updated or got a new Mac couldn’t use the school’s VPN client because they hadn’t updated their stuff and made the 64-bit client available.

    But people ignoring information does not justify not having it available. If nothing else, having the documentation available ahead of time makes it easier to direct people with problems covered by that documentation.

  4. Make sure your support teams are well-informed of the change
    There’s really good data showing that the OneDrive team did not make sure the MS support staff were fully up to date on the changes and likely issues resulting from them. That borders on inexcusable.

  5. Don’t victim-blame
    Don’t pull a Steve Jobs and tell them they’re holding it wrong. If someone has had their stuff working for years with your product, you make a change and they break it, and now they have to do a lot of work to get back to where they were, telling them they’re doing something wrong is a bad idea. It’s not their fault, especially if you didn’t warn them well.

  6. Don’t Gaslight
    Telling people the new changes are really better for them and their objections/problems aren’t real or “there aren’t that many people who didn’t like the change”? Bad idea, very bad, almost guaranteed to motivate people to think hard about not being your customer anymore. Your opinion on the changes is not more important than the pain and work you’re causing folks.

  7. Make sure your “solutions” aren’t more trouble than they’re worth
    This applies more to the FoD “Just keep clicking and downloading until it’s all done” coda. The level of anger created by making a customer do a lot of work just to get back to where they were three seconds prior to your change is not small. Ideally, the OneDrive team would have had scripts and other tools available to help make managing the OneDrive root changes and the FoD changes easier before the changes were implemented. That would have been a big help. Also, having the command line tools that would be a big help in remedying this actually work correctly and not be more work than clicking a hundred folders would have been a big help.

  8. ONLY ONE MAJOR CHANGE AT A TIME
    Yes, I know, repeat. Enough people do this that a reminder is called for.

The OneDrive team got caught between the rock of API changes and a hard place of their own making. One I have sympathy for, the other…especially given the really bad response of the OneDrive team to the problems their bad decisions caused? Not so much.

Dealing with ‘new” OneDrive

So the changes that bit me in the tuchas in this post have come to production OneDrive, and I’m almost surprised to see my reaction was on the mild side. I’m not surprised, the forcing of FoD, aka “deleting a bunch of files from your hard drive without so much as a by your leave” was a shortsighted idea that on par with the decision to make the Oozinator and its infamous ad.

For the more technically astute, there are two technotes that apply:

https://docs.microsoft.com/en-us/onedrive/files-on-demand-mac gives you some tips on configuration, including the infamous /setpin switch, which in theory would allow you to better automate keeping copies of things locally. In practice, it’s awful to use, see my original post for my experience. Like really, it’s a dumb implementation, and that is the kindest thing I can say.

https://docs.microsoft.com/en-us/onedrive/deploy-and-configure-on-macos has more useful information, in particular the FilesOnDemandEnabled key, which can be set to off. The OneDrive team of course doesn’t recommend doing this. Right now, the effort involved in caring less about the OneDrive team’s wishes with regard to Files-On-Demand would be considerable, so I shan’t bother. Suffice it to say, their wishes on this issue have as much value to me as my turning off Files-On-Demand did for them.

To be blunt: were a random script or executable do what OneDrive is doing here, namely deleting data without so much as a warning, we would call that script malware and warn the world about it so suitable countermeasures could be implemented. That OneDrive gives you an as yet manual method to eventually get all the files that were already local back to that benighted state doesn’t change the malware-like behavior OneDrive is engaging in here.

As far as what I’ve seen from the OneDrive team on this:

We didn’t have many users turning off Files-On-Demand, this is not a real problem

The usage or lack of usage of Files-On-Demand is not the problem. It is OneDrive deleting large amounts of data without the user having any real warning about it, nor option to refuse. It is even worse for people such as myself who had turned Files-On-Demand off, and had OneDrive decide, “nah.”

We give you a way to still have all your files local

If I total your car while you’re asleep, and I pay for its replacement, you are still without a car for a non-zero amount of time. The fact that it’s not costing you any money doesn’t reduce the amount of time to get back to where you were before I totaled your car to zero. Same thing here. This also assumes that everyone using OneDrive has unmetered, high bandwidth (at least 100Mbps) connections to the public internet, and the time to restore potentially GB of data. 26.2GB in my case. Even with Google Fiber, that’s not zero time. There is no small amount of elitism and arrogance in that statement.

Apple’s API changes required this

They required you to move the OneDrive folder, they most certainly did not make you force everyone to Files-On-Demand, insinuating otherwise is quite insulting to your customers’ intelligence.

That there is still some method to undo this…nonsense does not make it zero bad, or zero damage. This needs to be fixed. There is nothing so amazing about Files-On-Demand that makes this decision worth the damage this has caused OneDrive, and I think it a shame that the team’s mindset comes from such a place of elitism and deliberate ignorance.

Portable PowerShell In an Application Bundle

So obviously, I have a fondness for PowerShell on macOS, but one of the downsides with really integrating it into things is being able to rely on it as a resource. It’d be nice if you could just include it with a script or bundle, so that it was available for that script or application to use as needed.

Well, turns out, you can, thanks to the PowerShell team making some really good decisions in how they distribute PowerShell on macOS. If you download the installer bundle from the PowerShell github site, https://github.com/PowerShell/PowerShell/releases/tag/v7.2.1 (the current stable version as of this post), and you open the bundle up in something like Pacifist, you realize that all the important powershell bits outside of the man page and an app launcher are all in the same directory:

Everything is in that directory

When I saw that, I decided to do a really quick and dirty test of my “Portable PowerShell” thought. So using Script Debugger, I created a script bundle, which is an AppleScript, but in a bundle format, which means it has the Contents/Resources/ tree. I then copied the usr/local/bin/microsoft/powershell/<version> bundle from the install package into the resources folder. In the script bundle itself, I set up the path to that bundle:

set thePwshAlias to ((path to me as text) & "Contents:Resources:powershell:usr:local:microsoft:powershell:7-preview:pwsh") as alias
set thePwshPath to quoted form of (POSIX path of thePwshAlias)

Then the statement that does the test of this:

do shell script thePwshPath & " -c \"Get-PSDrive\""

And the expected result:

Name           Used (GB)     Free (GB) Provider      Root                                 
----           ---------     --------- --------      ----  
/                 782.49        149.06 FileSystem    / 
Alias                                  Alias 
Env                                    Environment 
Function                               Function
Temp              782.49        149.06 FileSystem   /var/folders/0_/h5qr4rfj2     
Variable                               Variable                    

Which was the expected result. Now, some caveats:

  1. This is literally the extent of my testing. Making this work with custom modules, etc, is going to involve more work on the setup, if that’s even possible.
  2. I HAVE NO IDEA IF THE POWERSHELL LICENSE ALLOWS THIS. If the lawyers get flustered and stern, telling them “Well John said you could do it” will result in me laughing and saying “Yes, technically this is possible, I made no claims as to legality. It may be fine, it may be verboten, I have not checked either way.

My thought on this is that if you were creating a cross platform app and you wanted to implement PowerShell for that app, this is a way to ensure it works on macOS without making the user do a bunch of separate installs. It’s a proof of concept that happens to work, so coolness. As I get time, I’ll beat on it more, there’s some real potential here.

Also, many thanks to the PowerShell team for doing things in a way that made this trivial to test and play with.

Things AppleScript does better

Seeing as I’ve (deservedly) been slightly hard on the Apple automation team(s), I’d like to point out that for all its faults, there are things AppleScript does far better than PowerShell, and things that Microsoft could learn from to improve.

  1. Basic UI functions: I’m talking about really basic things that you might want to put in a script, like a UI for choosing a file, a folder, etc. Like this isn’t even close, PowerShell at best is almost as easy as AppleScript for some things, but at worst…ugh. Like to choose a new file, that’s pretty simple:

    $FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property @{ InitialDirectory = [Environment]::GetFolderPath('Desktop') }

    $null = $FileBrowser.ShowDialog()


    That’s not significantly worse that AppleScript’s “Choose File” dialog, which forgoes only the ShowDialog() method. The issue is that depending on how you’re running the script, the file browser may show up behind other windows, and there’s no easy way to force it to the front.

    Where PowerShell falls over is if you want the user to enter text in a dialog box. For AppleScript, it’s literally a single line:

    set theReply to text returned of display dialog "I am a dialog" default answer "default text"

    That’s it. That’s all you need. You can also get the button returned or if the gave up flag hit, it’s all in a single record if you want all three.

    In PowerShell, this is like around 30 lines of code where you’re building the OK and Cancel buttons by hand, setting their sizes, etc. That’s ridiculous for something that simple. The same thing for building a list of text items to choose from. In AppleScript:

    set theChoice to choose from list {"one", "two", "three"} with title "Choose from the list" with prompt "Pick something" default items {"one"} with multiple selections allowed without empty selection allowed

    The result is a list of choices.

    PowerShell, again, you have to build the entire dialog by hand. Like, if i’m doing .NET dev in a text editor, sure. But for what should be a higher-level scripting implementation, that’s ridiculous.

    Ironically, in PowerShell in macOS, you can integrate PowerShell with AppleScript UI primitives like Display Dialog easier than you can do the same thing in Windows. Y’all, come on. This is just stupid.
  2. Interacting with applications. The syntax for AppleScript gets a lot of guff, but launching an application in AppleScript and similar is far more intuitive and easier to learn than in PowerShell.

    tell application “Microsoft Excel” to launch

    or if you want it to launch and become active:

    tell application “Microsoft Excel” to activate

    With PowerShell, it can be simple if the application is in a known location, but if not, you have to know the path, etc. Which can be more than a little painful, but that’s an OS issue.

    Being able to just tell an application to do stuff, “tell application “Microsoft Excel”… is also nice when actually interacting with the running application. As opposed to working with the running copy of Excel:

    $theExcelApp=[Runtime.InteropServices.Marshal]::GetActiveObject("Excel.Application")

    This is the part of a higher level language that shouldn’t require this level of work. The level of detail needed to do simple stuff in PowerShell is way higher than it should be.

There’s other items, but some of them depend on syntax preferences. I find the english-ish nature of AppleScript to be helpful, other people hate it, I hate dot languages, I find them unnecessarily obtuse. The two items I listed here cover rather a lot, so it’s not just “oh two things”.

Honestly, this is part of the reason I find Apple’s hostility towards user-created automation via things like AppleScript so infuriating. The OS has a good base, only needing a better system-level automation framework so you can have more coherency across the board for automation, but instead of building on that foundation, it really seems like Apple is moving to gut the entire thing and force you into Shortcuts, Swift or nothing. That’s really quite dumb.