Jack of All Trades…

Is pretty damned awesome

Because it’s come up a few times on LinkedIn, I wanted to talk a bit about “job-hopping.” You know that awful thing people do where they don’t work at a job “long enough” and something, something, it’s a bad idea to hire you.

I don’t know why this bullshit persists…

Okay, I do know, and it’s stupid. It’s a holdover beloved by boomers from the days when you worked for one company your whole life. Boomers love that shit. “Loyalty”…man, they go on about loyalty and how no one is loyal anymore. You know who’s really not loyal?


Yeah, none of these jerks want to talk about that. About how the days when you got regular cost-of-living raises, pensions, retirement benefits, and promotions and all the other boomer-era employment participation awards that boomers act like still exist. (And yeah, that’s what a COLA raise is: a participation award. You showed up, you didn’t screw up, here’s more money. Boomers will crap all over participation awards while remaining preciously ignorant about how their entire lives were wrapped around them.)

Hell, boomers still believe that the majority way you get a job is by going into a place and showing gumption and potential.

OK Boomer

What nonsense.

Like seriously, they bang on about loyalty, but when a company yeets thousands of people into the streets, “Oh well, you have to make hard choices.”

Boomers are curiously unidirectional with their loyalty crap. The company can crater your benefits, make your health insurance nigh-unaffordable, make you pay twice for health coverage (That’s what an HSA is. Paying twice for the same thing. Tax benefits don’t change that), and make you pay almost all the cost for your own pension while acting like a 5% match is some kind of gift. Honkie please.

Oh, and they hate unions. I mean, not when they meant their parents had some protection, because it was okay when they did it. But now that they’re at the top, them workers gotta stop being so uppity and acting like their labor has any value beyond what we feel like paying.

Boomers demand total loyalty and show precisely zero. Boomers, as a rule, can kiss my ass. (Thank god my parents were Depression-era. I lucked out that way.)

When your only way to get a raise is to switch jobs, of course you’re switching jobs on the reg. When the only way to get better benefits is to switch jobs? Get a promotion that actually includes more money? When there is no job security because the CEO only cares about their exit package? When your retirement is almost entirely dependent on how well you play the market? When companies fight for “at will” employment, which means not only can they fire you because it’s Tuesday, but you have almost no protection against discrimination because any pushback and “So we’re making your position redundant, get out.”

(Don’t give me that shit about the law. If you can find a lawyer who will basically work for free for years and If you can keep fighting for decades and If you accept that once the word gets out about the lawsuit, you’ll be basically unemployable and can afford that…then you can sue over discrimination. If all of those aren’t true, you’re stupid to even try to sue for it. The law just makes it possible to win, not easy.)

So anyone giving you crap about job-hopping, if it’s at all possible, walk out. Working for that person will suck low-quality balls. That is a person who wants loyalty from you and will show you none.

The Upside

However, there’s actually an upside to it, “job hopping” that is: breadth of experience.

If you want well-researched details, “Range”, by David Epstein is an amazing book. But just in my own career, where I’ve changed jobs, on average, every 4-5 years, with different positions and industries, my “lack of loyalty” means that I have this bizarrely wide range of experience as a Sysadmin and IT tech. While the tech I’ve worked with has been basically the same from gig to gig, the implementations and management requirements of that tech has been really different, and that has been huge for me.

It’s let me avoid getting locked into a “magically perfect” solution, because how? When you go from a university to a financial services company, how the hell would what works for the former even be an option for the latter? If you tried to impose a financial service’s network requirements on an advertising company, you’d be fired out of a cannon into a brick wall, and rightfully so, because it would be a remarkably stupid idea.

Because I’ve had such a wide range of experience and situations, I can handle a wide range of problems, and be able to come up with solutions that I’d never even begin to think of if I’d worked in the same situation for 28 years. In a sense, it’s a lot like being a consultant: I’ve never had the luxury of consistency. Honestly, doing the same thing for too long, I get to where I’m just phoning it in. I’ll automate it within an inch of its life, make it all into a system, and get it to where a reasonably smart marmoset can run it.

That’s what I do to everything eventually. And while it seems nice, honestly, only having to work an hour a day or so is boring as hell. Which is when the resumes go out. If I can find a new thing to do, then that’s cool again, but, a lot of companies make that remarkably hard to do, which is just a “WHY???” thing. But it happens a lot.

So yeah. I may not have deeeeep experience in any one thing, (other than maybe SNMP and AppleScript), thanks to my “Bored now, bye” thing, I’ve encountered damned near everything, and so nothing is all that surprising, and if it is new, I can figure it out pretty quickly.

So when you see a resume that shows a wandering-assed path through the world, realize that just may be a gift you’re looking at, and take more than six seconds to really read it. May work out better than you’ve been trained to think.

More Complex Powershell AppleScript interactions

What If You Need More Than One Line?

So in the process of creating my series of ways to have PowerShell connect to AppleScript, thereby giving PowerShell far more access to the richness of what macOS has to offer, I realized that what I’d been doing had been limited to one-line commands. Choose File, Display Dialog, etc. Those are valuable, but sometimes, you need more than one line.

Now, usually, you’ll see a lot of really weird calls to osascript -e with complex quoting and escaping, things that tend to not work terribly well within the PowerShell world on macOS. But, there is another way to use the AppleScript runtime from the shell environment. You create a shell script, i.e. “osatest.sh”, but for the shebang, you use #!/usr/bin/osascript instead of the usual #!/bin/bash.

The advantage of this method is that you can then write fairly complex scripts with multiple lines, user input, probably even application control, and then provide a return to the powershell environment in a number of ways, i.e. return values or a return file path.

As a short example, here’s one I used in testing:


display dialog "Look ma, multiple steps with osascript and no funky quotes!"
display alert "THIS IS SO COOL"
set theList to {firstword:"This",secondword:"Is",thirdword:"a",fourthword:"list"}
return theList

So this does a few simple things. Displays a dialog, displays an alert, builds a record, assigns it to a variable, and returns the variable to the calling process. But it’s literally bog-standard AppleScript. No special escaping or quoting needed, multiple lines are fine. you could copy everything but the shebang line into the macOS script editor, and it would work fine. (I literally did that, it did in fact work correctly)

If you call it from PowerShell, it works as expected, nothing funky:


firstword:This, secondword:Is, thirdword:a, fourthword:list

So you can return basic data types, numbers, strings, records, lists to PowerShell from AppleScript and then use that within PowerShell. Which creates some complex potential for more than just one-off commands.

You can even have some back and forth, wherein you use the one-off commands to get information from the user to build a proper .sh script that is run against the AppleScript runtime, do things in other apps, like the Finder, Word, Photoshop, <other scriptable thing>, and then return usable results to PowerShell.

I’m a bit busy for the next month or so, but I’ll try to get some examples going to show just how neat this can be. I’m also going to start using the tag “powershell-applescript-bridge” to help folks find these posts easier.

Using AppleScript’s “Display Dialog” with PowerShell

Okay, I didn’t have a clever title. Sue me

One of my biggest complaints about PowerShell is the lack of UI primitives. Coming from AppleScript, I’m used to having a variety of very basic UI options for the using, without needing to do anything to set them up. Need to get a simple response from the user? Display Dialog. Want them to choose from a list? Choose From List. Choose Folder, Choose File, etc. They’re all a part of core AppleScript.

You can eventually do the same with PowerShell, but even on Windows, there’s a lot of .NET setup and silliness you have to deal with. Which is dumb, and something the PowerShell team should address at some point. But that’s way off in the future, if ever. So I set about solving this for myself, and came up with method that seems to work, albeit in an awkward way. The sample script and readme are up on my GitHub site, and it’s not all that complicated. I spend more time shoving the results of the command into a hashtable than I do running the command.

I used “Display Dialog” because it’s a simple command that is fairly representative of most AppleScript UI primitives. It’s a single-line command with various parameters, and it returns a record (hashtable in PowerShell parlance) as a comma- and colon- delimited string, at least as far as using it this way is concerned.

This uses the osascript command to run the AppleScript command(s). One thing that failed miserably was trying to use Invoke-Expression -Command for this. Primarily, this seems due to the complex use of single and double quotes within osascript that causes PowerShell to vomit all over itself. However, if you pipe the command to osascript, then it works alright, as seen below:

$results = 'display dialog "this is a test" default answer "default answer" with icon caution'|/usr/bin/osascript

this runs display dialog as expected, and gives you a single, comma-delimited string for the record returned. Each item in the record is split by a comma, so the initial return looks like:

button returned:OK, text returned:default answer

Okay, that works, it’s consistent. The next thing I do is split the string into an array with the split command:

$results = $results.Split(",")

So if all you want is each record in its own element, you can stop there. For my case, I wanted the return in PowerShell to be a hashtable, aka AppleScript record, so there’s a couple more steps. First, we iterate through $results to look at each item separately. Since the second item in the array is going to have a leading space, we run each item through Trim() to remove leading/trailing whitespace characters.

$result = $result.Trim()

Then I take each item and split it on the colon which creates a new array entry.

$temp = $result.Split(":")

From there, I take the elements of the array, and insert them into a hashtable:


If you’re dealing with truly large returns/strings, this would be a suboptimal method. But, AppleScript display primitives don’t really return massive amounts of data, so it’s not a big deal. Here’s the entire code block for the whole script below. The first line creates the dialogReply hashtable:

$dialogReply = [ordered]@{}
$results = 'display dialog "this is a test" default answer "default answer" with icon caution'|/usr/bin/osascript

$results = $results.Split(",")

foreach ($result in $results) {
     $result = $result.Trim()

     $temp = $result.Split(":")

Then to display the hashtable, just call $dialogReply and you get:


Name                Value
----                -----
button returned     OK
text returned       default answer

This should work with a large number of, if not all the AppleScript primitives. I’ll be looking at playing with different ones as I have time.

From My Heart and From My Hands

It’s ALIVE!!!

So the script I talked about here is now an actual importable Poweshell Module. It wasn’t hard, but it was a pain in the ass, more than I think it should have been. Most of this is because, unsurprisingly, almost all the documentation on modules is highly windows-centric. So hopefully, this post will help that.

This is a very basic module. It exports no cmdlets, just a single function. Which means this post is not going to be a huge help for some gigantic thing, but it hopefully is a start.

The first thing you want to do is make sure you have a working script, aka a .ps1 file. Doesn’t have to be complicated, Get-MacInfo really isn’t. It just grabs a bunch of info, shoves it into a hashtable and shows you what it found. Once your basic code and logic is working, you’ll want to copy the .ps1 file to a .psm1 file. That’s important in the PowerShell world, as that’s the traditional extension for a PowerShell Module.

Next, wrap all your code that isn’t comment-based help in a function. In my case, it was just: function Get-MacInfo {<code>} Since it’s a single-function module, the only function is well, Get-MacInfo. If you’re going to have multiple functions, including setter functions, you really want to read the pertinent MS docs, starting here. Understanding those will help. Once you’ve wrapped it in a function, you want to export that function so it’s available to Powershell. To do that, add an Export-ModuleMember line to the bottom of your .psm1 file like so:

Export-ModuleMember -Function 'Get-MacInfo'

This is the only function I have, so I only need a simple line. If you’re exporting cmdlets or both functions and cmdlets…you’re probably well beyond what I know or can help with.

Next, we want to create the module manifest. This is fairly straightforward, but there’s some gotchas that can make you a bit bonkers. The basic creation is simple, you use the New-ModuleManifest. There’s a lot of parameters you can use, the only one that’s required is, I believe, the -Path variable. Mine looked like this:

New-ModuleManifest -Path 'Path for where you want the .psd1 file to be created' -ModuleVersion '1.0'-Author 'John C. Welch' -Company 'Bynkii.com' -Description 'A macOS version of the Get-ComputerInfo module' -ProjectUri 'https://github.com/johncwelch/Get-MacInfo' -ReleaseNotes 'First module release' -HelpInfoURI'https://github.com/johncwelch/Get-MacInfo/wiki' 

That will create a basic .psd1 file in the location that you specify in the -Path parameter, and you’re almost good to go.


So there’s a couple things you have to do manually. First, you have to actually tell the manifest what it’s referring to. If you open the manifest file (it’s basic XML), look for the #RootModule = ” line. MAKE SURE YOU UNCOMMENT IT. I didn’t and lost my fool mind for a few hours. Put the name of the .psm1 file in between the single quotes. Mine looks like:

RootModule = 'Get-MacInfo.psm1'

Next, look for the FunctionsToExport line, make sure IT is uncommented (it should be) and put in the name of the function(s) you’re exporting from the .psm1 file. Since I only have the one, mine looks like this:

FunctionsToExport = @('Get-MacInfo')

Everything else should be okay as is. Once you have those two files set up, you want to put them in the right place. You can put them anywhere, but putting them in the paths that PowerShell knows about makes your life much easier. On my machine, since I’m running the 7.1 preview, my paths (and the command to show them) look like this:

 $env:PSModulePath -split ‘:’                                                                                                                                                 ~/.local/share/powershell/Modules                                                                                                                                                  /usr/local/share/powershell/Modules

If you only want to have the modules available to a single user, put them in the home directory Modules folder. For everyone on the machine, put them in one of the others. I have specific install instructions on the GitHub wiki for the project, so I won’t belabor those here.

If you put the module in the right paths, running Get-MacInfo will automatically import the module, and it will just work. If you put it somewhere else, you have to deal with manually managing Import-Module, and that’s on you.

Please note this is a really simple project. Modules can be really, really complicated and include binary files. This guide will be of no help at all for those. But if you’re just getting started, this may be of use. In any event, if you want to use Get-MacInfo, it’s on GitHub, with the install instructions,

PowerShell Fun

As some of you may know, I’ve been dabbling in Powershell for some time now, and have started trying to make it more useful on the Mac. There’s a lot that you can do on Windows that should work on the Mac and as of yet does not. One of the sillier things is “Get-ComputerInfo” which on Windows shows you all kinds of neat things about your computer, and doesn’t exist on the macOS version of PowerShell. I thought this was a shame so I set about trying to replicate this functionality, so that you get at least some of it. (There’s a lot of windows things that don’t make sense on the mac, so I didn’t worry if I didn’t get those.)

The name of the script is “Get-MacInfo” and it’s available from my Github site. It’s a pretty simple script, that should work under current versions of PowerShell. I’ve built and tested it under the 7.1 preview versions, but there’s nothing in there that shouldn’t work under 7.0. Prior to 7.0, no idea.

The script itself pulls in data from a number of sources, including uname, sw_ver, system_profiler, osascript, sysctl, and some built-in PowerShell functions. It dumps them all into a hashtable so they can be displayed/retrieved easier. If you run the script without any parameters, you get the full table dump, as seen here:

Name                           Value
----                           -----
macOSBuildLabEx                17.7.0:
macOSCurrentVersion            10.13.6
macOSCurrentBuildNumber        17G12034
macOSProductName               Mac OS X
macOSDarwinVersion             17.7.0
SMCVersion                     1.70f6
HardwareSerialNumber           ***************
HardwareUUID                   ********-****-****-****-************
HardwareModelName              MacBook Pro
HardwareModelID                MacBookPro8,3
CPUArchitecture                x86_64
CPUName                        Intel Core i7
CPUSpeed                       2.2 GHz
CPUCount                       1
CPUCoreCount                   4
CPUL2CacheSize                 256 KB
CPUBrandString                 Intel(R) Core(TM) i7-2720QM CPU @ 2.20GHz
L3CacheSize                    6 MB
RAMAmount                      16 GB
AppMemoryUsedGB                9.9869
VMPageFile                     /private/var/vm/swapfile
VMSwapInUseGB                  1.5210
BootDevice                     /dev/disk1s1
FileVaultStatus                Off
EFICurrentLanguage             English (United States)
DSTStatus                      True
TimeZone                       America/New_York
UTCOffset                      -05:00:00
DNSHostName                    ********.local
LocalHostName                  *********
NetworkServiceList             Ethernet, iPhone USB, Wi-Fi, iPad USB, FireWire, Bluetooth PAN, Thunderbolt Bridge
CurrentUserName                ******
CurrentUserUID                 ******
CurrentDateTime                5/20/2020 8:42:19 PM
LastBootDateTime               May 7 17:26
Uptime                         13.03:16:50

If you just want one or more of the parameters, then you’d supply those as a comma-delimited list, i.e.:

./Get-MacInfo.ps1 RAMAmount,NetworkServiceList,macOSCurrentBuildNumber                
Name                          Value                                                                                               
----                          -----                                                                                               
RAMAmount                     16 GB                                                                                               
NetworkServiceList            Ethernet, iPhone USB, Wi-Fi, iPad USB, FireWire, Bluetooth PAN, Thunderbolt Bridge                  
macOSCurrentBuildNumber       17G12034                     

The script itself takes about a second or two to run, as it collects all the info first, then displays what you want. Yes, that’s wasteful to some, but a) takes less than two seconds to run on ten-year-old gear and b) I’m lazy. You’re welcome to improve on it.

Actually, that last part is serious. I didn’t get every possible parameter, i can’t think of them all. The script itself is extensively commented, so even a novice should be able to figure out what’s going on and add to it. I’m also not that clever with output formatting as you can see from the code.

Anyway, now that I have the parameter input done, my next step is to turn it into a proper PowerShell module so you can incorporate it into your system. Hopefully, that will be sooner than later, and I’ll try to put a post up about it when I can.

In terms of using PowerShell, as a language, it’s really very nice. I like it better than most, it’s really easy to get up and running quickly and there is a LOT of documentation and support from both Microsoft and third-parties. (HEY APPLE! YOU SHOULD THINK ABOUT STEALING HOW MICROSOFT DOCUMENTS THEIR AUTOMATION STUFF. IT’S REALLY USEFUL. HINT!!!!)

Visual Studio Code (VSC) is a really nice dev environment for the Mac, and you really only have to install like two extensions to get it working well with PowerShell and GitHub, specifically “PowerShell” and “GitHub Pull Requests and Issues”. Both are easy to use and in conjunction with VSC give you most of the tools you need to get things done.

My biggest complaint about PowerShell is the complete lack of UI primitives, like simple dialog boxes and lists. Building those without external modules is just ridiculously tedious and kind of inexcusable. Come on Microsoft, AppleScript has had “Display Dialog” and “Choose from List” for how many decades? This is just silly that a modern language doesn’t allow you to create simple UI elements without a gob of code setup.

My second biggest is that its OS integration on macOS is pants. I mean, look at the reason for this post. What would be neat, and probably doable, instead of trying to replicate Apple’s existing scripting implementations in PowerShell would be for the PowerShell team to just build an event broker that could take Commands from PowerShell, spit out Apple Events to the OS/other applications and return the results to the script. I’m not saying that’s easy but over time, it’s probably less work to build a daemon that does that instead of trying to replicate many decades of work in terms of OSA languages.

Either way, Powershell on macOS is quite usable, and y’all should give it a try. The instructions for doing so are here, and you can do it via either Homebrew, or just downloading the installer from MS.

It’s A Trap!

Why “If it ain’t broke…” needs to be abandoned

Before you read the rest of this, I want you to read this: https://www.us-cert.gov/sites/default/files/publications/AA20-133A_Top_10_Routinely_Exploited_Vulnerabilities_S508C.pdf

It’s not terribly long, or hard to read, but I want to note some things. First, out of ten listed items, four of them pertain to Office 2007. That’s over 13 years old, yet evidently, in widespread enough use to still be a problem. There’s items for Windows Vista on that list.

It’s not just Office. There’s people using Apache Struts versions that are three years old. The last version of Visual FoxPro was in 2007. Versions of Flash that are years old. Flash. One of the biggest security holes in the world, and people are a) still running it and b) running old versions. Over and over, and over…old software. not months old, but *years* old. Many years in some cases, and I guarantee you one of the justifications for this?

“It still works”

Followed by:

“It’s too much work to update”

There’s more, but you see the point. All of it tends to be driven by the “If it ain’t broke, don’t fix it” philosophy, which is usually driven by desires to reduce costs, either labor or fiscal. But does it work? Really? I have severe doubts that it ends up saving time. Because when you hang on to old software across that many years or decades, the cost to update, to move on, gets higher every year. I mean, I hope no one is still running critical software based on Visual Foxpro, but if they are, the cost premium to move that application, those applications off 13-year-old database software to something modern? Supported? It won’t be small. It’s never small.

And before I hear the “you don’t know…” nonsense start up, yeah Seymour, I do know. I know real well, I have since the mid 1990s. I’ve known since I moved from DOS to Windows to OS/2 to Windows. I’ve known since I moved from System 6 to System 7 to MacOS 8 to MacOS 9 to every version of OS X and macOS since. Solaris, ditto. AIX. OS/400. Linux. I’ve known on a loooot of platforms. In a lot of different industries. At a lot of different levels. I know, all too well.

But that’s part of IT, of tech. You have to stay current. Yes, that’s a pain. Y’all, I’ve been in the Mac ecosystem for a long time, I am well-aware of how people can expect that vendors support old versions of software forever for no cost whatsoever. But that can’t happen. Codebases have to be updated to stay current. That means moving on. It means accepting that even if it’s working correctly, you may have to change it, or “fix” it, or update it. Because if you don’t, you find yourself with ransomware. Or a dead server. Or network.

You have to stay current.

Part of not being “broke” has to include “up-to-date”. Even when that’s inconvenient, or expensive. And sometimes, that means having to move away from something that works well, and is easy to maintain. It means changing platforms. Sometimes it means moving from on-prem to cloud which is even more complicated, because now your security posture, your DR/BCP, your procedural stuff, all of it gets so much more complicated. Deployment gets easier with cloud-based software, sometimes, but other things get worse.

And yes, I get it. Sometimes, there’s stuff you have to stick with longer than you want. But there’s ways to manage that too. Virtualization is your friend there. Yes, that’s more complicated, but again, look at that list. Look at the shear number of years-old software. Sometimes decades-old. I have extreme, extreme doubts that none, or even most of the installs of Office 2007 are necessary, that there’s a hard reason that can’t be overcome.

This can’t continue. This isn’t a case of Company A’s network is completely disconnected from Company B’s like it was when I started in IT. With rare exception, every network is connected, on some level, to every other network. The cheese does not in fact, stand alone. As well, everything happens too fast for humans now. A ransomware attack, a DDOS attack, they happen far too fast for human reaction and response speed. A great example is the Maersk attack from a few years ago. There is no way for humans to respond at all to a modern attack, much less well. I don’t care how smart you are, you’re a human, you’ve evolved in a world where milliseconds are your speed limits. Meanwhile, the computing world is operating in nanoseconds. You’ve lost before you start.

You have to stay current.

If you don’t stay up to date, even with all the pain that can involve, you’re at risk. If you have customers, their data is at risk. If you’re a medical company in any way, that data is at risk. I’m not just talking about someone stealing the data either. What happens to a hospital when their entire EMR system is encrypted by ransomware that has no unlock key? Where you pay up and the thieves run off with their bitcoins and you’re screwed. Backups can help there, but if you’re not planning on how to do network-wide restores, that’s still a non-zero amount of time, and depending on your industry, even if there’s no data exfiltration, you still have a lot of explaining to do.

I am also aware that often, the software you use is based on multiple packages, and those vendors don’t always keep up to date, or vet the sources they use when building their software. Open source can be even more annoying, because “you have the source just build it yourself” is so often the refrain of lazy, well, assholes, who want the fun part of being a software vendor, the cool parts, but not the boring, dull grind that is support and updates to avoid problems.

“Just build it yourself” only works if you have the ability, time, and skills to do so. As well, if I’m going to do all the work to build and maintain software, why would I pay someone else to do it? There’s an economic downside to being an asshat here, people should stop. If my company is an auto repair shop, I’ll do your work for you for free when you show up and fix my customer’s cars for free and pay my rent. (Funny how no one ever takes up that offer. They get all MY time has value” about it. Well Seymour, so does mine.)

Customer support, software support sucks. The economics, in most cases are awful. I read a while back that the third support call on a given piece of software effectively erases any profit made from that sale. Subscriptions may have changed this, but not by much. It’s tedious, but that’s part of the gig. Which means keeping all your stuff current, even that really cool framework you got from someone’s site. Which can be really hard when the person who made that framework decides “nah, i’m done” and punches out.

(There are many reasons why I am not convinced the “build your software from 49087435 remote distribution sites is as good as people think. The above is one of them. So is security.)

It’s even worse when you consider the lackadaisical approach to secure coding in CompSci/Coding Bootcamp programs. Too many effectively ignore it beyond trite warnings, a big chunk of the other have a token class or lecture on it. It’s treated like a bolt-on rather than an integral part of coding starting at the most basic levels. I sometimes think ethics gets taken more seriously.

But you can’t do anything about that. You can’t fix how the people writing your software were taught, how they view things. You have no power there, other than that of the checkbook (assuming you’re paying for your software.) If you do have a “I’m paying you money” relationship, then you have more power than you think. Don’t be an ass about it, but if you’re paying money, then I think you have a reasonable expectation that the company/person accepting your money will keep their stuff up to date so you can as well. But that does mean that when they update, you have to update too.

(No, not immediately. Did I say be stupid? No, I did not. But 13 years is not a reasonable amount of time to test an update.)

Ultimately, if you don’t want to be on the “list of companies that got hacked because of old stuff”, you have to stay current. There’s no other option.

Speaking of ARMs and Legs…

Okay, it’s an awful title, whatever.

Once again, the “OMG ARM-BASED MACS ARE COMING” news is making the rounds. In the past, I’ve been fairly dismissive of this, because it’s been kind of ridiculous for a number of reasons.

But lately…

So looking at it logically, and based on actual past data from the previous CPU architecture changes Apple’s done with the Mac, there’s a number of things that have to be as close to zero-effort as possible:

Application support: This should be assumed without saying, but these days, assuming is a bad idea. The last time Apple moved chip architectures, when they moved to Intel, it was…well, it wasn’t painless, and it obviously wasn’t overly painful. It worked more often than not, and really the only major initial casualties were Classic Mac apps that people were still using. To the best of my knowledge, Apple never even tried to get Classic working on Intel, at least not seriously, and honestly, that was a smart decision. Classic was done, it was time to move on. That left a transitionary period for PowerPC apps to move to Intel, which by and large, Apple handled well. Or at least well enough. There’s a few apps that didn’t make it, but honestly, I think the knifing of 32-bit support in Catalina probably caused more code casualties than the removal of any support for PPC apps. That transition took longer than the 64-bit only one did, and by the time it happened, PPC-only apps were ones that were never going to move.

So is there currently application support for ARM? Mostly. Starting with 10.14 and accelerated by 10.15, the ability to have the same apps with the code run on ARM (i(Pad)OS) and Intel (macOS) is pretty far along. It’s not completely there yet, mostly at the UI level. SwiftUI is very much an i(Pad)OS first, macOS second tool. Every piece of macOS code I’ve seen come out of SwiftUI is very clearly an i(Pad)OS app in appearance.

That’s not to say SwiftUI can’t be improved, I mean, it’s not even a year old yet. But at the moment, would I use it for complex macOS apps that fill UI needs you just aren’t going to have on iPads or iPhones? No. But it’s improvable.

Apple’s also spent the last decade or so building developer tools that not only let devs not have to get down into the iron as it were, but have actively discouraged it. The imminent death of kexts is another example of this. The more distance between code and the underlying hardware you have, the less you care about the underlying hardware. If the frameworks all handle the low-level stuff for you in the ways you need, why waste time fiddling other than to prove you can? Even complex apps don’t have the same need for hand-tuned assembly that they once had.

The apps aren’t there yet, but man, it’s close, and I think you’ll see it get a lot closer in 10.16.

Hardware Performance: Yeah, this is there. In fact, it may be more of a jump in performance than going from PPC to Intel was. At this point, it’s not even close and given the crazy performance ARM provides in resource limited things like iPhones and iPads, ARM in a MacBook Pro? A Mac Pro? Hell, a Mini? Come on, this will be crazy faster. The GPUs might be an issue, but I can’t see AMD or even Nvidia deciding to tie their fortune to Intel on that scale. As well, Apple’s internal GPU design teams have been doing amazing work, again, in environments where power and thermal headspace are really limited. In a Mac Pro?

they’ve gone plaid!

This isn’t even a serious question. The ARM architecture has been performing like Intel wishes it used to. “redonkulous” is a word I would use for it. That’s important because…

Virtualization/Bootcamp: Honestly, this is the only real question in all this. If Apple goes to ARM, whither Bootcamp and virtualization with acceptable performance. Anyone thinking this is a minor feature is…I will be kind and say they are thinking suboptimally. Being able to run Windows/Linux/*BSD/etc. on the physical hardware, or at acceptable speeds in various hypervisors is a serious feature the Mac has. It’s not always as easy as it should be, but it is there, and it is something a lot of people use. Losing that would force a lot of people off the platform, especially in corporate quarters, an area where the Mac has done very well over the last few years.

It’s an important factor, a critical question and one that Apple would not just have to “sort of answer”, but be able to solidly answer and demonstrate said answer should the announcement of ARM Macs be made.

So is Apple going to announce ARM Macs at the 2020 WWDC? Maybe, but I’d not want to put money I cared about on it. 2021? That might be different based on the WWDC, and I think anyone interested in this issue in a serious manner needs to spend quality time with the WWDC sessions this year. Not just the keynote, but pay particular attention to the “State of the Union” sessions, especially the things scrolling by in the background during the “other features we don’t feel like talking about” part. Yeah, it’s a bit of Kremlinology, but those have been pointers before.

I think ARM Macs are far more of a possibility than they were a year or two ago. But I think 2021 is the more realistic year for them.

Well Isn’t That Just Dandy

Yes, yes it is

So for some of you following me on Instagram/Facebook/Twitter, you’ve noticed the tie/vest pics i’ve been posting. Honestly, it started as a lark, but folks seem to dig it, so cool.

If you’re wondering where I get stuff, to date, that’s been easy: The Men’s Wearhouse Clearance rack. Seriously. I get really good, quality shirts/vests/ties for cheap. Cheaper than Target, and much better quality. Like shirts and/or vests for $12. Pants, meh, I get those at Costco.

The tie knots are tutorials from a couple places. https://agreeordie.com/features/fashion/616-how-to-tie-a-necktie-eldredge-knot/ is a good start. They have videos and diagrams. For a lot of the others, it’s videos or nothing, and honestly, some of them, even a good diagram doesn’t work that well. There’s one guy who’s my go to on these, Patrick Novotny. He’s engaging, and he does a really solid job of showing you how to tie some of these really complex knots in a way that doesn’t leave you falling behind or floundering.

I’ve found some good potentials for vests, so as I add to the collection, I’ll show them off. But as a start, Men’s Wearhouse is solid.

It Is Not Your Place

At some point, men have to stop being jerks

First, read this twitter thread, and its associated sub-threads. It’s not a pleasant read, and if you feel called-out by it, good, you should. I genuinely hope you are feeling a good amount of shame and guilt, as the behavior it talks about is bad, and anyone engaging in it should feel bad. The idea, the mental breakdown that passes for a coherent thought that a random guy has the right to go up to a stranger and “pop quiz” her on self-defense, to “educate” her on how she’s being unsafe is…there are not enough letters in “appalling” to properly get across how appalling this is.

First, it’s not your damned place to do that. It never is. And those of us not engaging in this awful behavior, regardless of where we identify on the gender spectrum wish you would stop. Yesterday. Just…just what kind of egotistical, self-centered, full frontal idiocy has taken residence in your brain to where you think it is your job to require a random woman to perform for you? I want it to be brain worms, or an undiagnosed tumor, because that way, it’s not you actually fully doing it.

It is never brain worms or an undiagnosed tumor. It is however, as neat an example of male privilege and patriarchal behavior as one could wish for. The assumption that any woman exists for you to “educate” on some random subject. I mean, there is an unintended benefit. Everyone witnessing this will indeed be aware of how dangerous men are. Starting with you. Because what you are doing is attacking a random woman.

If you wish to raise “but I meant well”, save your breath, and your effort, your intent is meaningless. In fact, it makes things worse, because you actually think, according to your intent, your stated intent, that you are behaving not as an attacker, as someone to treat with extreme wariness and a healthy amount of fear, but as a “friend”, as someone doing someone else a favor. You are literally so…stupid, so self-centeredly, preciously, stupid that you think you are behaving in a societally beneficial way.

You are not. In fact, the only difference between you and a rapist is the lack of sexual assault. You have however, still committed several crimes, of this there is neither doubt nor argument. When you come at someone, when you touch them or verbally engage with them in a threatening manner, deliberately designed to create a fight-or flight reflex, you are committing crimes. You are a criminal. In a just world, you would be arrested and tried as such. But given how (American) society (at least) treats such paternalistic fuckwittery, you’d instead get a pat on the back and encouragement.

“Gee, why do so many women hate men.”

You. It’s because of you, and your ilk, and I rarely use the word “ilk” with such satisfaction as I do here. You aren your ilk are why. If “asshole” could be a gender, so many people would make you its spokesperson. It is not women’s job to exist in a perpetual state of battle readiness just to satisfy you. It is your job to not be an asshole. Alas, we’ve already seen you’re a failure in that arena.

As you are so determined to be “that guy”, allow me to point out that you are also lucky beyond measure, because if one of the women you attacked were actually trained or were actually concealed carrying, you know, the things you (don’t really) want them to be so they can not be (very much be) helpless against “bad” guys (like you), then you would be at best getting a ride in an ambulance, if not a hearse. Let us look at the scenario with the woman who was locking her bike. (for background, I have more than two decades of experience with martial arts/self-defense and am a currently active practitioner. In the last year I’ve had multiple bruised ribs, at least one fracture, and a possible fracture of another bone. Or two. I am not an idle observer in this area.)

I’m kneeling down to lock up my bike on a busy corner; a man steps over the center bar of the bike and onto the shoulder strap of my bag. He then explains, while I’m immobilized, that this was a little free safety advice. No shoulders bags!

So, there are two positions that “kneeling” can apply to, I’m going to go with the most common, on one knee, with one foot on the ground. The woman has some form of shoulder bag, probably in an over the shoulder, ’round the neck, under one arm configuration, that would be the easiest way to carry a shoulder bag on a bike. (We’ll assume bicycle, given her description.) She also, given her activity, has a rather large either U-Lock or a reasonably heavy padlock/chain combination.

Dipshit comes up, probably from behind, and steps on her strap, probably right where it meets the bag, that would be the easiest way for him to pull off this bugfuckery. She’s helpless now, right? Nope.

Since she can’t easily turn, her first priority is to gain some freedom of movement. As the pressure on the shoulder strap would be pulling her down and to one side, her best bet would be to use the foot already planted and shove as hard as she can into the foot/lower leg of her attacker. She only needs to move his foot a bit to get some free space. Even against a large man, this would be effective, especially given our Dudley-Dumb-Right here.

Once she disengages his foot off that strap, then the variables increase. She can pull back the other way, creating more space between her and her attacker, possibly getting the bike or whatever between her and him, (a good choice, gives her more overall options.) If she’s really successful, and more mad than scared, then he’s stumbling as well, and she has a very heavy, very hard weapon in her hands, a solid case of self-defense, and a chucklehead about to have the worst day ever. If he’s lucky, she only hits him in the nuts with the lock. If he’s unlucky, she lays his ass out with it. Possibly permanently.

If she’s got a good grip on the chain, since she knows where at least one leg is, and a quick look will verify the location of the other, then she has a really good triangulation on where his skull is. She also has a chain, with a padlock of some kind on it, and one functioning arm. Even a near-direct hit will work for her immediate needs.

See, here’s the thing these yutzes never think about: while he knows he’s “helping”, she, dear reader, she does not. She only knows that someone is attacking her. And I know women, such women, that would turn that attacker into fuckin’ hamburger, especially if they have such a weapon as a bike lock.

But that’s the benefit of patriarchal privilege isn’t it? To be so sure, solely based on observed gender differences, that you can attack a woman and “reasonably” expect not only that she won’t hurt or kill you, but thank you. That is privilege writ large, it is the most perfect example of privilege one will ever find.

It is my fondest hope that the next time one of these preciously privileged anthropomorphized sphincters pulls something like this, that he does in fact do so on the wrong woman, and she only stops hitting him because her arms and legs are tired. Then, maybe, calls the cops. Or better yet, just goes on about her day and leaves him to the mercy of other random strangers.