Embracing IT alternatives

Are system administrators becoming mere users of technology?

Posted on: 2025-11-25

A few days ago I came upon a Reddit thread where someone was asking for advice to move away from Microsoft online services like Entra ID and Office 365. The premise was that their organization was now managed by a foreign owner who didn't want to be tied to US tech giants. However what surprised me was the overall response by top commenters. It basically boiled down to if you're not using Microsoft's services, you're doing it wrong. I will admit that Microsoft has become the default option for things like authentication and device management in the enterprise, but the amount of people that flat out couldn't think of an alternate solution was a bit shocking.

Being in IT used to mean managing many different devices and software stacks, using ingenuity and experimentation to come up with the best solution, and focusing on the actual use case rather than defaulting to a specific set of large providers. The IT landscape has become consolidated to an extreme degree in the past decades, whether it's on the hardware side, operating systems, software or even cloud options, where many system administrators would be hard pressed to name more than 3 cloud providers, all US-based of course.

As someone who worked with computers back in the 1990s, I remember a world of MS-DOS, Windows NT, Novel NetWare, FreeBSD, Linux and more. Networking involved actual switches, routers and cables. Setting up an email account could involve dealing with Sendmail, Postfix or Microsoft Exchange. Updating a DNS entry meant editing your BIND9 configuration file. Even directory services, which Microsoft Windows server platforms took by the reigns very quickly, still left room for options like LDAP and Samba. This isn't to say that I wish we would go back to the good old days, or that new service offerings don't have some significant improvements over the older options, but it felt a lot more meaningful to do IT work in the 90s than it does now, where all of the actions I've listed above basically amount to clicking on some buttons in your Azure or AWS portal, or pushing a Terraform commit. And if something breaks? Your only recourse is hoping that the Tier 1 support person (or AI chat bot) on the phone at least has a decent checklist to follow, since all of those interfaces are proprietary and closed source.

Being in IT means being a problem solver. System Administrators are craftsmen, we're here to solve problems and make sure we provide the best IT infrastructure to the organization. But as IT workers we don't control the budget, or purchase decisions. We're often handed a directive and told to implement it, regardless of any objection. And for the past few years, Big Tech companies have been extremely successful in wooing corporate executives. Trying out the cloud became Cloud-first, then Cloud-Native, and for many organizations, it went all the way to Cloud-only. I can't blame IT workers for short-sighted executive decisions to go in a specific direction, although there does seem to be a shift going on lately with some businesses starting to leave the cloud.

The bigger problem in my opinion is how a lot of IT workers seem to have lost the knowledge, skills and even desire for a more competitive world. Administrators went from being builders to being operators, and then users of a SaaS product. When everyone follows the same line of thinking, that's reflected in online forums, social media, education, and soon you will be hard pressed to find something related to user authentication or device management that doesn't automatically lead to a Microsoft, AWS or Google documentation page. Hence, the bias that I've observed. This is where being a life-long learner has benefits, and I'm thankful to still find playing with technology late at night fun after over 20 years in the industry.

And I know this may come out as an old guy shouting at the cloud, but there are a lot of very good reasons why this is a bad trend, especially among younger IT workers. First, consolidation leads to single points of failures, and when a large part of the Internet relies on a handful of companies, an outage like the Cloudflare one, or the Microsoft 365 one, or that earlier issue with the AWS US-East-1 outage should prove that point. With AI being pushed into everything, this push for centralization is only going to become worse, since most organizations don't have the skills or resources to train their own models, and people will demand access to ChatGPT and other productivity tools. That doesn't mean the cloud is bad, but an hybrid approach works best in many cases.

Then there's the fact that all those companies are US-based. I won't repeat what I wrote in a previous post but IT is a critical infrastructure, and our critical infrastructure shouldn't be in the hands of a small number of large US corporations. It puts your entire digital infrastructure at the mercy of those companies, along with the laws and possibly the whims of the current US administration. This has implications for data sovereignty, privacy, larger attack surfaces, regulatory concerns, and so on.

The good thing is that modern alternatives do exist, and from my research, they seem much more prevalent in European environments than American ones. Using open-source, using local providers, or simply building custom solutions may take more time and resource investments, but that's literally how products are created. If everyone defaults to a single product, innovation cannot happen. And while IT workers tend to have little say over corporate decisions, it would be a shame if our profession became nothing more than being users of technology, where all that's left is wondering if things have always been this way.