This is a featured article used with permission from the original author.

Please see the 'About the Author' section at the bottom of this page.

Turbo Fredriksson

Linux/Unix people


Lead Cloud Architect and veteran Debian Linux contributor


Atari 2600

Turbo Fredriksson


How has IT changed in the last forty years?


The short answer, if you don't care or have time to read the whole article, is that it hasn't!


The Past


I'm now 56 years of age.
I started with computers when I was twelve or thirteen.
That is around the early '80s, when I managed to partly convince, and partly trick my mother into buying me a Commodore VIC20.
That's the predecessor to the all-famous C64!


It effectually had 3.5kB (kilo bytes!) of memory, but you could disable the “OS” and get more to use.
It had no disk drive, hard drives weren’t even available for personal use at the time.
The ones that did exist, cost horrendous amount of money and only the very richest institutions and companies could even consider buying one.


Floppies were really floppy!
8” in size, round discs in a square, thin cardboard envelope.
But I couldn't afford even that, the floppy drives were at least a month’s salary, and I didn't have one.
I was still in school.


“You can store your recipes in there”.


Yeah, well she (mom) was actually one of a small team working for Ericsson in Gothenburg (well, actually Mölndal, which is a sister city to Gothenburg) in Sweden, working on one of the first, and only, super-computers in Sweden.
She managed to figure out how it works, how to use it and wrote documentation for the rest of the team and people needing to use it.
So she was already fairly computer savvy.


I learned programming, BASIC, and was fairly good at it.
Then the Commodore Amiga 500 came along – I missed the Amiga 1000, it was just too expensive.
It was targeted at the office, with that kind of price!
With its graphical user interface, a point-and-click system, that was many years before anyone else – Windows 3 was released in 1990.


So, I learned a different BASIC, finally outgrew that and learned C.
Unfortunately, the compiler I had at the time was buggy (or just a crappy implementation, we never really figured out which), so I learned pointers wrong :(.
Which I'm still suffering from, it's hard to relearn something!!


Then Commodore got into financial trouble, the Amiga 600, 1200 and 2000 was never a deal breaker.
They improved the hardware, but not substantially enough for me.
So instead, I got a PC.
A 386.
But with that I could grow!
Add more memory, hard drives had become affordable, you could change the screen, box and keep it running for much longer.


With the 386, then the 486 and then the Pentium – I, II, and III and I think I even had a Pentium 4 there for a short period – I got into Linux in the early 1990's (think it was fall '91, but memory is a bit fuzzy on that, was a while ago.
Was introduced to it by some friends at the local computer club, loved it – it was free!
Completely free!
Something I got used to with the Amiga and all the Public Domain software.
Now, they where free to download and use, but you didn't have access to the software.
So not completely free.
But free enough for me at that time.


With Linux, I learned more about networking, real operating systems, multi-user systems, and got a lot better at C, because the GNU compiler (gcc) was actually working correctly.
Still had problems with pointers (which is really an absolute must to know inside and out, the whole language is based on it.


I worked as a teacher in and around Gothenburg for a few years, “teaching IT” (programming, networking, security, Linux administration etc).
I sucked at it!
Just because you're really good at something (and I had by then already outgrew my betters, people who had started with old-school UNIX and played around with Linux longer than me – only by a few months, I started about eight, nine months after Linus had released his first version on the UseNET), doesn't mean you can teach it!
But I figured it out and got reasonably good at it.
Something I've got a lot of good remarks about from my managers over the years - “Turbo can explain very complicated subjects in a way that even I can understand”.
High praise, and one that I cherish more than anything.
Some have said “Turbo is the sharpest knife in the drawer, we put him in places where everyone else fear to tread” is another.
Ok, so I sometimes let that go to my head, but like my grandfather used to say, “it's difficult to be humble when you're the best”.


But I'm getting ahead of myself.
After the teaching, I got into consulting.
Not really where I thought I would go, “it just happened”.
Somehow.
But I always ended up in very deep water, no idea what I was doing, or how to do what I was asked to do.
I thought for a very long time that I was cursed, or that someone really disliked me, wanted to put me in my place.
It was VERY uncomfortable!


However, somewhere along the line, I started to .. appreciate it!
Funny how the mind works.
One impossible mission after another.
But I solved it – I learned what I needed to learn, figure out how solve the problem, then it was a fairly easy matter of .. “just doing it”.
That's around the time I learned the expression “work outside [of] your comfort zone”.
Well, that place became the only place where I could feel .. properly useful – “think not of it as a problem, but as a solution”.
Yeah, that is a lot of BS!
The problem is what makes life worth living!
The solution is in there, I knew that, I just had to roll up my sleeves and start digging.
Once I've dug deep enough, there it was.
Always!


This isn't going to be a job application, so I leave a lot of it out.
But I've designed, built, racked and setup Cisco networks with hundreds of nodes for Ericsson.
I've designed and built large compute clusters with thousands of cores for Astra Zeneca.
List is quite long – 40 years is a long time!

The Present


So fast forward till today, 2025.
With all the experience I've had with impossible missions in every single field of IT, almost every set of tools, language and people interactions, I've been designing and building large cloud environments with hundreds, sometimes thousands of machines, networks, security to repel even NSA (or at least, it could in theory, not ENTIRELY sure in practice.


Multi-account setups with dozens of accounts in AWS, GCP and Azure.
Even a multi-cloud setup, with fallback to another cloud is in there in the CV.
I have access to unlimited resources and compute power, unlimited amount of storage.
I have access to networks that can dwarf anything I've even thought about twenty, thirty years ago.
All at the tip of my fingers, within seconds or minutes whenever I want to.
As much as I want to!


So with all this access to unlimited resources, how could I even remotely say that what I do today is exactly the same as what I did almost forty years ago!?
When I had to petition the bosses and the upper management for maybe hundreds of thousands of dollars for hardware.
Wait weeks for it to be delivered, then weeks to unpack it, rack it, connect it, install it and then spend another few weeks to make sure it works correctly.


How could that possibly be the same!?


Well, because I do exactly the same thing now.
Granted, I just use completely different tools.
They're faster, smarter, and just .. better!
But the idea behind them is the same – automate, automate and automate!


In “the old days” (a sentence I absolutely loathed when my grandfather started one of his life lessons with when I was younger, we had to hand-roll everything.
We wrote shell scripts, with the occasional C binary, maybe some Perl (Python wasn't even a glimmer in its creators eyes at the time to solve the problem of how to automate an install and setup of a database, or a network, or of a Internet gateway, or .. whatever we needed to do.


These scripts were hardwired to this very specific use-case.
When the use-case changed, it was next to impossible to change it, so we had to create another set of automation for this new use-case!
VERY time consuming, although over time we managed to have a “core set” that we could reuse.


Today, there's a lot of this already done – Ansible, Puppet, Chef, Terraform, CloudFormation.
There's loads of these to simplify things for you.
Whatever your need, you can just google your problem, and someone has infrastructure as code for you!
But if you look deep enough, they do exactly the same that we did, just soooo much simpler, better and faster!
AND more scalable!


So although the tools have changed, made things simpler and better.
They do the exact same job – automate the creation and setup of infrastructure, as well as test and install software onto multiple machines at the same time.
Or staggered, if one prefers.
Scripting that would have taken months to do forty years ago, you can now do in an afternoon.


The Future


However.
There is a set of technologies (plural!) in the works right now, tech that has gone just beyond the theoretical stage.
They're in a stage where they “kinda” work, experimental but not quite ready for general consumption quite yet.
They don't work exceptionally well yet, but they do work!


And the primary one is AI.
AI is here, and it's here to stay.
Another saying that I partly don't like and partly like, depending on what mood I'm in, is “you can't put the genie back in the bottle”.


AI is exactly that – it can already produce code!
It's not very good code.
Actually, it's absolute crap.
But it is creating it!
If there's one thing I've learned in my 56 years on this retched planet is that even something really bad, will just get better.
IF there's a need for it!


And there's absolutely a need for AI.
It will replace a large (!) chunk of IT personnel – and no, that's not a good thing, but it is a thing!
It will start with Ops, DevOps and developers!
Ops is probably the first to be replaced – computers are really good at prediction.
DataDog is proof of that.
I used it quite heavily in a big cloud project a few years ago, and it warned me in advance when something would go wrong.
It was right in every single case!
Don't know if they use AI as we know it today (it was before the big AI band-wagon, we called it “Big Data” then – same thing, without the “smartness”), but whatever tech they did/do use, it is very good at predicting what will happen.
It's just a matter of feeding enough data, information about your setup and giving it enough data points.


Problem #1 – Better AI


There is one potential caveat here with AI.
There's not enough computing power in the world for it to be really (really!) good today.
The Nvidia GPUs simply can't be produced in enough quantity, and cheaply enough, for AI to get that extra edge.


Queue DeepSeek!
When the big, multi-billon dollar companies, spent trillions of dollars to develop their AIs, the Chinese did it with a few millions.
AND it didn't require highly specialized hardware!


Problem #2 - Energy


But neither of this helps in the long run, DeepSeek will reach the second issue soon enough – power!
Electricity!
There isn't enough of it, cheap enough, and readily available enough to support the next few stumbling steps of AI.
Even the previous generations of CPUs and GPUs that DeepSeek uses, they still need electricity.
And plenty of it!


So that's the next, second, giant leap that's needed – abundant and cheap energy.
Queue the next generation of electrical power generation – fusion instead of fission.
This would be Tokamaks.


There are a few experimental reactors around the world, but they're not getting enough funding, so progress is slow.
But one reactor in the US is breaking even now!
As in, it produces just enough energy to sustain itself.
There's nothing left over to sell, but it is breaking even!
Which is better than requiring more energy than it produces.


Then there's the very old technology of Thorium, salt, reactors.
A lot of countries around the world had these in the 1960s, so it's easy to do.
They're also (as is the Tokamaks) 100% safe.
A Thorium reactor is simple, cheap and quick to build and can produce a large amount of cheap electricity.
Also, the raw material, Thorium, is abundant in the earths crust and isn't very expensive to excavate.
It also isn't dangerous to get, not like the raw material we need for batteries to our electric cars.

Power generation then ties in with cooling.
The cooling of all these computers, storage, network devices and most of all, the huge amount of GPUs cost a fortune to power.
So even more electricity is needed.


Problem #3 – Quantum Computers


However, abundant energy on its own doesn't help.
The next, third, problem after that is size visa vi compute power.
The biggest super-computers today take up huge amount of real estate, thousands of square meters.


This is the next step in societies required huge leap – more compact compute power.
This is where quantum computers would play a big role!
They will be millions of times more effective, use a fraction of the power and require a fraction of the real estate than even the biggest and most powerful super-computer today.


IBM has the biggest one to date (2022), 144 qubits.
I have no idea how many qubits you'll need to be effective, but they're working on it!
They've gone from two qubits (1998) to 144 in about 24 years.
Doesn't sound like a speedy technical advances, until you fully understand the complexity of the issue, and how each cycle of progress takes less time than the previous.
My personal opinion and guess (!), is like going from inventing the spear to landing on the moon in the same amount of time.


A Giant Leap For Mankind


Combine these three – quantum computers, unlimited cheap energy and better AIs, and we would not recognize “IT” (Information Technology) as we know it today!
It will be more revolutionary than the industrial robot was in its day, where it effectively replaced over 90% of the workforce!


Is this a good thing or not!?
No idea, but you can't put the genie back in the bottle.
This WILL change the world.
Maybe not for the better, at least not for the foreseeable future (or my lifetime!), but it will change!


Prediction


MY amateurish prediction, if we don't blow ourselves up in the meantime (and considering that “The Doomsday Clock” is now 1 minute, 29 seconds to midnight, and what's going on with politics in the world today, it's a high probability!), is that this will all happen within the next ten years!
Maybe twenty, but I'm absolutely sure it won't be as long as thirty!


I based that .. prediction on Moores law - “transistors on a microchip doubles every two years”.
Not quite applicable, but close enough.


So where does that leave people working in IT today?
Well, learn and work with one of these three upcoming techs, and you'll be safe.
For the rest of us, that's too old to start over from scratch, well..


If I ever figure that out, I'll write another article.


End Note


Thanx for reading, and as usual, these are my opinions and observations.


Just remember, not that long ago someone predicted that there's only ever going to be a use for four, maybe five computers in the world.
And that Bill Gates said that 640kB RAM is more than enough.


Predictions about the future are inherently wrong and impossible to do.
But every now and then someone do get it right, I just hope I'm not one of them.









Nota bene


Debian Linux is one of the most influential and long-standing Linux distributions, forming the foundation for many other distributions, including Ubuntu.
Its history dates back to the early 1990s, when the free software movement was gaining momentum.
It was founded by Ian Murdock on August 16, 1993.
At the time, most Linux distributions were either commercial or difficult for users to manage.
Murdock, a computer science student at Purdue University, wanted to create a fully free, community-driven operating system that adhered to the principles of the GNU Project.
He initially released Debian Linux 0.01 as an announcement on a Usenet newsgroup (comp.os.linux.development).


He named the project "Debian," a combination of his and his then-girlfriend's (Debra Lynn) names: Deb + Ian.


Early Development and the Debian Social Contract (1994–1997)
As the project grew, Murdock envisioned Debian as a stable, well-maintained alternative to existing Linux distributions.
To ensure openness and transparency, the Debian Free Software Guidelines (DFSG) and the Debian Social Contract were drafted in 1997.
These documents defined Debian’s commitment to free software and later inspired the Open Source Definition used by the Open Source Initiative (OSI).
The Debian community model set it apart from other Linux distributions.
Instead of being controlled by a company, Debian was governed by a group of developers, making it one of the first truly community-driven distributions.


Expansion and Key Innovations (1998–2004)
Debian continued to grow, attracting developers from around the world.
Key developments during this period included:
APT (Advanced Packaging Tool) (1998): Introduced an easy way to install and manage software dependencies, revolutionizing package management.
Debian expanded beyond Intel x86 to support various hardware architectures, including PowerPC, ARM, and more.
Debian gained a reputation for being one of the most stable and secure Linux distributions, making it a favorite among servers and enterprises.
During this period, many Linux distributions (such as Ubuntu in 2004) started using Debian as their base.

Debian adopted a democratic leadership structure, with a Debian Project Leader (DPL) elected annually.
The leadership model ensured no single entity controlled the project.
Ian Murdock left Debian in 1996 to pursue other projects, but his vision continued under new leadership.
With the launch of Ubuntu in 2004, Debian’s influence grew dramatically.
Ubuntu, based on Debian, brought a more user-friendly experience, making Debian’s core system accessible to a broader audience.
Other Debian derivatives, such as Linux Mint, Knoppix, and Raspbian, also gained popularity.


Debian continues to release major updates every few years.
The release names are always based on characters from Pixar’s Toy Story.
It powers everything from personal computers to high-profile servers and embedded devices.
Debian’s principles and guidelines have influenced countless projects.
Ubuntu, Linux Mint, Kali Linux, and others trace their roots to Debian. Debian’s democratic approach set a precedent for open-source governance. Debian has played a pivotal role in shaping modern computing.


What Does It Mean to Be a Debian Linux Contributor?

Being a Debian contributor means actively participating in the development, maintenance, and improvement of the Debian Project, one of the largest and most influential Linux distributions.
Contributors help Debian grow by writing code, packaging software, testing, documenting, translating, and providing community support.
Unlike many projects controlled by a single organization, Debian is community-driven, and anyone with the right skills, dedication, and interest can contribute.
Debian consists of tens of thousands of software packages.
A key contribution is maintaining these packages
Users and developers can also contribute by reporting and fixing bugs in Debian’s Bug Tracking System (BTS).
Also testing new versions of Debian (especially in unstable and testing branches) is needed.
Good documentation is essential for Debian’s usability.
Contributors can write guides, manuals, and FAQs.
Debian is used worldwide, so translating it into multiple languages is important.
Debian has a large community, and helping others is an important contribution.
Contributors can help by providing Community Support.
They can answer questions on forums, mailing lists, and IRC channels.
Debian also has many servers and services that require maintenance.
Contributors can help administer Debian’s infrastructure.

Being a Debian contributor is a rewarding experience that allows you to give back to the open-source community









About the Author


Turbo Fredriksson - Debian Developer


Turbo Fredriksson

Turbo Fredriksson


For LinkedIn Profile Click here

I have 35+ years of working in IT, from starting out as a developer and then becoming both system and network administrator.
Then as an architect, designer and managing large scale compute clusters and networks with thousands of nodes and computers and tens of thousands of cores.
The last 8 years I've been working with both the private cloud (VMWare and OpenStack) as well as the public cloud (mainly AWS, but also some Azure and GCP).

Debian Developer Jan 1997 - Present · 28 yrs

ZFS On Linux Developer

Programming Languages
Infrastructure
Docker Products
Microservices
Linux Firewalls
AWS Web Application Firewalls (WAF)
Amazon SQS
Amazon Simple Notification Service (SNS)
Amazon Route 53
AWS Identity and Access Management (AWS IAM)
Amazon Elastic File System (EFS)
Amazon ElastiCache
Amazon Dynamodb
AWS Directory Services
AWS CodeDeploy
Amazon Cognito
Cloud9 IDE
AWS Workspaces
Internet Security
OpenStack
Linux System Administration
Puppet (Software)
AWS Backup
AWS APIGateway
Elasticsearch
Amazon EC2
Amazon Relational Database Service (RDS)
Python (Programming Language)
Kubernetes
Amazon CloudWatch
Microsoft Azure
Bash
Jenkins
Google Cloud Platform (GCP)
Amazon S3
AWS CloudFormation
Software as a Service (SaaS)
AWS CodePipeline
AWS CodeBuild
Amazon Elastic Container Registry (ECR)
Amazon Fargate
Amazon EventBridge
Amazon ECS
Amazon EKS
AWS Lambda
DevOps
AWS Command Line Interface (CLI)
Terraform
Amazon Web Services (AWS)
Linux
Perl
Open Source
Unix
Apache
Shell Scripting
C
MySQL
Integration
Software Development
Telecommunications
JavaScript
HTML
CSS
Analysis
Security
Network Security
Firewalls
Cisco
Web Services
PostgreSQL
VMware
VPN
Virtualization
Computer Security
Active Directory
Solaris
TCP/IP
Switches
CVS
Vulnerability Assessment
Cloud Computing
System Administration
Disaster Recovery
git
Subversion
VirtualBox
Mac OS X
Routers
Router Configuration
Route Planning
Wireless Networking
Wireless Security
Qmail
Bind
Bacula
LDAP
OpenLDAP
Kerberos
WAN


Causes
Animal Welfare
Civil Rights and Social Action
Environment
Human Rights

https://github.com/FransUrbo


See you in the next one!


If you wish to support our project

Donation link (Buy me a coffee):

https://buymeacoffee.com/Alex_Cyber_Synapse