From Monochrome to Machine Learning: Reflecting on a Lifetime in Tech Before the Next Leap

The first computer most people encounter today fits in their pocket.
Mine filled most of a desk. Actually, a desk and a printer cabinet next to it.

My very first computer experiences were on our school’s Apple I machines, playing Oregon Trail on either black and white TVs or bright green monochrome monitors. Later at home, in the late 1980s, we got an Epson Equity I+ with a 10 MHz CPU, 640K of RAM, a 10 MB hard drive, a 5.25 inch floppy drive, and an EGA monitor that felt like stepping into full color after years of monochrome. If you lived through that transition, you remember how magical it felt. Not just more pixels, but more possibility.

Since then I have watched wave after wave of transformation reshape software and the world around it.

  • The CPU evolution, from the simplicity of that first 10 MHz machine to modern processing stacks capable of teraflops.
  • Gaming advances from Pac-Man at the arcade or on Atari at home, to Super Mario on Nintendo, from 8 bit to 16 bit, to 64 bit, and eventually to fully immersive 3D experiences.
  • The rise of the internet, starting with dial up BBS systems and rapidly becoming the global information network we rely on today.
  • Communication shifting from letters and postcards to electronic messaging, long before anyone imagined what smartphones would become.
  • Social platforms growing from AOL chatrooms and small forums to worldwide communities where billions share ideas.
  • The mobile phone evolving from a clunky brick attached to a briefcase to a pocket sized computer that dwarfs the power of my first PC.
  • Music and movies becoming instantly accessible. No more waiting for release days or storing shelves full of physical media.
  • Breakthroughs in biology that finally let us map DNA and understand life in new ways.
  • Automation taking form in 3D printing, robotics, and smart home technology.
  • Machine learning taking root, followed by modern generative AI that can now write, reason, and assist in ways we only imagined in science fiction.

And these are just the highlights. After 25 years in the industry and a lifetime of tinkering, the list is a long one.

What is interesting about right now, especially with 2026 on the horizon, is that it feels like another one of those rare pivot points. Except this time the shift is not about faster processors or prettier screens. It is about the relationship between humans and software fundamentally changing.

We are entering an era where tools are not just tools. They are partners. They help us think, reason, and build. They fill gaps. They accelerate ideas. They lift creativity instead of replacing it.

For the first time in my career, I am building something that sits directly in the center of that shift. Something that focuses on how to evolve the bigger picture rather than profit from it. Something designed with everyone in mind, not tied to advancing any particular agenda.

I will be announcing something big very soon for software development. At least, I think it is big. For now, I will simply say this:

Everything I have learned since that Epson Equity, from the GW-BASIC and MS-DOS 3.2 manuals, through distributed computing, microservices, and machine learning, all the way to modern AI, has been quietly converging into a project I have been shaping for a very long time. I just did not realize it until recently.

It is close now.

And once it is ready, I believe it will change how a lot of people think about building software, and possibly grow into something far larger than that.