· 5 min read
Looking Back at 2023
A look back at what I learned, built, and changed my mind about this year.

Looking Back at 2023
End of year, time for a recap of what shifted in my work and thinking.
SSR is Back
The industry swung back from Static Site Generation to Server-Side Rendering this year. For a while there, the default was to pre-render everything at build time - faster initial loads, simpler hosting, just throw it on a CDN. But SSR started making more sense again for a lot of projects I worked on.
The shift happened because of what we’re building now. When you’ve got user-specific content, real-time data, or pages that change frequently, SSG means either constant rebuilds or client-side fetching that defeats the purpose. SSR gives you fresh content on each request without the hydration jank.
I’ve been using Astro for content-heavy sites (this blog runs on it), Nuxt.js for Vue projects, and dabbled with Remix and Next.js on various work. Each has its quirks. Astro’s partial hydration is clever - ship zero JavaScript by default, add interactivity only where needed. Nuxt 3’s composables and auto-imports took some getting used to, but the DX is solid once you’re in.
I still reach for SSG when the content is genuinely static. A marketing site that changes once a month doesn’t need server rendering. But the “SSG everything” mentality from a couple years ago feels dated now.
AI Tools: Initial Hype, Then Reality
ChatGPT was everywhere at the start of the year. I used it heavily at first - explaining concepts, drafting boilerplate, working through problems out loud. It felt like having a junior dev available 24/7 for the tedious stuff.
But as the year went on, I noticed something. The AI was most useful when I already knew roughly what I wanted. It could fill in syntax I’d forgotten or generate repetitive code. Where it fell short was the harder stuff - architectural decisions, debugging subtle issues, understanding why something wasn’t working. For those, I still needed to actually understand the system.
What ended up mattering more was investing in better tooling. Properly configured TypeScript catches type mismatches before they become runtime bugs. A good debugger with breakpoints and watch expressions beats console.log archaeology. ESLint rules tailored to the project prevent entire categories of mistakes. These aren’t as flashy as AI, but they’re reliable.
I still use AI tools, just more selectively. Quick lookups, brainstorming, explaining unfamiliar APIs. But the fundamentals - understanding the code, knowing the patterns, reading error messages properly - that’s still where the real work happens.
Getting Into the Weeds
The project that stretched me most this year was a game auto-scaling system. The requirement was straightforward on paper: spin up game servers when demand increases, spin them down when it drops, don’t let players experience lag or disconnects.
The reality was messier. I had to understand how game servers actually work at a lower level than I’d needed before. How do you gracefully drain connections before shutting down an instance? How do you route players to the right server based on latency? What metrics actually predict demand versus react to it?
This pushed me into territory I’d mostly avoided: writing bash scripts to manage server lifecycle, understanding Linux process management, figuring out VM orchestration and networking rules. The OSI model stopped being something I vaguely remembered from a course and became something I had to actually think about.
It was uncomfortable. I spent a lot of time reading documentation for tools I’d never used, debugging issues in systems I didn’t fully understand. But coming out the other side, I have a much better mental model of what’s happening below the application layer. When something goes wrong now, I have more vocabulary for describing it and more ideas for where to look.
Back to Uni
I enrolled in a Masters of IT at Deakin. This wasn’t about credentials - I’ve been working professionally and the self-taught path has been fine. It was about getting a different perspective.
When you learn on the job, you learn what you need for the current project. That’s efficient, but it leaves gaps. You might never encounter certain algorithms, or database theory, or formal methods, because your projects didn’t require them. University forces you through material you wouldn’t have chosen, and some of it turns out to be useful in unexpected ways.
There’s also something to be said for the structured approach. Self-teaching means constantly choosing what to learn next, which can lead to chasing whatever’s new rather than building foundations. Coursework has a progression that builds concepts in order.
Early days still, but it’s been a good complement to the practical experience.
For 2024
A few things I’m aiming for:
- AWS Solutions Architect cert - I’ve got Azure Fundamentals already, and most of my cloud work has been Azure. AWS is different enough that it’s worth learning properly, and the cert gives me a structured path through the material.
- Finish the Masters - Keep chipping away at the coursework.
- Distributed systems - I’ve worked with distributed systems but haven’t studied them systematically. Things like consensus algorithms, CAP theorem tradeoffs, eventual consistency patterns. I want the theoretical grounding, not just “it works in production.”
- TypeScript and Go - These are my main languages now. TypeScript for web work, Go for backend services. Both reward going deeper - advanced type patterns in TS, concurrency patterns in Go.
Solid year overall. Built some things I’m proud of, learned some things that were overdue, changed my mind about a few things. Keen to see what 2024 brings.