The Developer That Nobody Wants To Be
Dev Leader Weekly 127
TL; DR:
Legacy systems are some of the most real software engineering
Saying “AI tools don’t work here” might be a cop-out
Someone has to bridge the communication gap, and it might need to be you
No livestream this week, sorry!
Stuck Maintaining Legacy Code -- Now What?
I saw a post on the ExperiencedDevs subreddit that I think a lot of us can relate to. A developer is the sole maintainer of a legacy .NET application. They’re dealing with brittle code, zero test coverage, and a product manager who doesn’t seem to understand the challenges. Oh, and to top it all off? The PM is telling leadership that this developer isn’t using AI tools and that’s why things are slow.
Spoiler: the developer says they ARE using AI tools.
There are a few different angles I want to unpack here, because this scenario hits on so many things at once. Legacy systems. AI tool adoption. And the communication gap between engineers and PMs. Let’s get into it.
You can check out my full thoughts on this in the video below:
Legacy Systems Are REAL Software Engineering
Here’s the thing that bugs me when people say, “I’m just maintaining legacy code, it’s not real engineering.” That is sometimes the most real software engineering, because you have some of the worst constraints to work within, and you have to get creative.
Think about it. You’re working in a codebase where:
There’s little to no test coverage
Everything is tightly coupled
You touch one thing and five other things break
The “tribe” that built it has long since disappeared
And in this person’s case, they’re not just doing bug fixes -- they’re actively developing new features on top of this. That’s a completely different ballgame. If you’re just patching a system that’s end-of-life, you might get away with minimal fixes to keep things running. But if you’re adding features? People are paying for this. The product might be legacy from a code perspective, but it’s not legacy from a product perspective.
If you’re wondering where to even begin with this kind of situation, I’ve written about refactoring legacy code and what you need to be effective -- it’s a good starting point for thinking about how to approach these codebases.
So what do you do?
Start Building Confidence Through Tests
You need to work towards having test coverage that gives you confidence. And that might mean writing tests you don’t love.
Here’s what I mean. I like writing code that can be unit tested -- almost in isolation from everything else. But in a lot of legacy systems, including ones that I wrote that became legacy, the code was not designed that way. Dependencies aren’t mockable. Concerns aren’t separated. That whole class of tests? Just not available to you.
So you go to the other end of the spectrum. Big integration tests. Maybe even UI tests that click through the application. Are they brittle? Absolutely. But if you can get some tests in place that cover your number one use cases -- the things that absolutely cannot break for your users -- you now have some confidence.
I’ve talked about dealing with legacy code and how to make anything more testable if you want a deeper dive on practical strategies for this. The key idea is that you don’t need perfect tests -- you need tests that give you a signal.
And by the way -- don’t be fooled by coverage numbers alone. I’ve written about why test coverage can be misleading and how to avoid false confidence. High coverage with low-quality tests can be worse than no tests at all because it gives you a false sense of security.
Actionable Tip: Start with confidence-building tests, even if they’re ugly. Cover your most critical user flows first. As you refactor, you can replace the crappy tests with better ones. But having something is way better than having nothing. Here’s what I’d actually recommend as a starting point:
Identify your top 3-5 user workflows -- the ones that, if they broke, would generate support tickets or lose customers
Write end-to-end tests for those -- even if they’re slow, brittle, or ugly
Use those tests as a safety net while you start refactoring the code underneath
As you refactor, add focused tests at lower levels where you can
The point is: start introducing mechanisms to get confidence. Then, as you refactor and clean things up, you can write better tests and eventually ditch the ones you don’t love. If you’re working in a .NET codebase specifically, check out my article on refactoring C# code with 4 essential techniques for some practical patterns.
AI Tools “Don’t Work Here” -- Really?
Let’s pivot to the AI tool situation. This developer says they’re using Gemini, which is great. But then they mention having access to Cursor and saying something like “Cursor can’t work on legacy .NET projects.”
I have a challenge with statements like this.
Look, maybe Cursor isn’t the most effective tool for this specific scenario. But saying it can’t be used? That feels like a cop-out to me. I’ve used Copilot CLI to build legacy .NET projects -- not with dotnet build, but with Visual Studio tooling on the command line. No UI. It worked.
You can build on the command line. When you start having tests, you can run your tests on the command line. Cursor can go through files on a file system. If you can trigger your build from the command line, there’s a path forward -- even if it’s not the polished IDE experience you’re used to.
I actually wrote a practical guide on getting started with AI coding tools that covers how to think about tools like Cursor, Copilot, and Claude in different development contexts. The reality is that these tools aren’t all-or-nothing -- you can use them for specific parts of your workflow even if they can’t handle everything.
For legacy codebases specifically, here are some concrete ways AI tools can help even when the project structure isn’t ideal:
Understanding unfamiliar code -- paste a class or module and ask “what does this do?” before you start modifying it
Generating test scaffolding -- even if the tool can’t run the tests, it can help you write the boilerplate for test setup
Exploring refactoring options -- describe what a piece of code does and ask for suggestions on how to decouple it
Documentation -- generate documentation for undocumented legacy code so the next person (or future you) has context
Actionable Tip: Don’t rule out tools because they don’t work the way you expect. Get curious about finding ways to make your tools effective for you. The more familiar you get with what tools are capable of -- and how to use them in different scenarios -- the more options you have.
At work, you might not have the freedom to pick whatever tool you want. I get that. But within the tools you do have access to, there’s probably more you can squeeze out of them than you think.
Bridging The Communication Gap
Now for the part that I think matters the most: the relationship between this developer and their PM.
We have two people saying different things. The developer says, “I AM using AI tools.” The PM is telling leadershi,p “this person isn’t using AI tools, that’s why things are slow.” The truth? It’s probably somewhere in the middle.
This developer is probably using AI tools. But are they using them effectively? That’s open for debate -- and I’m not saying that to be harsh. They’re already acknowledging that some tools don’t work well for their scenario, and I’ve just argued that I think they could push harder on that front.
From the PM side? They’re probably right that things are behind their expectations. The reasoning they’re giving might be wrong, but the core observation -- that delivery isn’t where they want it -- is probably true.
Someone has to bridge this gap. And it sounds like it hasn’t been either person up to this point.
If I knew someone was saying things about me that I didn’t believe were true, I’d confront them. Not aggressively -- but I’d need to get to a point of truth. I’d make time and say something like: “Hey, I will show you how I’m using AI tools. If you think I’m not, let me demonstrate.”
But clearing that up alone won’t solve the problem. These two people need to work together to figure out what they need from each other.
Communication is one of those soft skills that engineers often undervalue, but it makes or breaks your ability to operate effectively on a team. And it’s not just about talking more -- it’s about talking in ways that land with your audience.
Actionable Tip: Before you go into a conversation like this feeling defensive, try preparing with these three things:
A demo, not a debate -- Show your PM exactly how you’re using AI tools. Screen share, walk them through your workflow. It’s a lot harder to argue with a live demonstration.
An honest self-assessment -- Are there areas where you could be using tools more effectively? Acknowledging that proactively builds trust and shifts the conversation from “are you doing it?” to “how can we do it better?”
Questions, not accusations -- Ask your PM what success looks like to them. What timelines are they working against? What are they being asked to deliver? Understanding their constraints helps you propose solutions that actually work for both of you.
Talk About Systems, Not Classes
Here’s something practical: as an engineer, you might totally understand why things are brittle. You know that Class A is coupled to Class B, which is coupled to Class C, and touching any of them cascades. But your PM? They don’t know that. And talking to them about classes and methods is not going to help.
Instead, uplevel the discussion. Talk about feature areas. Draw some higher-level block diagrams. Say something like: “Hey, some of the logic for this feature lives here, but it’s actually spread across these other areas. If I touch one, we’re going to see problems across all of them. And right now, I have nothing that tells me when that’s going to happen.”
That’s something a PM can understand and reason about. It gives them the context they need without drowning them in implementation details.
This is actually the same skill that I see trip up engineers when they’re working with junior developers -- the ability to adjust your communication to your audience. You wouldn’t explain coupling the same way to a PM as you would to a senior engineer. Meeting people where they are is how you actually get alignment.
If you’re dealing with technical debt specifically and need a way to frame those conversations with leadership, I’ve written about how to balance technical debt in a way that connects engineering concerns to business outcomes.
Actionable Tip: Try to understand what your PM needs. They have goals, expectations, and things they need to report on. If you can understand their motivations, you might find that how they’re approaching things isn’t effective for you -- but the why behind it makes sense. And then you can propose alternatives that work better for both of you.
Here’s a framework that has worked for me when bridging these kinds of gaps:
Translate risk into time -- instead of “this code is tightly coupled,” say “changing this feature has a 50/50 chance of breaking these other two, and debugging that takes 2-3 days each time”
Propose trade-offs, not ultimatums -- “We can ship this feature in 2 weeks, or we can spend 1 week stabilizing first and ship in 3 weeks with fewer bugs after. Which would you prefer?”
Make the invisible visible -- track the bugs caused by coupling, the time spent on regressions, the incidents. Data changes conversations.
Wrapping Up
Legacy systems are challenging. AI tools in constrained environments are challenging. Working with someone who doesn’t seem to understand your reality is challenging. But none of these are unsolvable.
Start building confidence through tests -- even ugly ones. Get creative with the tools you have -- don’t dismiss them before you’ve really explored what’s possible. And put in the effort to bridge that communication gap -- even when you don’t want to. Especially when you don’t want to.
If you want to go deeper on any of these topics, here are some resources:
Dealing With Legacy Code - How To Make Anything More Testable
Getting Started with AI Coding Tools - A Developer’s Practical Guide
Software Engineering Soft Skills - 6 Focus Areas That You Need
I wish this person all the best. Legacy systems are a lot of fun in their own way -- lots of interesting constraints and lots of interesting challenges.
Join me and other software engineers in the private Discord community!
Remember to check out my courses, including this awesome discounted bundle for C# developers:
As always, thanks so much for your support! I hope you enjoyed this issue, and I’ll see you next week.
Nick “Dev Leader” Cosentino
social@devleader.ca
Socials:
– Blog
– Dev Leader YouTube
– Follow on LinkedIn
– Dev Leader Instagram
P.S. If you enjoyed this newsletter, consider sharing it with your fellow developers!



