When I started in IT (1991!!), we were pulling cables through ceilings and fixing servers when they crashed – the internet was dial up and often restricted to one PC in the office.
There was no such thing as “managed services”
Over the last month I have been lucky enough to see quotes that have been presented to clients by a variety of managed service providers which have ranged from the sublime single page and a price, chinese menus where the client makes the decision of what to buy [select that you can have security options on some of your PC’s!] to reasonable presentations which really define what the client needs.
The problem is that the client doesnt understand security needs and benefits especially in an industry that is strewn with acronyms – and I am fairly conviced that there are a number of managed service providers who dont understand what they are selling or installing to protect clients technology.
So the client ends up buying based on price and telling the providers that are actually delivering well in their space ‘you are too expensive’ based on nothing other than guesswork.
Driven a little by an excellent LinkedIn post How do you know if you’re appointing a company to manage your IT without trustworthiness, experience, competencies, or morals?
I started looking at who is holding your managed service provider to account?
We have proper trade bodies for alarm companies, electricians and plumbers. NACOSS SELECT. NICEIC. SNIPEF but nothing for the companies managing your systems, protecting your data, and acting as the first line of defence against cyber threats?
If your managed service provider tells you they’ve got ISO27001 or Cyber Essentials Plus, that’s great. But here’s the problem. Those standards don’t look at what actually happens on the ground. They check internal documentation and data practices once every year maybe two years. They are frameworks.
They don’t test how well your backups are really working or whether the technician resetting your password followed secure identity checks.
We need a new kind of validation. here are my thoughts:
- Assesses service desk practices. Are all technicians trained on security, ideally accredited ? At a very basic level is identity verified before a password reset? Are tickets logged and decisions auditable?
- Does your managed service provider carry out internal ‘fire drills’ what if their security is breached tests and how often?
- Randomly selects clients to test real-world controls like backups, MFA enforcement, patching, and admin account hygiene
- Evaluates operational maturity. Are internal systems hardened? Are staff trained? Are risks logged, reviewed, and mitigated, not just guessed at? Is there are security budget?
- Product stack – what tools do they use? is it up to date current and reliable
- Resilience. What happens if your account manager quits or your key engineer is off sick for months?
- Tracks team health and culture. Are salaries fair? Are people supported and retained? Is this a provider built to last?
This isn’t about creating more red tape. It’s about giving clients clarity and giving good managed service providers a way to prove it.
Because the truth is, most clients can’t tell the difference between a high-performing managed service provider and one that’s barely scraping by.
Until something breaks.
We don’t need another box-ticking scheme. We need something continuous, real time, and backed by evidence. Something that moves us away from marketing fluff and towards a visible standard of excellence.
And if you’re a good technology managed service provider, you should want to be measured this way.
I’ve been speaking with a few people who are thinking the same and I would be interested in hearing from clients and suppliers to hear their thoughts.
Get in Contact – a simple email, phone call or coffee.

