A Week in the Life of a Cyber Security Analyst
Oxygen Technologies is proud of the time and effort we have put in to creating a Cyber Security program. It’s taken time, money, resources, partnerships, and a whole lot of gumption, to take on the responsibility of something that is ever changing as this pillar of technology
There are numerous things you need to check off as you roll out any new service or product. For Oxygen, the first step was aligning our selves with industry leaders to partner with, to collaborate with, and to provide guidance. The next, and most rewarding, is the recruitment of talented resources who could take those first three steps and execute. The interesting part about cyber security is you go from dipping your big toe to into the proverbial water, and then realizing that are immediate issues that need to be addressed, and you just had to cannonball in. This is where putting the right people with the technology creates some great outcomes. So, I’m gong to take a few minutes of your time, and talk about one of our first major cyber activities we proactively executed on, on behalf of our clients.
Approx. two weeks ago our cyber security team became aware of a severe issue with Microsoft’s print spooler service. I won’t go into the gory technical details, but pretty much every Microsoft Server going back 25 years had this issue.
Our team became aware of this issue at approx. 2pm on Mon July 6th – the first step our team did was verify the severity of the exploit with both Microsoft and members of our advisory group (part of our secret sauce at Oxygen, is we utilize third party experts outside of the Oxygen Team.) While we waited for validation, we began internal discussions on the best course of action. We needed to weigh the negative effect it would cause our clients vs the importance of risk mitigation due to the severity of the exploit. Our decision was to implement Microsoft’s recommendation of stopping the print spooler service across our entire client base – effectively shutting off print services for our clients. While this planning was occurring, validation from our advisory group came in agreeing with our course of action.
Late that afternoon Oxygen’s Operations Manager sent out an email detailing our execution plan to our clients – effectively speaking we were turning off their print services. We let the clients know that we would be available to support them, and once Microsoft released the patch, we would deploy it via our Managed Services console. I must stress, the proactive work we did, both to shut off the print services, and to apply the patch, was helped through automation via our Managed Services team. We were able to use scripts, and group policies to streamline the process.
Microsoft started to release patches by end of day Tuesday July 7th. We initially tested the patches within our environment, we have a testing sandbox for exactly that. Once the patches were confirmed to be good (Microsoft’s track record for patches isn’t flawless) we began the process via scripts, Managed Services, and through manual touches to update close to 350 servers across our clients environments starting at 7AM Central Time Wednesday morning, completing on Thursday at 6PM (yep, our team worked all night getting this completed.)
On the Thursday we were notified by our advisory group that there was still an issue with the print spooler exploit (this is why you collaborate.) We took additional measures to close off any additional vulnerabilities that this presented.
By Friday, all clients had their environments patched, and print spooler services enabled.
Here were some things we learnt through our first critical event with our Cyber Security Team:
- Clients truly appreciated how proactive we were. I will not lie, it helped having major news media sources reporting on the exploit, this validated our approach.
- Oxygen had a lot of institutional hesitancy towards turning off a critical service. There were comments like, we should ask our clients permission first, or is the vulnerability that big a deal. Our cyber security team was adamant that we needed to address it, and the one thing we realized is that therefore companies’ partner with Oxygen, sometimes you need to make a hard decision, it may cause some issues, but as I said to my leadership team, I’d rather deal with upset clients because WE DID SOMETHING vs an upset client because WE DIDN’T.
- We had approx. five clients who needed work arounds due to print volumes. Our On Demand Team was able to address that by noon on the Tuesday.
- We had one upset client who took on full responsibility if they were exploited and asked us to turn the print service back on, before the patching was complete.
- We had clients letting us know that they were being notified three days after we started working on their environments, by other IT vendors to be careful of this vulnerability. This provided further validation of our approach, and in the business of cyber security and zero day exploits, to be 72 hours of your competition is an affirmation of your processes.
- For all this work, in research, and technical fulfillment, it cost each of our clients one hour of support – this is an example of driving value for our clients.
If you have any questions about our processes, or our Team, please feel free to reach out. We’re looking forward to writing more blogs about the successes, and yes, we will have failures, and we wont be shying in sharing those as well.