As an Application Security Specialist,  part of my role is to help teams with engagements ensuring they get a well considered and executed penetration test. Its important teams can not only understand the report findings but also learn from them and roll those learnings into their coding standards and come back stronger for the next audit showing their pen testers their real value has been understood. For this you need run the right testing first and that doesnt happen the real value is lost.

I have worked with engineering teams who have a reasonable level of security maturity and other teams who are just starting out on their journey in writing more secure code.

It struck me one day that both high and low maturity level teams really struggle with working closely with external penetration testers and in running a good penetration testing engagement.

I started to think about what was common across all the teams I have worked with and what could be the problem, and looked a bit into my own past life as a QA tester for answers.

I figured out what it was one day when I was thinking about a particular engagement I was doing testing some gas power-plant infrastructure stuff for a client remotely. I shipped a build to them with the ultimate confidence our engineers have provided  excellent software that both met the specification and was robust and my testing was solid evidence this was the case.

This was back in around 2007 where the way of producing software where I worked was the waterfall methodology. As a QA test engineer I was not really involved in planning, system design and very little in development.

Testers would instead be brought in at the very end of the project to test as much as possible and triage the issues at close out of testing in the hopes they could sign off a build and drop it into the clients user acceptance testing environment for their review.

Without all the right context I had tested to the spec and done some fantastic automation and performance testing. Based on my understanding from the spec I thought we had done a great job. I had almost completely perfect traceability and test evidence in the reporting to back it up.

Sadly we had missed several very important issues which showed up instantly and lead to a rather intense call with the client not long after pushing a build to their user acceptance testing environment.

It turned out while I had exceptionally great coverage of the specification and done well considered exploratory testing I was not an expert in gas networks and accordingly had missed some pretty critical issues.

It was a matter of context handover. The client had never worked with a professional QA tester before and didn’t understand really what I did and what I needed to be given in order to go on doing well in testing their system. I was also offsite so not really part of the internal team that was building things from their end.

Missing the kinds of things I did was entirely reasonable given this and to course correct the client made up a package of documentation, got me some business and technical support contacts and ran some peer testing sessions with me that completely fixed this.

Much later on in my software QA career and having been part of high performing agile teams who were shipping to production 10+ times a day I knew one of the many key elements that helped these teams perform building great software at such a cadence was having fast feedback loops and con conversations throughout everything they do. It was all about sharing and understand each others context.

Yet despite all the evidence pointing to successful engineering teams are those teams working much closer together across the disciplines and having especially QA and Security ‘shifting left’ throughout the entire SDLC , we are still doing security right at then end.

Like its waterfall all over again? Are we throwing a build over the fence to penetration testers offsite somewhere and asking them “Did we build this securely?”. It certainly seems so and I feel their frustrations mirror my own during my days of testing software built that waterfall style.

I have seen how much this is damaging for both the engineers / defenders and the penetration testers. The context is not handed over as well as it could be and the penetration testers aren’t always getting the level of support they need to execute a great engagement.

I set out a few years ago to start fixing this and have written this post (and have done a few conference talks) to help share what I learned help close these gaps and make life better for both teams.

Understanding the engineers perspective on pen tests

Firstly id like to talk about one of the most significant gaps between engineering teams and penetration testers. Often being in two different locations and handovers done via short calls and email both teams do not get the opportunity to have a well considered / meaningful hangover to understand each others needs.

It feels like the pen testing engagements are never forecast and outlined to the people building the software, especially their need and to be ready and supportive of penetration testers. The pen test  ‘sneaks up’ on engineering teams and before the engagement its always a mad rush to assemble the minimum pre testing requirements while also rushing to meet sprint deadlines and ship.

Engineers have often never been told what happens during penetration testing and do not understand the kind of context and support that would give pen testers the right context and information to execute your best testing.

Further to this some teams have a certain fear or anxiety around their work being audited by a security professional. Mostly likely because of the potential fallout and delays created by receiving a report with critical finds. That and in many causes the engineers are facing an audit on a skill set they most likely have received minimal to no training on.

It results in engineers not having the headspace or inclination to do a great job to handing over the penetration testers.

Often the result is a penetration test thats not sufficiently scoped or supported and everything suffers from execution to the reporting and the remediation.

Penetration testers trust me teams do struggle to work with you and understand how to best support you.

If you can respect this and really help them with what you need they will start to understand you are a teammate who they can work to ship a secure build together and not something to be fearful of.

Engineers and defenders, if the penetration testers get more involved you need to welcome the penetration testers into your team and support them every step of they way.

So how have I helped teams despite being often seperate make this stuff work better?

Scoping the engagement

  • Make the best use of a kick off call. Have engineers and QA people especially be available to walk through the application and requirements on a brief call with the penetration testers. A screen sharing sessions can work really and help both side consider some potential pain point to investigate further.

  • Lets get good at asking each other leading questions  

    • “What are the most important aspects of this system”

    • “What are the first things you’d attack in a system like this ?”

    • Then perhaps talk test approach, “Anything else I should look at?”

  • Sense check both teams have a mutual understanding of both the business as much as the technical elements of what you are trying to release

  • Non standard penetration testing engagements are ok! If your building a risky feature or something new with lots of unknowns why not ask for a split engagement where a code and design review happens at 20% complete? Work with your engagement manager to scope the test that makes sense not just testing at the end.

Can you have onsite testing?

This is one of the most effective ways I have helped teams fearful or new to penetration testing overcome this and get a good handover of context done at the same time. Any expense on flights and accommodation pays for itself 10 times over because

  • Especially if teams are new to penetration tests or struggling they will begin to understand the aims and needs of penetration testers and start to build a good relationship with them as team mates.

  • The most effective place to have operations, technical and business support is by being in the team building the software.

  • Fast Q&A / feedback loops as testing is being executed for both teams.

Know testing is coming and build the pre testing requirements early

Book and socialise the penetration testing well in advance, even consider adding it as a calendar item for the team members so they are mindful this is an activity they need to support and be involved in making a success of.

Have the team build out a well considered documentation package as part of pre testing requirements your Penetration testers ask for. Include anything that helps them understand what you have built. Examples include:

  • Network and component diagrams, A good overview of the environments

  • Solution architectures and specification documents

  • Early source code and issue tracking access

  • Early access to the test / staging environment

  • Two sets of testing credentials for every type of system user. Ensure they users are not touched by other employees in your business during the testing

  • Test scripts

Support each other to run a great engagement

To run a great engagement the teams should commit to supporting each other.

  • Engineering leads should specify a point of contact who will ensure they have business, dev, QA and Ops people on stand by

  • Make it your priority to  unblock penetration testers during engagement if they are missing context or locked out / are missing access

  • Penetration testers should for longer engagements make sure they check in with the teams. Especially if there is a high or critical find which is going to impact delivery or worse in known to be in production. 

Make the time for remediation and involve the penetration testers

Time for remediation needs to be budgeted for and engineering teams need to be taking advantage of any post testing review and regression testing activities on offer from the penetration testers. Dont be afraid of the report findings, get their help to understand them and improve the coding standards and education in your team to come back stronger for the next penetration test.

  • Engineering teams must defend the right to make changes to ship secure product. Put all finds into tickets and make a commitment to address them in order of risk.

  • Use the provided opportunity to talk through finds and fix approaches with the penetration testers

  • Developers participate in regression testing / code walkthroughs if they can.

Learn

The most painful thing for me as an Application Security specialist and no doubt our penetration testers is seeing different teams in the same company make the same mistakes. To show Penetration testers you are able to realise their true value to you beyond just doing another engagement and come back stronger next time.

Dont lock up the Penetration testing reports. While these are not for everyone in the business to see I encourage the technical report be shared across all engineering teams and ops / SRE teams too.

Applaud quickly recovering from and remediating issues just as much as not having them in the first place. While a clean test report is nice when you have finds you fix you are learning.

Track back report findings to support articles and entries in your coding standards. Help the engineers get all the support they need to be educated and not repeat history.

I hope these ideas can help you and your teams work closer with external pen testers. I will have a follow up blog post containing the YouTube video of the conference talk that supports this post in due course. 

As always id love to hear your thoughts and any ideas you have be it from the engineering side or as a penetration tester. I feel we must get things working closer so both teams have a much better experience and deeper learning.

Best way to do this is to reach out to me on twitter @sparkleOps.