A truly independent awards programme

DevOps Industry Awards judges are appointed based on their extensive experience in the DevOps field. These seasoned professionals, all of whom currently hold senior management roles, guarantee that each entry is judged fairly and accurately.

To ensure complete impartiality, all entries are judged anonymously with company and individual names, products, or references to any identifiable solution and/or service, being removed before the judges receive them.

This stringent process means that each winner of an award has done so based purely on merit. So, regardless of company size, budget, customer base, market share, influence, vendor, academic, end user, consultant or otherwise; the DevOps Industry Awards truly is an independent awards programme that recognises and rewards outstanding achievement.


Chair of the Judging Panel

Paula Cope
Director of Quality Assurance, DevOps, Release and Environments
Deutsche Bank AG

Judging Panel 2020

Andrew Dalmeny
Director of Devops

Darren Griggs
Director Of Engineering
Vanquis Bank

Bhavik Gudka
Director, Software Engineering
Capital One

Dávid Jámbor
Head of Systems Engineering
Vodafone UK

Rachel Jones
Head of IT Operations
HM Land Registry

Danny Myers
Head of Equities DevOps
J.P. Morgan Asset Management

Declan O’Gorman
Head of Enterprise Engineering
Royal Bank of Scotland

Andrew Sheppard
Chief Digital Information Officer
HM Revenue & Customs

Jérôme Tassel
Director TV & BB services and systems engineering

Judges’ feedback

The following are comments gathered from the 2017, 2018 and 2019  Judging Panels after they had reviewed all entries. It may help you in deciding on what information to include in your entries.

As a whole the entries did not take into account user experience as much as the Judging Panel would have liked.

Additionally, the entrants should remember that they are being judged by a panel of industry peers and so would do well to pitch the entry to the target audience. A few entries were considered “overly simplistic.”

Entries that fared better tended to:

  • give explicit evidence of project success (time/money saved, etc.)
  • show empirical evidence on what the problem was, what they changed and the what they measured to show the success
  • show willingness to adopt more modern DevOps practices and show some initiative to research how others are improving
  • include voices of customers/clients, which was helpful in showing successful outcomes
  • include a strong introduction summarising the project (timelines/scope) and what the expected outcomes were/what the business stakeholders were looking for
  • give strong evidence of communication skills in the ‘best individual’ categories
  • give strong evidence of work/involvement outside of their main organisation in the ‘best individual’ categories
  • emphasise the role of DevOps throughout the SDLC
  • demonstrate holistic approach to DevOps and upskilling of team members
  • show commercial awareness
  • not be overly perfect – there is no such thing as a perfect project – it was interesting to hear about the occasional blip in the road and how the team worked to overcome it
  • clearly discuss project challenges and how they were overcome
  • give context to metrics to fully justify their inclusion
  • not attempt to use metrics, which are broadly regarded as bogus (e.g. simply quote test case numbers) but provide a range of metrics which demonstrated success

Weaker entries tended to:

  • not focus on a project or come across too much like a sales pitch
  • not give evidence of the project’s purpose/scope/timeline/success
  • not consider the larger picture
  • not justify the reasoning behind including metrics
  • list a large number of acronyms, tools or technology (this just distracted from the original problem and the eventual outcome)

Subscribe to our newsletter