Judges’ feedback

The following are comments gathered from 2020 Judging Panels after they had reviewed all entries. It may help you in deciding on what information to include in your entries.
As a whole the entries did not take into account criteria as much as the Judging Panel would have liked. Additionally, the entrants should remember that they are being judged by a panel of industry peers and so would do well to pitch the entry to the target audience. A few entries were considered “overly simplistic.”

Strong Entries

  • Emphasised the business importance and criticality of the project, and clearly identified what was innovative about the approach being adopted.
  • Clear project goals coupled with quantified outcomes/successes; adaptability to mitigate unexpected challenges/risks and the utilisation of a wide variety of testing approaches and techniques that aligned with the complexity of their environments.
  • Very clear evidence of the project methodology and justification of tech choices. Very clearly detailed description of understanding the Stakeholders needs, the importance of the project and the overall Goals.
  • Explained well the approach to selecting tools, and the reasons for selecting them. The selection of the tools themselves represent best practice & approaches, and the results speak for themselves.
  • Give context to metrics to fully justify their inclusion. Challenges are demonstrated in length with good details
  • Very clear exactly what technology they adopted, why they did so and the challenges they faced by technically and from a skills perspective. The benefits both to Product development and Product support were clearly articulated.
  • Good leadership traits and some positive results from across commercial engagements,  evidence of personal learnings, engagement with the broader DevOps community or approach to systemic embedding.
  • Focus on quality first, shift left, automation of testing and Deployment along with the cultural important factors – collaboration, team work working together and the real coming together of minds, experience and different dynamics.
  • Good business context, good story of the issues and challenges faced by the organisation. Highlighted the cause and effort of the challenges faced by the organisation across a number of departments.
  • Evidenced of real and significant change to roles in the organisation and how these changes provided improved value within the delivery capability really highlighted the challenge faced with cultural shift within an organisation, and brings to life the level of personal investment required by individuals to achieve that shift.
  • Strong evidence of a quality approach, especially the standard of their own people and the service they provide to customers; go above & beyond with their focus on ‘capability building’, forming partnerships and soft skills, on top of their technical prowess; free, on-going “DevOps assessments” and a good example of value add activity (certificate outages).
  • Very strong evidence of external engagement (hackathons, networking events, Meetups, speaking events, government advisory bodies, etc) and genuine thought leadership.
  • Vendor seems to operate on a solid cultural foundation of L&D/knowledge sharing and this is reflected in their willingness to engage with the wider DevOps community.

Weak Entries

  • Did not describe challenges very well, talking about business, project, or architectural challenges – rather than challenges in their automation journey. Unable to justify their choices in selecting an implementation approach.
  • Did not cover all the criteria’s defined, which made it as slightly weaker applications.
  • Lacked details around the approach to Security. Diagram summarised tools used but limited detail on what challenges were encountered and how those were overcome through the use of automation.
  • Not demonstrating evidence of delivering to time, within budget or engagement with stakeholders, neither reflecting back on goals to establish ultimate project success.
  • Concentrating on the merits of the tool or method rather than the actual project deliverable is less likely to be scored highly.
  • Not justify the reasoning behind including metrics.
  • Missed some important stages in their CICD pipelines that the others have incorporated like static code analysis, vulnerability management, monitoring framework.
  • Very little technical information within this application to make a judgement on how and why they’ve implemented what they have.
  • Focused too much on the application changes or agile process but could not articulate how DevOps played a key role while some could not demonstrate how they implemented DevOps principles like investment on CICD, building cross functional teams, etc.
  • Leant completely towards detailing a technical implementation and not the primary purpose of DevOps which is to provide both an engineering but also a business / customer benefit.