How to Achieve Successful 360 Degree Reviews
Over the last year or so we have completed quite a number of 360-degree review assignments for our clients - and we'd like to share some of our experiences with you.
We've generally been engaged to design, administer and report 360-degree reviews for individual assessment against competency frameworks. They are also used for individual performance assessment. The 'rules' for success are pretty much the same in both cases.
We think that the rules for success look like this:
Platform, data storage and online access
Select the software platform that is going to meet your current as well as your potential future needs. Might you want to repeat reviews in the future and compare outcomes? Might you want to expand from competency assessment into performance management and/or succession management? Might you want to change your questions or rating scales in the future, or provide an additional language to English?
A good platform selection will pay dividends and a poor one will hurt you. At ODI we use Centranum TMS for our client engagements – it is eminently programmable, modularised, New Zealand based, and supported by a highly experienced registered psychologist.
Testing
Set up dummy reviewers and reviews so you can test the whole process (from invitation to review, the review itself, to report generation). Work on the basis that anything that can go wrong will go wrong. Engage others in the testing to get diverse user perspectives.
Review questions
Take a reviewer-centric view of the questions to make sure they are clear, unambiguous and readily understood by the reviewers; this will assist with the quality of responses as well as with the usefulness of the review outcome reports. Professional help from an organisation psychologist to get the language right can significantly add value.
Ratings and comments
Use a rating scale language that works for your business (you may already have a favourite one). An odd number of ratings plus a ‘not observed’ option works best – either five or seven gives a meaningful spread.
You can rate either on a frequency basis (none of the time, some of the time, all of the time, etc), or on a degree basis (poor, satisfactory, very good, excellent, etc) depending on your situation. Professional advice as to the rating scale suited to you can pay dividends.
Ratings should be compulsory for each question, with ‘not observed’ giving the necessary flexibility. That can prevent reviewers from avoiding harder questions. Comments are to be encouraged but should not be compulsory – not every reviewer will be confident to offer useful comments.
Reviewer selection
Encircle each reviewee with a suitable mix of reviewers - self, manager(s), peers, team members. If reviewees select their own reviewers, moderate the choices they make so the outcomes are truly representative.
Anonymity
Reviewers need to feel safe, right from the beginning of the process. Reinforce that their feedback is important and anonymous – make sure you walk the talk about that.
Review instructions
Provide simple and clear instructions to reviewers about how to access the online review portal and how to step through the survey. State (accurately) how long each review is likely to take. Test your instructions to make sure they work and are as simple as they can be.
Some reviewers will get stuck in the process; run a Helpdesk while the reviews are open so you support these to completion.
Reviewer follow-up
Some reviewers won’t ‘get around to it’ in the timeframe you want. Reviewer-specific follow-ups within the review period will encourage completions. Don’t overdo it though – stalking generally doesn’t help. Especially, don’t follow-up those who have already completed; always work off up-to-date completions information.
Tailored reports
Design your review reports so they are clear and unambiguous, taking a reviewee-centric viewpoint. Introductory notes are very useful to reinforce what the review is about, and what it is not. Make sure the notes and tables are true to the language of the questions and don’t introduce new concepts at the reporting stage.
Have electronic copies available for your organisation’s records, but give bound hard copy of individual reports to each reviewee – this encourages their future use of the information.
Feedback
Reviewees will vary in their sensitivity about the disclosure of review outcomes to them. It’s best to hand out review reports only when accompanied by a planned feedback session. Feedback should be provided by an experienced leadership coach or organisational psychologist.
If you're interested in running a 360 review in your organisation, contact Nicky on 021 133 1201 or info@odi.org.nz