It is a well-known fact that selecting the wrong software package can doom an ERP implementation before it gets started. The good news is plenty of information and tools exists with regard to managing software evaluations. All you need to do is look for them. These include the evaluation steps to perform, defining requirements, preparing an RFI, as well as independent insight into just about any ERP package.
TWELVE TIPS TO MANAGE SOFTWARE DEMOS
However, much less has been written about one of the biggest pitfalls associated with selecting ERP systems. This mistake is a failure to properly manage the software demonstrations.
The “demos” are where the rubber meets the road. If not done correctly, the evaluation will cost more and take longer than necessary, the selection processes will turn into a software beauty contest, and bias and subjectivity will taint the evaluation team (and results). In this case, even after doing all the other things right, you may still end up with a terrible package.
Formal software demonstration should begin with the short-list of two or three packages. If there are three packages under consideration, it is best to reduce the list to two as early as possible. The inputs to the demonstration include high-level “must have” business needs, detail requirements for each module, and scripts to guide the demonstration (demo scripts). The following are some tips for managing the demos:
1. For each package, plan two rounds of software functionality demonstrations (lasting roughly 4-5 hours each) and one short wrap-up session.
This should be sufficient time to complete the functionality evaluation and tie up loose ends. When going beyond this, the law of diminishing returns quickly sets in. In this case, you are simply trading off the issues associated with one package for those of another. Typically, none of which has any bearing on the final decision. As a rule, an evaluation that takes more than five or six months is taking too long.
2. Establish a clear agenda for each session (including a timetable for each topic).
Work with the vendors in establishing the agenda, but do not let them determine it. The agenda should be identical for each vendor for a particular module, at least in the first round. In addition, it is important to focus first on the important requirements that drive the evaluation in each module. These include unique requirements and those weighted highest in priority.
3. Do the vendor introductory meeting and basic software navigation first.
This initial meeting allows the vendor to get their standard sales pitch out of the way and demonstrate the navigational aspects of the software to the entire evaluation team (instead of redundantly covering these topics in each module demonstration).
4. Create an equal forum for all vendors.
For a given round of demonstrations, the meeting place and delivery method (sales team on-site or remote) should be the same for each vendor. This is about leveling the playing field. For example, if one vendor performs a demo on-site and another does the same module remotely, who do you think has the opportunity to leave the best impression? This impression may have little to do with the software. Encourage all vendors to be onsite for the demos.
5. Request that each vendor send their A-Team (people that really understand the software and how to demonstrate it).
If the demonstrator is not familiar with the software, he or she can make a good package appear inadequate. This is not necessarily just the vendor’s problem since it can become yours if the wrong package is selected. It is in your best interest to understand what each package can really do.
6. Educate the team on how to apply the package scoring method.
Most methods to evaluate software functionality include a scoring system that quantifies the extent the software addresses a particular business need. The team needs to understand how to apply the system to properly disposition what they see within the software and vendor responses to specific questions.
For example, when a vendor responds to one of your requirements with statements such as “minor software change”, “write a report”, “future release” or “I will get back with you” (and they don’t), score the item a zero (or close to it). Too often teams give the vendor more credit than deserved when the software does not address a business need right out of the box. Also, make sure the team understands it is part of their responsibility to seek clarification from the vendor when responses are unclear.
7. When one vendor uses sample data representative of your business for the demos make the other vendor do the same (or no vendor uses company data).
Otherwise, this will give one vendor an unfair advantage. All vendors understand the use of your company’s data for a demo makes their software look “less foreign” or more appealing to the evaluation team. However, this has nothing to do with what the software can do.
8. Make sure all vendors perform demonstrations with a copy of the production version of software and with technologies typically deployed.
We want to avoid demos using software, databases, and other technologies not representative of the actual product. Though not common, some vendors modify screens and program code prior to demonstrations to make the software appear that it meets requirements.
9. Require the vendor to demonstrate the software release to be implemented (usually the current release).
When a vendor wants to demonstrate a previous release this could mean the current release is not ready for prime time. Check references of those using the current release. If the information obtained is troubling, run away from the package as quickly as possible.
10. The evaluation project manager must be an interpreter.
Though the vendor and the team are speaking to each other during the demo, this does not mean they are communicating. Vendors and users come from different worlds, use different terminologies, and the words can get lost in translation. It is the job of the project manager to ensure that communication is truly occurring.
11. The evaluation project manager must be a referee.
This includes ensuring adherence to the agenda, participation of all evaluation team members, and that the vendor is truly answering the questions and actually demonstrating the software.
12. Conduct team follow-up meeting immediately after each demonstration.
Allow time at the end of each demonstration for the team to discuss what they learned, reconcile differences, score the package, and document follow-up questions. Of course, this segment of the meeting does not include the vendor.
When this is not performed immediately following each demo, the team will be hard-pressed to remember what they saw. In addition, two people attending the same demo can walk away with very different perceptions.
Independently, each team member records the score for each requirement during the demo. Afterwards, individual scores are discussed and reconciled into a single team score for each requirement. When a consensus score is not possible, it becomes a follow-up item.
If you like my articles here at ERP.com, you will love my new book "Control Your ERP Destiny". It is the first ERP book to present strategies and techniques that enable companies to take charge of their ERP projects to drive success. The book is endorsed by several industry leaders and was Amazon's No. 1 new release in the Computer and Technology Project Management category in June 2012. It is also available at Amazon UK, Borders AU and as an e-book at Google Books.