1. The tool is generally the least important part of the process.
In my white paper on More Data/Less Cost, I speak about the three Masters you need to create perfect reports – the Business Master, the Tool Master, and the Data Master. The Business Master and the Data Master are by far the more important roles – the tool is the smallest part of the equation. Once you understand the data and the business requirements, both Microsoft SSRS and Crystal Reports are simply ways to get a well-designed query to the end user. Getting the query right is what takes the training and work – anyone who is pretty good with Excel can figure out how to change formatting or sorting.
2. Tools don?t make messes, people make messes.
I see so many instances in which correlation is confused with causation ? i.e. ?We bought a new tool and things got much better.? But that doesn?t mean that the tool was the issue ? as we say, nine times out of 10 we can get you the reports you need from the software you already have. If you look closely you’ll usually find that it wasn?t the tool change that mattered, it was that people changed their processes. Reporting had been an ad hoc affair, and the new tool was the occasion for a new set of controls and processes that ensured reports were tested thoroughly before being used. That?s often what actually makes the difference.
3. Be honest about who?s going to use the tool.
When you evaluate tools, you need to be real. If you buy a home range and you?re not just trying to impress your friends, you ignore what?s used in restaurants because you?re just never going to plate 200 meals every night. If you want to impress people you?ll buy the most expensive, shiny thing you can find – but that?s never our goal where reporting is concerned.
The great promise of Crystal Reports is that end users can write their own reports, that they don?t have to wait for IT or work with geeky developers they don?t like. Indeed, that?s the biggest of the (many) lies that software salespeople tell. While it sounds good, it almost never happens. People know this, but they still evaluate tools with that holy grail in mind. Is it easy for a user to understand? Can a user make it work? That’s ridiculous, because all that user will probably ever do is run a report. Their feedback on ease-of-use is pretty meaningless.
I?ll be continuing this series after the New Year, but keep in mind that, while tools may be different, they are almost never the key to success or failure of a reporting project.