Free Research Paper on Benchmarking

Free research paper sample on Benchmarking:

Benchmarking is “the practice of recognising and examining the best industrial and commercial practices in an industry or in the world and using this knowledge as the basis for improvement” (Naylor 1996). Benchmarking is used worldwide and no doubt receives attention due to the tangible nature of its underlying methodology. Benchmarking has long since been utilised in the UK NHS and, whilst it may therefore have comparable value to the case study, the benefits and potential pitfalls of the technique in this arena are well reported (Bullivant 1996). Indeed, Black et al (2001) warn of the potential dangers of externally imposed benchmarks and an over reliance on performance indictors as measures of quality. Again, the situation of externally imposed benchmarks has relevance to the case study, where standards are set by an external body.

Whilst restrictions on space prevent discussion, it is worth noting that other techniques such as the Six Sigma method (see Behara et al 1995), and a systems approach (see Johnson et al 1995) also exist to measure quality.

We can write a Custom Research Paper on Benchmarking for you!

In conclusion, whilst a universally agreeable definition of service quality remains elusive, it is clear that service quality generally equates to an, albeit indefinable, evaluation regarding the superiority of a service. Indeed, any attempt at a more exacting description that this is futile. Furthermore, any attempt to conclude which system for measuring quality is most effective would be equally futile. The basis of a preference for a particular method may be as varied as its extolled merits and deficiencies, and would appear to be dependent on the prevailing circumstances.

As Crosby (1979) famously stated, “Quality is free”. Well, is it? This may have been true of 1960’s manufacturing where end consumers directly compensated the manufacturer for their product and, thus, the costs of implementing quality control processes were effectively covered by financial savings brought about by lower defect rates, a reduction in the necessity for after sales servicing and a decrease in warranty claims. However, in the arena of a 21st century, public sector, service industry, where additional processes and requirements are met from a limited budget, the majority of which is provided by government funding, and where the consumer does not directly compensate the service provider in all cases, it may be argued that quality is not free. Indeed, it is apparent from the case study (s. 4.13) that the rural ambulance service is continually seeking a balance between the cost of providing a high quality service and the risk to the community should the quality be reduced. Thus, in real terms, the provision of a quality service may be very costly.

Consider if you will, the situation where a rural ambulance service is required to meet previously defined response times. The only way these times may be achievable is by substantial investment in technology, or by an increase in the number and location of ambulance crews. Neither of these solutions comes without out a price, whether it be financial, as alluded to by Crosby (1979), or political. Yet, are response times the only indicator of quality and, if not, how can other aspects of quality be identified, defined and, thus, measured?

Historically, response times have been the holy grail of emergency service performance indicators and at the time of the report the Victorian Rural Ambulance Service (VRAS) relied predominately on this factor to measure their effectiveness (s. 4.25). However, problems with accurately obtaining this measurement, related to the different approaches and inconsistencies (see s. 4.26), serve to highlight the danger of relying on a single measurement.

It would also appear from the report that VRAS places great emphasis on the evaluation of financial performance as a means by which to judge overall performance. Whilst also a useful indictor, there are growing concerns that an over reliance on this measurement can lead to long-term problems (Kloot 1999).

The other measurement used by VRAS to monitor quality, as highlighted by section 4.27 of the report, is the comparison of the number of external compliments and complaints that are received. Whilst it is accepted that this method is only used to provide a broad indication, it must be noted that this measurement is unsolicited and, as such, is reliant on the impulses of third parties. Indeed, if a patient is the recipient of poor service, they may not always be capable of raising a compliant at a later date.

In is evident from the report that there is a preference towards the use of benchmarking as a means by which quality is to be measured. To this end, it is necessary to identify performance indictors that are capable of benchmarking in order to assess quality. Indeed, as alluded to above, the accurate collection of data relating to response times would be one of several suitable measurements. However, whilst the measurement of response times is undoubtedly an important factor in providing a quality service, it is by no means the only factor.

The VRAS’s main responsibilities are given as the maintenance of a suitable standard of clinical care in providing the initial healthcare response, ensuring the operation of an effective communication system, and determining the location and availability of ambulance service resources in order to provide a timely and quality ambulance service (s. 2.4). As such, any initial benchmarks should seek to measure the performance of these responsibilities. It is important to note that benchmarking is only as effective as the data on which it is based and it is apparent from the report that, historically, there have been problems with collecting quality data. As such, a crucial first step after the identification of possible areas for benchmarking would be to establish means of accurate and consistent data collection relating to the respective areas. The importance of this step is recognised by Bullivant (1996) in his review of benchmarking in the UK NHS.

Furthermore, in relation to communications, the report highlights the current governmental objective to apply computer aided dispatch systems to all of the State’s emergency service operations in order to improve various aspects of the service’s quality (s. 4.34 – 4.36). This point neatly illustrates how technology can be used to improve quality. However, there can, of course, be no guarantee. One need look no further than the London Ambulance Service’s well publicised and disastrous attempts to computerise their dispatch system (Page et al 1993). Indeed, this point also serves to emphasise the earlier submission that quality is not always free; take for example the huge cost of installing a new computer system.

In relation to the standard of clinical care, the report identifies the importance of maintaining the required level of clinical expertise amongst ambulance officers. Yet, rather than maintaining a standard, the importance of continual improvement with a view to long term development, has been identified as an important factor in providing a quality service in a healthcare setting (Gummesson 2001). Indeed, using the example of the Shouldice Hospital in Canada, Gummesson also points to the benefits of patient follow-ups as a vital aspect in the provision of a quality service.

Gummesson’s (2001) point serves to illustrate the dangers of a reliance on internal benchmarks alone, as a means by which to measure quality; a point well made by Black et al (2001). Whilst internal benchmarking and performance measurements, many of which are provided in the report (see table 6B), can promote accountability to stakeholders (Kloot 1999), they fail to measure the expectations and perceptions of the very same people.

It is noticeable from the report that one of the potential uses of benchmarking is given as ‘making the community aware of the quality of service it can expect’ (s. 4.15). Indeed, as suggested by Roledo (2001), if service quality is dependent upon customer expectation then we ought to consider a more active approach to the management of customer expectation. Strange then, that whilst the report recognises the importance of customer’s expectations, it stops short of suggesting they are measured. Expanding the ideas of Roth and Griffi (1994), Everett et al (1997) have concluded that “The customer specifies quality, and his or her satisfaction is the basis for measuring quality performance”. As such, it may be argued that VRAS needs to employ one of the previously discussed techniques for employing direct customer input, rather than relying on the solitary technique of benchmarking alone, as would largely appear to be the case.

On this point, the report does highlight VRAS’s current (at the time of the report) commitment to obtain operational accreditation with the International Standards Association under quality standard ISO 9002. Accreditation will, if approved, ensure the continuing development of standard operating procedures which should, in turn, improve the over all quality. Indeed, recent research by Dick et al (2002) led them to conclude that service firms who consider quality accreditation (ISO 9000) to be important, “have an increased usage and emphasis on both the internal (conformance) and customer-based (exception-gap) quality measures”.

Order custom research paper on BenchmarkingFree research papers, free research paper samples and free example research projects on Benchmarking topics are plagiarized. is professional research paper writing service which is committed to write top-quality custom research papers, term papers, essays, thesis papers and dissertations. All custom research papers are written by qualified Master’s and PhD writers. Just order a custom written research paper on Benchmarking at our website and we will write your research paper at affordable prices. We are available 24/7 to help students with writing research papers for high school, college and university.

This entry was posted in Free Research Papers and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>