Chapter X

Towards a QoS-Focused SaaS Evaluation Model

Xian Chen, Abhishek Srivastava and Paul Sorenson

Department of Computing Science

University of Alberta

 

X.1 Motivation

In the past decade, the growth of web service technologies and the emergence of service-oriented architectures (SOAs) have added tremendously to the increasing maturity of the Internet and the software industry. These advancements make it possible for software vendors to deliver effective software applications as web-based services using a new delivery model called Software-as-a-Service (SaaS). In simple terms, SaaS is a model of software deployment where an application is hosted as a service provided to customers across the Internet [1]. By eliminating the need to install and run the application on the customer's computer, SaaS alleviates the burden of software maintenance, ongoing operation, and client support for the customer. Conversely, customers relinquish control over software versions or changing requirements; moreover, costs to use the service become a continuous expense, rather than a single expense at time of purchase. SaaS applications are generally charged on a per-user basis and are shared by multiple independent customers [2]. Under SaaS, the service customer receives the benefits of the software, with clearly understandable costs, at a contractually defined service level [3].

While successful commercial SaaS applications like Salesforce.com and Google Apps are now deployed, tools and approaches to assist organizations in evaluating and planning for SaaS opportunities are not yet widely available. This chapter provides a model framework for evaluating SaaS applications based on quality of service characteristics, and forms the basis for a toolset to assist the IT planning process.

X.2 Service System Quality Management

In studying SaaS evaluation, our focus on quality management is motivated by two basic assumptions about the nature of service system delivery.

1.     Service systems operate most effectively when both the service customer and the service provider understand and actively engage in the co-creation of value [4].

2.     Service system improvement is best achieved when the major service quality factors are mutually agreed upon, tracked, managed, analyzed and acted upon by the service customer and service provider.

The goal of service quality management is to provide lower cost, better products and services and higher customer satisfaction. Traditionally, if a service provider understands what a customer wants from a service (typically defined with a detailed specifications based on the customer requirements), manages the variables in the service delivery process that can lead to deviation from specifications, and delivers the service in accordance with the customerÕs stated requirements, the service system is properly managing with respect to service quality [5]. In practice, however, a dynamic approach must be used in managing service quality due to continuous changes in the cost of service delivery, customer requirements and the emergence of new technologies. When existing customer expectations are not met, a new expectation benchmark must be set and service re-evaluation undertaken. The need is growing for evaluation models to assess service quality on an ongoing basis and to improve/accelerate decision-making related to the adoption of software services in general and SaaS applications in particular, given their rapidly increasing adoption [6].

Unfortunately, most current quality management approaches for SaaS services focus on the perspective of service providers, and thus do not fully take into consideration the collaborative nature of the two basic assumptions given at the beginning of this section. Approaches, such as SERVQUAL [7], American Customer Satisfaction Indices (ACSI) [8], and Balanced Scorecard [9], incorporate the viewpoint of customers but often not in combination with the providerÕs viewpoint. What is not present in the existing literature is an approach that adequately combines the perspectives of both provider and customer together with the nature of their ongoing business relationship. Therefore, at a general level we are interested in addressing the following research problems: (1) the exploration of an integrated model that takes into account the shared nature of service quality in SaaS systems and (2) how to best track and improve the service quality effectively by applying the model.

X.3 SaaS Maturity Models

In the process of developing the foundation of our SaaS evaluation model we explored a number of related models for assessing service system delivery and management. They are characterized as Service Delivery Models and their approaches are summarized later in section 7 of the chapter. These models are relevant and complementary to SaaS evaluation; however, their scope is broader than SaaS systems and is primarily concentrated on service delivery from the perspective of service provider. In this section we review the two main SaaS maturity models that have been proposed to date.

 

X.3.1 Microsoft SaaS Maturity Model

Microsoft introduced the first widely published SaaS maturity model in 2006 [8]. A four-level SaaS maturity model was proposed mainly to assess the maturity of single-packaged SaaS applications. According to the model description, SaaS applications can be classified by three key attributes of architectures: configurability, multi-tenant efficiency, and scalability. Each level is distinguished from the previous one by the addition of one key attribute. A brief explanation of each level is as follows [10]:

á           Level 1: Ad-Hoc/Custom: Each customer has a customized version of the application and runs its own instance of the application on the servers hosted by the provider. Migrating a traditional non-networked or client-server application to this level typically requires the least development effort and cuts down operating costs primarily by consolidating server hardware and administration.

á           Level 2: Configurable: The second maturity level provides greater application flexibility through configurable metadata that enable customers to use separate instances of the same application code. This allows the provider to meet the different needs of each customer through detailed configuration options, while simplifying maintenance and updating of a common code base.

á           Level 3: Configurable, Multi-Tenant-Efficient: At the third maturity level, the provider adds multi-tenancy support to the second level capabilities, enabling a single application instance to service all customers. This approach allows better use of the providerÕs server resources without any apparent difference to the customer.

á           Level 4: Scalable, Configurable, Multi-Tenant-Efficient: Better overall scalability for the providerÕs service delivery is the goal at the fourth level. This is typically achieved through a multitier architecture supporting a load-balanced farm of identical application instances, running on a variable number of servers. Effectively, a Òcloud computingÓ [11][12] approach is adopted by the provider to support a set of application instances. The capacity of the providerÕs system can be increased or decreased dynamically to match demand by adding or removing servers, without requiring changes to the application software.

 

X.3.2 Forrester SaaS Maturity Model

Forrester's model, the other major SaaS maturity model, provides guidance on strategy transformations for software vendors working with services providers who are considering a SaaS business model. This model classifies the maturity of SaaS solutions on five levels, according to the way a SaaS system is delivered [13]:

á           Level 0: Outsourcing. In outsourcing, a service provider operates one application or suite of applications for a large customer organization. Typically an outsourcing provider is obligated under contract to the one customer and cannot directly leverage that customerÕs application for a second customer. Because of this restriction outsourcing does not qualify as SaaS, thus this level is not considered as a formal maturity level. It is included as level 0 because SaaS providers often launch their business operations through outsourcing arrangements with a few preferred customers.

á           Level 1: Manual ASP (Application Service Provider) Service. The model at this level is mainly targeting midsize companies. An ASP hosts packaged applications (e.g. SAP and PeopleSoft ERPs) for multiple customer organizations. Typically, the service provider allocates to each customer a dedicated server running that customerÕs instance of the application. This allows, as deemed necessary, the ability for a provider to customize the installation in the same way as self-hosted applications.

á           Level 2: Industrial ASP Service. At this level, an ASP introduces advanced IT management software to provide identical packaged application with customer-specific configuration options to many small-to-medium sized customer organizations. A key element of the industrial ASP service is that the core elements of software package are the same for all customers and therefore a significant amount of the operating costs can be shared amongst multiple customers.

á           Level 3: Single-app SaaS. From this level on, SaaS capabilities become built into the business applications. These include web-based user interface access to all services and the ability to service a great number of customers with one scalable infrastructure. Single-application SaaS adoption focuses on small-to-medium size businesses. Like the industrial ASP service of level 2, the only way to customize the application is through configuration. Salesforce.com's customer relationship management (CRM) application initially entered the market at this level [13].

á           Level 4: Business-domain SaaS. At this level, the SaaS provider offers not only well-defined business applications but also a platform supporting additional business logic. This allows the single-app SaaS of level 3 to be augmented with third-party packaged SaaS solutions and optional customized extensions. The model can now satisfy some of the requirements of large enterprises by migrating a whole business domain like "customer care" to a SaaS solution.

á           Level 5: Dynamic Business Apps-as-a-service. At this level, Forrester's model claims that a new Dynamic Business Application imperative "design for people, build for change" is embraced. Advanced SaaS providers coming from level 4 will offer a comprehensive application as well as an integration platform on demand, and pre-populate the platform with business applications or business services. Customer-specific and even user-specific business applications on various levels can be composed dynamically. The resulting process agility should be attractive to everyone, including large enterprise customers.

There are similarities and some distinct differences between the two SaaS maturity models from Microsoft and Forrester. Both models describe a set of greater capabilities needed by the SaaS provider to manage common software architectures and infrastructure as the levels of maturity increase. Microsoft's model focuses on the increased capabilities of a SaaS deployment through the re-architecting of single application packages delivered on common infrastructure. These capabilities are embodied in three key attributes: configurability, multi-tenant efficiency, and scalability. Forrester's model takes an evolutionary approach that provides prescriptive guidance to software vendors and service providers in the transformation of enterprise-wide software. If we restrict our attention to single application deployment of SaaS, levels 1 through 3 have significant similarities in the two models. The major difference at level 4 is the support for software across an entire business domain in the Forrester's model. Level 5 of the Forrester's model appears to have no counterpart in the Microsoft's model. A scan of the SaaS literature indicates that there is likely no SaaS implementation that would be rated at ForresterÕs level 5 in existence today.

An important observation of these SaaS maturity models is that neither focuses on quality of service. Without the ability to assess quality of service delivery, the decision makers (i.e., the customers and the providers) will have a difficult time planning and managing service improvements. In addition, these models largely ignore the perspective of service customer, and only emphasize what the service provider can do. It is our strong belief, based on the two fundamental assumptions about service systems identified in section 2, that it is necessary to incorporate the perspectives of both service provider and service customer in any SaaS evaluation model.

X.4 Quality in SaaS Business Relationship

In this section, we introduce the notion of quality as it applies to service delivery and then discuss how quality is often expressed or realized as part of an ongoing SaaS business relationship.

X.4.1 Quality Definitions

The definition of "quality'' has been addressed and debated for a long time in a number of academic and industrial publications [14][15][16][17][18]. Of these, we have chosen to focus on the one developed by David Garvin [16] in which he identified five major perspectives to the definition of quality: transcendental, product-based, user-based, manufacturing-based and value-based. We have found that for software services it is difficult to separate product (the software system) from service (the deployment or actual ÒmanufacturingÓ of the system as a service). For quality of service, we only consider the following four perspectives:

á      Conformance quality: Equivalent to many aspects of a combination of GarvinÕs product-based and manufacturing-based perspectives focusing on conformance to specifications. Typically the focus is internal and on determining that performance matches original design specifications often expressed in service level agreements (SLAs). Approaches that can be applied to manage conformance quality include (1) QoS specification languages [19] in which the quality requirements, quality capabilities and quality agreements are expressed, and (2) service level standards such as IT Service CMM [18] and ITIL [19].

á      Gap quality: Equivalent to GarvinÕs user-based perspective focusing on whether customer expectations are met or exceeded. This is the most pervasive definition of quality particularly as applied to business management. Most approaches on gap quality use the Gaps Model of Service Quality [24], which measures the gaps explicitly by considering both customer perceptions and expectations. These approaches include SERVQUAL [7], ACSI [8] and TechQual+ [23].

á      Value quality: Equivalent to GarvinÕs value-based perspective focusing on the direct benefits (value) to the customer. It is a universal measure for widely different types of objects, and can be an appropriate guideline for continuous quality improvement. Approaches on value quality introduce more business-oriented measurements, such as productivity, Return on Investment (ROI) and risk estimate, and provide greater insight to business goals.

á      Excellence quality: Equivalent to GarvinÕs transcendent perspective focusing on recognition of excellence. It stresses the features and characteristics of quality, but it may change dramatically and rapidly. In IT services, excellence quality is marked by uncompromising standards and high performance, and can be used directly as promise and advertisement. Therefore, It is usually externally defined and hard to relate to quality improvement.

Because of the difficulty in using excellence quality to identify quality improvement opportunities, we focus only on the first three definition of quality in our work [24][25].

X.4.2 Quality management in a SaaS business relationship

Basic to any SaaS deployment are business relationships between the provider organization and the various customer organizations to which the provider delivers its services. Two of these relationships, presented from a provider organizationÕs view, are shown pictorially in Fig. 1. The relationships, labeled Conformance Quality and Gap Quality, are depicted as measures in the diagram. These are measures that should be managed by the SaaS provider as part of their business relationship with their customers. In most service arrangements, conformance quality is expressed as service levels agreed to with the client. With SaaS, service levels are often advertised in advance as part of the providerÕs marketing strategy and finalized under contract when a service sales agreement is reached with the customer. Therefore, in SaaS the focus on conformance quality aspects such as volume (transactions per minute), response time, availability of service, etc. are usually negotiated and agreed to up front between the production department (responsible for running service support) and the marketing and sales departments of the provider organization.

Providers are also involved in gap quality measurements with customer organizations. Typically, quality concerns related to ease of use, response to failures, and user training are determined by the provider using survey tools involving the customers. This form of user input identifies gaps between what the customers are experiencing in using a service and what they would like to be experiencing. This feedback is critical if a provider wishes to improve their service.

_____________________________________________________________

Figure 1: Provider organization view of a SaaS business relationship

 

The view of SaaS business relationships from the customerÕs perspective is shown in Fig. 2 in which two relationships are depicted. The first, named Functional Needs, expresses the user requirements for supporting their workplace activities in the customer organization. The Business Units of the customer organization usually consults with their users to determine if these service requirements can be met through a service offering by one of more SaaS providers.

The second relationship, labeled as Value Quality, captures the value the customer organization places on deploying a service using a SaaS. Although there is no universally accepted definition of value quality, common approaches for measuring value quality include: cost-benefit analysis [26], ROI analysis [27], risk management [28] or combinations of these approaches using a balance scorecard [9].

_____________________________________________________________

Figure 2: Customer organization view of a SaaS business relationship

 

X.5 Co-Creation of Business Value in a Service Relationship

The discussion from the previous section on value quality was from the perspective of the customer organization. But one of the fundamental definitions from the merging area of service science [4] is that a ÒÉ service system is a value co-production configuration of people, technology, other internal and external service systems and shared information.Ó The question arises is how is the notion of co-creation of value in a SaaS offering supported in value quality measures?

Let us explore this question by considering the possible co-value situations that can exist between a service provider and service customer organization. These situations can be represented in Fig. 3 where we express the customer and provider values respectively on simple x-y axes, each axis ranging in scale from low to high value. In general, the value measures for the provider and the customer are dependent on the nature of the service offering. For the purpose of this discussion, let us assume simplistically that the customer value is determined primarily by ROI (Return on Investment) analysis and the provider value is determined by the total profit (income after all expenses) from providing the service. In the diagram we have characterized the five regions with names that reflect the relative maturity of the service offering [29]. When a service is first developed it is typically done as a limited offering (or research prototype) based on research of market opportunities and the innovative application of new or advanced technologies or processes. From the perspective of value quality, the service provider sees low value (little or no profit) and a customer also sees low value because the prototype service is limited in functionality with little commitment to sustainability because of the trial nature of its deployment.

_______________________________________________________________________

Figure 3: Phases of service delivery based on co-value to the customer and provider

 

Assuming the service is well received for its initial functionality and responsiveness, and its user base increases, the value (as determined by ROI) will increase for the customer. During the early stages of growing the service from prototype to an initial release in the marketplace, the value to the provider (profit) remains low or at best increases slightly.

Once the service takes hold in a marketplace and large numbers of customers acquire the service, the value for the provider (profit) increases substantially in proportion to the number of customers. The value to the customer (ROI) is very dependent on the costs associated with the delivery of the service within a growing marketplace. If there is little or no competition for the provider, we move to a monopoly service situation typically generating higher costs and therefore lower relative value for the customer (ROI). Alternatively, the marketplace could quickly yield a healthy set of service providers that should lead to an increase value for customers (ROI) because cost of service should not rise substantially if at all. This stage, labeled the mature service, represents the situation when the co-value of the service business relationship for providers and customers is at its peak (we refer to it as a Òwin-winÓ value situation).

Note that it is rare for a software service marketplace to remain in a monopoly situation for an extended period because the capital investment for new providers to develop competitive services is usually not extensive. Therefore, generally for SaaS, a monopoly service should quickly transition to a mature service situation.

A fifth stage that can occur is when service competition increases for the provider and marketplace adoption becomes so widespread that the service becomes commoditized. At this commodity service stage, the value to the provider (profit) can decrease significantly because of decreased profit margins on a per customer basis. The value to the customer can also decrease at this stage because the commoditized service is no longer a strategic advantage for customer organization, which may have its own set of competitors.

The transition from a commodity service to a research prototype is represented as a dotted line to show that often a new provider organization creates a new service innovation that impacts the commoditized marketplace. This new service will begin its own service maturation process that can displace the commodity service in that marketplace. An example of this is the rise of email services in the last decade to replace much of the standard mail services that had been commoditized.

Of course, not all service offerings follow this form of Òlife cycleÓ. Many new services do not make it past the prototype stage or linger in the initial release stage without garnering significant market presence. Some services, given the nature of their potential marketplace, may never be commoditized. Ideally, both service provider and service customer continue to seek ways of maintaining a Òwin-winÓ business relationship where new or added co-value is continually being created for a service offering. At the core of the SaaS QoS model that we present in the next section is the characteristics of the business relationships between service customer and service provider.

X.6 Specifications of a QoS-Focused SaaS Evaluation Model

In this section we present our initial version of a SaaS QoS evaluation model and illustrate its features using existing SaaS applications. The model prescribes the quality of service approaches for four service classes based on the business relationships between the service provider and the service customer: Ad-hoc, Defined, Managed and Strategic. The model is summarized in Table 1.

Table X.1: Maturity Levels of Business Relationship in SaaS services

 

Maturity Level

 

Characteristics of Business Relationships

 

Service Customer

Goals

 

Service Provider

Goals

 

Quality Approaches

Level 1

Ad-hoc

Functionality needs achieved

 

Service delivery on an Òas neededÓ basis

 

Some quality measures may be in place

 

Level 2

Defined

 

Functionality needs achieved with reliability and other desirable quality requirements guaranteed

 

Service delivery on a regular (defined) basis with defined capability

 

Conformance quality measures (SLAs defined and tracked)

 

Level 3

Managed

 

Goals of Level 2 plus agreement on monitoring of service quality assurance

 

Service delivered with configurable capability. Shared responsible to monitor and manage service quality factors

Conformance plus gap quality measures

 

Level 4

Strategic

Proper governance of service to ensure value goals defined and achieved using approaches such as Cost-Benefit analysis, ROI analysis and risk management.

Dynamic delivery with the shared goal of service improvement with customer.

 

Conformance, gap and value quality measures

 

 

X.6.1 SaaS Maturity Levels

Ad-hoc Service

A SaaS service is called Ad-hoc if it is used by a customer on an as-needed basis in response to business requirements. The goal of the service customer is to ensure that the service meets the critical needs of its users. Typically few, if any, QoS attributes are tracked by the provider on behalf of the customer. Examples of Ad-hoc services are Amazon.com and Expedia.com when used widely in an organization to facilitate book and travel purchases respectively.

Defined Service

A SaaS service is called Defined if it is described in a contract or an agreement which outlines service usage and guarantees the service level capabilities typically through Service Level Agreements (SLAs). The QoS concerns focus on measurable, performance-oriented factors such as availability and responsiveness. A good example of a Defined service is Google Apps [30] Enterprise Edition which has a defined SLA focusing on availability. Another example is SAPÕs Business ByDesign [31], which provides SaaS capabilities for ERP level applications (integrated accounting, supply chain, HR, CRM, etc). SAP also provides a SLA focused on availability.

Managed Service

A SaaS service is called Managed if it is a Defined service with additional agreed upon commitments by both the customer and the provider to share the responsibilities of managing the service. Examples of shared responsibilities include monitoring the service quality and refining the service to meet changing quality requirements. A good example of a Managed service is Salesforce.comÕs CRM (Customer Relationship Management) service. They provide customization and integration capabilities that allow customers to set up their own unique CRM service and share customer developed applications. Salesforce.com also supports tracking of service issues and commitments.

Strategic Service

A SaaS service is called Strategic if it is a Managed service in which both the customer and provider are able to identify the common, agreed upon, business value of deploying the service. Typically the decision to adopt a strategic service is based on business value analyses such as Cost-Benefit analysis, ROI (Return on Investment) and/or risk analysis. We have not found any good example of a Strategic Service in todayÕs SaaS solutions since we donÕt see the application of business value analyses in SaaS services management.

Fundamental to our model is the increasing role service quality measures play in the business relationship as this relationship moves from Ad Hoc to Strategic. In an Ad Hoc service, there is little or no emphasis on QoS measures. A Defined Service includes conformance quality measures, a Managed Service adds gap quality to conformance quality measures, and a Strategic Service includes value quality measures as well as conformance and gap quality measures. The goals of both SaaS providers and customers is to increase the depth of their business relationship and the service offering moves from Ad hoc to Strategic.
X.6.2 QoS-Value Graphs – a instrument for the QoS Focused SaaS Evaluation Model

For our model to be used effectively in the planning of IT services, it must be more than just descriptive. In particular, instruments must be available to support the definition, tracking and analysis of the value quality for each QoS attribute that is agreed upon by the provider and customer. Let us consider the following example scenario to illustrate how one such instrument, QoS-Value Graph, can assist in a key element of the model, the determination of co-value using QoS attributes.

Assume that an agreed upon QoS attribute is average response time for a set of five important service components of a service offering. We can represent in a QoS-value graph the relative value of different average response times for the customer and the provider. From this graph the customer is prescribing that response rates of less than 15 milliseconds have highest value. The customer value decreases in a linear fashion for average response times between 15 and 30 milliseconds, dropping to zero value for average response times greater than 30 milliseconds. For the provider, the value related to response time performance is primarily determined by their capability to meet response time demands with their delivered service. The value curve in Fig. 4 indicates that it is impossible for the provider to deliver an average response time of less than 10 milliseconds for the current service offering. From 10 and 17 milliseconds, the provider value increases rapidly representing a technology space that could be achieved if significant costs were invested in improving the current service system. For greater than 17 milliseconds, the provider value continues to increase in linear fashion representing decreasing response-time requirements for the SaaS provider.

______________________________________________________________________

Figure 4: QoS-value curve with QoS attribute of average response time

 

By using an instrument such as QoS-value curve, provider and customer can share important information to assess co-value opportunities and arrive at an agreement over average response time commitments. In the case depicted in Fig. 4, one could imagine the provider and customer arriving at a decision to use 17 milliseconds as the basis for an ongoing service level agreement.

These curves can be used across all QoS attributes that are deemed most important in any business relationship involving a strategic service. As another example of the QoS-value curve instrument, consider the use of Benefit-Cost analysis, a value quality attribute, as part of a strategic service partnership. Fig. 5 represents the situation when the Benefits to Cost ratio is adopted as a QoS attribute that would be defined and tracked. Note that we have decided in this example to inverse the normal ratio of Cost to Benefits to Benefits to Cost because it easier to conceive of an increase in customer or provider value as the QoS attribute increases. For a SaaS service offering, the customer benefits are the funds saved by deploying a service and the costs are primarily the funds as defined in the service contract with the SaaS provider. For the SaaS provider, the benefits would be primarily based on the funds received from the customer for delivering the service and the costs would be the funds required to operate the service. The QoS-value curves shows that for the provider there is a narrow region (1.6-1.8) of the Benefits to Cost ratio in which the value increases significantly. This represents the situation in which the benefits outweigh the costs by a comfortable margin – enough to ensure that the service relationship yields real value for the provider.

For the customer, a benefits to cost ratio is of no value until it reaches slightly above 1. The customer value then increases somewhat until a ratio of 1.8 at which point it increases significantly in a linear fashion. Assuming a strategic service relationship is sought and, therefore, co-value creation is an over-riding goal, provider and customer can share their QoS-value curves to assist in determining what is a viable cost range for the service offering that allows the provider to make a reasonable profit and the customer to garner significant value from the service.

 

Figure 5: QoS-value curve with QoS attribute of Benefits to Cost Ratio

Additional tools and capabilities for our QoS Focused SaaS Evaluation Model are being planned and these are outlined in the final section.

X.7 Related Work in Service Delivery and Management

In the past decade there has been growing interest in the definition of maturity models and specifications of best practices in the general area of IT service management and delivery. This work is relevant and complementary, but does not apply directly to our narrower focus on SaaS evaluation presented in this chapter. For completeness, we include a summary of this work in this section.

Frank Niessink et al.'s IT Service CMM (IT Service Capability Maturity Model) [20] is a service maturity model that enables IT service providers to assess and further improve their capabilities with respect to the IT service delivery. The structure of the model is similar to that of CMU/SEI's Software CMM (Capability Maturity Model) with five maturity levels: Initial, Repeatable, Defined, Managed and Optimizing, yet the contents are focused on key process areas needed for provisioning mature IT services. The model also introduces suitable and practical assessment approaches to determine and improve the maturity of the organization. However, this approach only aims at the implementation of service processes within IT organizations, and largely ignores the other important role of the service customer.

The OGC (Office of Government Commerce)'s Information Technology Infrastructure Library (ITIL) [21] is a framework of best practices in information technology, primarily focusing on IT service strategy, design, transition, operation, and improvement. In the past decade, ITIL has been adopted worldwide as one of the most popular service level standards in IT organizations. Instead of using ordered levels and process areas, ITIL organizes the processes as areas of best practices and describes the details of process implementation and activities. The emphasis in ITIL is on the delivery of IT services in-house by the Information Technology department. ITIL provides some general guidance to sourcing strategies and externally delivered services.

The adoption of SOA solutions in IT requires more specific maturity models to assess the SOA implementation and identify the SOA business value. Sonic Software's SOA Maturity Model (SOA MM) [32] is one such model, defining maturity levels with key business impact within the organization. The model was extended to include five aspects by Inaganti and Sriram's Model [33]: Scope of SOA Adoption, SOA maturity levels, SOA expansion stages, return on SOA investment, and SOA cost effectiveness and feasibility. Other SOA maturity model specialized in different areas of IT services include: IBMÕs SOA integration model [34] and HP's SOA domain model [35].

X.8 Conclusion and Future Work

This chapter provides the basis for a QoS focused SaaS evaluation model. The key contributions are the definition of a four level SaaS system maturity model and the inclusion of QoS-value graph instrument when using this model. The important aspects of this work include the recognition that SaaS evaluation must take into account the generation of co-value by both provider and customer and that additional tools are needed to assist both the provider and the customer in assessing and improving the service quality on an ongoing basis.

Further research is needed into tools that can be adapted to SaaS service offerings to automatically collect many of the QoS attributes that are agreed to as part of a provider/customer SaaS agreement. The evaluation model should also support regular reporting of QoS non-conformances and trends in service support (both positive and negative). Effort is also needed to integrate our work on SaaS evaluation with evaluation of other service offering approaches including in-house services and other forms of external services such as outsourcing. Finally, we are also investigating evaluation support for selecting the best (or currently most viable) SaaS offering among similar offerings by multiple providers. This work involves a weighted multi-QoS-attribute approach that could potentially allow the service selection decision to be delayed until just before the service is needed.

Acknowledgements: Paul Sorenson would like to thank Norm Pass and Jim Spohrer of IBM Almaden Research Center for several interesting discussions on co-value creation and service innovation that took place during a fall 2008 visit to Almaden. These discussions helped to form the basis for our QoS-value graph instrument. We also wish to acknowledge NSERC (National Science and Engineering Research Council) of Canada for funding support of this research.

References

[1]        SaaS.com, Improving Human Productivity Through Software as a Service, www.SaaS.com.

[2]        B.J. Lheureux, R.P. Desisto, and M. Maoz, ÒEvaluating Software-as-a-Service Providers: Questions to Ask Potential SaaS Providers,Ó Gartner RAS Core Research Note, Apr. 2006.

[3]        B. Waters, ÒSoftware as a service: A look at the customer benefits,Ó Journal of Digital Asset Management, vol. 1, No. 1, pp. 32–39, Jan. 2005.

[4]        J. Spohrer, P. Maglio, J. Bailey, and D. Gruhl, ÒSteps Towards a Science of Service Systems,Ó Computer, IEEE Computer Society, pp. 71-77, Jan. 2007.

[5]        V. J. Peters, ÒTotal service quality management,Ó Managing Service Quality, Vol. 9, No. 1, pp. 6–12, 1999.

[6]        M. Alvarez, ÒGartner Predicts Great Growth in SaaS Adoption,Ó October 2008, www.atelier-us.com/e-business-and-it/article/gartner-predicts-great-growth-of-saas-adoption

[7]        A. Parasuraman, V.A. Zeihaml, and L.L.Berry, ÒSERVQUAL: a multi-item scale for measuring consumer perception of service quality,Ó Journal of Retailing, Vol. 64, No. 1, pp. 12–40, 1988.

[8]        C. Fornell, M. D. Johnson, E. W. Anderson, J. Cha, and B. E. Bryant, ÒThe American Customer Satisfaction Index: nature, purpose, and findings,Ó Journal of Marketing, Vol. 60, No. 4, pp. 7–18, Oct. 1997.

[9]        R.S. Kaplan, and D.P. Norton. The Balanced Scoreboard: Translating Strategy into Action. Harvard Business School Press, August 1996.

[10]     F. Chong, and G. Carraro, ÒArchitecture Strategies for Catching the Long Tail,Ó Microsoft Corporation, Software as a Service Architectural Guidance Series, Apr. 2006, http://msdn.microsoft.com/en-us/library/aa479069.aspx

[11]     G. Gruman, and E. Knorr, ÒWhat cloud computing really means,Ó InfoWorld, Apr. 2008, http://www.infoworld.com/article/08/04/07/15FE-cloud-computing-reality_1.html.

[12]     P. Gaw, ÒWhat's the difference Between Cloud Computing and SaaS?Ó Web 2.0 Journal, Jul. 2008, http://web2.sys-con.com/node/612033

[13]     S. Ried, J. R. Rymer, and R. Iqbal, ÒForrester's SaaS Maturity Model: Transforming Vendor Strategy While Managing Customer Expectations,Ó Aug. 2008.

[14]     P. Hernon, and D. A. Nitecki, ÒService Quality: A Concept Not Fully Explored,Ó Library Trends, Vol. 49, No. 4, pp. 687-708, Mar. 2001.

[15]     C. A. Reeves, and D. A. Bednar, ÒDefining quality: alternatives and implications,Ó MIT Academy of Management Review, Vol. 19, No. 3, pp. 419-445, Jul. 1994.

[16]     D. A. Garvin, ÒWhat Does ÔProduct QualityÕ Really Mean?Ó Sloan Management Review, Fall, pp. 25–43, Oct. 1984.

[17]     A. Parasuraman, V.A. Zeithaml, and L.L.Berry, ÒA conceptual model of service quality and its implications for future research,Ó Journal of Marketing, Vol. 49, No. 4, pp. 41–50, 1985.

[18]     B.W. Tuchman, ÒThe decline of quality,Ó New York Times Magazine. Vol. 2, pp. 38-41, 104, Nov. 1980.

[19]     G. Dobson, ÒQuality of service in Service-Oriented Architectures,Ó 2004, http://digs.sourceforge.net/papers/qos.pdf.

[20]     F. Niessink, V. Clerc, T. Tijdink, and H. van Vliet, ÒIT Service CMM Version 1.0 Release candidate 1,Ó 2005, http://www.itservicecmm.org/.

[21]     Office of Government Commerce, Service Delivery, IT Infrastructure Library. The Stationery Office, 2001, http://www.itil-officialsite.com/home/home.asp.

[22]     V.A. Zeithaml and A. Parasuraman, and L. L. Berry. Delivering quality service: Balancing customer perceptions and expectations. New York: The Free Press, 1990.

[23]     T. Chester, F. Miller, and D. A. Trinkle. ÒService Quality Assessments with Higher Education TechQual+,Ó Educause Annual Conference, 2007.

[24]     X. Chen, and P.G. Sorenson, ÒTowards TQM in IT Services,Ó Proc. of the 2007 Workshop on Automating Service Quality, held in conjunction with ASE (Automated Software Engineering), pp. 42-47, Atlanta, Nov. 2007.

[25]     X. Chen, and P.G. Sorenson, ÒA QoS-based Service Acquisition Model for IS Services,Ó Proc. of the 6th Workshop on Software Quality, (workshop held in conjunction with ICSE 2008), Leipzig, Germany, pp. 41-46, May. 2008.

[26]     ÒCost/Benefit Analysis: Evaluating Quantitatively Whether to Follow a Course of Action,Ó http://www.mindtools.com/pages/article/newTED_08.htm

 

[27]     CIO Council, ÒThe Value of IT Investments: ItÕs not just Return On Investmenthttp://www.cio.gov/documents/TheValueof_IT_Investments.pdf

[28]     VOSE Software, ÔIntroduction to Risk Analysis,Ó http://www.vosesoftware.com/.

[29]     Marketing Teacher, ÒThe Product Life Cycle,Ó http://www.marketingteacher.com/Lessons/lesson_plc.htm

[30]     Google Corporation, ÒGoogle Apps Service Level Agreement,Ó 2009, http://www.google.com/apps/intl/en/terms/sla.html

[31]     SAP AG, ÒSAP Business ByDesign: The Most Complete and Adaptable On-Demand Business Solutions,Ó 2008, http://www.sap.com/solutions/sme/businessbydesign/overview/index.epx

[32]     Sonic Software Corporation, ÒA New Service-Oriented Architecture (SOA) Maturity ModelÓ, 2005, http://www.sonicsoftware.com/solutions/service_oriented_architecture/soa_maturity_model/index.ssp

[33]     S. Inaganti, and S. Aravamudan, ÒSOA Maturity Model,Ó BP Trends, Apr. 2007.

[34]     A. Arsanjani, and K. Holley, ÒIncrease Flexibility with the Service Integration Maturity Model (SIMM): Maturity, Adoption, and Transformation to SOA,Ó IBM developerWorks, Sep. 2005, http://www.ibm.com/developerworks/webservices/library/ws-soa-simm/

[35]     HP SOA Maturity Model, 2007, https://roianalyst.alinean.com/calculators/hp/hpsoa/HP_SOA_Maturity_Assessment.html.