MAKOKHA, FRANKLINE

Photo
MAKOKHA, FRANKLINE
Project Title
A VENDOR NEUTRAL QUALITY OF SERVICE MONITORING MODEL FOR SOFTWARE AS A SERVICE CLOUD COMPUTING SOLUTIONS
Degree Name
DOCTOR OF PHILOSOPHY DEGREE IN COMPUTER SCIENCE
Project Summary

The high uptake level of cloud computing services has correlated with an increase in providers of cloud services, who are offering comparable solutions marketed at different prices and at distinctive Quality of Service (QoS) levels. This portends a decision challenge to users of those cloud services, who have to make a selection or a comparison between the available providers of cloud services in so far as performance of their cloud solutions are concerned. Even though there exists computational models for developing QoS measuring tools, they are not vendor agnostic therefore hampering cross vendor functionality comparison.

The decision challenge difficulty is further made worse in view of the blueprint of the current cloud QoS monitoring framework, where the results are stored in the cloud provider’s platform after monitoring for the user to query later. This raises trust issues as far as the results are concerned during service level agreement evaluation.

Further, the rise in the number of providers of cloud services, claiming to offer better services than their competitors, has brought impetus on the need for users of cloud services to authenticate the QoS measured by the different tools of cloud solution providers.

Noting the unavailability of vendor neutral cloud QoS measuring tools, cloud clients have not option but to depend on the tools offered by the cloud providers who also double as their cloud service providers, due to vendor specificity of the tools. In a situation where the user has multiple providers for the same services, with an aim of increasing redundancy, it becomes difficult to liken the QoS results obtained from the various providers since the tools used in measuring cannot be ported across platforms.

To abate the decision challenge and enable cross cloud performance comparison, various research have been done culminating in probable solutions, like the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), Heterogeneous Similarity Metrics (HSM), Event Based Multi Cloud Service Applications Framework, Multiple-Cloud Monitoring platform, MUSA framework and the PRECENSE framework.

Whereas there is existence of research trying to address the cross cloud performance comparison, the shortcoming is that they rely on the use of existing vendor specific tools, customized for the specific cloud providers infrastructure which are then spread across different cloud providers, while in some instances the use of customized agents installed in various cloud providers’ platform.

The other outstanding shortcoming of the solutions meant to address the service choice overload problem is that they rely on synthetically generated data as opposed to relying on the actual data generated during the time the platform is in use, as well as relying on historical data of past clients.

This research addressed the existing gap by developing a cloud QoS monitoring framework from which a vendor agonistic cloud QoS monitoring model can be developed. The focus was on Software as a Service (SaaS) cloud computing solutions. In development of the framework, the research focused on the location of the QoS monitoring tool, the intention of monitoring, and the mode of access to the cloud services.

It was discovered through this research that to actualize development of a vendor neutral model, the location of the cloud QoS monitoring tools could be shifted from the cloud provider’s platform to the client devices, and use of internet browsers as the mode of access to the services.

This culminated in development of a vendor neutral SaaS cloud QoS monitoring tool, prototyped as a bowser extension, which was implemented using JavaScript, Node.js and SQLite database. To confirm the applicability of the developed vendor neutral tool, it was implemented on chrome browser. The monitoring tool developed from the vendor agonistic model was then used to monitor the QoS of five international SaaS cloud companies, which are, Salesforce, Google, Hubspot, Shopify and Microsoft.

The QoS parameters monitored by the vendor neutral tool were service stability, service response time of the service and service availability, which are the main quantitative parameters for cloud QoS as far as performance is concerned. For validation purposes, results from the vendor agnostic tool and the cloud service providers’ integrated tools were compared using a case study approach. The vendor agonistic tool proved more handy compared to the cloud providers’ integrated tools with regards to the number of QoS parameters it could monitor, as well as the fact that it can be used for performance comparison across different cloud platforms.
The vendor agonistic cloud QoS monitoring solution has capabilities to measure three main QoS metrics, namely, availability of the service, service stability and response time of the service, unlike the cloud providers’ solutions, which have capability to only measure one quantitative metric. Hubspot, Gsuite and Salesforce have a service availability measuring capability, while Shopify has a service response time and service availability capability only.

Due to its vendor neutrality, the tool can also be handy to cloud service users in so far as performance comparison across clouds is concerned for providers offering similar services, prior to the user making a choice on which cloud provider to procure for long term services.

The tool was subjected to Google docs and Microsoft 365 cloud services for comparison performance between 6th October 2020 to 27th October 2020, under the same computing platform and Internet conditions. From the comparison, the average service response time for Google was 4.47 seconds while for Microsoft was 6.04 seconds. Both platforms had an availability of 100% since at no time during experimentation did any of the platform report a platform failure leading to outage of services.

Whereas the availability is 100%, the fluctuations in the service response time were higher for Microsoft at 5.966 seconds than for Google at 2.003 seconds, meaning the Google platform was more stable than the Microsoft platform. In terms of evaluating the trustworthiness of the results reported by the different cloud computing platforms, this research developed a quantitative trust model for evaluation trust of the various cloud providers. This involved a comparison of the results from the vendor neutral model and the results from cloud provider’s embedded monitoring tools, using the common metric monitored by the two solutions, service availability, and a new parameter, confidence interval, introduced by this research.

From the trust evaluation done between 6th October 2020 to 27th October 2020, it was noted that the two compared cloud providers, Google and Microsoft, were both trustable since the results they reported were within the confidence interval of those reported by the vendor neutral model. In terms of overall performance it was found that Google performed better than Microsoft. Where a decision is to be made on whose services to procure, the user can factor in the decision making process this performance measures. Future works from this research could be extended to monitor infrastructure as a service based as well as platform as a service based cloud solutions. Additional works could also focus on other common aspects used by all cloud providers at the client side, for example the operating system, where the monitoring capability could be installed as a utility on the operating system.