ArticlePDF Available

Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh

Authors:

Figures

Content may be subject to copyright.
I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
Published Online August 2021 in MECS (http://www.mecs-press.org/)
DOI: 10.5815/ijieeb.2021.04.04
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
Web Performance Analysis: An Empirical
Analysis of E-Commerce Sites in Bangladesh
Md. Tutul Hossain, Rakib Hassan, Mahfida Amjad
Department of Computer Science & Engineering, Stamford University Bangladesh, Dhaka, Bangladesh
E-mail: {tutulhossain.cse, tipusultan9t7a}@gmail.com, mahfidaamjad@stamforduniversity.edu.bd
Md. Abdur Rahman
Centre for Advanced Research in Sciences (CARS), University of Dhaka, Dhaka, Bangladesh
E-mail: mukul.arahman@gmail.com
Received: 20 April 2021; Accepted: 06 June 2021; Published: 08 August 2021
Abstract: Performance testing of e-commerce site is important for upcoming improvement and making better user
experience which is performed by several web performance testing tools available on online platform. There are several
tools user can use to scan their site for performance testing. This paper presents a web based application to collect and
compare performance parameters with results automatically by applying WebpageTest, PageSpeed Insights and
GTmetrix tools. For doing the test comparison nine parameters are considered and these are Load Time, First Byte,
Start Render, First Contentful Paint, Speed Index, Largest Contentful Paint, Cumulative Layout Shift, Total Blocking
Time and Time to Interactive parameters. The framework is developed with PHP, MySQL, CSS and HTML, where user
will provide intended site’s url to test performance. This paper presents the performance of ten e-commerce sites of
Bangladesh. Among the three tools WebpageTest and Gtmetrix can collect the reports of all the parameters. 1.62 (site7),
3.25 (site4) and 1.89 (site7) seconds are reported as lowest value for tools WebPageTest, PageSpeed Insight and
Gtmetrix respectively. The average results of three tools is measured where, the minimum value is shown as 0.03
seconds for ‘total blocking time’ by site7. And maximum value is shown as 17.78 seconds for ‘load time parameter
recorded by site10.
Index Terms: Web performance testing tools, scalability, speed, stability, load time.
1. Introduction
E-commerce sites handles large number of users concurrently. As the number of users increases day-by-day, the
performance of the sites plays significant role to keep the user satisfied and thus keep site trends in a good shape.
Performance testing is a non-functional requirement which check the system parameters in terms of responsiveness and
stability beneath varied workloads. This testing is performed by different scanners available on the online-platforms.
These tools live the standard attributes of the system, like quantifiable, dependableness, resource usage, etc.
There are several advantages of performance testing tools like (i) validate the basic options of the system, (ii)
measure the speed, accuracy and stability of system, (iii) determine discrepancies and resolve problems, (iv) improve
optimization and load capability, etc. Performance testing also examines the sites behavior on multiple devices.
There are several existing works related to website performance analysis such as paper [1] discussed performance
testing as well other testing methods to handle the website development challenges. Comparison among three
automated user acceptance testing tools for web application has been described in [2]. The work [3] presents a load
based testing tool which allows the performance analysis of web applications by means of scalability. The research [6]
presented, a model based performance testing tool that measures the performance on web applications and services
using the measurement technique. Whereas, the work [7] described an approach to automatically analyzes the execution
logs of a load test for performance problems. The new automation testing tool selenium web driver has been described
in [8]. The paper [9] investigates open source web service testing tools for lack of parameters such as response time,
number of data processed in terms of bytes and throughput. The paper [10] presents a comparison made between
selenium and other tools to find the best tool and on the basis of the results; a case study is presented to explore the
performance of the selenium. Even though several performance analysis research exists in the literature and discussed
different web scanner tools like PageSpeed Insights, Webpage test and Gtmetrix. However, an automatic web
framework is essential to collect web performance data using different scanners and compare the results. Which will
remove users hassle and difficulties in terms of using different tools for the same website.
48 Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
To this end, a web based framework is presented in this research to collect web performance records, analyze,
compare and finally store for future usage. Three reputed web performance analyzer has been incorporated to collect
and compare the e-commerce site data. This framework will show how a website can behave and respond during
various situations. Thus it will help developers to test the speed and stability of their website.
The framework takes the urls as input and produce the test results of each website using three different tools which
are (i) WebpageTest (ii) PageSpeed Insights and (iii) GTmetrix. The results are stored to the database for further use
and analysis. Also, the system will create a graphical comparison representation among all the sites based on considered
performance parameters such as Load time, First Byte, Start Render, etc. using the stored results. In addition, the system
has provision to save the comparison results by the registered users.
The proposed system contains two modules named as frontend and backend module which are developed with
PHP, MySQL, CSS, HTML. The user will interact with frontend module to collection website performance records
after getting registered. A user can scan web performance records of web applications by providing the respective
website url. The framework will scan the given sites automatically using the mentioned three tools.
According to the result analysis among ten web applications, the lowest and highest average performance score is
1.62 (site7) and 10.99 (site8) respectively shown by WebpageTest tool. The PageSpeed Insight tool records 3.25 (site4)
and 10.12 (site2) as minimum and maximum seconds respectively by PageSpeed Insight tool. However, the tool reports
0.00 second for ‘cumulative layout shift’ by site8, which indicates either the tool could not detect any record or the site
consumes 0 second for this parameter. Where, the site2 records 24.10 seconds for ‘time to interactive’ parameter, so this
site will take more time to be interactive compared to other experimented sites. The third tool, GTmetrix reports 1.89
(site7) and 9.56 (site8) seconds as lowest and highest average performance respectively.
Finally, the average records of these three tools are measured where, the lowest value is shown as 0.03 seconds for
‘total blocking time’ by site7. And highest value is 17.78 seconds for ‘load time’ parameter recorded by site10. Three
tools’ bottommost and uppermost average performance is 2.34 and 9.53 seconds for site7 and site8 respectively.
The main contribution of this performance testing framework is to collect performance data automatically and
generate graphs depend on testing performance data. Also, user can compare and save two or more site results at a time
with a single click.
The rest of this paper is organized as follows; Section 2 discusses the related work. Section 3 is about empirical
data collection. Section 4 illustrates the data analysis. Finally, section 5 concludes this paper with future research
direction.
2. Related Works
There are some existing works related to website performance analysis. There are different types of testing like
Functional Testing, Usability Testing, Performance testing, Security Testing, Interface Testing and Compatibility
Testing methods discussed in [1] which will help in handling challenges during the website development. Comparison
among three automated user acceptance testing tools for web application has been described in [2]. The evaluation is
based on the usability criteria of the tools. The three tools evaluated are Acceptance Test Driven Development. Three
proposed tools are Test Complete, Selenium Web driver and Watir Web driver. These three tools undergo usability
criteria testing that include efficiency, effectiveness, satisfaction and error. From the evaluation, Watir Web driver has
been proven as the best testing tool based on usability criteria of user acceptance testing.
The work [3] presents a Web Application Load based Testing tool which is a set of tools that allows the
performance analysis of web applications by means of scalability. It is based on workload characterization generated
from information extracted from log files. The workload is generated by using customer behavior model graphs. The
workload used to evaluate the web application under test which is the representative of the real traffic that the web
application has to serve. In [6], they present a model based performance testing tool that measures the performance on
web applications and services using the measurement technique. The tool uses models to generate workload which is in
real-time and it measures different performance indicators. The models are defined using probabilistic timed automata.
It describes how load is generated from the models and the features of the tool.
An approach which automatically analyzes the execution logs of a load test for performance problems has been
described in [7]. They first derive the system’s performance baseline from previous runs. Then they perform an in-depth
performance comparison against the derived performance baseline. Case studies show that their approach produces few
false alarms with a precision of 77%. A new automation testing tool Selenium web driver has been described in [8]. The
paper [9] investigates open source web service testing tools wildly for lack of parameters such as response time, number
of data processed in terms of bytes and throughput which are then added to it through the modification of the tool.
The paper [10] presents a comparison made between selenium and other tools to find the best tool and on the basis
of the results; a case study is presented to explore the performance of the selenium. An approach is used to analyze web
application services and to report test results has been depicted in [11]. The approach uses Genetic Algorithm to
generate Virtual Users which act as workloads. The generated workloads are applied to the System under Test for
measuring various important performance indicators. The generated test reports help testers to measure performance of
web application services under different conditions.
Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh 49
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
Paper [12] describes a set of software tools to support the testing of web based applications. The tool set covers
application model extraction, test execution automation, and test design automation. In addition, a graph based
application model is also presented to model the behavior of web based applications. With the graphic presentation,
several traditional software testing techniques are extended to test web based applications.
The paper [13] proposes the idea of modeling infrastructures as a general solution to the problems and describes a
particular infrastructure that they have developed. It creates EMOD a tool for modeling the performance of e-commerce
sites. It shows how EMOD can be used to model a particular site, and validate its predictions. An automatic
conformance testing tool with timing constraints from a formal specification Timed Extended Finite State Machine of
web services composition has been implemented in [14] by an online testing algorithm. This algorithm combines
simultaneously the idea of test execution and debug to generate and simultaneously execute the test cases. The paper
[15] describes the tool WSDLTest for automatic testing of web services. The tool can be used for testing of web
services for which WSDL 1.1 or WSDL 2.0 documents are available.
The e-commerce site’s privacy policy and security issue are examined by paper [17] to analyze user’s personal
data and security vulnerabilities. This survey study leverages penetration testing tools to collect data. Where,
vulnerability issues of e-commerce sites have been experimented by [18] applying both technical and non-technical
strategies. The study finds that vulnerabilities are exists in most of the e-commerce sites which causing unsecure for
customers to trust their private data. The research [19] demonstrated the website’s usability and accessibility issues for
Nigerian banks websites. The study applied both manual and automated approach to evaluate the accessibility and
usability issues and check the websites standard as defined by Web Accessibility Initiate.
3. Empirical Data Collection
E-commerce sites handles large number of users concurrently. As the number of users increasing day-by-day, the
performance of the sites plays significant role to keep the user satisfied and thus keep site trends in a good shape. For
this purpose, web application performance data collection system made automate in this proposed web based
framework. Where three reputed web performance analyzer has been incorporated to collect and compare the
ecommerce site data. The details regarding proposed e-commerce website performance comparing framework is
presented in the following sections and depicted in Fig. 1.
Fig. 1. Overview of the proposed method
The Fig. 1 illustrated the overview of the proposed system where a registered user can test the performance of any
web application by providing the urls of the sites like url1, url2, ..., url10. The framework takes the urls as input and
produce the test results of each website as R1, R2, ..., R10. The system collects the performance records for the given
50 Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
url using three different tools which are (i) PageSpeed (ii) Insights, (iii) Webpage test. The results are stored to the
database for further use and analysis. The developed web application creates a graphical comparison representation
among all the sites based on considered performance parameters such as Load time, First Byte, Start Render, etc. using
the stored results. The system has option to save the comparison results by the registered users.
3.1. Data Scanning Tool Selection
The developed framework incorporated three web performance analyzer to collect website performance data.
These tools work as the fundamental tool of this web framework. The tools are selected based on their availability,
functional performance, stability, etc. The overview of the experimented tools is given below.
Webpage Test: This is one of the most popular and open source web performances testing tool. User can test a
site considering different locations and browsers using this scanner.
PageSpeed Insights: This is a reliable web performance analyzing to scan user experience for a given site
accurately. For a web page, this tool generates performance results for both mobile and desktop devices. This is
also a free tool.
Gtmetrix: This is used to analyze the performance of a web page with a single click for different location,
browsers, connection speed and more. The basic usage of this tool is free to test and monitor web pages
performance.
3.2. Proposed Framework Development
The system contains two modules named as frontend and backend. Both the module is developed with PHP,
MySQL, CSS, HTML. The user will interact with frontend module to collection website performance records where
user will register first with their details. The registration system requires username, user id, email, password, security
question, security answer.
After getting registered with the framework, a user can scan web performance records of web applications by
providing the respective website urls. The proposed framework will scan the given sites automatically using the
considered three tools (i) PageSpeed (ii) Insights, (iii) Webpage test.
The control of the proposed system will remain under the admin user. Upon completion of performance testing,
this system can compare and save the test result for a particular website. In addition, the application will generate a pdf
file consisting of detail report for further use.
4. Findings and Data Analysis
The collected data has been analyzed and compared to identify the significant problems in the e-commerce sites.
These results can help to mitigate the problem in the site in the next version release. For this empirical analysis, top ten
e-commerce sites of Bangladesh has been considered to scan the data using this web based framework. All the
discussed results are collected and analyzed using the proposed web based framework.
4.1. Environment Setup
For this empirical data collection following software and hardware has been used. HTML, CSS, JavaScript, PHP,
MySQL, Fpdf17, XAMPP and windows OS has been used as software requirements. Whereas workstation consisting of
intel core-i5, 50GB hard disk and 4 GB RAM is used to run the web application.
4.2. Data Collection
The data has been collected for nine parameters and plotted in Table 1, Table 2 and Table 3 for tools WebPage
Test, PageSpeed Insight and Gtmetrix respectively. And the average results of these tools are shown in Table 4. The
tables heading Site1, Site2, Site3, etc. are indicating the e-commerce site-1, site-2 site-3, etc. respectively. And the
performance parameters named as (i) Load Time, (ii) First Byte, (iii) Start Render, (iv) First ContentFul Paint, (v)
Speed Index, (vi) Largest Contentful Paint, (vii) Cumulative Layout Shift, (viii) Total Blocking Time and (ix) Time to
Interactive are given in the row. The value represents in second and ceiled to two decimal points.
For collecting performance results using the proposed framework top e-commerce sites of Bangladesh has been
selected according to a recent survey by an agency of Bangladesh which was published online. This paper is not
mentioning the name of the sites because of preserving the privacy issue.
Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh 51
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
Table 1. Performance Results for WebPage Test
Sl.
Parameters
Site1
Site2
Site4
Site5
Site6
Site7
Site8
Site9
Site10
1
Load Time
9.47
20.58
8.93
16.05
9.89
3.85
14.81
13.64
20.05
2
First Byte
1.34
2.15
1.08
1.92
1.17
0.95
12.38
1.13
2.14
3
Start Render
2.30
3.00
1.90
4.00
2.50
1.40
14.00
2.30
4.60
4
First Contentful Paint
2.27
2.99
1.93
4.04
2.45
1.41
13.97
2.29
4.63
5
Speed Index
3.52
14.29
2.96
7.74
3.96
1.66
14.06
8.58
12.39
6
Largest Contentful Paint
6.64
3.89
4.23
7.41
5.97
1.56
14.24
7.56
15.64
7
Cumulative Layout Shift
0.06
0.06
0.05
0.30
0.14
0.36
0.62
0.14
0.15
8
Total Blocking time
0.14
1.03
0.00
0.87
0.55
0.00
0.00
1.23
0.31
9
Time to Interactive
7.41
16.73
3.82
15.70
8.85
3.38
14.81
11.46
12.87
Average
3.68
7.12
2.77
6.45
3.95
1.62
10.99
5.37
8.09
As stated in Table 1, WebPage test tool scanned for nine parameters and it shows the values for all. The parameter
‘load time’ shows 20.58 as maximum value for site2, which indicates the site needs more load time than others. Where,
the site7 records 3.85 as a minimum load time. For ‘first byte’ parameter it is lowest and highest value is 0.95 (site7)
and 12.38 (site8) respectively. Considering average value for all the parameters, best and worst records are shown by
site7 (1.62) and site8 (10.99) respectively. The tool shows ‘total blocking time’ parameter value as 0.00 for site4, site7
and site8.
Table 2. Performance Results for PageSpeed Insight
Sl.
Parameters
Site1
Site2
Site3
Site4
Site5
Site6
Site7
Site8
Site9
Site10
1
Load Time
-
-
-
-
-
-
-
-
-
-
2
First Byte
-
-
-
-
-
-
-
-
-
-
3
Start Render
-
-
-
-
-
-
-
-
-
-
4
First Contentful Paint
2.60
3.40
1.90
3.30
3.60
3.90
2.40
5.10
3.00
6.10
5
Speed Index
9.30
12.90
6.10
4.10
10.20
11.90
13.70
14.40
16.20
8.90
6
Largest Contentful Paint
14.9
9.00
2.90
4.10
8.80
17.30
2.80
5.40
14.80
7.60
7
Cumulative Layout Shift
0.44
0.25
0.07
0.50
0.05
0.01
0.49
0.00
0.88
0.05
8
Total Blocking time
1.32
11.05
2.84
0.22
2.08
1.55
0.07
0.45
2.18
0.16
9
Time to Interactive
16.60
24.10
13.70
7.30
19.30
16.90
5.60
8.00
14.80
21.00
Average
7.53
10.12
4.59
3.25
7.34
8.60
4.18
5.56
8.64
7.30
According to Table 2, the PageSpeed Insight scanner can collect records for (i) First Contentful Paint, (ii) Speed
Index, (iii) Largest Contentful Paint, (iv) Cumulative Layout Shift, (v) Total Blocking Time and (v) Time to Interactive
parameters. However, it is unable to detect any records for remaining three parameters. Among the scanned results,
site8 shows 0.00 for ‘cumulative layout shift’, which indicates either the tool can not detect any record or the site
consumes 0 second for this parameter. Where, the site2 records 24.10 for ‘time to interactive’ parameter, so this site will
take more time to be interactive compared to other experimented sites. The average minimum and maximum values are
3.25 (site4) and 10.12 (site2) respectively.
Table 3. Performance Results for Gtmetrix
Sl.
Parameters
Site1
Site2
Site4
Site5
Site6
Site7
Site8
Site9
Site10
1
Load Time
7.90
13.20
7.00
10.80
8.70
4.00
13.10
13.40
15.50
2
First Byte
1.20
2.00
0.10
1.30
1.50
1.30
11.30
1.30
1.20
3
Start Render
1.50
2.10
1.70
2.80
2.70
1.70
12.20
2.30
2.90
4
First Contentful Paint
1.50
2.20
1.70
2.80
2.70
1.70
12.20
2.30
2.90
5
Speed Index
2.40
6.50
2.60
6.30
4.10
3.80
12.40
5.90
7.40
6
Largest Contentful Paint
2.90
12.80
3.00
7.10
5.30
2.00
12.30
13.40
9.70
7
Cumulative Layout Shift
0.09
0.07
0.11
0.50
0.01
0.48
0.57
0.61
0.25
8
Total Blocking time
0.11
0.43
0.08
1.10
0.56
0.02
0.00
0.88
0.44
9
Time to Interactive
2.00
7.10
3.90
9.70
3.50
2.00
12.20
9.90
8.00
Average
2.18
5.16
2.24
4.71
3.23
1.89
9.56
5.55
5.37
As stated in Table 3, the site3 and site8 shows 0.00 second in case of ‘cumulative layout shift’ and ‘total blocking
time’ respectively. The tool records highest and lowest ‘load time’ as 15.50 and 4.00 seconds for the site10 and site7
respectively. Even though, the site site8 shows 0.00 second for ‘total blocking time’, it records maximum time for (i)
first byte, (ii) Start Render, (iii) First Contentful Paint, (iv) Speed Index, (v) Largest Contentful Paint and (vi) Time to
Interactive parameters. The lowest and highest average performance values are 1.89 (site7) and 9.56 (site8) seconds
respectively.
52 Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
Table 4. Average of WebPage Test, PageSpeed Insight and Gtmetrix
Sl.
Parameters
Site1
Site2
Site3
Site4
Site5
Site6
Site7
Site8
Site9
Site10
1
Load Time
8.69
16.89
8.73
7.97
13.43
9.30
3.93
13.96
13.52
17.78
2
First Byte
1.27
2.06
1.34
0.59
1.61
1.34
1.13
11.84
1.215
1.67
3
Start Render
1.90
2.55
2.35
1.80
3.40
2.60
1.55
13.10
2.30
3.75
4
First Contentful Paint
2.12
2.86
2.21
2.31
3.48
3.02
1.84
10.42
2.53
4.54
5
Speed Index
5.07
11.23
3.78
3.22
8.08
6.65
6.39
13.62
10.23
9.56
6
Largest Contentful Paint
8.15
8.56
2.78
3.78
7.77
9.52
2.12
10.65
11.92
10.98
7
Cumulative Layout Shift
0.20
0.13
0.04
0.22
0.28
0.05
0.44
0.397
0.54
0.15
8
Total Blocking time
0.52
4.17
1.41
0.10
1.35
0.89
0.03
0.15
1.43
0.30
9
Time to Interactive
8.67
15.98
8.53
5.01
14.9
9.75
3.66
11.67
12.05
13.96
Average
4.07
7.16
3.46
2.78
6.03
4.80
2.34
9.53
6.19
6.97
The measured average performance records for each parameter has been plotted in Table 4. The average value has
been calculated using considering the three tools scanning output. According to this table, the lowest value is 0.03 for
‘Total Blocking Time’ by site7. And highest value is 17.78 seconds recorded by site10 for ‘Load Time’ parameter. The
three tools’ lowest and highest average performance is 2.34 and 9.53 seconds for site7 and site8 respectively.
Beside individual tool’s results analysis, the proposed framework generates average results as Table 4 to compare
each parameter value for ten sites. As the Pagespeed tool could not collect data for ‘Load Time’, ‘First Byte’ and ‘Start
Render’ parameters, the average is calculated using the records of Webpage Test and Gtmetrix tools. Using Table 4, a
user can easily find out the major performance difference among the sites and can take necessary action.
5. Conclusion
This paper presents a web based framework for performance analysis for any given website automatically. It can
show the developers how a website can behave and respond during various situations. It will show them the speed and
stability of their tested websites. To use this system, a user needs to register and provide the website’s url for
performance analysis of the given web application. The framework uses Webpage Test, PageSpeed Insights and
Gtmetrix web performance tools to scan the given website url. The performance records have been collected for nine
parameters such as Load Time, First Byte, Start Render, First Contentful Paint, Speed Index, Largest Contentful Paint,
Cumulative Layout Shift, Total Blocking Time and Time to Interactive parameters. The individual scanner records as
well as average of three scanners have been generated by the developed framework in order to analyze status of the
performance of the sites. According to the average results, site7 scored lowest for ‘Total Blocking Time’ among the
considered ten sites. The site10 scored 17.78 seconds as Load Time, which indicates that this site takes more loading
time compare to others. At present, this web based application works only in the computer system. Development of
mobile version as well as increase number of test limit will be the future research direction.
References
[1] Shakti Kundu,” Web Testing: Tool, Challenges and Methods”, IJCSI International Journal of Computer Science Issues, Vol. 9,
Issue 2, No 3, March 2012.
[2] Sherolwendy Anak Sualim, Noraniah Mohd Yassin, and Radziah Mohamad,” Comparative Evaluation of Automated User
Acceptance Testing Tool for Web Based Application”, International Journal of Software Engineering and Technology, IJSET
Vol. 2, No. 2 (2016)
[3] G. Ruffo, R. Schifanella, M. Sereno and R. Politi ,” WALTy: A User Behavior Tailored Tool for Evaluating Web Application
Performance”, Proceedings of the Third IEEE International Symposium on Network Computing and Applications (NCA’04).
[4] Mahnaz Shams, Diwakar Krishnamurthy, and Behrouz Far,” A Model-Based Approach for Testing the Performance of Web
Applications”, Proceedings of the Third International Workshop on Software Quality Assurance (SOQUA’06)
[5] Shikha Raina and Arun Prakash Agarwal,” An Automated Tool for Regression Testing in Web Applications”, ACM SIGSOFT
Software Engineering NotesVolume 38, Issue 4.
[6] Fredrik Abbors, Tanwir Ahmad, Dragos¸ Trus¸can, and Ivan Porres,” MBPeT: A Model-Based Performance Testing Tool”, 4th
International Conference on Advances in System Testing and Validation Lifecycle, November 2012
[7] Zhen Ming Jiang, Ahmed E. Hassan ,and Gilbert Hamann and Parminder Flora,” Automated Performance Analysis of Load
Tests”, 2009 IEEE International Conference on Software Maintenance, 30 October 2009
[8] Renu Patil, and Rohini Temkar,” Intelligent Testing Tool: Selenium Web Driver”, International Research Journal of
Engineering and Technology (IRJET), Volume: 04 Issue: 06, June -2017.
[9] Tanuj Wala , and Aman Kumar Sharma,” Improvised Software Testing Tool”, , International Journal of Computer Science and
Mobile Computing, Vol.3 Issue.9, September- 2014, page 573-581
[10] Hafsah Mahmood, and Mehreen Sirshar,” A Case Study of Web Based Application by Analyzing Performance of a Testing
Tool”, International Journal of Education and Management Engineering (IJEME), 51-58 page, July 2017
[11] Ms.B.Shyaamini, and Dr.M.Senthilkumar,” A Novel Approach for Performance Testing On Web Application Services”,
International Journal of Applied Engineering Research, Volume 10, Number 18 (2015)
Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh 53
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
[12] Ji-Tzay Yang, Jiun-Long Huang, and Feng-Jian Wang,” A Tool Set to Support Web Application Testing”, International
Computer Symposium (ICS), October 1998
[13] Jonathan C. Hardwick, Efstathios Papaefstathiou, and David Guimbellot,” Modeling the Performance of E-Commerce Sites”,
Proceedings of the 27th International Conference of the Computer Measurement Group, January 2001
[14] Tien-Dung Cao, Patrick Felix, and Richard Castanet,” WSOTF: An Automatic Testing Tool for Web Services Composition”,
2010 Fifth International Conference on Internet and Web Applications and Services, 2010 IEEE
[15] Ilona Bluemke, Michał Kurek, and Małgorzata Purwin,” Proceedings of the 2014 Federated Conference on Computer Science
and Information Systems”, Vol. 2, pp. 15531558.
[16] Mahfida Amjad, Md. Tutul Hossain, Rakib Hassan, Md. Abdur Rahman, " Web Application Performance Analysis of E-
Commerce Sites in Bangladesh: An Empirical Study", International Journal of Information Engineering and Electronic
Business (IJIEEB), Vol.13, No.2, pp. 47-54, 2021. DOI: 10.5815/ijieeb.2021.02.04.
[17] Issah Baako, Sayibu Umar, Prosper Gidisu, "Privacy and Security Concerns in Electronic Commerce Websites in Ghana: A
Survey Study", International Journal of Computer Network and Information Security(IJCNIS), Vol.11, No.10, pp.19-25, 2019.
DOI: 10.5815/ijcnis.2019.10.03.
[18] Issah Baako, Sayibu Umar, "An Integrated Vulnerability Assessment of Electronic Commerce Websites", International Journal
of Information Engineering and Electronic Business(IJIEEB), Vol.12, No.5, pp. 24-32, 2020. DOI: 10.5815/ijieeb.2020.05.03.
[19] Ishaq O. Oyefolahan, Aishat A. Sule, Solomon A. Adepoju, Faiza Babakano, "Keeping with the Global Trends: An Evaluation
of Accessibility and Usability of Nigerian Banks Websites.", International Journal of Information Engineering and Electronic
Business(IJIEEB), Vol.11, No.2, pp. 44-53, 2019. DOI: 10.5815/ijieeb.2019.02.06.
Authors’ Profiles
Md. Tutul Hossain was born in 1997 In Narail, Dhaka, Bangladesh. He has completed his B.Sc in
ComputerScience and Engineering from Stamford University Bangladesh in 2020. Currently he is working on
WordPress Developer at Buy Perfume in Bangladesh (BPIB).
Rakib Hassan was born in 1997 in Dhaka, Bangladesh. He has completed his B.Sc degree in Stamford
University.
Mahfida Amjad was born in 1985 in Dhaka, Bangladesh. She has completed her Master degree in Information
Technology from Institute of Information Technology from University of Dhaka in 2009. And she completed
B.Sc. in Computer Science & Engineering from Manarat International University in 2007. She is a faculty
member of Computer Science and Engineering (CSE) Department of Stamford University Bangladesh. She has
devoted herself in teaching profession since 2012. Her research area is wireless communication network, human
computer interaction and software engineering. She has published a number of research papers in various
international journals and conferences.
Md. Abdur Rahman received his BSc in Information Technology from Visva Bharati University, India in 2004.
He has completed his Post Graduate Diploma and Master in Information Technology from University of Dhaka,
Bangladesh, in 2008 and 2009 respectively. He is a Senior Computer Scientist at the Centre for Advanced
Research in Sciences, University of Dhaka. His major research interest includes text analytics, application of
machine and deep learning in software engineering and natural language processing. He has published a number
of research papers in various international journals and conferences.
54 Web Performance Analysis: An Empirical Analysis of E-Commerce Sites in Bangladesh
Copyright © 2021 MECS I.J. Information Engineering and Electronic Business, 2021, 4, 47-54
How to cite this paper: Md. Tutul Hossain, Rakib Hassan, Mahfida Amjad, Md. Abdur Rahman, " Web Performance Analysis: An
Empirical Analysis of E-Commerce Sites in Bangladesh", International Journal of Information Engineering and Electronic
Business(IJIEEB), Vol.13, No.4, pp. 47-54, 2021. DOI: 10.5815/ijieeb.2021.04.04
... Quantitative [32] Load time, first byte, start render, first contentful paint, speed index, largest contentful paint, cumulative layout shift, and total blocking time. ...
...  Website performance testing score (%) ( 1 )  The performance score typically encompasses performance metrics related to user experiences. Essentially, the performance score is indicative of the speed required for users to access websites, as determined based on metrics such as page load time, first paint, first contentful paint, first meaningful paint, and time to interactive [32]. The assessment of website performance scores, relying on specific measurement parameters, is commonly conducted through tools like Google ...
Article
The central emphasis of human resource management resides in the precise delineation of job responsibilities, engaging in systematic inquiry to identify well-suited candidates, and discerningly electing the most qualified individual from the pool whose qualifications align with the job description. The selection of a well-suited candidate is dependent on the creation of an appropriate candidate pool. Digital platforms are preferred over traditional approaches to reach human resources. The main objective of this research is to identify the most suitable digital platform for posting job advertisements based on website performance. The multiple-attribute group decision-making approach is adopted in this research, considering both qualitative and quantitative criteria. A decision support system for website selection is developed, where expert importance levels are calculated based on picture fuzzy sets (PFS). The weight levels of criteria are determined using the introduced PFS-based criteria importance assessment (CIMAS) method. The ranking of websites is calculated using the proposed PFS-based alternative ranking technique based on adaptive standardized intervals (ARTASI) method. These methods are hybridized into the PFS-CIMAS-ARTASI model. Additionally, an algorithm for this hybrid model is developed. A case study is conducted to demonstrate the applicability of the PFS-CIMAS-ARTASI hybrid model for website performance calculations. Robustness tests based on various sensitivity analysis scenarios are performed. The research results indicate that PFS-CIMAS-ARTASI is both applicable and robust. Comprehensive managerial implications are presented and elaborated on.
... We have a recent study by Totul et al. The objective of the study is to [10] find the 'total block time' and 'load time' parameters in various Bangladeshi e-commerce websites. ...
Article
Full-text available
Since Bangladesh recently announced the Smart Bangladesh concept, the Gov ernment has decided to move its national services online. To that end, they have built websites for each sector, including the Land Ministry, to serve the nation. The initial goal of this step is to ensure that the service is equal and hassle-free in both urban and rural areas of the country. With this modern technological support, almost one hundred percent of the Land Ministry’s of fice work has shifted to online services. However, with these advancements, some drawbacks, such as security concerns related to data safety risks, accessi bility, and vulnerabilities, have emerged, threatening the nation’s billions of sensitive data. Common vulnerabilities found on these sites, such as SQLi and XSS, could expose the nation to significant threats. This paper aims to identify various Common Vulnerabilities and Exposures (CVE), Common Weakness Enumerations (CWE), potential XSS vulnerabilities, and SQLi possibilities on the websites of the Land Ministry. To do so, the study employs penetration testing and scans six types of risk alerts (high, medium, low) on the Land Min istry’s websites using OWASP ZAP and Vega tools. Surprisingly, security con cerns were not properly addressed during the development phase of these web sites in Bangladesh. Based on the collected data and its analysis, this study con cludes with an assessment of the current accessibility issues and vulnerabilities on the Land Ministry’s websites.
... Beberapa alat yang dapat menjadi usulan dalam mengevaluasi kinerja dan kualitas situs web menggunakan empat lapisan: pemrograman, analisis, informasi, data, dan antarmuka pengguna utama [9] atau mengukur kinerja website terhadap kecepatan, jumlah permintaan, waktu muat, ukuran halaman, SEO, seluler, dan keamanan [10]. Selain itu permasalahan yang sering terjadi dalam kinerja suatu situs web terdapat sembilan parameter yang dipertimbangkan untuk pengujian kinerja adalah Waktu Muat, Byte Pertama, Mulai Render, Cat Konten Pertama, Indeks Kecepatan, Cat Konten Terbesar, Pergeseran Tata Letak Kumulatif, Total Waktu Pemblokiran, dan Waktu hingga Interaktif [11]. Preferensi dan perilaku pengguna di situs web ecommerce memengaruhi kinerja situs web tersebut, yang diukur melalui uji kecepatan dan parameter lain menggunakan GTmetrix [12]. ...
Article
Full-text available
Movieku is a free movie download and streaming service that has become a popular choice for users in Indonesia. A smooth user experience requires careful performance testing of the website. This study aims to assess whether the website operates smoothly and provides the best experience for users. Testing was conducted using two techniques: accessing the website with two different devices (a laptop and a mobile device) and comparing performance across various browser applications. The scope of testing includes evaluating website access speed using the GTMetrix tool. The performance test yielded satisfactory response times; however, some aspects of the Movieku website's performance were below average, such as the first contentful paint, which took 1.4 seconds compared to the recommended 0.9 seconds, and the largest contentful paint, which took 2.2 seconds compared to the recommended 1.2 seconds. The study results provide accurate insights into the Movieku website specifically and contribute to optimizing other websites for a smoother and more optimal user experience. Keywords: GTMetrix, Load Testing, Performance Testing, MovieKu Website
... Load testing, a type of performance testing, is a series of processes for determining load-related vulnerabilities in a system that is being tested [30]. One well-known and open source web performance testing tool is WebPageTest [31], which can be utilized to assess the website's speed, usability, and resilience [32]. Consequently, [33] used the WebPageTest tool in his study, which is acknowledged as the leading open-source tool for assessing three essential web metrics: loading time, page size, and the number of requests. ...
... In Next.js, the Image component is used for managing responsive images, while in React and Flask, the srcset and sizes attributes are utilised. The initial image on the main page is loaded with high priority to positively impact the Largest Contentful Paint (LCP) metric [54]. ...
Article
Full-text available
This paper comprehensively reviews methods for improving single-page applications’ visibility (SPAs) and user experience, focusing on the intricacies of search engine optimisation (SEO). This research contrasts the complexities and challenges in optimising SEO in SPAs instead of conventional multi-page applications (MPAs). It identifies vital optimisation methods and evaluates their applicability in the contemporary web landscape. The research method involves implementing the explored optimisation techniques across three distinct projects utilising emerging technologies for SPA, MPA, and a hybrid approach using Isomorphic JavaScript. These applications are systematically examined and subjected to a comparative analysis to assess the effectiveness of the optimisation strategies before and after applying the optimisation strategies. The empirical results substantiate that adopting an innovative approach to Client-Side rendering for the initial page load, combined with traditional SEO practices, performance enhancements, and tailored methodologies for specific technologies, facilitates SEO optimisation in SPAs at a level commensurate with MPAs. The findings of this work hold significant implications for web developers, offering insights and actionable strategies to augment visibility and performance in search engine results. By bridging the theoretical understanding with hands-on application and empirical analysis, the research contributes to the evolving field of web application development. It underscores the critical role of SEO optimisation in the context of SPAs, highlighting its importance for search engine rankings and overall user engagement and satisfaction. Code is available on GitHub:https://github.com/karolinakowalczyk?tab=repositories&q=TravelBLog.
Conference Paper
The traditional imperative programming paradigm at web server side encounters some performance limitations when handling lots of concurrent requests. The performance bottleneck cannot be resolved solely by greater hardware investment and program optimization, which will raise system operating expenses. As a new programming paradigm, reactive programming can handle many concurrent requests with a small number of threads, thereby boosting system performance. This paper proposes benchmarks for performance analysis of reactive web applications: TPS (Transaction Per Second) and ART (Average Response Time), CPU utilization, and network bandwidth utilization, and quantitatively analyzes the performance of imperative application and reactive application in various scenarios. According to the experimental results, the reactive programming paradigm can increase TPS by 6.5 times and decrease ART by about 93% when compared to the imperative programming paradigm in a Linux server cluster with 10 Intel i5–, 12-core CPUs, 4GB memory, 10MB/S network bandwidth, and a maximum of 10,000 requests. Network bandwidth utilization is decreased by around 2 times, while CPU resources are lowered by about 3 times. This demonstrates that reactive programming can enhance the performance and reliability of web applications while handling heavy concurrent demands more effectively.
Article
Full-text available
In the realm of software project development, it's not uncommon for changes to emerge at various stages of a project's lifecycle. Such alterations can manifest in virtually every facet of the software development process, from conceptual design decisions down to the minutiae of the source code. To proficiently manage and track these dynamic changes, professionals turn to a specialized toolset known as Software Configuration Management (SCM). One of the standout features that SCM tools bring to the table is the 'diff' capability. This functionality allows developers to identify and review the disparities between two versions of source code. Recognizing the importance and utility of this feature, the primary objective of this research is to create an advanced diff application. This application, by leveraging the Myers Diff algorithm, is meticulously designed to pinpoint and showcase differences in characters between two sets of text-based data. Moreover, it accentuates these differences by visually highlighting the contrasting characters between the two datasets. To ensure the reliability and accuracy of this newly developed tool, we undertook a series of validation tests. We juxtaposed the results from our application against those from a comparable existing tool. Impressively, the discrepancies in results were minimal, with a marginal difference of just 1%. This suggests not only the utility but also the precision of our application in real-world software development scenarios.
Article
Full-text available
User acceptance testing is the last phase of the software testing process. During user acceptance testing process, actual software users test the software to make sure it can handle required tasks in real-world scenarios, according to specifications. The purpose of this paper is to compare three automated user acceptance testing tool for web application. The evaluation is based on the usability criteria of the tools. The three tools evaluated are Acceptance Test-Driven Development. Three proposed tools are TestComplete, Selenium Webdriver and Watir Webdriver. These three tools undergo usability criteria testing that include efficiency, effectiveness, satisfaction and error. From the evaluation, Watir Webdriver has been proven as the best testing tool based on usability criteria of user acceptance testing.
Article
Full-text available
Nowadays, web applications are most popular in the Internet world. Most of the people spend their valuable time in web applications. Testing web applications is very essential as to ensure its quality. Performance testing measures the efficiency of web application. It identifies performance bottlenecks and optimizes system performances. The proposed approach is used to analyze web application services and to report test results. The approach uses Genetic Algorithm to generate Virtual Users which acts as workloads. The generated workloads are applied to the System Under Test (SUT) for measuring various important performance indicators. The generated test reports helps tester to measure performance of web application services under different conditions. The proposed approach is applicable for all kinds of applications like e-governance, e-commerce, e-learning, blogs, forums, social networking, and group communications.
Conference Paper
Full-text available
This paper presents an automatic conformance testing tool with timing constraints from a formal specification (TEFSM: Timed Extended Finite State Machine) of web services composition (WSOTF: Web Service composition, Online Testing Framework), that is implemented by an online testing algorithm. This algorithm combines simultaneously idea of test execution and debug to generate and simultaneously execute the test cases. In this tool, the complete test scenario (timed test case suite and data) is built during test execution. This tool focus on unit testing, it means that only the service composition under test is tested and all its partners will be simulated by the WSOTF. This tool also considers the timing constraints and synchronous time delay. We can also use this tool for debug that is not easy while we develop a composite of Web services.
Article
Regression testing of web applications is a costly activity as it tends to generate more test cases than the previous stages of software testing. This cost can be reduced significantly by identifying and testing only the modified parts of a web application. This will require locating the changes that have been introduced in the web application from the previous version that was tested. In this paper we have introduced an automated tool for locating the changes in the web application which will thereby aid in effective regression testing of the application. This tool compromise of 3 parts, a) a web crawler that crawls the web application, b) an HTML DOM tree generator that generates the DOM tree for a specified web page, c) an comparator that compares the new DOM tree with a previous version of the DOM tree stored in our system.
Conference Paper
Poor performance of Web-based systems can adversely impact the profitability of enterprises that rely on them. As a result, effective performance testing techniques are essential for understanding whether a Web-based system will meet its performance objectives when deployed in the real world. The workload of a Web-based system has to be characterized in terms of sessions; a session being a sequence of inter-dependent requests submitted by a single user. Dependencies arise because some requests depend on the responses of earlier requests in a session. To exercise application functions in a representative manner, these dependencies should be reflected in the synthetic workloads used to test Web-based systems. This makes performance testing a challenge for these systems. In this paper, we propose a model-based approach to address this problem. Our approach uses an application model that captures the dependencies for a Web-based system under study. Essentially, the application model can be used to obtain a large set of valid request sequences representing how users typically interact with the application. This set of sequences can be used to automatically construct a synthetic workload with desired characteristics. The application model provides an indirection which allows a common set of workload generation tools to be used for testing different applications. Consequently, less effort is needed for developing and maintaining the workload generation tools and more effort can be dedicated towards the performance testing process.