ThesisPDF Available

The crossover point between keyword rich website text and spamdexing. [0002]

Authors:

Abstract and Figures

With over a billion Internet users surfing the Web daily in search of information, buying, selling and accessing social networks, marketers focus intensively on developing websites that are appealing to both the searchers and the search engines. Millions of webpages are submitted each day for indexing to search engines. The success of a search engine lies in its ability to provide accurate search results. Search engines’ algorithms constantly evaluate websites and webpages that could violate their respective policies. For this reason some websites and webpages are subsequently blacklisted from their index. Websites are increasingly being utilised as marketing tools, which result in major competition amongst websites. Website developers strive to develop websites of high quality, which are unique and content rich as this will assist them in obtaining a high ranking from search engines. By focusing on websites of a high standard, website developers utilise search engine optimisation (SEO) strategies to earn a high search engine ranking. From time to time SEO practitioners abuse SEO techniques in order to trick the search engine algorithms, but the algorithms are programmed to identify and flag these techniques as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing (one form of spamdexing) in a webpage. However, they regard spamdexing in many different ways and do not provide enough detail to clarify what crawlers take into consideration when interpreting the spamdexing status of a website. Furthermore, search engines differ in the way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover point of keyword dense website text to spamdexing. Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the body text of a webpage. This raised several fundamental questions that form the basis of this research. This research was carried out using triangulation in order to determine how the scholars, search engines and SEO practitioners interpret spamdexing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the results were recorded. During both phases almost all of the webpages, including the one with a 97.3% keyword density, were indexed. The aforementioned enabled this research to conclusively disregard the keyword stuffing issue, blacklisting and any form of penalisation. Designers are urged to rather concentrate on usability and good values behind building a website. The research explored the fundamental contribution of keywords to webpage indexing and visibility. Keywords used with or without an optimum level of measurement of richness and poorness result in website ranking and indexing. However, the focus should be on the way in which the end user would interpret the content displayed, rather than how the search engine would react towards the content. Furthermore, spamdexing is likely to scare away potential clients and end users instead of embracing them, which is why the time spent on spamdexing should rather be used to produce quality content.
Content may be subject to copyright.
A preview of the PDF is not available
... Link On-Pages dikembangkan dengan sistem website berbasis CMS yaitu melakukan pemberdayaan fitur web yang dimiliki. Teknik yang dapat digunakan dalam memberdayakan fitur web seperti penamaan domain yang tepat dengan topik web, Judul Tag (Tittle Tags), Heading Tags, meta tags, pemberian deskripsi [8], memberikan atribut title pada setiap anchor text dan image, internal linking. Link off-pages dilakukan dengan aktifitas back-link, yaitu tautan dari blog atau website lain ke blog atau website yang dimilikinya. ...
Article
Full-text available
Abstrak Permasalahan yang sering dihadapi oleh UMKM adalah terbatasnya jumlah dan jangkauan pemasaran dan penjualan produknya. Begitu juga persaingan produk sejenis dapat terjadi oleh antar produk lokal atau produk yang datang dari luar. Hal ini disebabkan karena pemasaran dan penjualan masih dilakukan secara konvensional dan dilakukan secara individual. Penelitian ini bermaksud melakukan implementasi model sistem E-Commerce produk UMKM di suatu daerah atau Kabupaten dengan model pemberdayaan partisipasi kelompok siber (cyber cluster partisipatif) dalam melakukan linking web yang dikembangkan menggunakan sistem Search Engine Optimisazion (SEO) dan Content Management System (CMS). Tujuannya adalah agar web yang dikembangkan dapat dengan mudah menempati ranking teratas pada halaman pencari web (search engine) dan selalu terupdate isi dan rankingnya. Manfaat dari penelitian ini adalah meningkatkan pemasaran dan penjualan produk UMKM hingga pasar global dan menjadikan alamat web tersebut mudah dicari dan dan ditemukan karena sering muncul pada posisi puncak pencarian di mesin pencari seperti google. Luaran penelitian ini adalah website produk UMKM berbasis CMS dan teroptimisasi dengan model link internal dan eksternal sehingga selalu muncul pada posisi top range pencarian. Metode penelitian ini menggunakan action research, dengan model pengembangan sistem terstruktur model air terjun (waterfall). Aplikasi webnya sendiri dikembangkan dengan model prototype, sesuai dengan kebutuhan penggunannya. Kata kunci—Cyber-Cluster, SEO, CMS, UMKM Abstract Problems that are often faced by UMKM (SME) is the limited number and range of marketing and sales of its products. So is the competition of similar products can occur by inter-local products or products that come from outside. This is because marketing and sales are still done conventionally and done individually. This study intends to make the implementation of a model system of E-Commerce SME product in an area or district with the participation of empowerment models cyber group (cluster cyber participatory) in performing web linking system that was developed using Optimisazion Search Engine (SEO) and Content Management System (CMS). The goal is for the web that can be developed easily ranked the Web search page (search engines) and always updated content and rank. The benefit of this research is to improve the marketing and sale of products of SMEs to global market and making it easy to find the web address and and are found as often appear in top positions in the search engines like google. Outcomes of this research is based CMS website MSME products and optimized the model of internal and external links that always appears at the top position of the search range. Methods This study uses an action research, the model of structured systems development waterfall model (waterfall). Its own web application developed with prototype models, according to consumer needs. Keywords—Cyber-Cluster, SEO, CMS, SME
Research
Full-text available
Today the World Wide Web (WWW) has become one of the most important sources for information retrieval. Researches discovered that the so called “deep web”, the part of the Internet which is not located by search engines, is 400 to 550 times larger than the visual web (BrightPlanet, 2001). Therefore it is extremely important for website owners to increase the visibility of their websites as good as possible. This leads to the question which parts of websites actually are detected by modern Web Crawler's, like those the most big search engines are using. When looking at web pages it is obvious that a website, consisting only out of text, would not be attractive and also would in the most cases not support a high number of users. In fact online presences consist not only out of various lines of code and text but also out of a lot of pictures and, more and more, out of videos. Because of the importance of videos and pictures on websites and the relevance of search engine optimization, the role the use of images and videos plays in the achievement of top search engine rankings is discussed below.
Conference Paper
Full-text available
The primary objective of this research project was to investigate and report on the importance of finding synergy between the concept of website visibility and website usability. A website needs to attract visitors and the website designer needs to ensure the website is visible to search engines. Websites that are not ranked highly by search engines are less likely to be visited. It is common practice for Internet users to not click through pages and pages of search results, so where a website ranks in a search is essential for directing more traffic toward the website. Users tend to examine only the first page of search results and once they find a good match for their search, they normally do not look further down the results list. Most search engines display only 10 to 20 of the most relevant results on the first page. Search Engine Marketing (SEM) is often used to describe acts associated with researching, submitting and positioning a website within search engines to achieve maximum exposure for a website. SEM includes search engine optimisation (SEO), paid listings and other search-engine related services and functions aimed at increasing exposure and traffic to a website. SEO is the process of improving the contents of a web page so as to offer a richer harvest to a search engine crawler. The higher a website ranks in the results of a search, the greater the chance that that website will be visited by a user. SEO helps to ensure that a website is accessible to search engines and improves the chances that the website will be found by the search engines. Website usability focuses on what happens after users arrive on a website. Usability professionals observe what users do once they arrive on a certain page. Do the users take the desired call to action? Do the users understand what page they are viewing? Is the website easy to navigate? Website usability, therefore, can be defined as a website’s user-friendliness and the extent to which users can use the information on the website to meet a specific objective. The methods employed in this project include the review of literature and secondly, an inspection of South African based websites to determine the level of visibility and usability elements implemented. Results seem to indicate that very few South African based websites have both visibility and usability elements implemented on the same web page.
Conference Paper
Full-text available
The principal objective of this research project was to determine the extent to which e-Commerce websites make use of metadata to enhance website visibility to search engines. The methods employed were to firstly identify a number of e-Commerce websites, and secondly to inspect, record, and analyze the relevant meta tags used in its coding. A subset of 200 e-Commerce websites was compiled by extensive Internet searching using standard search engines and portals. Each site was evaluated to confirm its qualification as an e-Commerce site, and its three visibility-related HTML meta tags were inspected. The results prove that a reasonable percentage of e-Commerce web pages make use of the three meta tags in question, but none make use of Dublin Core. An average score of 66.83% was earned overall. The primary conclusion reached is that initially meta tag usage appears to be reasonably high, considering figures of 60% and above. However, if the percentages of the non-users are extrapolated across the size of the WWW, while assuming that not all web pages are e-Commerce based, a large number of web pages are not availing themselves of one of the most basic visibility features.
Article
Full-text available
This informative article introduces and examines the idea of academic internet search engine optimization (ASEO). Predicated on three lately done studies, directions are supplied on the best way to enhance scholarly literature for academic search motors generally and for Bing Scholar in particular. Additionally, we briefly discuss the danger of analysts 'illegitimately 'over-optimizing 'their articles. The ability to search a specific web site for the page you are looking for is a very useful feature. However, searching can be complicated and providing a good search experience can require knowledge of multiple programming languages. This article will demonstrate a simple search engine including a sample application you can run in your own site. This sample application is also a good introduction to the Python programming language key words: Introduction:
Book
Full-text available
The quest to achieve high website rankings in search engine results is a prominent subject for both academics and website owners/coders. Website Visibility marries academic research results to the world of the information practitioner and contains a focused look at the elements which contribute to website visibility, providing support for the application of each element with relevant research. A series of real-world case studies with tested examples of research on website visibility elements and their effect on rankings are reviewed. Written by a well-respected academic and practitioner in the field of search engines. Provides practical and real-world guidance for real-world situations. Based on actual research in the field, which is often used to confirm or refute beliefs in the industry.
Article
Websites can significantly increase their business volume through search engine optimization (SEO) and paid advertising. SEO involves the practice of modifying Web pages, to enhance their visibility in search engine results. Websites also need to adopt new strategies and tools, to increase the performance of their search engines and volume of business. They can also use search engine robots, to describe certain aspects of a Web page, such as its title, major headings, and special emphasis. The Websites need to select and highlight a heading, which can increase the visibility of a Web page and help in enhancing business prospects. Websites can also increase their business prospects, by establishing contacts with the Webmasters of related, exploring possibilities of link opportunities.
Article
This editorial preface will describe the need for new technology for search engines, the two generations of search engine technologies used, and the need for a third generation search engine. We will discuss the limitations of today's search engines and difficulties in determining relevancy of the search and the need for a context-sensitive ranking system, and take a closer look at the future of search engines based on the open source model.