Shai Aharony blog avatar
Shai Aharony
16 May 2017
AUTHOR:
Shai Aharony
PUBLISHED:
16 May 2017
LAST UPDATED:
11 Nov 2024

Long Term Hidden Text Experiment

Introduction

Traditionally, Web Developers and SEOs had a fickle relationship. Both being integral to reaching the aim of the client but neither want to sacrifice their goals in a compromise with the other.

The web designer has always been concerned with making the site aesthetically pleasing for the user while the SEO has always given Google crawlers the higher priority. Necessity bred compromise and methods that would aim to bridge the two have been established and used throughout many of the websites we see today.

One of these issues where compromise was needed was on page content. On the one hand, text was always seen as a necessity by any SEO company who were tasked with ranking pages for specific key phrases while on the other hand, text was always seen as a hindrance from the web designer’s perspective. Something that should be reduced to the bare minimum so as not to spoil the visual experience of the site in question.

The compromise was in the form of Hidden text. Not the white-on-white spammy type often used for keyword stuffing by black hatters but the more respectable and much more common “read more” type where a short excerpt was visible followed by “read more” link needed to be clicked before the full text expands. Done specifically from cosmetic considerations, this is normally achieved using CSS or Javascript and further back, Textarea was also a common way of encapsulating text in a restricted area.

6 Months ago, Reboot Marketing had a conversation with a web developer which many of the readers here would find familiar. We wanted some of the hidden text on the page to be visible while they were adamant to keep it hidden adding that they don’t believe that it would make any difference as “Google reads the code anyway”.

In 2014, google enabled a fetch and render tool which has provided us with an important indication on the direction Google was taking. It seems Google was no longer content with just reading the code of the page, it actually rendered it so that it could “see” the page as a human would see it. This led to the obvious conclusion that if Google can ‘see’ that text is hidden, why would it give it as much weight? The problem was that the argument (just like the one we had 6 months ago) was one of opinions. There were no hard facts or studies to show this so we decided to create one.

Hypothesis:

Over the last few years, it has been reported widely that Google is placing extreme amounts of resources in trying to actually see the page as a user would. Rendering pages, reading Javascript and understanding CSS are all part of that direction Google is clearly taking. It is therefore hypothesised that more weight is going to be given to keyphrase or text that is visible to the user rather than text that is outside the immediate view of the user.

Method:

Its important to understand that running a controlled experiment on an algorithm that is so complex is nearly if not completely impossible. Thousands of variables are at play and it’s our job to try and minimise those variables to the best of our abilities. By doing so, we have to restrict the questions we may be able to answer to their simplest form. Only by doing that we may get some meaningful answers.

For the purpose of this experiment, we invented a new phrase and ensured this phrase is not recognised in Google prior to the experiment. The phrase is: [andefabribiles] which is a fictional name given to a new type of bacteria.

Here is the result of that phrase on Google prior to starting the experiment:

We decided that we are most interested in how Google ranking algorithms behave in these 4 scenarios:

Homepage text completely visible:

Here is how Google sees the page:

Homepage text hidden in a text area while only the first paragraph is visible

Homepage text hidden via CSS while only the first paragraph is visible

Homepage text hidden via Javascript while only the first paragraph is visible

We chose those scenarios as these are the most common way web designers deal with space/design constraints. Its important to note that we have avoided having the key phrase [andefabribiles] on the first paragraph. This is done on purpose so that, on the hidden text pages, the key phrase is not initially visible.

How We Did It:

We purchased 20 new domains ensuring that each domain name returns no results in Google or have any previous history of registration. On each domain, we built simple yet slightly different website to ensure minimal footprint. We then divided the group of 20 into 4 to test each of the cases above in this manner:

Fully visible:
topodlemba.co.uk
ulmitatiti.co.uk
vernowelln.co.uk
wamesbudio.co.uk
yoddoweent.co.uk

Javascript Hidden:
jelellessl.co.uk
kafeaspost.co.uk
kiaterasiv.co.uk
kifertasid.co.uk
lagapsylli.co.uk

CSS Hidden:
mozymation.co.uk
odystrache.co.uk
oriernisti.co.uk
rellogidoc.co.uk
squissenti.co.uk

Textarea:
acquarbour.co.uk
batringies.co.uk
cheurebiti.co.uk
doncernuti.co.uk
jehostrasa.co.uk

Each one of the 20 sites contained approx. 400 word descriptions. The content will be very comparable in structure and length across all of the sites. To reduce variables to the minimum, we ensured that Keyword positions are all in very similar locations across all of the content and in all of the sites, the keyword is mentioned 3 times only on each site. In all cases, the keyphrase was only mentioned for the first time in the second paragraph. In the case of the 3 hidden text groups where only the first paragraph is visible, this keyword was not visible by default.

1. All domains had all robots blocked until content was ready and published. This ensured that the domains are not crawled by Google at substantially differing times.

2. All sites had non-duplicated but similar title tag structure. Again, care has been taken to ensure keyphrase is mentioned consistently at the end of the title tag across all sites.

3. Once the sites were ready, we continued in a zigzag sequence, removing the robot’s restrictions and fetched through unique Google Web Master tools accounts, locations and IPs ensuring the first site to get indexed was a JavaScript Hidden as JavaScript has traditionally seen incompatible with indexing/SEO then alternating afterwards until all 20 sites were indexed. We only continued with the next site once the previous had been indexed and showing in the Google index for both Site: and cache: commands.

4. We searched for the key phrases and recorded results with images.

5. We have monitored progress of ranking over the next few months.

Monitored Words:

andefabribiles

Precautions Taken:

1. We minimised all related search activities and site visits.

2. We ensured all activity was carried out via incognito/private mode.

3. We did not click through Google search results whilst monitoring.

4. All sites released in a zig-zag fashion to ensure even spread to any variables.

5. All sites released from new IP addresses and locations. Each IP had its own Search Console account to fetch from.

6. Apart from myself and one other person within the company, no one else externally knew the actual domains or test phrases. This was done to ensure there was no contamination of data by unprotected clicks and views.

The results:

As you can imagine, ranking data for 20 sites over a 6 months period of time which produced just under 3300 data points was quite a challenge to visualise. We wanted to include all of the data while allowing the reader to select differing website groups in one location to easily demonstrate the relationship between those groups.

Having 20 separate lines across 6 months would just be an overload of data so we decided to average out the rankings for each group and by doing so condense 5 lines into on.

We have also included separate graphs for Bing and Yahoo as a reference. We think you will find the contrast between those two search engines and Google very interesting.

Outlier Results: For an unknown reason, we have one site that reacted very strangely throughout the experiment so we removed that result from the graph. The removal of that one site out of 20 has not affected the overall result. You can see the full data set including the anomaly in the file download at the end of the experiment and also in the final detailed graph which includes it.

Interactive Graphs – You are able to select each group along the top of the graph and also zoom in to any particular date with more resolution for that period by dragging on the relevant date range in the horizontal bottom date bar.

The below graph clearly shows what seems to be a preference by Google for Visible text sites but surprisingly also weighs Textarea content as visible text. Its interesting to note that sites that used Javascript and CSS to hide text were treated almost equally and both are quite substantially lower than Visible and textarea sites.

 

The below graph is for the same data but from Bing

 

The below graph is for the same data but from Yahoo.

 

It’s very interesting to note the chaos that is demonstrated by these two engines in comparison to Google which would indicate they have no weight preferance for visible text and would also hint at the superiority of Google’s algorithms.

For those interested in the full-blown data, I have included the excel sheet here for you to download.

And for those wanting to see the behaviour of each individual site rather than the average of the group:

 

Conclusion:

The experiment clearly demonstrates Google’s preference for visible text. The experiment showed Google algorithms clearly gives less weight to text hidden via CSS and JavaScript. The level of this handicap is difficult to measure without further experimentation but it’s clear that having the text visible on your site helps substantially with the ranking for those terms.

What’s more surprising is the way that Google treats text in a Textarea as if it was fully visible. As the use of Texarea to display text in web design is almost non-existent these days, this is a bit of a moot point but interesting nonetheless. I you have any theories for this behaviour, please feel free to leave them in the comments below.

Special thanks go to our freelance data visualisation expert Kristian Vybiral who has taken the time to get the graphs exactly how we wanted them. Also to our whole SEO company and digital PR agency team for their help in getting all of our SEO experiments live and sharing them with their networks.