On the Ecommerce Outtakes blog, we talk a lot about what not to do online. In fact, our main focus is to point out where websites go wrong—with the intent, of course, to help improve the e-commerce experience across the web. One trend we’ve been noticing a lot lately is a lack of good filtering and sorting options. It’s a widespread e-commerce epidemic, and it’s high time we cured it.
Archive for the ‘Build Your Website’ Category
Rich snippets are all the rage these days. Ever since Google started
enhancing their search results with these extra tidbits of information,
everyone is rushing to update their web sites with the metadata to
enable them. So what is the benefit of having a “rich” search result for
your site? Good question. Other than giving the search engine user a
little bit of extra bit of detail, I suppose there’s also a subtle
psychological factor that kicks in. Someone might be more inclined to
click on a search engine result that has a 5 star rating and a friendly
face than one that doesn’t. Plus, they’re just plain cool. Who doesn’t
want to add bling to their search results? But this only scratches the
surface. There’s much much more to them than that.
Instant information aggregation: It’s only a matter of semantics
Rich Snippets, as Google calls them, are actually semantic markup. The
idea of marking up some sort of document with meta information for the
benefit of machines is not a new idea. Semantic markup is as old as
information technology its self. For example, a Word document contains
metadata about its author, and a digital photo contains meta data about
the camera it was taken with. You might, for instance, store your
digital snapshots in a photo archiving program which uses this semantic
data to filter your photos by date taken, lens type, flash used, etc.
So, in essence, metadata is data about data.
It’s should be clear, then, how this “data about data” can be extremely
useful to search engines. It can provide a search engine the ability to
derive a semantic meaning from a document’s meta
information rather than having to rely purely on the abstract, human
understandable, concepts within the text of the document. Searches can
become less about keywords in text documents and more about
relationships between semantical data types.
To illustrate this point further, consider the following search: Find
all restaurants with a 3.5 star or better rating on the Las Vegas strip
that specialize in Italian OR Mexican cuisine AND are open after 11 PM
on Sunday nights AND do NOT require reservations. On the
semantic web, rather than a list of links to restaurant web sites that
may or may not match your given criteria, you might get a list of
“restaurant result objects” that DO match exactly
that criteria and never even have to visit the restaurant’s web site.
This is where the real power of semantic data lies. Instant information
This “semantic web”, also, is not a new idea. In fact, Tim Berners-Lee
himself envisioned the world wide web as a kind of “Semantic Network
Model” and even the earliest HTML specifications included the concept of
meta tags, which you are undoubtedly familiar with. Later iterations,
such as XHTML, took this idea a step further. Most notably is the RDFa
specification, which has been around for quite some time.
GoDaddy might not be as familiar name as Google to ordinary internet users, but most webmasters had, of course, heard this name. GoDaddy is currently on of the leaders in webhosting industry, providing various related services, such a website hosting, domain registration, dedicated servers, email plans, etc. Although dominating the market is not something GoDaddy had achieved, it might very well be on their mind.
It has been reported recently, that Google and GoDaddy enter certain form of partnership considering a “WebSite Tonight” feature, offered by GoDaddy. This service is a powerful tool that allows users create a website pretty quickly by using one of the available pre-designed templates, making it look almost “professionally designed”.
Google’s share of WebSite Tonight is offering various add-ons, widgets and tools that might be useful for a website owner and/or visitor. These include customizable search bar, Google Webmaster Tools, SEO-checking tools and more. Submitting website to Google is also made easier, helping webmaster to appear in the listings of world’s leading search engine quickly. Some tools will be available during the website building process; others are incorporated into the website’s control panel.
It has only been several months since Google had announced and completed the “New Adsense” – a redesign of the familiar GUI, adding several features to impress the users. And here it is – they are already adding more attributes to the popular money-making feature.
According to recent report, there will be now more things you could do in your Google Adsense account, such as creating and editing channels in Adsense for Games and Adsense for Video, blocking specific products by names and view the reports by page, and not only by unit.
There have also been some “renames” – the HTML is now “rich media” and Dynamic Images are called “Animated Images”. All those (as well as text, image and Flash) are included in the performance reports as “Ad types”. In addition, “Ad Requests” is the term that is now used instead of “Unit Impressions”, counting each time the request to show the ad is sent by the website towards Google service.
Google hopes these updates will be beneficial to Adsense users, making the popular “monetize you website” option preferable over Affiliate marketing, specific client banners and other possibilities.now
We have been thinking about how to stop the spam bots without using a captcha. Most Captcha’s work to a certain degree, but in general you do not want to make it any more inconvenient for real people get through your web forms. Well Bob nailed it, just add a css class to the form field that says display:none, robots will fill out this field and real people will not.
So kill those super annoying captchas….personally I cannot read half of them anyway.
You might have noticed that starting mid-December, Google is labeling certain websites with “this site may be compromised” notice that appears in the search results under the website’s link. According to Google’s Matt Curtis, this is actually done to help webmasters, noting that their website is probably being hacked. The procedure of banning sites from Google search index as noting the owners via Google Webmaster tools has proved “too slow” as not many site owners check their Webmaster Tools notifications on regular basis.
As the hacked website does not usually present an immediate threat to the visitor (if malware is detected, Google Search will show the more aggressive “This site may be harmful to your computer” message), the “this site may be compromised” notice is destined mainly towards owners, who constantly monitor their website appearance in Google search, urging them to pay an immediate attention to the problem.
Recently there was some discussing about Google’s new search engine results page, offering the ability to see a preview of the websites within the results. Here is one of the original posts, it appears that Google is now testing or have released this in the United State results.
In reality this is not a new idea, they are basically borrowing the idea from Search Me
I do like the idea of letting users view the quality of a website before clicking, wonder if they plan on adding this to PPC? I would change the whole game.
You’ve probably heard that site speed is now one of the more than 200 factors that Google is using to rank search engine results. The reactions range from, “Everybody panic!” to “This will make it easier for the big sites to stomp the smaller ones,” to “Well it’s about time.” I actually don’t think that most smaller sites are going to suffer because of this change. Sure, the big guys can afford to have their sites hosted on faster, dedicated servers, but some of the worst sites when it comes to speed are sites of big cheeses, particularly those who sell expensive things. You can see in the screen shot the yawn-inducing graphic you have to sit through before you can actually do anything on one such site (a luxury watchmaker). If anything, it will be the sites that are electronic monuments to big egos that are going to suffer most. There are a lot of users like me who see that “Loading, please wait” widget as the perfect reason to click the “Back” button.
If you read Matt Cutts’ blog post for April 9, you’ll learn the reasoning behind Google’s decision, and why Cutts doesn’t think it’s going to be that big a deal. Here’s a recap.
Why Matt Cutts Doesn’t Think It’s Going to Be That Big a Deal
- Your site will depend much more on factors like reputation, relevance, and quality of content, and compared to these factors, site speed will be a relatively small factor in your ranking.
- Less than 1% of queries will change now that site speed is incorporated into the ranking algorithm. In terms of search results, there shouldn’t be a noticeable difference because the average SERP only shows about 10 results.
- Google actually launched this feature a few weeks ago and few people even noticed.
- Google has a whole mini-site dedicated to speeding up your site with plenty of resources and teaching videos about how to do it.
- Small sites are often quicker to respond to this kind of change on the theory that it’s easier to turn around a tugboat than the Titanic.
- And finally, most websites can be made faster with fairly simple fixes, which improves conversion rates and ROI.
Why Some People Say, ‘Oh Yes It Will Be’
Actually, a lot of the hand wringing over this is being done out of not precisely knowing things like how “site speed” is measured. Google isn’t terribly forthcoming about telling people how site speed is figured and weighted, so there are still questions surrounding this new ranking factor, such as:
- Is site speed the time it takes the entire page to load, or is site speed how much time per KB loaded? (Actually, code.google.com pushes their open source tool called Page Speed, which you’ll read more about below. Page Speed evaluates speed as page load time: the time elapsed between the time a user “requests” a page and the time the page is fully rendered by their web browser.)
- How will non-commercial, “passion”-based sites ever make it onto page 1 if they can’t afford a fast server or a web designer with mad skills?
- Will this give the big guys more stomping power since they can afford faster hosting and SEO services?
- Will overseas websites be penalized since their servers are in other parts of the world and may load slowly in the U.S.?
- Will this spell the end of small web hosting companies as sites rush to the big hosting companies with the fastest servers?
- Is this a form of “double jeopardy” where sites that already get less traffic and fewer back links due to their slowness are penalized more?
- Google Analytics code can measurably slow down load time – will webmasters be indirectly penalized for using Google Analytics?
What Should Webmasters Do?
Google wants you to speed up your site. They did some experiments where they deliberately slowed search results page loads to see how users would respond. They found out that slowing down a page by 100 to 400 milliseconds produces 0.2 to 0.6% fewer searches. Not only that, searches dropped even more over a period of weeks, and, even if the page loads returned to normal, it took users a couple of weeks to return to their normal search habits!
And sure, Google wants you to do lots of searches because the more you search, the more money they make. Therefore Google likes sites to load quickly so you won’t become frustrated and stop searching.
The first thing webmasters should do is to use some of the official tools for measuring site speed that Google offers. Go to Google’s Webmaster Central blog post on the topic of site speed (also see screen shot) and try out some of the speed measuring tools they suggest, including the Google-approved Firefox / Firebug add-on called Page Speed, the Yahoo! tool called YSlow, and WebPagetest to show your page’s load stats and generate a checklist for optimization. Google’s Webmaster Tools, under Labs, then Site Performance will show you how fast your website loads to users around the world.
Tips for Speeding Up Site:
- Reducing size of responses, cached pages, and downloads (minimizing payload)
- Reducing upload size (minimizing request overhead)
- Reducing the number of serial request / response cycles (minimizing round-trip to the server times)
- Improving your layout (optimizing browser rendering)
It seems we’re always hammering home the importance of link building, external links, and inbound links, and that sometimes makes us minimize the importance of the links available right there on your own pages. After all, you have complete control over the pages on your site, and if they have matured to where they have page rank, then that’s even better. There’s a lot you can do with respect to how your pages pass rank, which influences how search engines view the content on your pages.
SEO and high ranking for competitive terms actually has a lot to do with your internal link structure, though you might never know it for the choruses of “link building or die!” (which, yes, we’ve been guilty of as well). A new site that’s designed well, that’s themed and structured topically around a specific handful of keywords, has a better chance of rising to the top of the SERPs than an older site that doesn’t take this structure into account with respect to internal links, content, and naming conventions of filenames.
Optimizing Internal Linking
Since you have control over your internal pages, you might as well make the most of your on-site SEO opportunities. There’s a lot you can do to increase the relevance of all the pages on your site.
Start by making all your links absolute and getting rid of any secondary keywords that are irrelevant. As your pages mature, you want to make sure that they have names in the format of http://www.yourwebsite.com/pagename.html. That causes your pages to boost each other in the SERPs, and ensures that if your content is copied, the links will point back to your pages, giving you another back link (hooray!).
Put a limit on your outbound links at about 10. This helps you keep your pages focused. The fewer the outbound links, the more link juice the page has to transmit to its own keywords. This should be come evident when you start building external links and see how quickly the pages float to the top of the SERPs.
Optimize your anchor text by making sure your main keyword phrase shows up at least once on the page and in the title. If you assign each link wisely, you can nail down a handful of keywords that you want to show up exactly when people do searches. But don’t optimize any given page for more than three keywords. And if your page is getting much over 750 words, try to changing it into two pages with another keyword variation. This can get you double listed in the SERPs.
Keep in mind that contextual links inside your content should go high up on the page – above the fold if possible. Links that show up higher on the page carry more influence in search engines when compared with footer links. And if you’re using contextual links within your site, make sure you use the main keywords for the page that you want to rank with. In other words, make sure the anchor text on page x uses the keywords for page y that you want to rank for. This improves the quality of the internal back link with time, and in the meantime keeps your relevance high.
Other Things to Consider
It’s also important to remember that keyword stuffing isn’t good. It’s one of those cases where less is more. Keep it to the basics of once in the title, again in the description, and a few times on the page. And once in your h1 tags.
You know how when you break a toe, the doctor will often “buddy tape” the broken toe to its neighbor to provide support as it heals? Well, You can buddy tape your pages by letting four or five of your highest ranking pages concentrate their link-mojo to your newest page, the one that may be slightly wobbly and needs to build up its strength. This is a good idea whenever you launch a new page.
Sometimes you need to do some housecleaning as well. Getting rid of off-topic pages and doing a 301 redirect to another page (or your home page) that’s been indexed. Some people will buy a new domain name made up mostly of your keywords, then redirect your old site to the new one. This is a little controversial, and can be painful in the short term, because it will take a few weeks for your rankings to get back up to where they were. However, the rankings should come back stronger in the long run, assuming you’ve done your due diligence with optimizing.
This is a sort of risky move, and if you’re put off by the idea, you could instead make sure you have a blog that’s listed in blog directories and start updating it regularly to increase how often the blog is crawled. This should eventually lift the tide for your whole site from all the spidering going on.
If you take care of these on-page optimization techniques, then think how powerful your off-page optimization will be!
The great news is that web hosting is a very competitive market, and you have plenty of choices. There is stiff competition for the opportunity to host your website, so make sure that your needs are met. If one company doesn’t meet your needs, there are plenty of other fish in the sea. Here are the things you should consider and compare before signing on with a web hosting company.