There’s no spam in Facebook, right?

Something rather depressing landed in my inbox today. I subscribe to the mailing list over at Startups.com. They send through offers every few days, some of them are pretty cool. Then there’s this:

facebookSpam

It seems it’s now possible to pay to get Facebook likes. Presumably the company offers the people giving the likes money or some other sweetener to encourage them to hit them blue button of power.

This strikes me as very much the same thing as black hat seo, tricking the system to seem popular and sell more stuff. It’s a shame really, we’re planning on putting like buttons on CloudTrawl.com when it goes live, but with people buying likes doesn’t that cheapen the whole thing just a little?

Sharing reports with customers

feeltheloveI’d like to tell you a little about what we’re working on right now. In the past we’ve had quite a few requests from web consultants who’d like to be able to share exported reports from DeepTrawl with their own branding attached.

That’s been a priority for CloudTrawl from the beginning. If you’re a consultant working in web design we think CloudTrawl is exactly the kind of thing you’ll want to use and share with your clients, because:

– CloudTrawl proves the site you’ve created is consistently available & functioning well
– It allows both them and you to rest easy knowing if there’s a problem you’ll be alerted
– CloudTrawl is an awesome value-add service you can provide & shows you care about their site

So how are we going to make this work? Perhaps by allowing you to export a PDF containing a report and manually email it to your client? Nope, that’s so last century.

Surely the best way would be to allow them to log into CloudTrawl directly, see reports themselves and optionally allow them to change settings so they can get alerts for things link downtime and broken links.

That’s exactly what we’re doing. We’re implementing a feature called site sharing. Your CloudTrawl account could contain perhaps 10’s or 100’s of sites, all being constantly monitored. You can choose to share any one of these with anyone. If they don’t already  have a CloudTrawl account we’ll automatically send them an email inviting them to create one for free, they’ll then be able to see and interact with the reports and settings you’ve shared with them.

As an added bonus when that user logs in they’ll see your branding.

That’s some serious added value for your clients. For a low monthly fee you’ll be able to add all the sites under your care, share their reports with your customers and prove you care about their site. Feel the love!

Where can I see how Google ranks my site in the USA?

Just a very quick tip we think is worth sharing.

When you’re outside of the USA and you go to Google.com, quite often you’ll find that you’re redirected to one of Google’s country specific sites. This presents a problem for web masters. If we’re outside of the 50 states how can we tell where our site is ranking in the worlds biggest economy?

Well thankfully the answer is really simple. All you need to do is use this address for Google:

http://www.google.com/ncr

This will always take you to the US version and you can see where your site is ranking for specific keywords as if you were sitting in the USA.

Book recommendation: The Lean Startup

lean startupMany of us are quite familiar with the concept of testing ideas. Not sure which Buy Now page design will work better? Use an AB test and find out. Not sure if a feature of your site is clear and easy to use? Get some users in front of it and watch how they use it.

Services like  Google Web Optimizer Google Analytics experiments make this so simple it seems crazy not to test any idea with real people and get real figures on what works and what doesn’t.

This is the type of thinking encouraged by The Lean Startup, an influential new book by Eric Ries. He systematically dismantles the reasons for using what he calls “vanity metrics”, e.g. How many new visitors your site is getting per month. The reason he doesn’t like this type of thinking? Because if you’re making changes to your site it’s way to easy to imagine the changes you’re making are the reason for the increase in visitors. In fact this may be due to word of mouth and your graphs would keep going up even if you did nothing.

It may appear that this book is for hardened software developers and their CEO’s, not for web designers and site owners but we’d argue there’s a lot to be learned here for both camps. A web site is a big interactive thing and site owners can easily fall into business traps just like software developers.

Now a word of warning: In our opinion the book became a little repetitive, but even if you only read the first few chapters it could get your brain buzzing and soon enough you may find your thinking completely rewired and your way of viewing your work could be greatly enhanced. You can get the book here.

Uptime check frequency – why does it matter?

Since we starting planning our uptime monitoring service we wanted to offer something different – not just a me too service but a deal changer. In short we wanted to do uptime checking better than anyone else.

One of the big differentiators is how often a monitoring service checks your site. Is it once per hour? Once every 5 minutes? The industry consensus seems to be that once per minute is adequate for everyone. That’s an assumption we wanted to challenge. At first checking every minute may seem pretty ok. When you receive a downtime alert SMS, does it really matter that it came perhaps 50 seconds after your site went down? We say yes, and here’s why.

bar graph white

The Slashdot effect

This is a pretty common issue when running a site. You want lots of visitors, millions would be nice. You’ve paid for a server or hosting service which can deal with your normal amount of traffic but then a massive spike comes along from a popular link. Now your site should be serving huge amounts of requests. But it doesn’t, it falls over under the weight of the traffic.

If you know the site has gone down you may be able to quickly add capacity or deal with the influx by replacing some of the big images and keep it online. In that situation 50 seconds is too long to wait. You could easily have lost several thousand viewers. If you’re selling stuff how many sales will that lose you? We’d bet quite a lot.

Even for your regular traffic 50 seconds could mean losing important viewers. Not good.

Another time you need to know right away is when you’re doing updates to your site. Perhaps a piece of hardware is being changed out. Perhaps the server settings are being tweaked. In that situation isn’t it best to know immediately if there’s a problem? Especially since you’re probably sitting right in front of your computer ready to fix any issues.

History matters

The next reason for more frequent checks is viewing the uptime history of your site. The more often the checks, the better the history.

history

When viewing charts like the one above it’s important to know the figures are accurate. Is your service provider really maintaining the uptime you expect? Is it time to switch? Better data allows you to make better decisions.

So for these and many less dramatic reasons we decided that 1 minute checks simply aren’t good enough. We’re developing our uptime monitoring to check every 30 seconds. That’s better than anything we’ve seen anyone offer. In fact it’s 2x better than the existing market leaders.


Let’s talk a little about confirmations

stationsThere’s another thing to think about when trying to get meaningful downtime alerts as fast as possible: False positives and how we deal with them. A lot of existing services will send you an alert once more than one of their monitoring stations has checked your site is down. This is because monitoring stations themselves aren’t infallible. The network could be flaky near one of the stations and that’s why it sees your site as down. So it’s a good idea to get another station to confirm before alerting you.

The problem with a lot of services we’ve seen is that they’ll do this in their own sweet time. A station will check your site, see it’s down and then wait at least another minute for another one to confirm it.

We made an architectural decision that when a station sees your site is down it will immediately ask another station to check it. That way you get alerts right away, and we make sure there aren’t any false positives.

One last thing: Realtime

One final observation we made was that other services give you no feedback about what’s happening until something goes wrong. This unnerved us. You can sit and look at some service’s web interfaces and have no clue anything is happening at all. For peace of mind we added the Realtime view.

realtime

This constantly shows what CloudTrawl is doing. You can actually see the countdown until the next check, where it’ll come from and the results of the last check from every worldwide monitoring station.

To sum up, we:

Check your site every 30 seconds (2x better than our competition).

Perform immediate confirmations (no false positives – no delays).

Show a realtime view allowing you to see exactly what CloudTrawl is doing and exactly what the state of your site is at any time.

These three reasons are why I’m personally very proud of what were doing with our uptime checking and why I genuinely believe there is no service out there which can beat us.

Want cool charts for your site?

With the release of CloudTrawl drawing closer we’ve been concentrating polishing the user interface. We’ve been working hard to make it feel like something webmasters intuitively already know how to use. To achieve this we’ve taken some inspiration from services such as Google Analytics. Their interface gets one thing really, really right: charts.

Analytics charts look awesome and are really easy to use. Base on that inspiration we’ve come up with our own charting system which we believe is just as cool and intuitive:

coolChart

The fully interactive chart above took only two days to implement. Now we can re-use it over and over to show lots of different kinds of data. It isn’t flash, it isn’t a static image, it has rollovers and all the other neat stuff you’d expect.

So how did we get something so feature rich up and running so fast? Easy; Google Charts.

Like a lot of the stuff Google does this is so easy to use it makes you want to cry. We’re a Java shop and so we used the GWT API which allowed us to create the extra controls for viewing data between two dates.

But if all you need is a simple chart with some copy and paste html this is really easy. Googles Quick Start guide has some script which you can copy paste and edit to show your first chart with your own data in a couple of minutes.

A shot of their chart gallery is below to give you an idea of some of the cool stuff you can use to jazz up your site:

googleChartGallery

Many data centers run far too cold

I remember a few years ago I was working in a data center which was so cold we needed to wear sweaters and gloves just to work.

Recently we’ve heard a lot about hosting providers moving to colder countries just to save on the expense and environmental impact of keeping servers cool while avoiding downtime due to machines running too hot.

I just read an excellent Wired article which concludes that many machine rooms run far too cold anyway. Many of us could save a lot of money, co2 & frozen hands by just dialling up the temperature. It’s a highly recommended read for anyone involved in operating a data center:

World’s Data Centers Refuse to Exit Ice Age

Regression tests are awesome things

So I’m sitting here waiting for my regression tests for a new CloudTrawl component to run. If you’re not in the know a regression test is a bit of code which tests some other code. Kind of like a check list to stop a programmer making dumb mistakes.

While my mind wondered I thought –

“Why on earth don’t regular web sites have regression tests?!”

And then it hit me, that’s what we do. Seriously DeepTrawl & CloudTrawl are the regression test. Awesome. Job done. thought over.

Link checking JavaScript links

Having worked on DeepTrawl for a long time there are a few questions which come up again and again. Some of these are equally applicable to CloudTrawl (since it also does link checking) & I’ll try to address some of them in this blog.

One of the really common support questions is “does your link checker work with JavaScript links”. The answer is always no, and it’s likely neither product ever will. Here’s why.

JavaScript is a tricky beast. In the simplest case it might seem easy for our software to follow a JavaScript link. Perhaps one looking like this:

<a href=”#” onclick=”window.open(‘somedoc.html’, ‘newWindow’)”>Click Me</a>

The problem comes in that onclick property. You really could have anything in there. For instance there could be JavaScript to:

– Ask the user which of several pages they’d like

– Take account of their session (are they logged in?) and act accordingly

– Get some information from the server

The examples above can stretch out into infinity because JavaScript code can do anything you want, that’s the power of using a programming language in a web page.

So the answer might appear to be that we should build in a JavaScript engine and have our products do exactly what the browser would have done when it encountered that link. The problem really comes when the script has some interaction with the user. There’s no way to know what the user would have done, so there’s really no way to know where that link should lead. The code might be asking the user to click one of two buttons but it could just as easily be asking them to sign up for a service, entering a new username, address etc. and giving them some unique content based on what they entered.

If we tried to implement this we’d get into a spiral of problem solving and at every step there would be things we couldn’t solve perfectly. We’d have to fudge it & take a guesses. I don’t think guesses are what anyone wants from a link checker. We want certainty.

When I first started thinking about this problem I decided to take a look at what some of our competitors where doing. Some claim to be able to follow JavaScript links so I tested them out with some real examples and found results I didn’t like.

They would cope with very simple examples like the one above, perhaps even slightly more complex ones, like this:

<a href=”#” onclick=”window.open(‘somedoc’ + ‘.html’, ‘newWindow’)”>Click Me</a>

(Note that here I’m building somedoc.html by adding together two strings).

But if things got a little more complicated they just refused to check the link. That isn’t what I want for our customers. I’m not saying our competitors are intentionally misleading in their marketing, there are a lot of very simple JavaScript links on the web but I think people would expect all types of links to be handled and I just don’t believe we could ever live up to that promise.

Since we can’t make a good guarantee I didn’t want to make one at all. False negatives are really bad, our products saying they’ll find an error then failing to find it are the issues that keep me awake at night.

Usually when people ask, I tell them it’s a good idea to always have a regular <a> link with an href attribute to match every JavaScript link, even if the ‘copy’ is hidden away at the bottom of the page. This means DeepTrawl and now CloudTrawl will be able to scan the site for broken links. But there’s another really important reason to make sure all your links can be found somewhere in the page in regular old fashioned <a> tags.

Google and other search engines don’t guarantee to follow JavaScript links. A lot of the time they discover the content in a site the same way our technology does, by starting at the first page and following all the links. If a link is in JavaScript the search engines may not follow it. So, they may not see your pages and won’t be listing them in the search results. Ouch. Better get putting in those regular <a> tags!


Legacy Internet Explorer vs. 99.999

Right now we’re hearing a lot about individuals being asked (begged?) to move away from Internet Explorer 6. As many readers may know even Microsoft is getting in on the act.

This is important for home users. Security is a big problem for all web users, combining a lack of security awareness with a browser which won’t be patched at all in coming years is a really, really nasty mix.

There’s a reason why many of these users have’t already upgraded, if they’re really happy with IE6 that probably means they’re not into the latest-greatest web apps. They spend their time doing email, browsing Amazon & eBay. Productivity boosts possibly aren’t their bag.

But here’s the thing. There’s a very large group of people who really would benefit from using the latest web technologies: pretty much everyone who works in an office.

Google docs is basic but awesome for collaboration. Lucid Charts freaking rocks. I challenge anyone to watch their demo and tell me Visio is more compelling. More and more web apps are appearing which push not just what a web browser can do but what a productive professional can do.

Even better I.T. departments don’t need to get involved for users to start playing with these things and prove their value. Maybe for free, maybe by using their departments expense account users themselves can start using these apps and test their value without I.T. needing to invest their time or budget. I would imagine that I.T. departments would be thrilled by this.

In the I.T. world there’s long been talk about the five nines. This means systems should have a guaranteed uptime of 99.999. This amounts no more than about 5 minutes down time per year. This is an excellent goal (which of course as an uptime monitor we applaud).

Seeking this value means I.T. has it’s head screwed on when buying systems, they’re serving the organization well by keeping it’s employees productive.

So here’s the elephant in the room. The next improvement in productivity many not come from internal systems, it may very well come from applications chosen by end users, hosted in the cloud.

That makes it seems really, really crazy that many I.T. departments hang on to historic browsers which won’t work well with these new productivity aids.

We understand the reasons. IE6 at least has an understood security profile. Probably more importantly, many organizations have internal applications which were written specifically for IE6 and absolutely will not work with anything else.

Since Microsoft plan to stop supporting it organizations are going to have to move beyond IE6 and are doing so right now. At the same time they may need to upgrade or dump their creaking IE6 compatible applications. That will hurt. Users will lose data. Hair will be torn out.

I believe many applications used by organizations in the future will be chosen by the users and won’t have much to do with the I.T. department. But there will still be many applications which will be developed for or sold to organizations from the top down. These will cost a lot of money, probably $100k+.

So here’s my plea to larger organizations: when considering buying any new software system please, please make sure it’s standards compliant. It should work perfectly in every modern browser. It’s html should validate. It shouldn’t use Flash or anything else which keeps you locked in to technologies which may disappear.

This way next time I.T. wants to upgrade browsers company wide they’ll be able to do it without fear of breaking everything and users can use the latest and greatest innovative web-apps without being held back by an ageing browser no-one in the company wants to keep.

The commitment to 99.999% uptime is awesome, let’s make sure we can keep a similar commitment to keeping workers a productive as they want to be.