Category Archives: Internet

IE7, Beta 1

Quite some time ago, Microsoft announced renewed interest in updating their flagship browser, Internet Explorer. It is a widely known fact that Internet Explorer has some serious deficiencies in terms of compliance with the web standards. The deficiencies I talk of, are a major pain in the arse for most web developers who choose to develop sites according to the web standards.

On Friday just passed, the IE7 team annouced some new features which will definitely be included in the next release of IE. In case you don’t feel like clicking the link, the couple that have been confirmed are:

  • support for per pixel alpha channel filtering in PNG images (this can be used for transparency)
  • addressing major inconsistency problems in the CSS engine, including fixes for the peekabo & guillotine bugs.

I really did have some serious reservations about what the new IE7 would come to the table with. To be honest, I thought it was going to be more of a security release with very minor improvements to the rendering engine. If this release ends up fixing some/most of the major issues that web developers face battling IE and the standards, it will be a glorious day.

Go Microsoft, I mean, go Team IE!
Did I just say that aloud?

Apple OS X Tiger & Safari 1.3

Apple have recently announced that the new version of OS X is about to be released, code named Tiger. Among some of the 200+ cool features being added into the system this time, the most significant one I have anything to do with (since I don’t own a Mac, however I recently purchased an iPod Mini), is Safari.

Safari 1.3 boasts a huge feature set, improved performance, thousands of bug fixes; the list is long.

This really is making the gap between every other browser and Internet Explorer more and more evident. Essentially, the top 5 browsers out side of IE now have full support for the standards; to me this is just a wonderful thing. In the near future, maybe we’ll be without CSS hacks for the other quality browsers and maybe, just maybe, it’ll push Microsoft to join the game.

D. Keith Robinson

During the week D. Keith Robinson mentioned he was taking his personal site Asterisk, in a new direction accompanied with a new design. He was going to wait for the CSS Reboot, however just had to get it out of his system.

The downside at the moment is his older content, which there was just gallons of, is now temporarily gone; the new location will be advised shortly no doubt.

I must say the new design is a fresh change for Keith and I think he should feel happy and liberated that the repeated patterns are now gone. I wasn’t adverse to them, however after a while it grew a little tiresome.

In my opinion, it is a simple and elegant new design; great job Keith.

Rel=”nofollow” Follow Up

The rel=”nofollow” is certainly coming into effect already, with quite a few prominent weblogs implementing it themselves, installing a patch/update or a plugin.

In my previous comment about it, I mentioned that I felt that it isn’t the search engines job to filter out spam and that it should rest on the owner of the site to make sure thier particular backyard on the internet is mowed.

With that in mind, we clearly need to come up with some alternative methods to combat spam. There are a few options which would invariably slow down most spammers, but not all, lets investigate a few of them.

The first being mandatory registration on your site to leave a comment. The problem with forced registration, is that it doesn’t lend itself to someone being linked to your site and leaving a comment. Signing up on every site is just a pain in the arse, you know it and so do I, so for the moment, that isn’t an option.

Secondly, I think forcing comment moderation is an option. However, if you have an active blog, the inherent workload for the owner is quite tall. There is also the downside that people leaving comments on your site can’t view them, or participate with other users, until you approve their comments. Not ideal, we’ll leave it for the moment.

Third, this isn’t all that likely. Allow anyone to post comments to your site and their comments go live, but be examined for spam content before posting. This is fine, except where the spammers leave a non-spam like comment with a link. At which point, it gets posted and they get their reward. We could take it further and parse their input, pull down the text for the page they are linking to and parse the html for illegal keywords (in the same line of thinking as Squid might if it was proxying content).

Fourth and this is really a category of tactics. User interogation when they post. For instance, they go to post and before they do, they have to enter a string that is blurred within an image (done before). What about a random but easily answered question? This line of thinking I think, would make it much harder for spammers to automate their attacks; especially if the challenge was random.

Fifth, change the way we accept comments. For instance, most spammers will pick a particular type of blogging software and attack it because it is simple. Look at MT, when you submit a comment with that software, the feedback is always posted to comments.cgi or the like of. If I were a spammer, that is making my life very simple. Make it more complex, lets make the submission URL synthetic, so they can’t hardcode it. Lets link the synthetic URL to their session id and make it available for only x minutes at a time. Check that the referrer for the submission is in fact your own site and that the HTTP header information is all there and intact.

At this point, I havn’t thought the fifth item right through; however I feel that there might actually be some merit in it. What about a combination of all of them, varying from submission to submission; just to keep them guessing a little.

What ideas have crossed your mind about it?