Syncing your Firefox: Google Browser Sync

How could I miss this so long? Google Browser Sync keeps Firefox‘ bookmarks, sessions, cookies, tabs and even passwords (if you really want to) in sync. As I constantly switch between multiple computers, this is really a nice thing to have. You need to have a Google account, but as I already use Google Reader and Calendar, I already have one. In order to hide your data from Google (as if this mattered due the amount they are already collecting about us) you can protect the information by a password (PIN).

I’m currently trying it out and up to now I am very pleased with it. One more reason to stick with Firefox 🙂

Open Street Map

Today, German news-site Golem.de covered OpenStreetMap in a rather long article. OpenStreetMap is a project I have been watching for some time now as I think the availability of free maps is very important. I’d like to participate in the project but unfortunately I have got little time and no adequate GPS receiver.

Maybe anyone in the area of Linz, Austria would be interested in a little mapping party one day? The data in this area seems to be rather sparse 🙂 Anyways, I think the project or any similar project is definitely important and should be supported with all means. Anybody knows someone at a Austrian government body which could contribute data to such a project?

The Storm Worm

I want to point out a very interesting article by Bruce Schneier about the Storm worm. If it were not so illegal, the techniques used by this worm are very, very advanced and very interesting from a development and network/load-balancing point-of-view. Anyone interested in development, network administration, and security should read the article.

The worm has grown to a real epidemic by continuously adapting, changing its code, the code signature, etc. It has infected this huge number of computers because the resulting bot-net is hardly ever used, it keeps in a dormant stealth mode. Most users are not aware they are infected with the worm because it tries to avoid detection by not using to much ressources and therefore hardly attracts attention by system administrators. Bruce Schneier points out that maybe we should be worried about what’s coming in “Phase II”, once the gigantic bot-net is brought into action.

To avoid detection, the worm and the bot-net operators apply several advanced load-balancing and stealth techniques, namely a DNS technique called “fast flux” which very effectively blurs the traces to the real operators.

As I said, it is very interesting read. I recommend you also follow several of the outbound links.

Sharing and Synchronizing Data Across Multiple Computers

I have several computers, one at the office (Windows Vista), one at home (Gentoo Linux), and one notebook (Windows XP). On most of them I want to share a common set of files, including letters and other documents, but also Miranda. This time I am going to tell you how I keep my shared data in sync using Unison, PuTTY, and OpenSSH, using a dedicated server as central hub.

(Note: this is a rather advisory level HOWTO, not a step-by-step, command-by-command tutorial. It might give you some ideas nevertheless.)

Continue reading “Sharing and Synchronizing Data Across Multiple Computers”

Google Shared Items

Inspired by erik (once more), I started to share some interesting items from Google Reader. I also included the shared items in the sidebar of this blog (this feature requires JavaScript to be turned on on your browser). As I am reading mixed English and German feeds, there sometimes can be German items too. (In fact, when posting this, there is only one shared item and it is German…)

You can browse the Shared Items Page or even subscribe the RSS feed.

Fighting SPAM in phpBB – Part 2: First Impression

So the mentioned MOD for prevention of posting URLs in phpBB2+ has now been deployed in the tag2find forum for one week. What can I say? ZERO SPAM postings within this period. I had hoped it would reduce it a little bit, I didn’t expect it to eliminate the SPAM problem at all. I just hope it is not preventing “ordinary” users from posting.

Next step in this experiment will be to disable the CAPTCHA image for anonymous posting, just keep it for signing up. I am really looking forward how this works out. The CAPTCHA has kept a lot of posters from posting, so I’d be very happy if I can disable it for posting.

Fighting SPAM in phpBB

At tag2find, we are using phpBB2+ for our forum. This forum unfortunately is continuously being spammed by bots, despite the active CAPTCHA. Even at the strongest setting of the CAPTCHA, SPAM postings were coming through, but the CAPTCHA drove off a lot of potential forum posters which could not get past it.

Therefore I wanted to implement the approach I took for our blog: disable the possibility to submit postings containing links. Unfortunately phpBB out of the box does not permit this. After searching a while I found a promising MOD which I now added to the forum. Its not directly preventing posting links, but is more or less a RegEx-based blacklist of words which must not be used by users who are not registered for a certain number of days and have not yet posted a certain number of posts. The regular expressions supplied aim at preventing posting links, but I had to modify them, as they also contain “.net” which we must allow since our application is written in the Microsoft .NET Framework, so this term is going to turn up legitimately.

Let’s see if this measure will actually change the amount of SPAM being posted to the forum.

Simple Way of Fighting WordPress SPAM

As I am one of the developers of tag2find, I am also writing in the tag2find developer blog from time to time. This blog is a WordPress blog. One of our main problems there is fighting SPAM. We get literally dozens of SPAM comments a day. To limit the amount of SPAM visible on the blog itself, I found a very simple solution, which up to now did not produce any false positive: if a posting contains more than zero links, it will be held in the moderation queue.

WordPress offers this possibility out of the box, but the default is set to more than 2 links. I tried to limit it down to one, but this still missed to many SPAM attempts. Therefore I now have set it to zero. This works remarkably well. No SPAM postings anymore and we had just one or two false positives, which are not so bad as the comments are not deleted but just held for moderation.

I know, this is a very low-tech approach and puts some work on the maintainer of the blog, but it works almost out of the nature of SPAM, which most of the time wants to deliver links to pages to influence Google PageRank and/or lure people onto the website.

RSS Reading Online: Google Reader

I have been using JetBrain‘s Omea Reader for quite some time and I was very happy with it. While this worked quite well as long as I was working just on one PC, I soon got trouble when I switched from the notebook to a dedicated desktop PC at home and a dedicated desktop PC at the office, while still keeping the notebook for the time in between. Omea Reader was no option any longer, as I have Linux at home and Windows in its various flavors in the office and on my notebook.

I tried several Java-based applications and tried to keep their databases in-sync between the PCs, but this work soon got boring and it happened to annoy me. So I thought I’d switch to an online alternative.

My choice soon fell for Google Reader, as I already had a Google account. Despite the privacy issues with giving my reading-habits away to big G, I am really happy with this reader. It features everything I need and is intuitive to use. I am most happy that there seem to be very knowledgeable people at Google, as they also provide a keyboard interface for navigating the feeds. This is something I really appreciate as I favor the keyboard over the mouse.

So anyone required to keep their RSS-feeds in sync between various PCs, I just can recommend giving Google Reader a try. I know there are others and I know Google Reader is not particularly new, but I just tried it out now (as I had the need for a service like this).

Google Reader is also one of the first applications to utilize Google Gears for offline functionality. This is particularly interesting for me as notebook user without a wireless broadband connection available all the time. Up to now I had no time to test it, but I will give it a try soon. I will write about my experiences then.

Note: I had this article prepared since a long time, but I forgot to publish it… This article of erik just reminded me to do so.

phpMyAdmin with mod_fcgid

I am currently migrating my server configuration away from mod_php towards mod_fcgid (the successor of mod_fastcgi), as this allows me to use different users for executing scripts in different directories. I use this to have every hosted virtual domain using its own system user. This should (in theory) prevent one buggy application to take over all other hosted domains as well.

I though faced one problem: I could not get phpMyAdmin working and this was a requirement of one of my clients. phpMyAdmin kept popping up the authentication dialog over and over again when using HTTP Basic Authentication.

After searching some time, I noticed that, when using PHP in CGI mode, the authentication data is not passed over to the script by default. A FAQ entry of phpMyAdmin brought the solution to this issue: a ReWrite Rule was needed for the directory containing phpMyAdmin:

RewriteEngine On
RewriteRule .* - [E=REMOTE_USER:%{HTTP:Authorization},L]

Suddenly phpMyAdmin worked 😉