Category:

Archives:

Asterisk – the Future of Telephony

  on June 4th, 2007

asterisk.pngFinally, after over a year of dabbling, I have had success creating my own Private Branch Exchange (PBX) at home! A what? It’s a telephone system not unlike you would find at a medium to large company. Yes, it’s got extensions, it’s got voicemail boxes, it’s got (potentially) least cost routing and it’s got interactive menus! Press 1 to continue reading this blog or press 2 to commit ritual suicide! Yes – callers to my home phone number will soon be in for phone menu hell!

Asterisk is the key to this new little project. Those who know me will not be surprised to hear this is an Open Source application that adheres to Open Standards and runs under Linux. It really is a remarkable piece of software kit and is, I believe, fairly dramatically changing the face of corporate telephony systems. It’s what is termed a ‘disruptive technology‘ – a technology that is destroying the status quo in it’s niche and really leveling the playing field for everyone, especially newcomers. Until very recently your only option for an Interactive Voice Response (IVR) system, or even a semi competent office PBX, would be an expensive closed system – tens of thousands of Pounds plus annual support contracts. Now practically anyone can create such a system for virtually nothing. All you need is a computer running Linux plus an Internet and/or telephone network connection.

In my case I bought a slightly exotic telephone adapter called a Sipura SPA-3000. I had hell getting it to talk to Asterisk. This wasn’t Asterisk’s fault, nor that of the Sipura. The real problem is that the Sipura was primarily designed to work alone (or rather not with a complex PBX like Asterisk). Voice over IP (VoIP), IP being Internet Protocol, is a phrase you will hear more of in the future (and may very well have come across already). You’ve heard of Skype, I expect. That’s a form of VoIP. Skype (rhymes with type NOT typie) has created its own proprietary protocol and is very closed source and rightly criticised for it – you cannot interoperate with it, you cannot bug fix it, you cannot port it to other platforms (Windows, Mac OS X and Linux (though only x86 architecture) are supported) and, possibly most importantly these days, you cannot inspect the code to see what it might be doing to your system. All in all, Skype should be avoided for both trust and conscience reasons. But I digress. Asterisk supports the open VoIP standards that increasingly everyone except Skype uses, namely SIP, IAX2 and H.323. My Sipura, as the names suggests, supports the most popular of those protocols – SIP. It has a socket for a phone line (in my case going to a Telewest/Virgin Media outlet), a telephone handset socket (going to my DECT basestation) and a normal Ethernet network port, plus a power connector. It presents the two phone type sockets as SIP devices, i.e. they can connect to a SIP service such as Sipgate, VoIP.co.uk, VoXaLot, etc. In other words you can use a Sipura to give your normal phone access to, potentially, free phones calls over the Internet. The Sipura can be configured to route your outbound calls either over the Internet or through the normal phone network depending on the number you are trying to call. It’s clever stuff and only the tip of the iceberg to what this device can do. Sadly I wanted none of that, I just wanted a plain mindless SIP device that will connect to my Asterisk box. That was the tricky bit. Nonetheless I got there, eventually! I should say a big thank you to Fay of the East Grinstead Linux Users Group and the chaps at Runtime Consultancy Services – seeing Asterisk in action really brought the concepts home, I was really floundering before that!

I should also point out trixbox, which is basically a preconfigured Linux installation (based on the great CentOS – a Linux distribution I use all the time at work, in fact I’m using right now over this lunchtime) with various Asterisk packages already added, ready to go. There is a VMware image there along with a bootable CD image for installation onto a real machine. I’m using the VMware virtual machine image with it in mind to build up my Asterisk configuration there and then copy the config files across to my existing home server (it has a max of 512MB of RAM, so running VMware on that headless machine is impractical but it should be able to run the Asterisk packages natively).

In addition I should say that there are far easier ways to interface Asterisk to the normal phone network. There are several PCI cards available, most notably from Digium (the people who first developed and open sourced Asterisk – they realised they could make more money selling commodity hardware by opening up Asterisk than they’d ever make by selling bespoke PBX systems). You can get really cheap cards for about £30 on eBay. I wanted something a bit more reliable. Digium’s decent cards were coming out at about £160 with one FXO (Foreign Exchange Office – i.e. BT, Telewest, Bell, etc.) and one FXS (Foreign Exchange Subscriber – i.e. phone handset) socket. The Sipura had excellent reviews, didn’t take up a PCI slot (my main server only accepts half height cards) and was somewhat cheaper at about £90. A Digium card would, however, just have worked out of the box!

That was all a load of nonsense to most people, I am sure. Basically, since the breakthrough late last night of both making and receiving Telewest calls via Asterisk (with the Sipura inbetween) I am buzzing with the opportunities to tinker around with Asterisk. All that is really left is getting CallerID to come through from the Sipura to Asterisk. Based on the CallerID I will either let the call through directly (falling back to either voicemail (which will be immediately emailed to me) or, possibly, redirected to my office or mobile when away (using traditional VoIP as clearly my Telewest line will be busy)) or pushing the caller into some form of IVR menu trap – I’ll let those who I deem worthy know the secret way to get out of it! A worrying Text2Speech message may be included to scare off telesales people… And so the fun begins!

 


 

BBC On-Demand

  on February 5th, 2007

The BBC recently put up a Consultation for On-Demand Services on their web page. What does that mean, I don’t hear any of you asking. Basically they are talking about delivering their content over the Internet. There are several worrying aspects to their proposals.

Before I get into the details I should further explain the idea a bit. Those of us in the UK pay a Licence Fee that funds the BBC. Every home with a television must pay this (currently about £130 a year) or risk nasty things such as a fine or imprisonment. That’s not the issue, though. It is more the status of the BBC. They are a ‘Public Corporation’, incorporated under a Royal Charter. Above all they are a Public Service Broadcaster. There are no adverts and the vast majority of the funding comes direct from the Licence Fee (additional amounts come from the BBC selling programmes abroad, DVD/CD/Book/Magazine/etc. sales and some government money to subsidise OAP Licence Fees and the BBC World Service). So in a very real sense the BBC is owned by the people.

For some time now the BBC have had a wonderful Internet presence. That is absolutely within their “inform, educate and entertain” remit, their News site is particularly notable (and respected by people well beyond these British shores). Now they want to take it a step further. Again, for some time (he says vaguely) the BBC have broadcast most of their radio stations over the Internet in, roughly, realtime. More recently they’ve introduced the ‘Listen Again’ service so you can catch up with radio programmes you’ve missed from the past seven days. Now they are talking about doing the same with TV programmes.

So, what’s the problem? Yes, that does sound great but how do these questions sound from the consultation?

  • How important is it that the proposed seven-day catch-up service over the internet is available to consumers who are not using Microsoft software?
  • The BBC Trust has proposed setting a limit of 30 days as the amount of time that programmes can be stored on a computer before being viewed. As this is a nascent market, there is currently no clear standard on the length of the storage window. On balance, the Trust thinks 30 days is the right length of time. How long do you think consumers should be able to store BBC programmes on their computers before viewing them?
  • The BBC Trust concluded there was fine balance between public value and market impact in deciding whether to allow the BBC to offer audio downloads of classical music. While such downloads could help introduce new listeners to classical music, they could also deter purchases of commercial recordings. What is your view on whether – and to what extent – the BBC should be allowed to offer radio broadcasts of classical music as audio downloads over the internet?

Hmm, I don’t like some of that – and nor should you.

Let’s take that first one – How import is it […] to consumers who are not using Microsoft software? Obviously I’m a Linux user but that isn’t the point. Are they seriously proposing that they force the public to use one company’s software – expensive, closed source (and foreign) software? How can that possibly be in the public interest. As a public broadcaster they absolutely must ensure as many Licence Fee payers as possible can use their services. They should not be propping up a convicted monopolist such as Microsoft. Imagine if they had instead said ‘consumers who are not using an Apple MacOS X’. What an uproar that would have created. Many more people would have complained about them singling out one company and one group of users. Just because the majority of machines come preloaded with Microsoft software (clearly not Apple Macs) that doesn’t make it right. All Licence Fee payers have paid for these programmes. All deserve access to them.

Next up, the 30 day limit. Why should I, as a person that funds the BBC, have my access to material I’ve already paid for limited to 30 days? All Licence Fee payers should have unlimited access to the BBC back catalogue – big fat full stop. We should not be subject to DRM (Digital Rights Management) to limit our access to that material. There is no reason why we shouldn’t be able to download a programme and keep it forever. In practice I would hope that we won’t ever need to save such content as it will be available at all times. Nonetheless it belongs to the people and if we feel the need to burn it off to DVD so be it – we paid to have that content created in the first place. Yes, there is a danger that this content will fall into the hands of those who don’t pay the Licence Fee (this is already happening with bittorrent and foreign fans of things such as Doctor Who) but the potential damage of that is minimal. In theory foreign TV stations might see a drop of viewing figures and may eventually stop buying BBC programmes. It’s a danger but, from what I can discover, that will only amount to, at most, a loss of about £150M in an annual BBC budget of over £4billion. Small fry and not nearly enough to justify limiting the British public’s access to our own programmes. There’s also the question of DVDs and the like. Well, in the next decade or two, those will be dead, either way. You could make an argument for special features selling the DVDs/CDs but then you get into the murky area of those having to be funded entirely from the commercial arm rather than the BBC proper (otherwise it will be paid by the Licence Fee again). As I said, though, the days of physical media like DVDs and CDs are numbered. Books and magazines may continue for a while but ever those will one day be downloaded to a reader tablet of some kind. Don’t let the BBC go down the road of forcing us to pay for content twice over.

As for the last point ‘fine balance between public value and market impact’, that’s a tricky one. Or is it? Let’s not forget the mantra – we’ve already paid for the BBC content. Yes, if they make classical music available free to every Licence Fee payer that could well cut into sales of commercially funded classical music. This sort of music is a special case, I suppose, as anyone with an orchestra can put out a CD of the same music. The BBC has some excellent orchestras, see my Prom in the Park blog entry for more details, but who pays for them? Us! Commercial competitors will have to differentiate themselves in some way. That is no reason to restrict us to our own paid for content!

Rant over. I urge all you Brits to fill in the BBC On-Demand Consultation questionnaire. According to the unique BBC charter – “free from both political and commercial influence and answers only to its viewers and listeners”. It’s your BBC too.

 


 

Random Spam

  on August 25th, 2006

It never stops, does it? 07:45 to 20:45 at the office yesterday, then on my way home stop in to help someone install and setup a freeview box, then home and off to bed without supper! Not a typical day but nor does that sort of non-stop action seem too unusual either. Why is it there is always more to do than time to do it? Never time to get bored these days, that’s for sure.

What else has been occupying those few stolen moments recently, then? Spam. Damn spam. A.K.A Unsolicited Commercial E-mail (UCE, note the correct spelling of e-mail, not that I tend to use it, it’s email to me!) You see I’ve just managed, after several years, to enable the Staggering Stories emails. Yes, all those emails on the Contact Us page haven’t worked since day one. Whoops. I’ve been meaning to do something about that for quite a while but see the previous paragraph as to why it’s taken so long! For the first two or three years of Staggering Stories’ existence the email addresses were right there in plain ASCII text as part of the Contact Us page. Not surprisingly spammers quickly got the idea of harvesting email addresses from web sites and if Google can find our site to index it so can they. What must be a year or so back I decided that this wasn’t a great idea and now, if you look at the page source, you’ll notice that instead of hard coded email addresses there are script calls such as <script type=”text/javascript”>getaddress(“adam”);</script> Anyone with Javascript enabled in their browser (and that is just about everyone these days) will see the address as before but robots (also know as spiders, the programs that index the web) most likely don’t interpret the script. Or so I hope. It may be too late, though…

In the last 5 days I’ve had over 350 junk emails to a Staggering Stories address and that’s after some fairly aggressive anti-spam measures I’ve implemented in the last few days. My first effort was Greylisting, in the form of Postgrey. This works on the theory that most spam these days comes in from zombies. Yes, zombies are sending out most of the spam. Really! Of course, when I say ‘zombie‘ I don’t really mean the animated corpses who crave human brains. Instead they tend to be Windows machines that have been security compromised, almost certainly with the owner being completely ignorant of the fact, and are being backdoor controlled by a spammer. The spammers have thousands of such machines at their command which form what is known as a botnet. The botnet is then commanded to send out spam, using the bandwidth of the compromised systems. Suddenly the spammer have a huge resource of thousands of machines sending out thousands of spam emails an hour each – all for free. These botnets are also employed to take down Internet sites by focusing all that bandwidth power on one poor site in a big sustained assault. Of course to the owners of the individual zombie machines all they might notice is a slowdown of their computer and Internet link as their machines do the botnet’s bidding in the background. Generally zombie machines don’t use their ISP’s email servers to send the email and you will never see this spam appearing in your Outbox or Sent Items. Instead the machines connect directly to the destination email server (in Staggering Stories’ case my own Bytemark Virtual Machine), effectively pretending to be another email server that is passing the message on. This is where Greylisting comes in (and notice the odd use of the English spelling, as it should be, rather the more likely American English). A real email server works within well specified (in theory) protocol. If an email server isn’t able to handle an incoming message at that time it will tell the sending email server and the sender will queue it up to resend later. Zombie PCs don’t normally follow the protocol, they fire and forget – if it fails to get through first time they won’t queue and resend. Why bother when you’ve got a list of 50,000 email addresses to spam? Of course Greylisting does cause a delay in the email being received but once it has successfully had a resend from a machine it then allows future emails from that one through first time.

Did it work? Not noticeably, no. I’m not sure why yet. Perhaps zombie software has improved to do resends. Perhaps my Greylisting isn’t working at all. I’m not sure yet how to test those theories so it was on to another measure: Real-time Blackhole Listing.

A Real-time Blackhole List (also known as DNS-based Blackhole List) is the somewhat controversial idea of checking the IP of every machine connecting to your email server and outright banning them if they appear on the said list. So, we’ve got a list of bad IP addresses and we’re going to ignore email from any of those. Doesn’t sound too controversial. That is until you think how that list was built and maintained. There are at least a dozen such lists available, the majority free, and they vary greatly in how they obtain and maintain their lists. There are conservative lists and very aggressive lists. Some just use email honeypots, email addresses they seed on the web, newsgroups or elsewhere, that are never used for real emails, and any IP that sends an email to one of these is automatically banned (perhaps after a number of trangressions, perhaps only one). Then there are lists that contain known ‘dynamic IP addresses’ (almost, but not quite, exclusively used by home users). Many people take the not entirely unreasonable view that emails should never be coming directly from people’s home IP addresses, as legitimate users always relay it through their ISP. There are other schemes too, such as running email content analysis (much as your email client will do) to guess whether an email is spam – too many of those from your IP and you’re out too. Many lists combine such techniques. Some even allow you to appeal, though not all. That’s where it really gets nasty. If you find yourself on such a list, rightly or wrongly, it can be nearly impossible to get yourself removed from them. Some lists are very clandestine, in an attempt to avoid the wrath of the spammers (and let’s not kid about that, the spammers are increasingly linked with the Russian Mafia – there’s big money in spam). So, can I trust the quality of these lists? What the heck – I’ve got to try something! In the end I chose about a dozen different filter lists, based on nothing more than an Internet recommendation and crossed my fingers.

Did it work? Not noticeably, no. I’m not sure why yet. I am beginning to get the feeling my email server, Postfix, isn’t checking with any of these filter lists, let alone performing the Greylisting. More investigation is required…

 


 

A Tale of WWWoe

  on August 16th, 2006

Some people have noticed that the Staggering Stories Forum has been rather, er, absent for the past few days. Why is it absent, I hear you ask. No, nothing I’ve done (for once). It is my long time host for the Staggering Stories site, PlusNet. They’ve accidentally deleted the forum. Again.

This will actually be the second time I’ve had to recreate the forum after they’ve wiped my data. Last year, on the 14th of April, they had some kind of disk failure and lost all their customers’ CGI sites. It was supposed to be routine maintenance. Before it was due to happen they stated they would back up all the data as a precautionary measure. Apparently it was during this backup that they suffered ‘multiple disk failures’. No, they didn’t have earlier backups to fall back on. It was all gone. They referred us all to their terms and conditions and washed their hands of it. Your data, you rebuild it. For a company that hosts other peoples data for them that was a pretty shoddy response. I didn’t have a backup of the forum, foolishly thinking I could trust them with my data, and spent a frantic weekend recreating it from scratch. The only mercy was that their MySQL database server, where all the forum posts were stored, wasn’t affected. I just had to recreate the program and theme…

I am probably missing an incident or two that didn’t affect me but the next I became aware of was in late November 2005. A script designed to check their CGI platform accidentally saw ‘a small number of sites removed in error’. Fortunately that time I wasn’t hit. Again they washed their hands of it and left the poor souls to rebuild again. They did, however, promise to address the issue to ensure this didn’t happen again.

Then rolls around the 9th of July 2006 and PlusNet famously loses over 700GB (yes Gigabytes!) of their customer’s emails. It transpires that one of their engineers accidentally reformatted a live disk array rather than the backup pack, ending up with two clean and shiny disk array packs! This cock-up was major headline news for technology news sites and it even made the BBC. Thanks to the publicity they at least tried to recover the data this time by sending the drives off to a professional data recovery firm but, alas, it was doomed.

After such a high profile loss of data you’d think they’d learn their lesson, wouldn’t you? Sadly not. On the 11th of August 2006, sometime between 07:54 and 12:20, they did it again. That pesky CGI maintenance script turned on PlusNet’s customers once more! It started deleting everyone’s CGI directories. Their engineers noticed pretty quickly this time and killed it but not before it had eaten its way through a good number of sites starting with A and B (mine being ajpurcell, sadly). So, there goes the Staggering Stories Forum, again. They claim that this time they did have backups. Unfortunately they had an error in that backup script too and some of those backups had failed, mine seemingly included. I say ‘seemingly’ because they haven’t actually told me yet. In fact my Customer Service Ticket has gone completely unanswered for four and a half days, not so much as a note saying ‘we’re investigating’. Nothing. They claim that only about 10 customer’s sites have failed to be restored. Not too many for them to inform us personally, is it?

I’m now in the process of moving both the Staggering Stories and my Adam Purcell domains to a new host – Bytemark. They offer more storage (4GB over 250MB!) and even a full Virtual Machine where I control everything. I install and configure the OS, the web server, the database, everything. They provide backup space on another machine but it is my responsibility to use it. They keep the machine physically up and running and with an Internet connection and leave me to manage the rest. No more relying on PlusNet clowns with their squirty flowers and hilarious scripts.

That being the case you may notice a bit of downtime with those domains as they are transferred across to Bytemark. There’s nothing much to be done about that. For now the restored Staggering Stories Forum is temporarily hosted at forum.freelancepeacekeepingagency.co.uk. It will shortly move to forum.staggeringstories.net but I’ll sing that from the rooftops when it happens!

On an unrelated and lighter note I just have to mention, with great pleasure, that Skepticality has returned! It’s a podcast and they describe themselves as ‘ Our podcast is here to bring you relevant, under reported current events, as well as in-depth discussions from a scientific, critical, skeptical, and humorous point of view. In our travels we will tackle the beasts of pseudoscience; the paranormal, supernatural, ufo / alien encounters, mis-understood history, astronomy, space, and overwrought legends – urban or otherwise. Welcome to Skepticality, truth in podcasting.’

Edit (17 Aug 2006): I see that The Register now has a news story on PlusNet’s latest problems, particularly the CGI deletions.

 


 

They drink it in the Congo (well, More4 does)

  on August 10th, 2006

Last night the British TV channel More4 ran a short piece on Ubuntu and its mastermind, Mark Shuttleworth. They also have a news blog entry on it.

This is, of course, excellent. The biggest thing holding back Linux now is not technical issues but instead ignorance and apathy. Most computer users have never heard of Linux or even really understand what an Operating System is. Those that do generally accept the Microsoft tax, insecurity and vendor lock in with a ‘it does everything I need’ attitude. Of course it doesn’t do everything they need – a plain Windows install doesn’t come with office applications, such as a word processor, you generally have to mess about with a dozen different drivers from either a manufacturer cd or, worse, somewhere out on the Internet. Install a Linux machine and you get all this, for free, from the moment you first login (and a lot more decent software besides).

To solve these problems we need to get the word out there. Linux is open for business (in every sense) so give it a go. Now the news report in question doesn’t actually mention Linux at all. I’m not sure if that is a good thing or not. Obviously we want people to know that Ubuntu is Linux but do they actually care, especially at first introduction? Probably not. We don’t want to confuse people with Ubuntu is a distribution of Linux, not until they are ready.

As for Shuttleworth, he appears to be rapidly becoming a figurehead of the Linux movement. Having met him twice now, at both the LUGRadio Live events, it is very clear that he isn’t looking to create an Ubuntu monopoly in the Linux world. He is very community driven and a very savvy businessman – you don’t get £400million in your pocket without some business acumen. There have been a few Linux distros in the past that have attempted to package up a nice user friendly OS in a similar way to Ubuntu that have made the mistake of making their innovations proprietary. They have all failed to gain community and customer acceptance. One or two are still around and now trying to change their ways and release more liberal versions of their software but it is too late for them. Shuttleworth realised that first you need the buy in of the established Linux community and to do that you have to work with their terms – without Linux, Ubuntu is nothing but without Ubuntu, Linux will continue. Shuttleworth comes across as one of us, he’s a very approachable and knowledgeable man and can talk technical like the best of us. Perhaps deep down he really is one of us or maybe it is just a carefully calculated image. I don’t think it matters. What Linux needs is someone like him – the public face of Linux. Others have tried, or been hailed, as that figure but none have worked – RMS appears to be a frothing at the mouth extremist and even Linus Torvalds himself lacks the charm to really represent the community he helped build. Shuttleworth offers great promise for us, I think.

Ubuntu has also taught a lot of us what it really means to be accessible for the average user. For too long Linux has been a plaything for geeks. It has been ‘almost ready’ for prime time for probably half a decade or so. All it lacked was the impetuous to take it that last few percent of the way. The average end user shouldn’t need to go into a command line for anything, not ever. It will always be there for those of us who understand the power of command line shells, scripting languages, editing configuration files directly, etc. Your average Windows XP user will never have used the Command Prompt program, nor should they in Linux. That’s where Ubuntu and all those feeding work into Ubuntu are taking us (and there are a lot of people not directly involved in Ubuntu who are helping to bring about this new age of Linux). It will be an interesting few years.