Sunday, April 30, 2006

Advice on buying an inexpensive computer

People keep ask me about their pending computer purchases. Should they buy the cheapest machine they can or pay a little more to get something faster? Should they go with a name brand? Should they build it themselves? Should they switch to a Mac? How much memory? How much disk? Media center? What about Linux?

Q: Should I switch? Some questions are easier to answer than others. First off don't switch to a different operating system unless at the very least the person trying to get you to make the switch offers you free 24/7 support. Don't expect switching to solve all your computer problems. Switching from Windows to Mac is a bit like changing from a Model T Ford to an Austin 7. There are significant differences between the machines but both are pretty primitive contraptions. Any benefit that might be had changing from one to the other is likely to be lost in having to re-learn what you used to know. A: Not unless you are already married to the person proposing the switch

Q: Should I switch to Linux? If Windows is a Model T Ford, Linux is a DIY kit car. Suggestions to the effect that Linux is ready for use by non-computer litterate seniors is pure anorakism. If you want to become a programmer or systems manager then getting experience of using UNIX is worthwhile. If you want to do anything else you will find it quicker, easier and better to use a commercial O/S. A: If you have to ask the answer is definitely no.

Q: Should I build the machine from parts? A few years ago I used to do this, or rather I would buy from a shop that would put together the parts I selected for a small fee. At the time the name brand manufacturers charged a hefty premium for server class machines. Some name brands still do but there are plenty that do not. Building your own system may still make sense if you are building a very high spec machine with the very latest spec parts. But doing so is much harder than it was a couple of years back. Modern CPUs run really HOT. Unless you live in Antarctica or you are prepared to spend quite a bit of time working on getting the cooling right you are likely to end up with a machine that has a habbit of unexpected freezes. A: Definitely not something to consider unless you plan to spend a minimum of $1,500 on the system.

Q: Laptop or Desktop? Laptops have obvious portability benefits but they also have drawbacks. First they cost quite a bit more and they wear out much faster. Getting more than three years out of a laptop used daily is very good going. Plan on replacement in two unless you don't use the machine very much or don't travel very often. Laptops are the one exception I make to the golden rule of never buying extended warranties. Ultra compact machines are particularly fragile. Another drawback to laptops is your upgrade options are limited. If you plan on playing video games you may find that many games simply don't run on your laptop. A: Laptops are great but consider the drawbacks.

Q: What CPU should I get? At this point it really does not matter unless you are planning to play seriously cutting edge games or do a lot of video editing. Video editing is the one mainstream application that remains seriously CPU challenged. This is likely to remain the case for several years. High definition and new compression algorithms are both CPU hungry. A: Don't be too worried unless you plan to do professional video editing in which case consider getting extra machines rather than a bigger processor for your one machine.

Q: What memory should I get? All modern operating systems are memory hungry. All will perform much better with 1Gb of memory than 256Mb. I would not consider paying for a processor upgrade until I had at least 1Gb. If you are buying a desktop the cost of a memory upgrade may drop in the future. For laptops this tends to be hit and miss. You may be able to get a cheap upgrade but only if there is a current model that shares the same SIMM format. A: Try for at least 512Mb on a laptop, 1Gb on a desktop. 2Gb is not unreasonable.

Q: How much disk? Buying hard drive space is all a matter of getting the lowest price per Gb. That usually means buying one or two sizes smaller than the current maximum capacity. Today you can get a 500Gb disk for a premium price, a 250Gb drive costs much less than half the price. But there isn't much difference in price between 100Gb, 150Gb, 200Gb or 250Gb. The prices don't really ramp up until you go past 300Gb. What is much more important than disk capacity is reliability. RAID disk mirroring is cheap and a life saver. With RAID 1 you have two disk drives running in parallel. Both have a full copy of all your data. If one dies your data is still safe on the other. A: Go for the cheapest cost per Gb. Don't be too worried though because adding extra disk is pretty inexpensive.

Saturday, April 29, 2006

Collar of the week

The DoJ has brought charges against a security specialist for doing unauthorized penetration testing.

Eric McCarty, 25, is alleged to have probed the USC on-line application system. After discovering an SQL injection vulnerability he reported it to SecurityFocus rather than the university admins leading to a series of press reports.

I have been expecting something of the sort to happen for a while, security specialist makes unauthorized probe of system, owner makes complaint to police, prosecution makes a federal case of it.

As always an indictment is not a conviction and the press release may not state the full facts. This would not be the first time someone got into trouble for unauthorized penetration testing but it may be the first that attracts serious attention.

Discourse.net: Tell Us How You Really Feel

Froomkin blogs on the snippy response from academics on blogging.

I think the snippy comenters miss the point. Blogs are too new for anyone to expect them to be challenging the academic litterature. But law is not like physics, theory and experiment interact, the result of an experiment (i.e. trial) changes from day to day, the theory (i.e. statutes) changes as well.

So law is not a static subject. Events data matters, if you are going to be a leader in the field you need to keep up with the latest cases, judgements, laws. Someone who is providing the authoratative feed of current information in any field is going to be considered field leader.

In the traditional scholarship model the academic jealously guards their ideas to stop someone else stealing them before the big publication that nobody will read.

I don't write many papers for academic journals but there are hundreds of articles published each year on my work. This is the old model of scholarship before computer generated indexes made it possible to use publication rate as a cheap indicator of academic performance. Citation indexes don't do this job either, write something really stupid, get it published and hundreds will cite you in their refutations.

Having the top blog in a field is going to be much harder to fake than citations.

Friday, April 28, 2006

W3C Workshop on Usability and Transparency of Web Authentication - Summary Report

The summary report of the W3C Workshop on Usability and Transparency of Web Authentication is online. Well worth reading.

Amazon.co.uk:  Doctor Who - Radio Control Dalek: Amazon.co.uk:

I spent the past week in London at Infosec, opportunities for blogging were limited. The hotel was a fair walk from the hotel so I took cab on the first day of the show. The cabbie asked what show was on at Olympia, "Information Security", I replied. "Well that's the wrong show, in'it?", the cabbie objected, "they should have people security".

We had six daleks as give away prizes on the stand. People were much more interested in winning a dalek than the play station portables we gave away at the casino night. So for the benefit of UK readers who were disappointed not to win one here is a link where you can buy one of your very own. UK delivery only unfortunately. It may become available in the US now that the SciFi channel is showing Dr Who here.

Friday, April 21, 2006

More Bruce fodder

Turns out that the Atlanta airport scare was due to faulty procedures

This is being blamed on a 'computer glitch' but the real fault here is the procedures. If you have a testing procedure you have to have a process in place for dealing with a false alarm caused by operators correctly reacting to the test.

Monday, April 17, 2006

Origami PCs begin to arrive

eo has launched its first ultra mobile PC

Its not actually available until the last week of April, the price is twice the target price and they don't give any details on the capabilities of the docking station.

In particular before I bought one of these I would want to know if the docking station can drive an external monitor, if so how this is done. I would want to know how big the power supply brick is as well. I would want to know about the cost of spare batteries.

Talking of power supply bricks, I hope these people get a clue and build the power supply into the docking station/cradle. I find it really irritating when I need to carry arround three boxes with two cords to do one job. I know the economics of the separate power brick but when a product is marketed as ultra portable ergonomics should trump economics.

Saturday, April 15, 2006

Using a Thawte Email Certificate with Outlook Express v1.0

How to encrypt your email turned into why encrypting your email is harder than it should be. So lets try again.

Step 1: Is your email client compatible?

First thing is that you have to have a stand alone email client. I am not aware of any Web Mail offering with S/MIME support to date.

Most modern email clients support S/MIME. Even PGP Inc. supports S/MIME. If you use Outlook, Outlook Express, Lotus Notes, Thunderbird or Opera your existing mail client supports S/MIME. The same is true for most Apple email clients. Even Eudora, after being more or less the lone hold out for ten years has native S/MIME support in its latest version (7).

Step 1a Why not PGP?

PGP is a fine security product. OpenPGP is an open standard based on the PGP protocol. The only problem is that relatively few email clients provide native support for PGP. So an explanation of how to use PGP or the open source GPG would also have to explain how to install it. If you can do all that you probably don't need these instructions.

Step 2. Public Key Cryptography 101.

To use S/MIME effectively it is best to know a little about what is going on under the covers. In particular you need to know about public keys and digital signatures.

Lets start with digital signatures. Imagine that you wanted to have a way of proving that you sent a particular email message that nobody else could forge. Since an email message is simply information the only way we can prove that we sent the message is if we add some extra information to the message, we call that a digital signature. If we are going to meet our anti-forgery requirement the information you add has to be special, it has to be information that anyone can check but only you can create.

The way we do this is with a type of cryptography called public key cryptography. In public key cryptography we use two keys. One key is kept secret, the other can be made public. The secret key is used for the operations that only you should be able to do, that is creating signatures and decrypting messages. The public key can be used by anyone to encrypt a message to be sent to you or to check your signature on a message.

Step 2a: Certificates

Public key cryptography allows us to know with certainty who sent a message and know with certainty that only the intended recipient can read the message provided that the sender and receiver know each other's public key. This is the hard part of the problem. This is the problem I have spent fifteen years working on.

The best answer to date is to use a digital certificate. A digital certificate is a statement that says something like 'The key for Alice is 183828....'. Who signs the certificate... well thats a long story. Suffice to say that the question of who signs certificates is loaded with several books worth of political technical and legal issues.

Step 3: Getting a certificate

There are several places you can get certificates. Thawte has the advantage of being free.

[Disclaimer: I work for VeriSign which owns Thawte, I am not speaking for either]

To get a certificate click on the button that says 'Join' and simply follow the instructions.

First you have to set up an account. There are several screens of forms to fill in, all pretty straightforward. Then they send an email message to verify the email address for the account.

Next you can apply for one or more certificates. This involves another set of forms. These are pretty straightforward because you have already filled in most of the information. The main oddity is that you have to respond to a second email to prove that you own your email address.

Once the application is done it may take some time before your certificate is ready. Mine was ready ten minutes later. Again follow the instructions and your certificate is ready to use.

Step 3: Configure your email client.

If you are using outlook you now need to open up the options dialog, select the security options tab and select the options to sign and encrypt email with your newly created certificate.

You can now sign and encrypt mails by selecting the message options tab and selecting 'sign mail' or 'encrypt mail'.

To send an encrypted mail you need to have the certificate of the person you want to send email to. Signing your email means that your certificate will be attached to every signed mail message you send. So the recipient can keep the certificate in their address book along with the rest of your contact info. The email client will then pull it up whenever an encrypted email is to be sent.

Step 4: Publish your certificate

Probably the best way to do this at the moment is simply to publish the certificate on a Web site.

Step 5: This should be easier

I agree, and I am working on it. There is a ridiculous number of moving parts the user is expected to be aware of. On a usability email list recently someone wrote the 'Top ten things users need to know about SSL', there were 19 entries and SSL is meant to be simpler than S/MIME.

This was put together in something of a hurry, I will try to redo it with some explanatory pictures, screen cams and so on.

How to encrypt your email

John Aravosis asks how to make sure that the government is not looking at his email.

The short answer here is to get a free email certificate from Thawte and install it in your existing email client. Outlook, Outlook Express, Thunderbitd, Lotus, Opera, pretty much every widely used email client supports S/MIME encryption out of the box. No need to buy any thing more.

The long answer is that even the short answer is much longer than it should be. A digital certificate is a tool for managing an encryption key, in this case the public key encryption key that other people can use to send you encrypted email. Your email client will use the private half of the encryption key to decrypt encrypted emails people send to you.

Managing a digital certificate takes quite a bit of time and effort. Much of this is unavoidable, powerful tools tend to require some care and attention. Regardless of the email encryption standard you use you will need to spend some time thinking about how you publish your certificate containing your public key and manage your private key.

It is not enough to have created a digital signature, people need to have a way of finding your certificate when they want to send you an encrypted email. This is in theory meant to happen through the magic of directories but no Internet wide directory has been established to date. This means that the best way of publishing your certificate is probably to use it to sign all your outgoing email messages which will cause a copy to be automatically added to every email message you send.

This is already sounding much too complex. And for most purposes it is possible to make encryption dramatically easier to use. It is quite possible to have an email encryption system so simple that the people using it don't need to know its there. I gave a paper on this topic at this years NIST PKI workshop paper powerpoint.

The problem here is that John is concerned specifically about interception by a government. The system I have proposed is merely designed to deal with 99% of likely threats. When you are potentially faced with a government adversary the complexity of the task becomes dramatically harder.

Fortunately for John S/MIME and PGP are both designed to meet this level of attack. Both use encryption algorithms like 3DES and AES and RSA2048 that are beleived to be well beyond the cryptanalytic capabilities of any government. Both allow Web of Trust style trust semantics. Both provide security for the justifiably paranoid.

The problem is that this is not the level of security most people need and most people are not prepared to invest the time and effort required to use these systems properly.

Next a review of applying for a Thawte cert.

Thursday, April 13, 2006

Do digital signatures create unintended contracts?

According to a recent UK court judgment, Metha v J Pereira Fernandes SA [2006] EWHC 813 (Ch) (07 April 2006) almost certainly not in any common law jurisdiction.

The term 'digital signature' has always been a liability. The legal system has a very clear idea of what a signature is. A digital signature is something else entirely.

A digital signature provides a nearly unforgeable proof that the party that created it knew the private signing key corresponding to the public key used to verify the signature. If the private key is only known to one particular party that amounts to nearly irrefutable proof that the message was authenticated by that party.

People often ask me if this point has ever been tested in a court of law. To my knowledge it has not and I don't expect this to happen any time in the near future either. Cryptography offers a degree of certainty that is far far greater than any other form of evidence courts are asked to deal with.

Cryptography offers certainty but cryptography is only one part of a security system. A lawyer wanting to raise a question about the authenticity of the signature would be much better advised to attempt to raise doubt about the methods used to generate and protect the private key and the infrastructure used to identify the key holder than attempt to dispute the security of the signature algorithm. Even these arguments are fairly weak, the risk of loosing control of a private key are negligible in comparison to the difficulties raised by autograph signatures. Is the signature genuine? Was the document altered after the signature was made?

The real concern when using digital signatures is intention. Does the security protocol generate a signature that the user did not intend to create? Is it possible for a user to repudiate a signature that was intended to create a binding contract?

Most of the scenarios raised in crypto mailing lists are really about intention even if the apparent topic is key security. If a company really wants to use a digital signature as a method of signing contracts they are going to make pretty certain that they protect their private key.

This particular judgment is important because it helps answer the question of whether using a protocol such a DKIM might have the unintended side effect of causing an unintended contract offer or acceptance to be made. It is clear that it does not. It also helps us answer the other part of the question, is it possible for a signer to repudiate a digital signature intended to be an offer or acceptance of a contract? Again the answer appears to be no.

The one part of the judgment that should be cause for some concern is the conclusion that an email can contain an offer or an acceptance of contract even if there is no digital signature or other form of authentication whatsoever. That means that it is enirely possible for an attacker to create a forged email and then claim that it is a contract.

The best way to guard against this possibility is to sign every email message with a transparent, infrastructure level signature scheme such as DKIM.

Monday, April 10, 2006

Schneier on Security: KittenAuth

Two of the comments in the thread on the latest Turing test nonsense gave me an idea. Comment 1: CAPTCHAs are the way cryptographers get hackers to solve hard AI problems. Comment 2: So what if the poster is a robot, they might have something interesting to say.

So the ideal CAPTCHA would use an AI complete problem so that if there was a robot on the other end it would at least be capable of interesting conversation.

The downside to this approach is that every CAPTCHA is subject to a man in the middle attack. There is no way for posters on this blog to know what the captcha test is being used for. It could easily be presenting challenges from other blogs then using the answers to spam. The more frequently CAPTCHAs are used and the lower the threshold for acceptability becomes the easier this type of recycling becomes.

Bob on auto exposure

I am still not sure what Bob is trying to get at here. The picture he shows as an example of 'automatic' exposure is actually a simulation. I don't quite know how the D50 (and by extension the F100) would cope in the situation he demonstrates. Certainly the N90 would probably overexpose without compensation. But compensation is one of those things you just get into the habit of doing.

Provided that the negative is not overexposed or underexposed to start with it is still possible to post process to create any print that you might want.

I still believe that telling people to turn auto exposure off is not a good idea even if it does improve the pictures they take. The world has too many intrusive photographers as it is who think nothing of interrupting every event to dither over taking a photograph.

Friday, April 07, 2006

Beyond software usability: The case for luxury

Over the past couple of years I have been increasingly drawn into the ongoing debate on usability of security. I now think that the persuit of usability is a mistake.

The problem with the term 'usability' is that it reduces the problem to strict utilitarian goals: people find computers difficult to use so lets find ways to design computer software that makes it easier. The nadir of this approach was the vastly irritating dancing paper clip 'clippy' that Microsoft decided to inflict on users of Microsoft Office.

I don't just want a user experience that meets my minimum criteria for acceptance. I want a system that tries to exceed expectations. I want more than mere utility, I want luxury.

UNIX provides utility. UNIX with a modern GUI interface can even be said to make an attempt at usability. But short of a complete ground up re-write nobody could ever mistake UNIX for being luxurious.

Each time I plug a device into my Windows XP a little balloon appears to tell me that a new device has been detected. Presumably the designers thought this might be helpful. If I happen to plug a USB 2.0 device into a USB 1.0 port it warns me that this will cause the device to be slow. Presumably because the designers thought I really wanted to be nagged into buying a new computer.

What the designers of XP do not appear to have thought about is how to turn the stupid balloon help notifications off for good or how to set the task bar so that when it hides automatically it stays hidden until asked for. There is actually an obscure registry setting that can achieve this but playing hunt the registry key is hardly my idea of luxury.

A luxurious user interface does not constantly nag for attention. It values my time and takes care not to disturb me without a good reason.

Is luxury possible in a software system? In the 1980s the inventors of Space Invaders and Pacman proved that it was. A video game is a user interface so good that people will pay money to use it. Pacman was so good at this that it caused a national change shortage in Japan.

I don't want every computer program to turn into a video game. In fact I find that one of the more irritating properties of audio and video player software is the apparent need to support sixteen different skins, none of which bears a close resemblance to the look and feel of the platform.

I don't want to be patronized by the computer either. In an attempt to make the computer easy to use Microsoft added Clippy the backseat driver. This mistake would never have been made if the objective had been defined as luxury.

Thursday, April 06, 2006

Windows on Mac

I like Mac hardware but not OSX. The very idea that people might not like OSX is anathema to Mac purists. It shouldn't be. Having used three windows systems - XUI, X-Windows and Windows I have certain expectations for how things will work. OSX tries to nanny me into doing everything the Mac way. I don't like that, nor do I see much point in running an O/S that has less application support. The one button mouse was chosen after a study showed that complete novices could learn to use a machine faster with one button than with two. Twenty years later most of us are not novices any more.

What I do like about Apple hardware is the way it is put together. The only PC maker that comes close to Apple is Sony. Unfortunately Sony have serious build quality problems in my experience and their warranty only covers parts. Every one of our three Vaio laptops has broken before it should. Sony wanted $300 to fix a broken power connector, they use a non-standard part and don't sell spares through third parties.

Apple is guilt of some of the same problems but not quite in the same degree. Also they do hardware better than Sony does.

Only one small problem stops me getting a Powerbook, the mouse only has one button...

Tuesday, April 04, 2006

Declaration of usability

We, the users, are fed up. Computers are too hard to use and efforts to make them easier often make the problem worse. While we are mightily impressed that if cars had improved at the same rate as computers we would all be driving machines that cost a dollar, go at a zillion miles an hour and are capable of going to the moon and back on one tank of fuel progress in the area of what we used to call ‘user-friendly’ computing has been less than stellar.

The cars of the 1900s were slow and unreliable but driving one didn’t require a degree in nuclear physics. One could become an expert in every aspect of the earliest motor cars in a few weeks of intensive training. The need for the driver to continuously adjust the fuel/air mixture during a trip was eliminated before the mass production era. Synchromesh gears eliminated the need to double declutch in the 1950s (?). While the car has not matched the computer in cost, speed or fuel economy improvements the improvements in safety, reliability and luxury should put the computer industry to shame.

For the past twenty years the ‘user-friendly’ qualities of graphical user interfaces have been touted. In particular the ‘desktop’ metaphor developed at Xerox Parc, popularized by Apple and later adopted by Microsoft has succeeded brilliantly in its originally intended purpose of helping first time users overcome their fear of computers.

The mere act of adding a GUI does not make an application usable any more than adding a go faster stripe and a spoiler makes a car go fast. A user interface that is optimized for encouraging the naive user to learn how to use a computer is not the best user interface for daily use.

5th Annual PKI R&D Workshop #1

Keynote is Angela Sasse, she is giving a lit survey of the field so far. Interesting anecdote I had not heard: A bank changed the signage on their ATMs to reflect a their new clour scheme after a merger. This led to a series of complaints from customers who had scratched their PIN onto the old acrylic.

The list of things that the user is expected to learn to use SSL has 19 entries. The list of administrator issues has only 13... It is easy to see who buys certificates. Lots of Glasbergen cartoons, I could post one here but there are many more on his site.

Now we are onto questions. Starting question included an assertion that cars have not got easier to use until very recently. So I had to point out that we no longer spend time adjusting the carb during a drive. Bill Burr is trying to argue that a large part of the problem is that people are just not used to computers, the problem will go away in time. Angela disagrees. I think Bill is right to the extent that people will develop parts of the vocabulary over time, but that is not an excuse for not making it simple.

Now David Chadwick on the fact that X.509 and RFC 3280 have different semantics for name constraints. I think that this is really a consequence of the static trust semantics of X.509. If you want to manage trust on a large scale and make fine grained decisions you are going to need quite a bit of flexibility. X.509 gives only a limited pattern matching capability. Inevitably the semantics of the pattern matching get pulled towards the immediate needs of the person writing the draft.

Monday, April 03, 2006

Nielsen on Web Hype 2.0

Finally a 2.0 story we can all agree on. Web Hype 2.0!

Nielsen does not go into the technology level at all but still finds plenty of irrelevant hype. During the dotCom boom the standard modus operandi of Venture Capital was to have the companies they invest in spend most of their time and resources chasing the latest technology fads rather than making intelligent technology choices. Which is better Ruby on Rails or ASP? If you already have your site 90% implemented in LISP or PHP and you are not suffering an immediate problem as a result the answer is stick with what you have.

I think that his comments on Bloggers vs the mainstream media and Wikipedia vs Britannica are less on target. Even though the commercial relevance of the stories may not be very strong their social importance is vast.

Nielsen's argument that newspapers face a bigger threat from the loss of classified advertising than the threat from bloggers is probably correct. In the end the loss of classified advertising is going to cause most of the local print media to fold entirely. If Google opened up a version of Sidewalk it would be game over for any local newspaper in their covered area.

In Clayton Christesens's terminology the local newspapers face a disruptive technology change. The local newspaper does not add enough value to the content it produces to justify separate publication of a newspaper in every small town in the country. The national newspapers in the US face a rather different threat, there the issue is very much credibility. The mainstream media in the US has been set on a disaster course since before the O.J. Simpson trial persuaded CNN to get out of the news business and promote soap operas instead.

The problem isn't bias, its the triumph of punditry and opinion over actual news reporting. The problem with punditry is that the people who want to watch raw partisan punditry are much more likely to want to watch a fight where the scales are tipped so that their side is certain to win. A news station cannot deliver a show like Hannity and Colmes without losing the non-partisan audience. Fox news knows that it doesn't have many liberals watching.

The CNN management made the mistake of thinking that the distinguishing feature of Fox was the direction of its bias. They shifted to the right in an attempt to hold on to their right wing audience. A more logical response to Fox would have been to chase the opposite end of the spectrum. The best response would have been to do some real political journalism. The Abramoff, Noe and MZM scandals were all potential gold mines.

Bob Blakely on Leica vs Nikon

Bob argues that too much technology in a camera makes us lazy. He has abandonded his F100 for a Leica III.

I also have a Nikon N90s which I virtually never use, but thats because I have largely abandonded film for digital. I now use a D50 which is practically identical to the N100 in its interface.

The needs of landscape photography and portrait are very different. Unless you are being paid to have your picture taken the photographer has no business interrupting people. The biggest advantage of digital in turning bad photographers into mediocre ones is that they no longer spend ten minutes posing people and waiting for the conditions to become perfect, which of course they never will, the longer you make people wait the more bored they will become.

I don't think the automation in the camera makes a great deal of difference. Once you have decided what you want to take a picture of, compose the shot and focus on the topic of interest there are only two real choices you can make on a camera; aperture and shutter speed. And the choice of one strongly constrains the other, particularly in digital photography where the only objective in making the shot is to capture the desired depth of focus and preserve the dynamic range.

The flash gun on the other hand has more options and flexibility than can be imagined. You can point it in six different ways for a start, then there is the issue of front or rear curtain sync, exposure and so on.