(Back up to central page on the debate)
I find it extraordinary that the HCI community should feel so defensive about its worthiness of support. You know I barely consider myself a part of the HCI scene (e.g. I don't publish in what I would regard as recognisably the area, although you and probably Nigel Birch might disagree) yet to me it's blindingly obvious why HCI research should be funded.
To my mind you are supporting some fundamental fallacies about funding research and the nature of both research and computing science and these show up in your discussion. Questions of funding anything are essentially economic and questions of funding anything from the public purse are political. While there are those who would advocate economics as a science, even a magazine like "the Economist" thinks that the economic theories we've had foisted on us to date have done more harm than good (and essentially offer no worthwhile insights), and anyone who thinks that politics is the realm of the rational rather than the rationalisation of prejudices is in for disappointment (I give you Margaret Thatcher and Adolf Hitler as examples of the same thing here - both were popular with their public because they instictively homed in on popular prejudices of their times).
In regard to the nature of research, I am wholly unconvinced. Your argument seems to be one of trying to find underlying theories and your examples are all about serendipitous discoveries. These are essentially the stuff of scientific research, and I don't think Computer 'Science', let alone HCI is in that territory at all. There are many who would say that 'Computer Science' is an engineering discipline and calling it science at all is a dangerous misnomer. The people who want to call it a Science are essentially people from a mathematical background who feel a natural affinity for science (which uses mathematics a lot as its natural language), yet ignore the fact that mathematics isn't a science either. (Try Popper's view of science being about falsifiable questions: if you can't in principle falsify it by experiment it ain't science.) Discoveries in mathematics are not at all like the scientific discoveries you refer to, but are in fact much more like intellectual breakthroughs in engineering (e.g. the proof of Fermat's last theorem). Both mathematics and engineering deal with human artefacts and 'computer science' is a whole study built up around one particular kind of artefact: the digital computer. If you build computers there 's no doubt you're an engineer but if you try to understand software people think you're doing something else entirely. But there's no difference: if I can do it in software I can do it in hardware - the software is just a (huge) convenience (think of a Turing machine, or putting your favourite algorthm onto dedicated logic).
Ultimately science (and mathematics, and engineering, Uncle Tom Cobley and all) is a cultural activity and should be compared with opera, or the movies for that matter. What value do we put in these things? Usually we let history judge; 90% of everything is rubbish and what survives is, by definition, worth keeping.
From this viewpoint HCI sorta fits the 'engineering' model for Computer Science but it also has aspects which are recognisably those of the psychologist. But HCI also has lessons for the process we used to call systems analysis, and could take us into aspects of design which art schools would claim as their own. I am sure there are others but I am quoting my own particular hobby-horses. Most people would recognise these concerns as being relevant to computer science. What the HCI community tends to do is to try to suppress the computer science aspects of the subject, so find themselves floating about, not really attached to anything, wondering why they should exist at all (a typical reaction when you've blown most of your body away). There are quite a lot of good grungy teccy things, like how to handle event-driven systems, or do sexy graphics in real time, or try to convince people that this synthetic image is real, or discover what it is that users really want, or need, or user interface metrics (or any kinds of metrics really),...plenty to keep us all going doing things which need to be done. If you don't think they don't need to be done you should hear what even the best people in the digital media industry have to say about the horible interfaces to their work.
The sorts of argument you have been trying to sustain are just the sorts of argument people come up with when there isn't enough money in the system. British culture is essentially anti-science (scientists are 'anoraks' who don't get paid very much - and now I am talking about the popular view of science which includes engineers and mathematicians and other relatives of Tom Cobley) and anti-intellectual ("too clever by half") at heart, so doesn't understand that the wealth of nations depends on such things. Had we in the UK supported science properly in the early post-war period (and dealt with all the other things which systematically prevents this country from exploiting what science base it has) we wouldn't be having this debate. The adequate funding of science (etc.) would have given us German levels of prosperity and German levels of Science funding. Nobody would be questioning the value of HCI research because we could afford to support all manner of cultural activities, including generous 'science' funding, and people would approve because the returns on the relatively few successes would make the whole exercise worthwhile. Because we missed the boat in the 1950s, and effectively compounded the felony thereafter, we have less and less to spend on research and find that we can only afford an increasingly unrealistically small level of failure in such circumstances. The sums don't add up, but the politicians want us to fight among each other for the scraps, not start to wonder why the meal is so inadequate.
One pernicious argument which is advocated against doing any research anywhere is that the products of research can be picked up by anyone, so there is no benefit to the people who do it. Ignoring IPR issues for the moment this argument ignores the concept of sector advantage, where a particular line of work in a particular place leads to that place being the only place where a whole variety of activities make any sense. I quote Hollywood and Silicon Valley as particularly fine examples of this, but there are other examples all over the World. You have to start somewhere, sometime, and if you survive long enough you can probably build that sector advantage. Once you have it it's hard to compete. Think of the Science park around Cambridge.
Culture is a funny thing. It is very hard to change. Studies have shown that local cultures have survived for centuries. Scottish culture, particularly in regard to attitudes to so much of what we have been addressing here, is quite different to English culture, which has its own history. The reasons are irrelevant, the fact is that locally there is a (modest) cultural sector advantage in science and technology. Three hundred years is not nearly enough to make any significant dent in that. Maybe we might see some improvement in the science vote relative to England come devolution, and then all this angst would be consigned to history, where it belongs. The same argument suggests that nothing much will change in the English science vote, so they'll still be arguing these issues, and wasting their time, until the cows come home.
I note that UK spending on Science has gone down over the last 5 years while that in all our international competitors is going up.