2010-01-29

The proper keyboard

The other night I got into a discussion with a nerd friend of mine about what might be the best keyboard ever. We agreed that it should give auditory and tactile feedback. It ought to need little force, but also enough to make it sure that miskeying was unlikely. She even agreed to a point with my idea that the classical grand piano's touch could have something to do with a good keyboard touch, but I think I sort of lost her about there with the consequences. Thus, I'll dump the whole of my reasoning here.

A proper keyboard needs to have both tactile and auditory feedback. Those two modalities work together well when you're inputing information that will be interpreted by your visual sense -- the segregation of the two modalities heightens both at the same time. Your tactile sense helps you keep track of the keys, your auditory sense provides exceedingly time-accurate feedback at the same time, and all of that is separate from your visual sense which serves to check and guide the overall picture of what was said. Given that our hearing is accurate downto the microseconds, it probably makes sense that the primary auditory feedback should be purely mechanical as well, since the roundabout through a computer would introduce a delay and a temporal uncertainty that is far worse.

Thus they keyboard should be quiet, but it should nevertheless sound/click.

Then, one of the worst things a keyboard can do is to require lifting your fingers off it. To any touchtypist, that sort of thing seems like a swamp that swallows your fingers. Thankfully that sort of thing is rare today.

Third, it's also rather tiresome to have to push down the keys against a viscous, dissipative load, plus a string force. The string force which negates the previous need to explicitly lift your fingers off, but at the same time necessitates quite a lot of force to be used on your part for every downward keypress. And since laptops derive the balancing force from a nonlinear and viscously damped rubber pad, any work you expend on the key in either direction is rapidly dissipated. This is already a workable balance, but not an enjoyable one, and it's what laptop keyboards pretty much always do.

Fourth, older keyboards, and the better quality current, full-width keyboards do something quite different. They dissipate the end-of-movement energy via a plastic collision at either end, in average they exhibit linear tension, and then they have a little bit of die-hard hysteresis in the middle when they click. In here, the design parameter was traditionally about where the click happened with respect to the common typing motion. In here, we're finally narrowing downto the proper ergonomics.

But then, fifth, even the company my nerd friend pointed to hasn't really gone beyond this paradigm. They haven't really considered whether the traditional design is the best one; they've just gone with the best that we had in the past.

So the final, sixth, longest point of my is about keyboard idealism. Bear with me here, because this is gonna be wild.

Clearly we can't get all of those different nice aspects in a mechanical keyboard at the same time, unless we resort to very convoluted mechanical designs. Those will then break under no time. So, let's ditch the mechanical design everywhere we can. The plastic, mechanical stops will have to be there, of course, both going up and down. But the rest? Not likely.

Otherwise than that, every key could contain a lightweight, very tighthly wound, fed with very high voltage (from a buck-boost-fed power supply) moving coil. Set in resin, and moving against both a set Neodymiym magnet at the bottom, and a similar set coil further out. With enough current (plus regeneration, plus a big enough reserve capacitor), and especially high enough scanning frequency and proper signal processing, this sort of thing could mimic any and all touchfeels via algorithm. All of the so called "moving parts", i.e. coils, could be set in resin and be totally isolated wrt each other, for automatic military grade assurance of functioning. Pour a bottle of coke on it and it just keeps working, without any stickiness.

Then, it's not just about force or feedback when you work a keyboard to its fullest. Here, my favourite example is the grand piano. Pianists are able to expend even full Watts in steady state on their keyboard. Us touchtypists are not, so we get tired, and our output dropps off as our hands tire. What's the reason? Obviously it is because our power output isn't properly impendance matched to the keyboard, and the current keyboard is such a thoroughly dissipative thing. So why not make the above electromagnetic coupling work both ways, so that every keypress leads to electrical input, and then the circuit recouples that energy right back into the typist's fingers to bounce them off the keyboard? In that way we could go with "staccato typing" with diminished loss of energy, the way pianists do.

Old school electronic circuits could not have done that. They would have incurred heavy losses because of low voltages, too low magnetic fields, and resonant circuits which were only designed for a single resonant frequency/speed of typing. They would have been massive and dissipative as hell, as well. But today, we could easily implement all of this using a movable coil, a fixed coil, and if we wanted to save energy, a fixed high-flux permanent magnet, per key. With high frequency active control and the right power circuits per key. No, it wouldn't be cheap. But yes, it could be military grade in more than one sense, and even in the civil society that sort of thing could save insane amounts of money from treatment in carpal tunnel syndrome, lost productivity due to programmers struggling with their keyboards, and so on.

2010-01-25

Physical steganography

Suppose you want to bug a room and you know somebody's going to sweep it. It's well known that pretty much all radio frequency emitters can be detected using a careful RF analysis. Basically, if it's emitting energy, it doesn't much matter under real scrutiny how you try to divide that energy. It's radiant energy, and as such it can be intercepted and detected.

Unless, that is, it can't be told apart from what the sweeper is expecting already, or alternatively is so utterly different from what se's expecting that se doesn't even have the equipment to efficiently detect it.

From there, I can come up with two examples already. First, powerline hum. If all you put out is something very close to the extremely strong hum always present in any modern building, you're not going to get caught. And even if people start to pay attention to this sort of thing, you can always lower your transmission rate, driving the effort of detection upwards by as much.

And second, since that leads to a diminishing channel, you can play the age old game of finding new modes of conveying the information out. One such means that I haven't seen detected yet is coherent polarization modulation. Say, you place your transmission right on top of some prominent microwave emission line of a compound that is widely present and put out a low level, constant envelope additive signal. Only your signal encodes your stuff via varying polarization, not amplitude of frequency. Just try and find the equipment which can deal with this sort of stuff at low levels...

2010-01-05

High-rise fire fighting

I was just reading about a famous firefighting operation in Los Angeles. As always, a couple of ideas came to mind. Rather than waste them, I thought I'd write them down -- pretty much why I blog at all. Much of the problem of firefighting boils down to water, so that's what's guiding me here as well.

First, high-rise buildings present the precise same, gravity bound predicament to emergency personnel that they do to the architect, or the lazy person. Going up is unduly hard. One important aspect of this is standpipe usage. It's hard labour to get the water upto the fire, and even more importantly, you lose pressure on every floor going up. You have to have all sorts of engineering safeguards so that the upper and the lower floors receive even adequate service in time of need. And in an emergency that manages to break the standpipe, all hell breaks loose. You have about zero pressure for the floor above the break, and considerably less even for the floors directly beneath, because then the pressure only comes from the fountain above to the breakpoint (plus friction, viscosity and upward momentum, which should mostly be neglected because in the "normal" situation any such thing would hinder the service to the upper floors, which are no less valuable).

So my first thought... One way to combat this would be to include low-pressure shutouts on every floor which would cut out the flow below the break. Difficult to do in a manner that doesn't rely unduly on fallible technology or impede pressure/flow under "normal" conditions, I know, but worthy of thought especially in very high rise buildings where the probability and the span of potentially inaccessible floors compound. The solution would probably take the form of mechanical devices designed to slowly and gradually activate upon the presence of significant flow and low pressure, and which furthermore draw power from velocity alone. That sort of thing is eminently constructible, and at least in time, it could be refined to a level where it also "works itself out", so that jamming becomes unlikely even in extended use. As for illicit use of such devices, they would obviously be in more or less plain sight, any control of them would be rapidly overridden by a firesquad, and so any tampering and anybody hanging about the switch would be detected pretty much the first or second time around over an operation.

Second, I wonder whether the water and the inherent power its rapid, pressured flow carries with it could be used to extra advantage, by varying how not just the water, but the power carried by it, is actually used. I mean, power is Power; why waste the huge power that is in the flowing water so that it dissipates in splashes and heat? Why not do something useful with it?

From an alternative viewpoint, this boils down to the basic thermochemical fundaments of sustained combustion or the lack of it: what it takes for a fire to sustain itself is something combustible, usually a mixture of an oxidizer (oxygen, anybody?), an otherwise energy rich material that is held back from spontaneous decomposition by a molecular energy threshold (quite often an organic material, but purified metals and the like serve more than well), and then enough temperature that the average molecular kinetic energy transcends the chemical reaction threshold, giving rise to an exothermic reaction at a wide enough scale, and then positive feedback (usually limited by the amount of the most common oxidizer, air, available).

Ergo, a fire. Which tends to spread. And tends to consume all of those valuable organic and/or otherwise energetically unstable, purified materials we hold as economically valuable. We don't like uncontrolled fire much, then, so we fight it.

From those basics we see that fighting spreading fire means taking one of the necessary elements away. Suffocate the fucker for the lack of an oxidizer. Atomize and isolate it to limit the positive feedback/"contagion". Cool it down to recontain the original combustible so that it is energetically favourable for it to stay in shape and not burn. And in the last part especially, make sure that it don't remain hot enough to restart the reaction spontaneously, no matter what temporary extinguishing measures you've done before that. (Unless, of course, those measures aren't that temporary, which is why we also have foam, powder, zero-ventilation upon Halon release, and so on. But that is another story.)

As a general rule, you must not only stop the reaction, but maintain the reacting system in a state that somehow stops it from reacting again before it's reached its "normal", "reaction- blockaded" state. I believe every firefighter knows this, both theoretically, and practically, and is exceedingly good at it. So what I'm talking about is not what firefighters do, can do or should do. It's about the technology at their disposal.

Here I'd like to take the example of the (very badly named and kitsch-sounding) IFEX 3000. Here the idea is to carry compressed air for power, and little water. You use the air to atomize a small amount of water, and to propel it right into the "heart" (i.e. the surface of the base combustible) of the fire very rapidly. The stated idea of the invention is to maximize the usage of water, by driving it into the fire by force, and maximizing both the proportion of it that is eventually vaporized completely (100%, drawing away the maximum amount of heat water can carry per volume, which is a lot, and which is why we also primarily fight fire with water), and also doing this rapidly (the rationale for this is not explained too well, but it's sound: it's about dividing the fire and bringing large surface areas of it back into chemical stability at once; the spatial backspread is then so slow that multiple blasts can bring even large fires back under control; every fire fighter does this when swinging the hose around over a centralized, expanding cold-zone, and is encouraged to do so in training, instead of just "spraying around"). This innovation does work, and is quite an idea as such...but only upto a point.

There are at least two problems. In the case of the fire I looked at, one principal source of nastiness was burning aluminium, along with updraft over the outer surface of a high-riser which enabled rapid vertical spread.

Aluminium, that stuff reacts with water when hot. So if you feed in small amounts of high surface area water mist, the fire will not go out; even if you were able to crowd out oxygen altogether, factually you would still be supplying enough oxidizer to the fire to keep it going. In that case, if water is all you have (and with aluminium you'd have to use something like pure nitrogen to actually suffocate it, which is not feasible in practical firefighting work, especially at that scale), your only hope is to cool it down enough. So that it stops reacting/burning.

And that's also going to be a challenge to the fine-mist-little-water-camp, because, although aluminium has a limited heat capacity by any metallurgical standard, it also has less than intuitive thermal conductivity for a metal, so that you can cool it down at the surface, but it would still have plenty of internal heat to reignite at the surface if it had burned long enough beforehand. So that sort of thing clearly calls out for conventional, quite heavy water cooling at the surface, well in excess of what you'd use with other sorts of fire, and also dogged persistence -- which is squarely what IFEX cannot give you with the limited amount of water it carries.

(By the way, I suspect that is why they always demonstrate the system with volatile gas/liquid fires. There it's highly efficient, because the total heat capacity of the coolant needed is limited. You only have to extinguish the gases in the flame to bring the fire under control. But were you to put a couple of coal bricks, or a red-hot cube of iron above the liquid surface, or heaven bid an active flare into the mix, it just wouldn't work.)

So, is there a middle ground? A practical one? I think there is. The main point of IFEX is a) rapid delivery and b) high surface area of c) water, which is really, really good at c1) carrying away heat and c2) once vaporized, usually extinguishing the source of oxidizer (atmospheric oxygen) as well.

You don't need compressed air to atomize your water. You only need energy. And every firefighter knows there's plenty of that right in the pressure hose; with enough pressure and men to hold the hose down, plaster walls and the like are gone in no time. Water it also has, quite evidently. The only thing missing is the capability of efficiently forming mist instead of spray or a run. So why not build long, turbulent (even cavitating) , extremely high inlet pressure, asymmetrical, vortex-inducing, sideflow-turbine -like nozzles for firefighting use? That sort of thing would:
  • be powered only by the water pressure differential to ambient, making extra supplies unnecessary,
  • under current standpipes, it might reduce flow, but instead it could be optimized towards optimum dissipation of heat, and would obviously be under the control of the firefighter as well, for expert effect,
  • could yield many of the benefits of the IFEX design, without tripping any patents,
  • would have no more moving parts, though perhaps some extra length and weight, compared to today's standard nozzle (perhaps they would also only be used when burning metals are involved, or something, so that my idea is an addition to, not a replacement of, the well-tested arsenal of today), so that they could be easy to use and also robust,
  • unlike IFEX-type systems, would ensure continual replenishing, so that even high heat capacity targets could be fought continually until they truly give up,
  • unlike conventional type systems, would minimize drainage, which is a hazard not only to firefighters in heavy action (water clutters up the scene, and any water spilled is also a step away from water security in tight spots, exposing firefighters to unnecessary personal risk), but also to the property and the lives that are being protected, and finally,
  • they would be a logical step beyond the flashover/hot-ceiling spray techniques that are already commonplace, thus providing a physically well-founded step beyond simple spray, and perhaps even an impetus for going towards spray instead of splash in ordinary scenarios where high heat capacity of the combustible is not an issue.
Thirdly, and perhaps perhaps most fundamentally, I wonder whether it's too wise to regulate high-rise buildings' fire security even on the same *criteria* as that of lower ones. I mean, we've evolved through the stage where fires were typical, and the typical fire spread over narrow streets, wooden houses, and so on. But a skyscraper, that's something new, and something that we've already tried to engineer as fire-proof by our earlier standards. As such, it's failure modes in a fire are bound to be completely different from your typical fire.

Sure, this intuition comes from a lesson already learnt; namely the fire I'm now looking at. But it is also informed by the systemic perspective on things, much magnified by the failure of etc. the WTC towers -- which were also very, *very* well engineered, and in fact eventually stood much more than they were even designed to. So what I'm asking here is, did we forget some essential element in our social science, engineering science, economical science, chemical/combustion, the actuarial one, whatever, about the distinct character of high-rise? That is, that they're *high*; as in "gravitationally challenged" in so many ways?

I know that the intuitive aspect of this has been thoroughly grasped within the rescue community. The eventual, surprisingly fast extinguishment of it by the LAFD bears witness to this. But still, do we really, *really* have any hard science underlying our responses to high rise stuff? Something that explains what we do there, and at the same time suggests verifiably better ways to improve our response?

I don't really think so. And that distresses me. As someone who'd very much like to acquaint oneself with at least one of these marvellous products of the human mind and spirit.

2010-01-01

Equilibrium/gun-kata

Ever since I first bumped into the movie Equilibrium, I deeply liked it. Aesthetically, narratively, and for its dystopian social commentary. The critics didn't; they wrote it off as just one more dark, run-of-the-mill dystopia, saved only by the sleek outfits, the overblown commentary, and, of course, a hefty dose of noveau martial arts with guns.

Once more I don't know what this movie was supposed to tell us. Interpretation of just about any artistic work is bound to be a fickle, unsure and highly subjective process. Given my own eccentric leanings in aesthetics, I'd even say that the whole business is completely subjective -- what the critics say is good rarely coincides with what I deem to be so. But still...

Art stilll has its internal invariances. The internal symmetry that pervades the work, in one form or another. When that symmetry involves societal issues, it still challenges us. In art, you can bring on any symmetry you like, and that symmetry is going to be readable across the board when done right. No matter whether you like it or not.

In this case, I think the relevant symmetry is the tradeoff between emotion and order. The tradeoff between a neatly functioning, deterministic, safe society, and the one where individuality makes things complex, chaotic and, in the rhetoric of the film, "too human and emotional". That tradeoff and dynamic is pressed beyond all recognition, perhaps then making this flick a bit too overbearing for an analyst that goes at it with the usual, neutral starting point.

But if you *really* look at it, the so-called dystopia depicted could in fact be a kind of utopia as well. Indeed, there is no war there. No love triangles, no aggression, nothing personally or impersonally threatening. Sure, there is the control mechanism to keep things as such, and that gives rise to a certain dramatical dystopian imagery like maladroits continually being removed by force from the community, and incinerated. But the real philosophical touchstone is still the basic, personal existence of the majority, and the feelings of the protagonist as a (pivotal) part of it. If you think about it, that sort of existence is alien, but conceivable, and not necessarily all that bad.

This ambivalence is what makes Equilibrium so compelling to me. Also, it does not hurt the work that there is an amount of high-caliber action and martial arts going on: that could equally well be interpreted as an artistic means to get the real point across. Thoughtful people could or should read it as such as well. Then they would laud the work as something that encapsulates deeper philosophical contrapositions into a form which actually captivates the common audience. I aim at a win-win interpretation, and I think I have one here, despite common critical acclaim.

Finally, and simply personally, the visual style and composition in this film simply rocks. It would have been enough if they just invented gun-kata for it. But the cinematography goes *so* far beyond that. The sets are amazingly genre-consistent and particular, the consistent high contrast draws my eye, there is a definite visual reference to the monochromatic era, perspective is being used rather liberally to give a sense of space to the scenes, and the handling of photographic time takes on a quality I've *never* seen before or since (i.e. it hints at the kind of bullet-time Matrix used, but it completely stays away from the special effects used there, instead opting for lighting, perspective and well-placed visual cuts).

That all is the hallmark of an inspired photographer at work. The audio track, it's the genre-specific, which makes it rather muted, choppy, mostly non-dialogue. Just as the visual scenery is: a sequence of impressionistic sketches. Very much true to the cyberpunk form of telling futuristic stories.

As I already said, I really, *really* like the result. Perhaps you now understand why, and also why I think this particular movie is in my estimation under-appreciated.