Welcome to Orcmid's Lair, the playground for family connections, pastimes, and scholarly vocation -- the collected professional and recreational work of Dennis E. Hamilton
The nfoCentrale Blog Conclave
nfoCentrale Associated Sites
The nfoCentrale blogs, including Orcmid’s Lair, were published through Blogger via FTP transfer to my web sites. That service is ending.
Then there will be silence as Blogger is unhooked, although the pages will remain.
No new posts or comments will work until I updated the web site to use its own blog engine. Once that migration is completed, posting will resume here, with details about what to know about the transition and any breakage that remains to be repaired.
Meanwhile, if you are curious to watch how this works out, check on Spanner Wingnut’s Muddleware Lab. It may be in various stages of disrepair, but that blog will come under new custodianship first.
Labels: web site construction
Prophets in Their Own Lands
Back in February, I posted “Document Security Theater: When the Key is More Valuable than the Lock.” I was objecting to a technique, now being immortalized in open-document formats such as ODF and OOXML, whereby a hashed copy of a password is stored in the document such that it can easily be retrieved and used to attack the password itself. As explained there, the value of the password is not in being used to overcome the protection of the document against alteration – that is easy to do without ever bothering to know the password. The value of the password is that it is a memorable secret of the password holder and it needs to be protected (i.e., disguised) because it is also used for a variety of valuable purposes.
The failure to achieve a separation of concerns is probably a tip-off here. Either way, the exposure of hashed copies of passwords is not a new issue. There are available expert reports that identify the flaw. Attacks on passwords whose hashed copies are known have been popular since the first widespread Internet worm was released against unprotected systems. For example, the Unix /etc/passwords file with its hashed copies of passwords was commonly readable by all users and certainly anywhere once a root password was compromised. That users had the same passwords on different systems made leap-frog attacks from system-to-system particularly promising. It is like watching an elaborate arrangement of dominoes fall.
Encouraging Gullible Conduct
My argument then was that it is folly to increase the complexity of hash coding and believe that the password is thereby protected against discovery by a determined attacker. The defect in reasoning is in the assumption that the remedy to attackable hashed password copies is to use a “stronger” hashing technique. It does not make a memorable password stronger, and there is effectively a (disguised) copy of the password in plain sight. Having the copy and knowing the hashing technique allows that still-weak password to be attacked about as easily as it ever could be.
Systems which use password hashing as a way of not keeping passwords around in plaintext also arrange to secure the hashed copies against discovery. Once the hashed copies are known, discovery of the password is becoming child’s play, especially for memorable passwords that are reused by the password holder as a matter of convenience.
We’ve all learned by now that convenience trumps security, right? My objection is against willfully pandering to that conduct. You can imagine my dismay when my efforts to end that perpetration in the ODF specification were rebuffed by this argument:
“The justification for stronger algorithms than SHA1 is that many users use the same passwords for multiple tasks. So, it is worth to protect the key. Since we explicitly added the [SHA256 and stronger hashing methods] attributes to ODF 1.2 on request, we should not revert this.”
That is precisely the reason we should “revert” that so far draft-only provision of ODF 1.2.
Reality Will Not Be Fooled
Last week, there was announcement that some servers at Apache.org had been attacked and compromised. I saw notices such as ZDNet’s “Apache.or hit by targeted XSS attack, passwords compromised” and PCWorld’s (via Yahoo) “Apache Project Server Hacked, Passwords Compromised.” I didn’t read the articles, since it was about an all-too-common sort of break-in. What I didn’t appreciate was that the attackers stole lists of user names and their hash coded passwords.
What finally caught my undivided attention was the 2010-04-13 James Clark tweet, “Ouch. Hashed copy of password compromised for all users of Apache hosted JIRA, Bugzilla.”
The notice at the Apache Foundation cannot be clearer: “If you are a user of the Apache hosted JIRA, Bugzilla, or Confluence, a hashed copy of your password has been compromised.” And, of course, if we are putting hashed copies of passwords in plain site, it doesn’t need a hacked JIRA, Bugzilla, or Confluence configuration to get it. Even scarier is this observation: “JIRA and Confluence both use a SHA-512 hash, but without a random salt. We believe the risk to simple passwords based on dictionary words is quite high, and most users should rotate their passwords.”
What more do we need to know?
It is time to stop putting lipstick on what we know to be a pig.
I believe that this situation, for documents, arose through an over-constrained problem. We’ve been blinded into thinking that the safety of keys used for conveniently removing document protections is improved by strengthening the hashing for copies of those keys. All this does is encourage folks to be careless in the choice of passwords for this mundane purpose. We must find a way off that slippery spiral.
The intriguing problem is how to preserve the convenience of protection removal for document authors without subjecting their convenient, memorable password to discovery by attacking the plain-sight hashed copy. Is there a way out of the current awful practice? And if so, what do we do to overcome perpetuation of the flawed approach that is already in place?
[update 2010-04-17T19:09Z I broke up the first paragraph because it did not flow well. This allowed me to embellish the situation with more unpleasant historical facts. It is appalling to see how many years it’s been known that disclosure of hashed copies of passwords is a practically-attackable vulnerability]
In today’s Techflash Research post, Todd Bishop wonders why it is that Seattle is only #14 as a “Mac Metropolis.” (The catchy term is used in the report summary that Bishop links, and it is hard to resist repeating even if that is not what the report is about.)
First off, the Apple Market Ranker analysis that Experian Simmons summarizes is about owners of Apple products, not Macs. The basic question is, if you scratch a resident of one of the 206 Designated Market Areas (DMAs – don’t you just love being sliced and diced by market analysts?) in the United States, how likely is it that they will own or use an Apple Product: an iPod, iPhone, or Macintosh computer.
OK Ed, Let’s Wow ‘Em with the Numbers
The most impressive number that I see is that fully 21.6% of all adults nationwide own or use one of these products. I don’t know about you, but even if the iPod dominates the “ground truth” behind this statistical estimate, I am impressed. I’m sure that Apple stockholders smile and rub their hands in glee over what the iPad launch may do to these figures. For Apple executive management, on the other hand, I would consider this a cause for concern with regard to the prospect of market saturation. The iPad would be urgently-welcome as well as a potential market broadener.
In the San Francisco – Oakland – San Jose DMA, the gravity well of Silicon Valley (the red giant) and Apple (the blue dwarf), the figure is 32.3%. This is transformed into the wonderful statement that the adult residents of this DMA are 49% more likely than the average adult American to own or use at least one of these products. Well, sure 32.3/21.6 = 1.495 so we see where that more-dramatic figure pops out. If this were an election we’d say that Silicon Valley leads the nation by 10.7 points, but I guess that is not so sexy. I’m not sure what any of this numerical magic tells us, but let’s play along with the idea that it provides something useful for people who worry about life in the DMAs and how we might discretely dispose of our incomes.
The analysis continues through the top 10 DMAs (4 being in California) by this measure of Apple friendliness, with Boston (the Semiconductor East) at a close second and with Las Vegas at 27.9% as number 10. Starting with number 3, San Diego, we are told the populations of these DMAs (and the full report lists them all). For example, #4 New York weighs in with 30.4% of nearly 16 million adults and an observation about the observable presence of the iPhone. In contrast, the 8th and 9th ranked DMAs have adult populations of less than a million each.
Those Modestly Successful Puget Sound Folks
If you go through the registration to download the full report and find out about the Seattle-Tacoma DMA, you’ll see that we are #14 with a population of 3.6 million adults. A mere 27.4% are estimated to be Apple users (26.9% above the national average, if that fills you with regional pride). The largest DMA of those closest to the national average (and why not proud of it?) is #57, St. Louis Missouri, with 2.4 million adults and 21.5% estimated Apple lovers. For perspective, I note that 152 of the 206 DMAs come in below the national average for Apple Love.
Looking at the map that is provided in the full report, the Seattle-Tacoma DMA is at the heart of an Apple Love territory that spans the Vancouver BC – Portland Pacific-to-Cascades corridor, along with the I90 wedge to Spokane. We’re among friends.
Keeping Steve Ballmer Awake Nights
It’s not clear to me what this tells us about existing markets and market opportunities. It would be useful to know what proportion of those same populations own or use any device of the kinds that Apple sells and for how many of those none (and all) of them are made by Apple. Obviously, economic conditions, educational achievement, and infrastructure in a DMA also matters. There might even be a market differentiation among liberal (the “rest of us?”) and (economically-)conservative communities.
When the unpenetrated market consists of the owners of your competitors’ products, life becomes more difficult depending on how much people do not readily churn their discretionary possessions and favored brands. Still, the king of the mountain has to always sleep with an uneasy crown. If you are a pretender to the throne, I suppose having to fear disruptive forces other than your own is a condition to look forward to.
And a Little Reality Seasoning
The ZDNet report on personal-computer sales just reached my inbox: “Gartner: Apple sells 1.4 million Macs in US; captures 8% market share.”
To explain how these numbers are so widely different than that wonderful 26.1% of adults, nationwide, it is important to understand that market share is not about what folks own or use, but what was sold. The market’s 100% is all of the sales in a particular timeframe. Because sales of personal computers in 1Q2010 are 20% better in units sold than 1Q2009, the market is spoken of as having increased by that much. Notice that the statement is not about the revenue or the profit from those sales, which might sort out quite differently.
From this perspective, Apple sold 34% more Macintosh computers, moving from 7.2% of the units sold to 8.0% of the units sold in the most-recent quarter. HP and Dell still dominate with over 50% between them but their unit sales did not grow as much as the market, which grew about 20%. The sleeplessness at HP and Dell is of a different quality than what has Apple bounding out of bed every morning. (Whether they made up for it in cash rather than volume, we won’t know from the Gartner analysis.)
To estimate the fuzz in all of this, the ZDNet article also reports an IDC finding that Apple grew its sales but lost market share against the total market (which grew more). There is not enough information to know which are oranges and which are, uh, apples, among these comparisons. It could be that Apple lost market share worldwide, since Macintosh penetration is apparently not so hot outside the United States while HP and Dell maintain their positions globally. [Update 2010-04-19T01:20Z It’s worse than that. According to the IDC Analysis, Apple doesn’t even show in the top five world-wide, and their growth in the US was below the 18.4% of the total market. An 8.3% growth in Apple computer shipments left them down from 7.0% to 6.4% of the market. IDC describes its report as counting shipments and determines market share from that.]
Perhaps the oddest reporting of these latest figures for personal-computer sales is the underplayed fact that Toshiba sales grew faster than Apple’s, taking away 4th place. Acer did even better strengthening its 3rd place position as well. These two can be credited with capturing most of the market growth between them. Although this phenomenon is noted in the ZDNet article, Apple gets the headline and the lede. Interesting, aye?
[Update 2010-04-16T22:46Z Something lead me back for a second look, adding a paragraph about the far-superior Toshiba and Acer performance as of 1Q2010. The Tablet derby through to the end of 2010 is going to be fascinating.
… You visit a site, create a comment, and
… You attempt to register at a site, and
… They prefill a form with your user name or e-mail address
… They will take an OpenId
… They insist on inviting your automatic Disqus logon if the cookie is spotted
… You can’t find your password and you seek their help
You may notice that I have stopped using Technorati tags, since they seem to have no effect whatsoever and I haven’t figured out how to have them make a difference with any alternative source of tags. I should figure out de.licio.us, I suppose, except in that case I should first figure out why my de.licio.us feed has stopped.
I also use categories, well no … I use Blogger Labels which are sort of like categories except it is hard to find out what they are and place a current list and links on my sidebar. Blogger backlinks and Blogger labels remind me of the propensity of some Microsoft developer types to do-it-their-way when there is already an established practice out there. Yes, developers just want to have fun. But inflicting their NIH syndrome on the rest of us is not OK. Go do that in the privacy of your own home, please.
For the labels, I think I will periodically post a message that simply goes into every category I have used (Windows Live Writer knows what they are), so I can remind myself not to make up more and maybe even prune the list where I tend to always use multiple labels in combination.
Aren’t you happy that I have spiffed up this blog to the point that it serves as an invitation to my regular blogging on whatever strikes my fancy in the moment? Just wait, there are five more blogs and I have a great deal of pent-up blogging from my 18 months nose-down in document-standards work.
In Orcmid's Lair: February Frights Redux: Unification for Creative Destruction, I commented that I am in a death match between decline of my web-development machine and May 1, 2010, when when Blogger ceases publishing via FTP to my own domain and hosting service.
The laptop is now on life support and, so far, has not entered a vegetative state. But it can't sit up and stand on its own any longer.
Meanwhile, I have been working to unify my Blogger templates around one single "classic" layout. That has been interesting.
Testing Without Ruining the Blog
For dressing-up the sidebar and tidying up some aspects of the blog posts, I was able to confirm template changes using the template-preview provisions of Blogger.
Straightening Out the Archive Structure
It became trickier when I decided that each blog's archives should be in a separate folder. Orcmid's Lair wasn't done that way. Its archive pages were at the same level in the blog folder as the main page and some supporting items. Fortunately, I discovered that the archive-list pull-down would automatically change to reflect the new location once I said archives should go into a separate sub-folder. Then all I had to do was move the existing archive pages to the sub-folder to make the pull-down be true.
I now must remember to republish those few pages that have an older version of the pull-down. In this case, the blog is a little-bit broken, but easily fixed.
Being Conditional About Comments
The next tricky business was creating more cases that were conditional on which page was being generated.
In the past, I had full comments show up everywhere there is a copy of the related post. I decided to simplify and have comment detail only on the individual post pages. The version of an article on the main blog page, and in archives, provides a count and a link, but no comment content.
This became tricky to test because the template preview mechanism only shows what happens to the main page. To see what happens to posts, I must install the template and create a post or repost to see the effect on the individual post itself.
I can cause a repost, usually, by adding a comment to the post. If necessary, I can also change the conditionality to see what everything on the main page, but that is not a complete verification.
And Then There’s Backlinking
Even trickier was seeing how support for links would work. Blogger has a feature called backlinking that will report about other blogs that link to this one.
I don’t think it is exactly a track-back mechanism. I'm also not not sure how it works for blogs that are published via FTP.
To test whether backlinking is operating, despite Blogger indicating that my blog is backlink-enabled, I need to create a blog post that links to another of mine, and see what happens. That is the provocation for this particular post.
Also, I am using the BlogThis! pop-up that is provided if the "Create a Link" link is followed from one of my blog posts. This seems to be one way for Blogger to notice that a link to a Blogger-generated post is being made. Once this post is up, I can also recall it into Windows Live Writer and see whether I could have done it from there too.
Well, BlogThis! does an awful formatting job. I recalled the post into Windows Live Writer to touch it up as well as see if there is any special indication that this post is linking to another. I don’t see anything.
I’ll repost now and then see if I have to publish from Blogger itself to have backlinking be noticed.
Oh, and By the Way
While I have been rooting around in tweaking the individual blog items and how comments and backlinks appear, I noticed another problem. The permalinks on comments don’t work. I have attempted to use an alternative way for creating the backlink, but there is something not happening. I will have to look at the source-code of the generated HTML pages to figure this one out.
Technorati Tags: open standards, Microsoft, OOXML Format, ISO/IEC JTC1 SC34 WG4, IS 29500, ECMA 376, OOXML strict, OOXML transitional
I was startled to see the level of passion in Alex Brown’s 2010-03-31 post, Microsoft Fails the Standards Test. Alex has two concerns: (1) dwindling OOXML standards-maintenance attention and resources; and (2) Microsoft silence with regard to implemented support for the strict level of IS 29500 and any retirement of the transitional level as the only level supported in Microsoft implementations of OOXML.
Perhaps the most level-headed analysis is the “Wow” from Andy Updegrove in his 2010-04-01 post, Alex Brown: “Without action, the entire OOXML Project is now surely headed for failure.”
For me, the most peculiar aspect of the reactions I see is not that Alex has the concerns he announced, but that others treat his expression of concern for the future as a declaration of the actual present. Furthermore, these observers who proudly pontificate that there is no action, there will be no action, and there was never going to be any action, excitedly congratulate Alex on having awakened from being hood-winked. It is as if nothing has happened since 1998 and the book is closed on Microsoft forever.
Expectations Against Observable Reality
I want to look at just one part of this situation: expectations around IS 29500 implementation in Microsoft products. The desire to have Microsoft abandon to-be-deprecated transitional provisions of IS 29500 in favor of producing only documents in the strict IS 29500 format is tied into that expectation.
I am eminently qualified to address this topic. I have no information on what Microsoft is actually doing to incorporate support for IS 29500 in its products. I have no idea what strategy Microsoft has, if any, with regard to the retirement of support for IS 29500 compliant transitional documents. Microsoft doesn’t tell me anything about product efforts and I am happy to keep it that way. Microsoft doesn’t seek my advice on the matter either. So I am perfectly positioned to speculate, with my standing as a standards, interoperability, and architectural armchair astronaut unblemished.
What I am going to report is my observation of the simple state of affairs and how difficult it is to erase the past and jump to implementations that only produce what are called strict IS 29500 documents. Expecting that to have been achieved in two years is about as unobservant as belief that all Microsoft needed to have done was adopt ODF as its native format in the first place.
Can I Has
|You are navigating Orcmid's Lair.|