Thursday, July 9, 2009

Why Steam isn't for me (and shouldn't be for you either)

There are two things missing from digital downloads:

1. The physicality of it. The smell, the tactile sense of realness, of being something and not nothing.

2. The trust that we can still use our purchase in a year, two years, or even in 5 or 10!

The first one cannot be mitigated, I don't think. If you don't have something physical for your expense, then there is simply no way to replicate that. Its like reading a book from a PDF / web site / kindle, vs. having a book in your lap that has that book smell, that weighs actual pounds or ounces, that you touch. (we have more senses than just eyes and ears, and we have evolved a great deal of our brain/nervous system to connect to those senses, so when they're missing, we're missing a significant part of our connection to the thing).

The second one is all about retailers / distributors realizing that they're shooting themselves in the foot over imaginary losses. The fact of the matter is that most folks either have made a conscious choice to support those whose services and products they use, and those who are lacking either the wisdom, the experience, or the moral consciousness to make that choice and steal whatever they wish. Or perhaps they are simply unable to afford it for legitimate reasons - i.e. real poverty, reasons beyond their control that keep them from having the basic means to afford such things. However, those same people would not be paying for the product regardless of DRM anyway, so they aren't any more of an argument in favor of DRM than those who simply are thieves at heart.

Making it hard to use the things purchased, and punishing those who do the right thing by giving them a less reliable product that has a lousier experience for the customer is stupid in the extreme, and it will take time for our species to really groc that and then to publish in a way that adds value to the paying customer, instead of removing it.

Steam is amongst the worst offenders (although there are worse, D2D being an obvious example). Steam has many fans because it is easy to use, looks good, has an excellent breadth of content, and is centralized.

However, Steam is a form of DRM - requiring that it phone-home in order to authorize your use of those things you've 'purchased' through it whenever you wish to use it.

This, to me, is a deal-breaker. So long as I am able, I will always be willing to buy my software, games, and music, etc. But I will never buy DRM laden "pay for rent" type software, games, and music. I want to outright buy my own copy, just like I always have from a brick and mortar store, that is mine within fair-use law, until the day it is so old and incompatible that I cannot find anything that plays that type of media anymore (and that had better be a good long time).

DRM is as morally objectionable to me as is piracy to those who've labored and endeavored to create something creative and wonderful only to have someone steal it and abuse their sweat and tears. Its hard to make something as wonderful as a good game, a good piece of utility software, or an engaging album, and anyone with even a modicum of consciousness should be able to recognize that and be willing to pony up a reasonable amount of cash to pay for the thing they're so enjoying or getting use out of.

It takes a really low and mean individual to believe that they are offering me a reasonable exchange of goods when they "sell" me something that is entirely controlled by them, and includes a self-destruct code and built-in obsolescence (i.e. when the authorizing server goes defunct, so does my DRM laden product).

Its not an honest sale of something (even a virtual something) when DRM is involved. Its a dishonest cloaking of "for-rent" under the guise of "for sale"... and we all sense it at some level and we feel uncomfortable about "buying" that. This practice of selling DRM laden products is wrong. Just as wrong as stealing.

Thursday, June 25, 2009

Never use DLLs for code

Unfortunately, this advice won't be usable to some developers. For folks who are writing Active X controls, you're inherently forced to do exactly this: create a DLL which contains code.

However, for those of us who are not trying to create reusable binaries for use in an environment that requires Active X - which I suspect is still quite a significant audience, DLLs should simply never be used for anything other than localized resources.

The reason is simple: DLLs create DLL hell.

There are multiple layers to the issue of DLL hell, various different unfortunate outcomes and scenarios, all of which should be enough in and of themselves to dissuade you from using them in the first place. But when taken together, it blows my mind that anyone still thinks its a good idea to build them, and use them, ever (unless they're absolutely required by the technology in question).

The only reason they exist, is to create a common library that can be drawn upon by many clients at runtime, thus reducing the memory footprint of all of the running applications that rely upon that common code, since only one copy need be loaded by the OS, and then simply mapped into the address space of each client that was linked against it.

But, the thing is, that was a reasonable need in 1990, when Windows 3.0 was released. Memory was indeed very limited, and even the delay of having to load from disk a separate copy of the DLL for each process was a significant performance hit for the OS. DOS compilers & linkers had been using similar techniques for a few years, to squeeze extra usage out of the very limited address space afforded to the 16 bit programs of the day - such as Borland's Turbo Pascal which was able to map various units into a single shared memory section, swapping out which unit was loaded into that section dynamically, at the control of the parent software. This was a clever way to make larger programs than previously possible, and Microsoft's creation of a built-in model to support that in Windows 3.0 made perfect sense.

However, Its 2009. Nearly two decades later. Memory is dirt cheap. Machines regularly come with 4GB or more. Every single process in the machine can easily afford to have its own copy of every library that it needs to function, and still afford vast amounts of ram to data and disk cache and every other function the machine needs to operate well.

And over the years it has been proven time and time again that software which relies on external DLLs for code is far more fragile than those which do not. Microsoft has even had to retool the OS to explicitly account for the myriad issues surrounding DLLs (side-by-side assemblies), and the reality that software can't easily share common DLLs in practice because of subtle (and sometimes not-so-subtle) incompatibilities between versions of a common DLL. One DLL uses one STL library, and expects map<int> to have a certain size and structure, while another DLL was built using slightly different libraries or compiler or linker options that resulted in a map<int> that is not binary compatible with the other one. Suddenly, moving data between the main program and one or the other of these DLLs can lead to slicing, or to invalid memory access, or simply data corruption at the lowest possible levels, which inevitably leads to bad data or (if you're lucky) crashed applications.

The motivation to use DLLs makes no sense. It has no place in a modern computer. Its a nice theory, a fine vision, but one which in practice is a pile of paper cuts that adds up to a lacerated face... the lacerated face of your customer, bleeding all over and wondering why in the 9 hells they every bought your software!

Simply create static libraries, and link against those. It still gives you a simple way to create and maintain common code amongst multiple projects. It gives you the advantages of writing common code once, and debugging it to the benefit of all client programs. And when you fix or update some aspect of your library, you're updating or fixing every client program that relies upon it. But you're not creating a scenario where testing your software on one system is totally arbitrary and unrelated to how it might perform on another system due to different versions of a common DLL. How can you feel confident that your software is debugged if you don't even know if its the same software you're running on two different machines because whole chunks of your code may be different between the two depending on what other software was loaded on one vs. the other! And how can you rest easy knowing that depending on what other software your customer installs after yours, your software may stop working at any time - suddenly using a new version of a common DLL that you haven't tested against, and may well exhibit new and difficult to diagnose or even recognize bugs.

Again, the motivation for DLLs is long, long antiquated. Its a hang-over from an age long gone. Only the absolutely most anal-retentive amongst us would think that it has any place in today's world of software design and distribution.

DLLs are still great for dynamically loading different resources - dialogs, fonts, etc., at runtime. But they have absolutely no business loading code at runtime. Computers are already complicated enough, and already rely upon too many variables - such as the patch level of the OS itself, and the state of the registry, and so on, with out needlessly adding additional moving parts in the form of core-code that your app needs to run varying over time or machine that you're installed on.

Do yourself and your customers a favor: insulate yourself from DLL Hell - simply do not ever use DLLs for code that you write, and always supply compatible versions of 3rd party DLLs in your application's executable folder with your installs, so that your software will choose those over any other installation's copy of the DLL, and your software therefore will continue to run as correctly a year from now as it did when it was first installed.

Thursday, June 11, 2009

Question: "Do you approve or disapprove of the job President Obama is doing in office?"

Anyone would have been so much better than Bush that its hard to speak out against Obama. In many ways, he's got my approval.

But fundamentally he's more "more of the same" than he is of "change."

The stimulus is going to be an anchor on our economy for at least 20 years, and will become a battle cry of Bush-like people, the neocons and other even more wrong-headed folks, to allow them to tear down the valuable social services and protections for our natural resources, and to further prop up the old-money in this nation.

His continuation of our Imperialistic policies in Iraq and Afghanistan only continues to inculcate the young into Al'Qaeda and similar fundamentalist, reactionary radical organizations, which further erodes our economy and our political clout on the world stage.

He's continuing bush Jr's legacy of covering up our extremely questionable and anti-democratic actions at home and abroad, suppressing the release of evidence of governmental wrongdoing and protecting complicit companies from prosecution.

He's got a tough job, there's no doubt. But principles of freedom, democracy, and accountability should not be negotiable!

Studies have repeatedly shown that throwing money at problems doesn't work. Some ungodly statistics of failure come from throwing cash willy-nilly at things, especially governmental monies where they have only the most tenuous connection to performance and market viability.

Companies founded with Governmental grant money almost always fail, from what I understand. The role of government money should be in pure research and in social services, NOT in financially stable for-profit companies. The two do not mix in anything you might name as 'successful'.

The banks and fat-cats involved in the mortgage debacle and the ensuing stock market crash needed to fail. The market *needed* to correct itself. Those folks should have lost their shirts, pants, fancy boats, planes, and 3rd and 4th homes. The government should be helping out those who are screwed through no fault of their own: folks who have a solid history of being productive people living at the edge of being able to afford
their homes. Loan guarantees, collecting the bad debts even. But the process of throwing money at the problem which more than less has saved those very folks who most deserve to lose everything due to their greed saves them from learning any lessons from their wanton greed and gluttony. They've Learned Nothing.

And because they've learned nothing, and because we, the schmuck tax-payers have paid the bill (and all of us Americans and most of the globe) will continue to pay this debt down for the next 50 years, we've doomed not only ourselves to a horrendous burden for the sake of the wealthiest top 1/2% of the world, but we've further ensured that the lesson that they've learned is: "Be too big to fail, and everyone will bail you out ensuring your reckless gluttony."

American Politics are strife with unbridled greed and self-interest at the expense of the greater whole. Despite current appearances, there have been times in our history and in the history of the world where
politicians are at least embarrassed by their obviously unethical and selfish actions. Times when the masses have recently had an uprising against those in power and humbled them, reminding them that their position of power is predicated upon the people's willingness to go along with the social contract.

That hasn't happened in entirely too long in this nation, and in most of the so called 1st world countries. Politicians have become openly self-serving, openly flaunting their wealth and power and daring anyone to do anything about it. And the sad thing is, we generally don't. We just shrug our shoulders and figure its business as usual and the world is corrupt and always will be.

But change... real change... will come to America, and to Europe. China is rising fast. India is becoming more and more a player. And the damage we are doing to the planet, to our very ability to survive on this rock in space, is rapidly moving towards mass-self-extinction.

The question to my mind is whether we will do as we've always done: shirk our collective responsibilities and ignore the problem for as long as possible (which is often too long), and end up a lot of rotting corpses on an overripe planet? Or will we take the lessons of community and greater good to heart and start acting like we give a damn?

Obama is a good man... in much the same way Bush was a good man... they both honestly believe that they are helping the world. They both have strong ideas as to what the best way to do that is. But Bush Jr. was laughably naive, probably the dumbest individual to hold the office of President of the United States in our entire History; and in the name of loyalty he handed most of the reigns over to Dick Cheney, who was a cynical, power-mongering neocon, who worked unabashedly for his own personal and ideological self-interests and fuck everyone else, in this nation or abroad.

I prefer Obama's form of goodness. I think its closer to honesty, and further from the rampant self-delusion of Bush Jr.

But in the end I strongly fear that the road to hell is being paved with a 1 Trillion dollar money-printing binge which will falsely inflate the economy and allow us all to continue to ignore the looming threats to our very existence.

Global warming - or more accurately: the exponentially accelerating consumption and exploitation of the worlds finite resources polluting the atmosphere to the point that we cannot sustain our very existence, leading to massive, global, cataclysmic extinctions of the Human race, and most higher life-forms from this planet, effectively resetting the clock to the last time of the Dinosaurs demise. We are the comet, the super-volcano, the gamma-ray burst to end life as we know it on this planet. And it is our very predisposition to selfishness and greed that is rapidly becoming our self-undoing.

Economic ruin - which is really just another way of expressing the above: rampant self-interest, consumption of everything that the planet has to offer
, is real and is here. We are trying to live in way that is predicated on the assumption that there is always more basic materials to acquire, in order to make more for less. This is untrue. Its been untrue for ever, but we're rapidly approaching a sort of cliff or wall: where there simply is no more oil, not enough arable soil to produce the food, where we cannot distribute the food we can grow because we no longer have the gasoline to run our engines with, where we simply cannot feed ourselves. And yet we persist in talking about the economy, as though its disconnected from the things we do, from the health of this world. We've long been guilty of pretending that our economy and its exploitative health doesn't directly harm those in so-called 3rd world countries (economic slaves, by any other name). But it doesn't just harm many of our fellow human beings... its harming the very land we stand upon, the air we breath, the water we drink. Its killing us.

I am very glad to be alive. Its an amazing thing to be conscious, to be aware, and to think about how to solve these problems. I am deeply grateful to live this life.

I only wish more of us were willing to put our own narrow, selfish, small-minded, pathetic self-interests aside and think beyond ourselves - to include the entire planet in our sense of "self".

There is so much publicity these days about "think global, act local". But really, just think global. Act global. Without many, many voices... and voices with teeth... we're just going to do more of the same.

Democrats and Republicans are far more alike than different. They use a lot of rhetoric to try to spin the tale that they're opposites, but the truth is they're fraternal twins. Both work for their own self-interests first, then that which keeps them in power, then that which looks like its helpful to the world but which is in fact a thin veneer which serves to help themselves to remain in power, and so on ad nauseum.

American hegemony is the real name of the game to the Democrats and the Republicans. Both remain entrenched
in a view of the world that hasn't been relevant since before WWI. We're not a nation competing with other nations. That's just another illusion that galvanizes the plebes to get behind those in power and assure them of their ability to continue to live in luxury, unquestioned and obedient.

We are one people. One world. One collection of interrelated species of plants, animals, everything that lives and contributes to our existence on this blue-green marble. And without some balance, some dance of interdependent sustainable coexistence we're on a one-way ride to mass extermination, ourselves at the front of the roller-coaster to hell.

When I see on-the-ground true actions which are in alignment with the higher principles of "we're all in it together," and not just give lip-service to these ideals, or even consider them ideals all, but rather necessary survival mechanism, then Obama - or whomever is willing to go the distance - will get my full approval.

We're smart enough to recognize the elephant in the room... but so far we seem to be too lethargic and small minded to fix it.

I hope that our demise is slower than I think it will be, but fast enough that it will genuinely scare every man, woman, and child upon this planet into thinking of something beyond their next meal, next pleasure, next "me me me" moment, so that we can truly move on from being a world dominated by the emotional equivalent of two year-olds, into something more humane.

Will you work with me? Where do you see things going?

Monday, June 1, 2009

Whatever happened to simple?

Have you noticed a trend in the world of Programming, Computer Software, etc., towards making things more complicated than they really are?

I can understand this trend in terms of my experience as a programmer. When I was young, I thought I could pretty much make anything work, and that the time it would take to make it was always fairly short. "I can make that!" Accompanied with "That's not too hard, maybe a few hours of programming!" And time and again, my estimates were way off, and things took much longer than I had anticipated, and often bloomed in ways I had not foreseen, into complexities and code-necessities that required time and thought to handle.

Add to that the typical response of non-programmers to every feature: "Build that! And have it done yesterday!", and one quickly comes to the conclusion that things are harder than they seem, more complex than they ever appear at first blush, and a self-preservation mechanism of "default to more complex / difficult assumptions" so that you have the space and set expectations that might actually match the end-result, keeping yourself out of the cross-hairs of "you're late with X feature" criticism from management.

But on the other hand, I am more and more seeing examples of folks who are either over-compensating for these factors, or who are simply coming at everything from a hyper-conservative position, making it very hard to see the simple in things.

When I began my career, Text-based GUIs were just becoming available. Programs such as Borland's fantastic Turbo series, and PC Tools suite, and Copy II PC. Much of the standard fair of GUIs of today found their genesis (at least from my experience) in these software suites. The edit box, the drop-down menu, the combo-box, the radio button, check box, push button, list box, etc., all began their life to my eyes here. And they were generally very well done. They were consistent, and relatively intuitive. These software suites were by and large simple to use. And it was fairly common to see some new innovation, some slight twist on a theme, that made using them even easier and more powerful, such as begin-typing-jump-to-first-match in a list.

But it seems to me that much of this "try it and find out" mentality has been replaced with a "form a committee to study this issue and do focus groups to determine an interface's value" type approach. And this sort of hype-conservative approach stifles innovation, and to my eyes at least, leads to a sort of dumbed down, least common denominator approach to user interfaces.

I don't know about you, but my experience of "design by committee" is that of a sub-par experience, a sort of group-stupid applied to what should be fairly trivial and obvious decisions. It seems to me that it often invokes the "what color should we paint the shed" sort of arguments, where there shouldn't be this level of discussion and focus on something so trivial.

Essentially, the preconceived notion that every decision is not a trivial one, that there's no such a thing as a simple feature is a dangerous slippery-slope in and of itself. And here's sort of the seminal argument for why a simple feature isn't: How many Microsoft employees does it take to change a lightbulb?

And although I respect that, in fact, most things are more complex than they appear at first blush... I hurts my brain to see such overwhelming odds set up against any sort of trivial and useful innovation as a type-ahead search, or perhaps more compellingly, creating a security model at Microsoft that isn't ... well ... retarded.

Some things are simple. And they need to be left to a single developer's common sense to implement them. Its great to at least review such things on a local level - maybe the group to which that programmer is assigned. But having to have every single minute bit of trivia examined as though type-ahead and security UI are of the same depth of focus and importance is.. crazy. And that way lies stagnation and poor designs.

Two examples come to my mind when I read such blogs. First: Microsoft Vista's "shut down" button, and second: Vista & W7's "Security UI". Someplace I read a Microsoft blogger's scathing post on the process behind the Start-menu's shut-down button. How that it went to multiple committees, was changed over and over during the various stages of Vista's design, and ultimately was left with an unfocused, incoherent design. Clearly a victim of Parkinson's Law of Triviality.

But it pains me to think that the many intelligent and sometimes brilliant people at Microsoft have such a poor grasp of what is meant by "Ordinary Users". Ordinary users do not feel threatened by whether something is grouped or ungrouped on the task bar! Lol - they don't even, for the most part, have any idea how to use the task bar. Raymond's example is laughable. It sounds absolutely nothing like any ordinary user I've ever heard. And yet these are the self-same folks who believe that they understand what ordinary people perceive and how they think about computers.

But when you actually sit down and use a computer, be it a Vista machine or a Windows 7 machine, and are constantly bombarded with "This action requires Administrator permissions" dialog, where you have to type the password for the admin account over and over to accomplish even day to day tasks, with no option to perma-authorize anything, you come to the inescapable conclusion that the folks at Microsoft behind this UX as they like to call it (User Experience), must all be on some serious drugs, or more likely, utterly self-delusional. Its so far from a functional, usable interface as to make me want to shame them in front of the entire class. I'd give them an "F". Hell, I'd post it in the school cafeteria's wall for all to see. I'd hold a mandatory "school-rally" to highlight the vast failure that is their security, and I'd want to interview the many folks involved in the course of creating this albatross to try to come to grips with how such an Elephant comes to reside in plain view without anyone commenting on how incredibly retarded it is, so that we can learn from their mistakes and never, ever, ever make this colossal of a stupidity again.

Or do you find that you side with the "there is no such thing as a simple feature" folks? Do you find Microsoft's UI changes in Vista and Windows 7 to be genuinely better and more usable than their overall UI in XP? Would you prefer to see every decision go through committee? Or do you see a better way?

Friday, May 29, 2009

Game Review: Empire, Total War

Like most users (not critics), I'd give this about a 7. Its very well made, with wonderful craftsmanship, attention to detail, and a love for the genre from the developers shines through.

But it has one major flaw: Its BORING. Unlike the earlier masterpieces in this series: Shogun, Rome, and Medieval; Empire is just not that interesting to play. I blame the nature of warfare during this time period. After all, the units are simply all too similar to one another. They all have guns, they all stand there like total idiots and shoot & get shot. And the difference between a rookie unit and a seasoned infantry division is just too meager in terms of game play. So they all pretty much just stand there and shoot at each other... very little by way of tactics that I can make out.

Cannons are terribly unsatisfying. They're horribly inaccurate, and they don't do much to intimidate or demoralize the opponent, which was their primary effect as I understand this historic period. Which makes them seem quite pointless, when playing the game.

Overall, the shift of focus is away from the excellent tactical battles and into what has always been TW's weak suit: the strategic game. And although there have been considerable advancements in the strategic game, it still comes off as more of the same old same. It doesn't really feel like you're ... well, I'm not sure what you're supposed to be, in game terms, but whatever that is, its vague and ill-defined, and doesn't feel very powerful or interesting. Just mind-numbing and repetitive.

I would be remiss if I didn't at least mention the new naval real-time combat. Its certainly interesting, and absolutely gorgeous! Just watching your men man the top'sails is a sight to behold! The light reflecting off of the best ocean water rendered / simulated in a game to date in my personal experience of playing 3D games on a computer. Truly scrumptious.

And of the 3 main aspects of Empire:TW, the naval combat is probably the most interesting. Its challenging, and so gains some fun aspects. However, ultimately naval battles are relatively small part of the game overall, and they don't rescue you from the other two aspects: strategic play and land battles, and they're just too tedious for my tastes.

And this from a guy who LOVEs TW: Shogun, TW: Rome, and TW: Medieval 2.

What was your experience of the game, and had you played any of the earlier games in the series?

ADDENDUM: One thing I should have mentioned is how disappointed I was to find that having schlepped over to a brick & mortar store (Games Stop) in order to buy a real, independent, fully available and owned copy of the game; I returned home only to find out that the thing I spent money on was only really a bunch of wasted empty plastic with a software authorization code for Steam! I not only didn't get my own copy of the game, but I had to spend 4 hours waiting for the stupid thing to download from Steam!!!

Sometime, I'll dedicate an entire bog to how much Steam sucks. Yes, it works for many folks, but No, it doesn't work for me, nor for a lot of other folks. To each his own, but in this context, as is becoming all too common, I don't get to have my own, or any say in the manner - even going so far as to trick me into buying one thing only to really be another, that I didn't want! That's W-R-O-N-G. And between this latest edition being a lack luster effort by The Creative Assembly, and the (to me anyway) onerousness of exclusively shackling themselves to Steam, it will likely be a long, long time before I throw any money their way again.

Broken Metaphors, Broken Promises

So I decided I'd give Microsoft's Windows 7 a fresh try, doing a full install from scratch and choosing only their desired defaults, in an attempt to use it "the way Microsoft intended." I figured that perhaps I was getting a bit stuck in my ways, and that perhaps some of my ways of doing things are outdated our outmoded and I should try doing things as though I were learning how to use Microsoft's Windows for the first time, and let go of the past. Use a fresh pair of eyes.

I got the system up and running quite easily. Microsoft makes install a painless affair these days, which is much appreciated. Its a straight-forward and simple interface for choosing which drive to install upon, and deleting any existing partitions is quite simple, as is recognizing which drive is which, when more than one is plugged in. These are huge time savers, and give me the sense that what I am doing is easy, and simple, which keeps the anxiety down to an absolute minimum. Perhaps this is one of those truths that is under appreciated: a simple interface that gives you the freedom to be in control directly reduces anxiety that you may make a devastating mistake (like obliterating the wrong partition or installing on the wrong drive). Overall, the install is a lovely thing.

Once the system was up and running, I had a very easy time with device drivers. As in, it came equipped out of the box with everything my systems needed (that's right, systems, plural. I actually installed to an older Core 2 Duo E6600 on an Asus mobo, as well as to my newer i7 920 on an MSI x58 Pro mobo. And in both cases, all necessary devices were found & installed without any aditional web searches, or appeals to a non-extant floppy, etc... they were simply there, out of the box, like I would always want them to be).

So in terms of installation, Windows 7 makes an excellent showing for itself. There are some short-comings. Mainly in terms of missing features which I have desired since Windows NT: namely, the ability to set different parts of the OS to use different partitions or different drives to store different parts of itself. For example, I have often wanted to set where the user's folders should go. On the OS drive? Nope, I would much prefer them to be on a larger drive, so that I can spend a small fortune for the boot drive but keep pretty much everything else elsewhere. Same thing for the Program Files. I would love to be able to place these on another volume, so again, the OS would be on a super fast but smaller boot drive (say an SSD, though in the past it would likely have been a 10K RPM drive), while everything else would be placed on other drives, and not necessarily all on the same other drive. For example, how about C:\ is the OS, D:\ is Program Files, and E:\ is User's folders. The pagefile would be another thing that should be configurable at installation, though this is more minor these days given that the OS can have that reconfigured after installation.

Since I was doing this installation with the intention to "do it as Microsoft intended", when I was presented with the "type of network" I chose "Home Network" rather than skipping this phase. Once things were up and running, I connected the two machines to the same homegroup... Microsoft's newest spin and interface for sharing files & folders on a private LAN. This process was quite easy, and again I applaud the programmers and designers involved in making this a fairly obvious and simple configuration step.

I began to get exicted that perhaps, at long last, Microsoft had built an abstraction that would actually allow ordinary human beings to easily set what files & folders should be shared, and have some confidence that they were actually limited to those folks that they intended, to whit, those in their local homegroup. Poking around the interface in Windows quickly showed me that I could share any of my libraries to a homegroup, and that I could easily add additional folders to a library! Voila! I had a good set of abstractions for both my local machine and for accessing those same resources across the network: just place them in a shared library, and I'd be good!

So I added E:\Music, where I've got my 55GB of music that I've ripped from my CD collection to my Music library. I was able to easily set the library to save new music to the E:\Music folder, rather than my Users\...\My Music folder! Good, this is shaping up to be a useful feature! Now all I needed to do was to see if I could access this stuff easily from the second PC, the only other machine in my homegroup at the moment...

And this is where Microsoft's abstractions and interfaces fell flat on their faces. I wish this was a new experience, an anomaly, a fly in the ointment. But frankly, I've been a sometimes fan-boy and sometimes critic of Microsoft since Windows 3.0 (I started out as a major fan-boy, dead set against the lame OS-2 interfaces, and the even lamer GEMM and other similar "GUI/Multitasking" OSes of the time).

When I logged in to my other PC (the E6600 based machine), I was able to see my main PC instantly! Good! This itself is perhaps a huge improvement over Vista. Plainly stated: Microsoft's file-sharing code has been abysmal since they killed NetBIOS / NetBEUI in Vista (really since XP, since they deprecated it there, which meant that unless you manually installed it and configured things right, XP acted just like Vista and failed to see most of the machines on the network, and whenever it did happen to see a machine, it would often have long inexplicable delays when navigating to a given network share or machine).

I then opened the Music share, expecting to find the 55GB collection of music hosted on my primary PC, only to see a library / share that doesn't contain the folder & files that I intended it to! I just wasn't there! Huh?!

So like any 1/2 way sensible PC user, I began experimenting. I went back to my PC, and made sure that the necessary folder was in fact a part of the library. Yes, yes it was. Okay, so make sure that the library is shared. Hmm... this was not as self-evident. Right-clicking on the library gave me a list of choices: Make it private, Share with homegroup read-only, or share with homegroup read+write. But there is no indication of whether it is currently shared. I realized that I could open up the control panel and dig down in there until I found the homegroup configuration applet, and that would in fact have a check box indicating that the Music library was shared. But from the windows explorer, I had no direct way to confirm this. Knowing too that there was such an entity available to my other PC indicated that it was in fact shared made it further obvious that it was... but still - no way to verify this directly, on the object itself (remember object oriented interfaces..?).

Okay, so no way to verify it, so I'll just select "share with homegroup read+write" again. Check this on the other PC: nope, still no additional music folder within that library across the network.

Go back to my primary PC. Lets explicitly issue the share with homegroup to the Music folder within the Music library to try to get Windows 7 to acknowledge that "yes, everything in this shared library should, um, yes, be shared." Go back to my other PC and check... nope. No Music subfolder. (Another note to the UI designers at MS: There isn't any obvious feedback when you share something from Windows 7 that it is... well, shared. I tried issuing the command to share this folder multiple times... because I wasn't sure I had clicked it right the fist time given that NOTHING WHATSOEVER APPEARED TO HAPPEN the first time... or the second.)

And then I noticed something on the remote PC: my main PC now had an additional share listed under it: Music with an icon like the older versions of Windows would use to indicate a shared folder: a folder connected to an RNC-style pipe-like network... in addition to the shared Music library. Ugh! So now I appeared to have shared the Music folder in question, but under its own share, not as part of the Music library on my principle PC! Yuck!

After a few more attempts to unshare that share, and explicitly uncheck and recheck the shared status of the library after adding Music to it, I eventually gave up. I assume that based on this, Microsoft just didn't allow for someone to share something through a library other than the folders that were originally there (and I assume that are physically located under \Users\, which I noticed is actually a share that Windows 7 automatically created, but doesn't appear as one when I browse the PC from my other PC).

And so we end up with a broken metaphor: A library is a Microsoft concept for agglomerating a set of folders on my PC under a common heading to make it easy to store associated material under a single navigable start point. And a library can be shared. But a library can't share those things I add to it, making it... broken.

I'm a sophisticated, dyed in the wool, hard as nails, software developer and computer user for more than 25 years now. And I can look at all of this, and find the limits of what it can do for me, and I can shake my head and curse Microsoft's ineptitude, and get what good there is out of this situation. I can even begin to plumb where they probably went wrong (in intrinsically sharing Users\ but failing to fully support agglomerated folders transparently), and this kinda-sorta makes some sense to me. Its stupid, but I get why it required additional effort and thinking on the designers parts to get this element right, why it didn't "just work" as one would expect it would.

But how do I explain this to anyone? When my wife wants to share something... how does she come to terms with it? When my in-laws get stuck on this, what is my answer to them? I'm sure many of you have experienced that moment when they look at you like you're lying to them, like you're intentionally giving them a hard time, that somehow you, as representative of all things computer, have conspired to make this thing much more difficult and unfriendly than anyone in their right mind would ever think possible! That you must be in on some sadistic secret, intended to hurt "ordinary people." And to be honest, I sometimes think I am. Or at least, I look at Microsoft and I think they surely are! I mean, how much harder could it possibly have been to make this obvious usage scenario work intuitively, correctly, all the time, without myriad bizarre-o rules and short-comings? And why even bother to have such a half-baked feature as this? Besides causing sadness, frustration, and pain in their unfortunate and unwitting victims, what good does such a miscarriage of UI-design yield?

This whole sordid episode reminds me of so many other Microsoft creations and inventions. The "compatibility folders" for Vista and above. Their so-called "security" in Vista and above. The many poorly conceived and implemented abstractions for the file system over the years, which leave users only more and more hopelessly unable to find their stuff, and comprehend what they're doing when they save a file. Or how about their convoluted and often useless system for managing file type associations?

The library metaphor is a solid one. But a half-assed implementation is a death-blow to its usefulness and ultimately to its adopt-ability by the average user.

It seems to me that Microsoft is forever trying to make things better for end-users by going in absolutely, utterly, completely the opposite direction that they should: instead of simplifying things to make them comprehensible, they add additional abstraction and complexity, further obscuring and making utterly incomprehensible what should be a very simple thing indeed. How, I ask Microsoft's designers and engineers, is added complexity ever, in a million epochs, going to increase comprehendability?

Or do you see things differently?

Sunday, May 17, 2009

Windows Games Explorer

Starting with Vista, Microsoft has given us the rudiments of a way to organize our games on our PCs. A place where all of our games can be easily accessed. Great! Perhaps I'll be able to ensure that my machine is up to snuff for my games, or a way to help me troubleshoot the endless stream of problems that arise when trying to game on a PC?

Well, sort of. Like everything Microsoft, it seems, their "Games Explorer" is nice idea wrapped in a turd of an implementation.

I can add my own games links to it... though I have to do the leg work to find most games, because the GE fails more often than it succeeds at finding the games on my machine when I install them. And when I do add a game, chances are rather high that the wrong icon will be associated to it (see my rant on Program Files (x86)). And of course Microsoft didn't see fit to give me a way to edit my icons, say to fix the icon or to set start up options for my game (e.g. a command line option, many games do support them for power gamers or to avoid technical problems on some machines).

So, I have this place that is supposed to be a central launching point for all of my games, but it doesn't actually find my games, and manually adding them it generally does the wrong thing, with out recourse to correct its shoddy implementation.

Okay, well, maybe those short comings will be addressed in the next version of Windows... Windows 7. No. I'm running Windows 7 RC and I can tell you without a doubt that it still fails to detect most game installations, fails to provide basic interfaces to fix problems, fails to associate the right icon to the game in the first place, and provides no opportunities to customize the shortcut's behavior.

Hmm - well, maybe Microsoft would have thought to provide a central clearing house for ensuring that your video card is up to snuff, that the drivers are up to date, that Direct X is installed and up to date, etc, to help you be sure that your rig is configured properly to run the game your mouse is hovering over.

A reasonable set of features, a modest set of possibilities. So almost by definition then Microsoft comes to save itself from success and fails yet again.

No, there isn't any link or tools to see what version(s) of Direct X your machine is running. There's no diagnostics of any kind. What version of video driver do I have installed? Can't tell that from here. What version of Direct X does this game require? Nope, can't tell that from here either. There is a goofy number associated with each game that seems to supposedly refer to the minimum system requirements needed to play the game... the same sort of vague, poorly defined marketing bullshit that one finds on the side of the game box, that is so vacuous as to be useless to anyone but the most clueless consumer - say a 12yr old's mom trying to buy a game for him. To everyone else, these things are pointless.

But for the real stuff - the issues that get you messed up regularly when trying to play a game: Video drivers, Sound Card drivers, OS-Compatibility modes, and Direct X versions, there is absolutely no support. Zip. Zilch. Nada. Go fish. Sucks to be you.

Friday, May 15, 2009

Windows UAC: Why it is wrong-headed

The problem with the entire UAC approach to security as implemented on Vista and Windows 7 is that it is little more than way for Microsoft to punt on its responsibilities and claim with a straight face that Windows is not at fault.

As seen in the Windows 7 blog:
There has been no report of a way for malware to make it onto a PC without consent.
Sigh. Great, so your approach is working perfectly. You've gone from a system where malware installs itself while the user is doing things, to one where you bombard the user with generic, indistinguishable, relentless queries as to whether its okay to do the thing they just clicked on, and hidden in that avalanche of requests is one that is actually from a piece of malicious software, which gets approved as a matter of course, but its not Microsoft's fault anymore. Brilliant. But completely shallow, self-serving, missing the point, blame-the-victim mentality at its best (worst?)!

Many, many pundits critizied MS for its onerous UAC when it first came out. As they reasonably should have. And many of them made this exact point: if you bombard the end-user with questions that they don't understand and give equal weight to things that are ordinary with things that might actually be of concern, then they're not going to take you seriously and they're not going to spend time evaluating every single query before answering.

I mean, gosh, how many of you are familiar with "The boy who cried wolf" at Microsoft? Anyone? Anyone?

So here we are, looking towards the next version of UAC, and someone is actually foolish enough to make the exact claim that everyone denounced as stupid and self-serving for Microsoft to do: "Not our fault, you pressed okay."

Any security design that requires constant authorizations for even mundane tasks is going to create the boy-who-cried-wolf problem. Its going to cause users eyes to glass over. Its going to cause knee-jerk acceptance of the "ok" button everytime that interface is presented. IT IS NOT, at its most basic level, A SOLUTION. Its merely a way to push the blame onto the end-user. Its ... a ... marketing ... tool.

And most end-users aren't even stupid enough to confuse the two. Microsoft is only fooling themselves. When Vista's UAC first came out, everyone immediately glomed to the fact that this is not a secure system, or a useful answer or approach or remdy to the underlying first-principles cause of the problem: malicious software getting installed or otherwise allowed access to our computers.

In their defense, I certainly grant that this is a difficult and complex issue. However, I don't actually think its beyond the resources at Microsoft to solve it in a way that is genuinely useful and usable by its customers and business users.

I don't think most of this is that difficult for most users to understand: run programs at only the level of access that they actually need, and no more.

If there were a SIMPLE way to grant some programs more access than the default (which would be set to a low, harmless level), then users could grok this. They could easily distinguish that their anti-virus software needs full access to their machines, but the latest toolbar from Yahoo does not!

But since Microsoft's UAC is too fucking stupid to distinguish between the user clicking on an applet in the control panel, and a bit of software trying to install itself from a web site or removable media, then there is no way that end-users are going to be able to distinguish between when a security authorization is actually meaningful or routine.

Microsoft utterly dropped the ball here. They did nothing to improve the actual situation. All they did was blame the end-user and create a stupid self-justifying system to prove themselves un-culpable.

Its a shitty way to handle this situation, and I for one am neither amused nor tricked into believing their bullshit.

Microsoft's UAC is the biggest "fuck you" to its customers that they've probably ever done. And until they take a deep breath, and admit the truth, there is little chance of actual forward progress or rational conversation on the issue to be had.

PDC2008 Design Principles for Windows 7... some comments and reflections

For those of you interested, you can see the actual presentation here: http://channel9.msdn.com/pdc2008/PC22/

Over and over while listening to this thing I keep thinking: wow, do they actually listen to themselves? Do they realize how ironic they're being?

At this particular moment, I'm listening to the presenter describe the idea that computers "should meet your expectations". Okay, we're on the same page here. Yes, computers should meet the expectations, the common assumptions, the intuitively obvious minimum behaviors that we would expect them to. Insert a movie: watch the movie. Insert a game, play the game. But the irony here, is something like this: The example he uses is "you put your movie DVD into your laptop. What do you expect to happen?" Well, it depends on how savvy of a Windows user we are.

If we're a total novice, and have never tried this before, then we might think that it should do the most obvious thing: play the movie.

But if we're even just a a little bit savvy, just a little bit experienced with Windows, then we immediately think: who knows?! Maybe its going to pop up a cluttered list of possible things to do, with icons of the software that's associated with those actions (which themselves are more about marketing their corporations and not really helpful in terms of giving one a solid sense of what they actually do or mean), of which the most obvious choice (play a movie) might or might not actually be present depending on what software we've installed on this laptop, what horrible software came preloaded by whatever godforsaken laptop company we bought this thing from, and whether Windows itself is even currently functioning properly for playing a movie, not to mention what crappy for-PC-software there might be on this DVD from the movie company (including the possibility of malware or a root-kit). But really? Who knows what it will do. This is a Windows PC.

But all of those things - the real list of what a laptop is actually likely to do - read like some insane horrible nightmare. Yet they're the likely outcome. And again, if this was not a laptop, but something less retarded like a DVD player, then it would actually play the DVD as expected, would not become infected with malware or a root-kit, and wouldn't be too damn dumb to figure out what to do with a movie DVD!

But will Microsoft ever listen to its own rhetoric? Or do you like the way it works now?

Anti-Pattern: Windows Registry

So Microsoft, back in Windows 3.x days, discovered that lots of software, when it installed or ran, needed someplace to store its persistent configuration data. And every bit of software had implemented its own way to do this. Usually using a unique flat file of some sort, and they often stored that file in the root folder, or sometimes in C:\Windows, or sometimes in their application's installation folder C:\MyProg.

So, I'm sure that they thought: hey, we can provide an OS level solution that will help developers by giving them a higher level API to use, so that they don't have to waste time reinventing the wheel, and creating their own "read configuration data" and "save configuration data" functions for every piece of software that comes out for Windows, and we can at the same time clean up the file system a little and help organize our customers computers by placing all of this data into a hive of configuration data. Heck, we'll need one ourselves, for all of our software (Microsoft's), as well as for the OS itself, so we'll kill like a thousand birds with one stone and everyone will be the better for it.

Its a well-intentioned, and good-in-theory idea. One that, at the time, I too thought was a "good thing"(tm).

However, over time it became increasingly obvious what the down-sides and limitations are to such an approach to the problem. First, there is the issue of fixing data when a configuration setting is bad / broken / wrong. This happens to software on occasion: say it recalls its last used window position from run to run, but you've had to downgrade your monitor while your good one is out to be repaired, and as a consequence, the window for this application is off the side of the screen and cannot be accessed. So you tell your customers: hey, just delete this setting from your configuration data, and everything will be fine! The application will initialize to defaults again, and all is well. But... the user needs to make this change to the very same registry that holds all of the variables that allows Windows to boot and run correctly! How often have users inadvertently changed the wrong setting, or otherwise made their machines broken or unbootable due to mis-edits in the registry!

Its just a bad idea to put application data (user-data) in the same space that the operating system uses for its very ability to function!

And when networking and user-profiles were attempted to be grafted onto the system, further problems cropped up and required that there be some of the registry hive stored as user-data, and other as system data, and the registry editor and APIs now have to navigate multiple hives behind the scenes to get it to all come out right, and it didn't always.

And I can imagine the fun it must have been at Microsoft when they realized that they needed to add security descriptors for every single hive address in their registry in order to have any ability to control the access to operating system protected parts of the system from the application's user-data portions. And how often over the years has registry security settings interfered with otherwise perfectly functional software? I know as a software user as well as developer that many, many, many times over the years our customers problems have resolved to: You deleted an account from your Windows PC and now some portion of your registry is utterly inaccessible to you or anyone on that computer, despite logging in as Administrator!

These are all nightmarish consequences of Microsoft's insistence on the registry as being the right way to store and maintain application settings. But to my mind, there is an even more basic an obvious reason that the registry is a bad idea: portability.

If I as a user want to move my settings, that I have invested real time and effort into configuring for myself, from one machine to another - say from office to home, or from an older machine to a new purchase, I often run smack into the walls put up by the registry.

I cannot simply copy my application's installation folder from one machine to another. I cannot navigate to a configuration file and copy it to the new machine. No, I have to somehow extract various settings from different parts of the registry and reassemble them on the target machine.

And how do I know what parts of the registry I'll need? I don't. Nobody does. Even software developers don't know all of the registry settings that their own applications use. Often times their installer, often written by another group or dept., will have a bunch of settings in addition to the many that the application needs. Its a common truth that many software packages fail to remove all of their entries from the registry when they're uninstalled. Perhaps a new registry entry was added between the time their installer or uninstaller was written and the various patches that were applied to the main application by the time it was uninstalled, hence rendering the uninstaller's log of what registry entries to remove long-out of date and incorrect.

If the application's data had instead simply been stored in the application's folder, then installation could be 100% accurate every time by simply deleting the folder, which is intuitive and obvious to any computer user, but which seems to be well beyond the grasp of the designers at Microsoft.

And when one looks at the problem overall, one has to wonder: Why was there an insistence on the registry? And why has it persisted all of this time? It seems to me that the dirty little secret is that many software publishers love it precisely because it does break their software when users try to copy it to a new machine! Free copy-protection, and they can blame Microsoft and claim that they aren't responsible at all. Its a Windows issue.

Well, one doesn't actually have to continue to use the Registry, and it really is more menace than aid. Its time we all admit it as a failure, and put our configuration data in a per-user and per-machine folder in a user-editable flat file, so that we can empower our customers to have clear and precise control over our software, and so that they have an easy mechanism for disaster recover, system migration, and backups that don't require a specialized knowledge and tool for extracting and restoring this information to/from the highly error-prone Windows registry.

Anti-UI pattern: Copying files & folders in Windows

This is another one that drives me nuts in Windows, and has been bass-ackwards now since... well, since Windows was first conceived.

DOS had options to make this work right. DOS. Venerable, ancient, command-line DOS could do this, but Windows File Manager, and later, Windows Explorer still cannot:

Copy a set of files & folders and gracefully handle problems during the copy without requiring the user to start over from the beginning, or WORSE, leaving things in a messed up half finished state when the operation fails for whatever reason (out of disk space being an obvious and common one).

WTF? I'm listening to PDC2008 video from MS that gives an overview of their "design process", that discusses the why's behind many of their decisions, and the driving principles for their UI department for Windows 7. And many of these design principles are laudable. Simplicity. Less steps to do obvious and necessary things. Less clutter and distraction. Yes, these are wonderful! But do they step back and actually look at their own OS and evaluate maybe some of the most obvious and necessary functionality with Windows (or any OS): File Management???

Apparently not, otherwise they'd have long ago discovered that their OS doesn't have the fundamental interface / tool-set to give users the ability to reliably copy or move files from one place to another with confidence that the operation will actually succeed.

Why is that? What could possibly be going through the many minds at Microsoft that leads them to ignore this fundamental requirement of a modern PC?! Windows 7 "Gestures"... while being nifty and full of novelty, don't actually represent a requirement of today's PC. Maybe they're a sales bullet point requirement from MS's marketing dept. But they are NOT a core requirement of a PC.

Copying and Moving files reliably, however, IS.

So why is it that when I use Windows vaunted Explorer, and I select a folder containing many files, does it abort the entire operation and fail to clean up after itself (roll back) or offer me the ability to remedy the failure and continue the remainder of the operation?! The latter is preferable, but the former would be acceptable, and another alternative would be to allow me to pick up & resume an operation that was partially finished (though this probably the least intuitive, and hardest to imagine a UI that would make sense to most people for).

I have had situations where 98% of my files (which can take hours to move or copy) fails at that last 2%, and the entire thing must be started over. One cannot fix the offense (say, delete a few more files on the target volume in order to free up some space for the remaining few files in the incomplete copy). One cannot choose to cleanly roll back the operation, leaving things as they were before one started. This is especially important if I was for example MOVING a few thousand files, because it then abandons the two file systems in a mixed state.

What is especially egregious about this interface is that when you go to copy things, Windows takes a small century to go through and think about the operation you're about to perform... yet, when it finally decides to begin the operation, it apparently hasn't used that thinking time to actually do any reality checks - simple things like "does the target volume have enough space for the requested operation to succeed?"

Now, mind you, that's not an especially reliable check anyway. Its a possible up-front check, which isn't the worst idea to go through with, but the reality of matter is that the target volume way well change its available space during the time that the copy or move is being performed anyway, perhaps due to another copy process, or being a shared volume in use by other clients, or for any other of a large range of reasons. So whatever check it can do up-front are only an initial reality check, and hardly any assurance of success, and shouldn't be relied upon by the copy algorithm as assurances that it doesn't have to actually handle problems on the fly.

Waaaay back in DOS one could purchase the excellent PC Tools or Symantec Tools which would give you a Text/GUI for copying, moving, and other file management operations. These competing packages both offered very smart copy, move, and error recovery. And yet today, more than a decade later, Windows has yet to provide such basic and essential maintenance support.

As long as we're on it, I might as well at least touch upon the stupidities and inadequacies of Windows Explorers security management interfaces. They're clumsy, hard to use, impossible to understand with anything less than a certificate in Windows security administration, and they do the wrong thing in most circumstances.

Say I upgraded my computer and chose to do a clean install of Windows onto a new hard drive. But I want access to all of my old files and so I keep the old hard drive around for both backup and access purposes. After completing my fresh install onto my new drive, I attach the old drive, boot to Windows, and try to access my old files. BONK! Can't read most of the folders on my old volume, due to security descriptors. Okay, so I just need to change the security to give myself access to my own files. Its something like 20-40 clicks to accomplish this, along with a long, long ass wait. I have to take ownership, add security for myself, remove any useless security from them, and these operations are exposed through a thoroughly unfriendly interface that often refuses to do what I ask it to, despite my being the Administrator on the machine to which this drive is attached. And woe is you if you actually used MS's encryption abilities on any of your files... they're goners unless you're still able to boot to the old OS in order to access them.

So why is this? Why are these things incredibly head-up-ass retarded in their convoluted, frustrating, and generally unhelpful implementation?

Microsoft is fond of thinking of itself as the company that cares about its customers, that takes pains to learn how folks use its software and to actually improve its users experiences of its products, and yet consistently they miss even the most basic usability issues with their software, be it Word, Excel, Windows, or other.

Microsoft, when can we have a marginally functional file system management tool for normal usage patterns, such as ...wait for it... copying and moving files around. I know, I know, this is a TALL order, one way up in the stratosphere of DOS level of complexity. But I would think even you guys could manage this after a full decade to think about it and do usability tests and design by committee or whatever else you needed to do to come up with a viable solution.

Wednesday, May 13, 2009

Program Files (x86)

Okay, all of you Microsoft apologists out there - explain to me this: Why in tarnation does it make any sense at all to change the base pathname for legacy applications, instead of for new (x64) applications???

If MS had chosen:

C:\Program Files\
C:\Program Files (x64)\

That would have made somewhat more sense. But, why bother to divide 32bit and 64 bit applications at all? What was wrong with: C:\Program Files\ for both?

As a software developer, we would have considered it our responsibility to differentiate if we came out with a 64 bit product. It seems like an obvious thing... we write new software, we write a new installer for it, and we have it go to a different folder than our 32 bit software... or we maybe choose to place both the 32 bit and 64 bit versions in the same folder. Also a fairly obvious choice, and one we could have handled in our installer. i.e. if installing on x64, choose the 64 bit version of the exe and dlls and so on, and if installing under 32 bit OS, then choose the 32 bit versions of those files. I know for a fact that the 8 person company I work for could handle this with aplomb. I am pretty sure that even your average software development house, hell, even most corporate in-house developer teams could have handled such an obvious and simple issue.

So why the extra folder, and why rename the one that is likely to cause problems with legacy code?

If you haven't figured it out yet, then welcome to the group. I don't think there is a rational explanation. I have thought long and hard on this, and the only thing I can come up with is "group think" - design by committee. Its a stupid, stupid choice, and one that causes problems, and reflects the general pattern at Microsoft of disorganization. It was always horrendously shallow to choose an arbitrary folder like "Program Files" to shove... literally, everything you install on your machine into. Why not: Games, Productivity Apps, Tools & Utilities, and so on, actually organizing one's file system by some rational design?

Putting everything one owns into the same drawer is by definition not a system of organization at all. One must actually differentiate between things and group them by some level of conscious choice for there to be intelligence and organization in the system. Microsoft just doesn't seem to grasp the concept, and never has.

Its a stupid shame - and one that will doubtless cause many users grief and frustration as their software that used to work stops working, due to the broken paths. Here's an obvious example: In Vista and in Windows 7 we have the new "Games Explorer", where you can access all of the games on your PC from one convenient place. However, In many circumstances Windows shows the wrong icon for your game! You can take a what appears to be a well-formed shortcut from your start menu and copy it into your games explorer, only to have games explorer change the icon and substitute something retarded for it. At first this seems bizarre (in a way that only Windows is - commonly getting even the most basic things wrong). But then, you think: "I can fix that - just change the associated icon!" Hmm... well, Microsoft comes to the failure again by not providing a "properties" setting on many of the shortcuts - especially it would seem on the ones you add by hand from shortcuts!!!! Okay, well, that's bizarre (in a very Windows-way - where failure and stupidity seem to be "Job #1"). But maybe there was something wrong with the original shortcut, even though they appear correct in the Start Menu. You edit the start menu, get properties on the shortcut you need, and low-and-behold, the "change icon" dialog complains that it cannot find "%Program Files%/blah/blah/some.ico"!!! Seriously?! You've got to be kidding me, Microsoft?! Your idiotic decision to rename Program Files to Program Files (x86) now means that all of my 32 bit software's shortcuts will all always be wrong?! Wow. Just... wow.

I've also seen software that creates the old legacy paths for various things because the developer didn't realize that there was a dynamic way to get the paths from the OS, combined with MS's penchant for changing the rules every single time they release an OS (oh yeah, it makes perfect sense to change things arbitrarily from one bad system to another without actually making any gains in the process... that's surely someone's definition of "progress"... it just isn't any rational being's definition).


Or do you see the sense in Program Files (x86)?

Sunday, May 10, 2009

Wherefore art thou, BIOS flash (which works easily)

PCs have been around a while now. They're in our lives commonly, and the BIOS has been there since the very first IBM PC, giving the computer a way to start up.

In all this time we've seen a lot of improvements in the BIOS. Support for dynamically allocated resources (aka Plug n' Play), support for larger hard drives, and various new devices on the motherboard.

But in all these years we have yet to see a system that allows us to upgrade or replace our BIOS with ease. Yes, there are niche hardware offerings that have this - a socketed BIOS or a second copy of the BIOS on the mobo. And more recently one can load their BIOS from a USB thumb drive! Well, that's almost perfect. The ability to start up the PC, reload the BIOS, and move on.

If only it worked worth a goddamn!

Today, I need to update the BIOS on my wife's PC. She's got a Gigabyte GA-M68SM-S2. Suppossedly a "Gamer's Choice" series of mobos. Since this particular board has a lack of features to handle a slightly wonky hard drive, I thought I would update the BIOS to see if the probelm its having with the primary hard drive might have been resolved in the BIOS's latest release. (The technical info on their web site doesn't indicate anything about this, but like most people, I'm used to companies either poorly or not at all documenting their products).

I find that the BIOS for this mobo does include a utility that should allow me to update the BIOS directly from the boot up screen and a USB thumb drive. Excellent, I think! Finally, the motherboard manufacturers are doing something about this obvious need... and only about 5 years after they should have. Ah well, at least its available today, when I need it.

So, with thumb drive in hand, loaded with the latest BIOS downloaded from Gigabyte's web site, I boot my wife's PC and enter the Q-Flash utility to do this thing. But... there's a glitch. The Q-falsh doesn't recognice the one flash drive I happen to have at home. Maybe it would recognize one of the ones from my office. But I'm home today, and my wife is threatening certain parts of anatomy if she has to wait another day for her machine!

But what's to be done? Gigabyte doesn't supply a Vista x64 based BIOS update utility that I can use, and since it can't read the 1GB thumb drive, I seem to be thoroughly SOL.

Looking on Gigabyte's web site, I see that they (terribly helpfully) support the floppy drive as well. Yawn,... yup, that's an option. Every machine I know of surely does have a floppy drive, yessiree. WTF?? What dumb ass makes a Q-Flash utility that can't read a very very common and compatible Cruiser 1GB thumb drive? Why the goddamn hell isn't there a reasonable way to upgrade this machine's BIOS? "Boot to DOS" says Gigabyte's online help. What DOS??? Using what device that your BIOS can boot to that some instance of DOS could even boot from???

I swear, computers are 100x more difficult to use than they ought to be and I cannot for the life of me imagine why the f@#! that is. I am a programmer, and I write software most every day of my life. If our company wrote products that were half as useless and poorly made as it seems Microsoft and the Mother board manufactures regularly excrete we'd be out of business in a heart beat. What the blasted hell do their programmers do all day? For the past 19 years??

We need PCs to have a bootable maintenance mode. A robust, easy to use core-level OS that allows us to do basic maintenance on our machines, not dependent upon 1980s era technology or an bootleg copy of DOS that nobody has anymore, and can't boot from any of the devices available on a modern machine anyway.

What do you think should be done to make maintaining PC hardware less burdensome? What about other hardware than the mobo? How should one upgrade their RAID adapter's BIOS?

Politically Correct

WARNING: The following may be deemed inappropriate for some young readers by some parents. Read only at your own responsibility.

It seems to be all the rage in this country to make sure that everything we say is non-offensive to everyone everywhere at all times.

But why should I curb my freedom of speech just because you may find it offensive? Who made you ruler of the universe? Why are your ears and feelings so precious that they can't handle hearing something you dislike? Gimme a fucking break. Grow the fuck up and toughen up a little, you who are offended and so full of your dumb ass self that you think its someone else's problem that your feelings are impinged upon.

So you know someone who's clinically retarded. Who actually is what that word is supposed to technically mean. So the fuck what?! So you're offended by curse words, such as fuck, shit, asshole, etc. So?

It is common to hear folks described as an Asshole. I have an Asshole. Should I be offended? Should I think that the usage of the word actually refers to my physical person? Or to anyone in particular's actual anus? Are you really that fucking retarded? If so, this blog is clearly not for you. I don't have any intention to pandering to those who feel that everyone should make sure that their feelings never get hurt.

Here's a clue for you: I don't give a flying fuck about your feelings. Further more, nobody but YOU are responsible for your feelings. Its not my responsibility.

This whole dumb-ass world is full of folks who seem to think that their feelings are more valuable than anyone's right to say something meaningful and actually try to have a conversation about important things. And I understand why some folks may bow to that - extend their message to more ears by trying to remove anything offensive from the message.

But my own conclusions to-date on this issue are that those who are so caught up in the wording of the message that they get offended by references that have nothing to do with what they're being offended by (or claim to be - such as the idea that using "Fucking retard" somehow actually relates to people with a birth defect), are too fucking dumb to have a meaningful conversation with in the first place, and are not going to understand the content of the actual message regardless of how sanitized you make it.

So - if you're offended over things that aren't actually directed at you or anything real - take your precious ass elsewhere. You're obviously not someone I wish to have a conversation with, and it sounds like the feeling is mutual.

If I or anyone actually uses hateful language directed at a group of people, or an individual, then I can actually understand why they may be offended. And there do need to be limits to such hate-speech. Especially where the anger is directed at a person or group's identity, rather than on specific issues.

To me it seems too common in this nation to confuse this idea. A person yelling death to Jews is using hate-speech. It isn't a rational conversation. It isn't a reasoned argument. Its sole purpose is to generate anger towards a group of people.

Contrast that against someone who is insulting the Jewish tradition of avoiding non-kosher foods. That is NOT hate speech, even if it is offensive to some people. Belief in a bunch of multiple thousand year old scriptures dictating culinary limits is certainly a concrete idea, and one that can (and perhaps should) be ridiculed. It doesn't mean all Jewish people are idiots, but that a specific practice or idea is ludicrous.

Please note that the use of kosher foods and Judaism is a purely arbitrary choice. Any Christian, Muslim, Animistic, or other faith could easily have been chosen as my example, as every one of them proscribe certain behaviors based upon archaic arbitrary rules ascribed with magical power to lend them credence.

So why is it that we all feel so compelled these days to say nothing that could possibly be interpreted by some group as offensive? Do you feel compelled to hold your tongue?

"My" Documents

What are we? 3-year olds? Really? My folders need to include the word "My" in front of every one of them?

It was a phenomenally retarded choice on MS's part to originally name these folders with a useless, infantile "My" in front of every one of them. They finally corrected their stupidity in Windows Vista, but now they're regressing back to their toddler identity and requiring a "My" "My" "My" in front of everything in Windows 7?

Maybe they think users are so stupid that they can't tell that the folder "Documents" under "[their-name]" is, in fact, their documents? Perhaps MS should visit everyone's house and relabel everything in their homes with a prefix of "My" on it so we don't get confused about who owns what. Every time I read "My" Pictures, "My" Downloads, "My" Videos, etc., I can't help but imagine a 2-yr old throwing a tantrum on the floor kicking and screaming, snot pouring out into the rug, repeating MS's mantra: "Mine!" "Mine!" "Mine!"

Microsoft, Please grow up and treat your customers as something a tad more mature than two-year-olds.

Or do you like "My Documents"? What does it do for you?

Friday, May 8, 2009

Windows 7...

So... MS made a great product some years ago. An OS that pleased almost everyone, that was fairly simple to use, had great capabilities, could do business software, interconnect with printers, scanners, fax machines, connect to the internet, and supported vast hard drives, was quick and responsive to the user, and could be easily tweaked to your personal needs. It was the pinnacle of an age of OS technology... no, I'm not talking about Windows 7, or even XP. I'm talking about Windows 98. The ultimate hybrid of 32 bit processing and DOS mode kernel, bringing to near perfection computing on a PC. Sure there were still glitches here and there, nothing is perfect. But in many ways I contend that Windows 98 was superior to Vista, and the up coming Windows 7 from the "what does it do for me" and "ease of use" POV.

Windows XP was another high point in MS's OSes. It had the enhanced stability of the NT core combined with the vastly superior user interface supported by Windows 98, bringing to the masses a truly 32 bit OS with no legacy DOS compromises that was at once easy to use, hugely configurable and extensible by software developers, while offering robust support for the latest 32 bit technology of the chips of the day. This did away with the limitations of fixed USER, KERNEL and GUI heap sizes of Windows 98 once and for all. Gone was a need to regularly reboot the machine in order to get back missing resources that slowly leaked away into a no-mans land of purgatory, awaiting the next reboot to live again. NTFS was the new defacto file system, bringing with it massive improvements in file storage efficiency over the aging and stretched to nearly breaking FAT32 filesystem, giving us an underlying transaction oriented filesystem with much room to grow and adapt to future hardware capabilities. And much like Windows 95, XP was very well received... in time.

It is easy to forget now the many voices of distrust and dislike of XP at the time. But XP was initially criticized on many levels, for breaking compatibility with Win 9x games and software, and for foisting upon users the more error-prone and complex world of security attributes on the registry and filesystem, and user accounts that could easily prevent various software from functioning properly, or which could lock out your computer from being usable at all, were you so foolish as to forget the only account password for your machine. But in time, most folks saw that the relaxation of, and in some cases, complete disappearance of restrictions from the old DOS-kernel based 9x OSes by XP was a worthy trade-up, and the nay-sayers voices died down in a general atmosphere of contentment with XP. After all, it had taken Microsoft many years to bring the vision of a truly new OS, written from the ground up as a 32 bit fully preemptive micro-kernel-esque to the masses. An OS that was inherently stable, extensible, robust in every way, and secure enough internally as to rate a C2 level from the government. Modern computing was finally here! No more crashing the machine when a single buggy application experienced a failure. No longer would one application be able to grab access to another's memory, or to mess with any other component's state and cause harm to your PC. The dream appeared real, and for the most part, functional.

But having a single OS that both business and consumers use proved a fantastically huge target for the unscrupulous amongst us. Virus writers, malware writers, phishers, net bots, and so on found a fantastic wealth of users who didn't know how to protect themselves from the explosion of connectivity available during that time, and so Microsoft got something of a tarnished reputation for the guys that make the OS that is susceptible to every virus, malware, and bot-net ever conceived.

From there, MS, motivated to avoid losing its dominance in both the consumer market and the business market, has made more and more strident attempts to quash this reputation.

Sadly, this is what has inexorably lead to Vista and Windows 7. Not that this is all that Vista and Windows 7 are about. There is a great deal of focus and good design behind C#, and the .NET platform as a whole. There are some excellent improvements to the video device driver model, as well as to the sound subsystem. And intertwined with all of these other improvements has been the largely unheralded creation of x64 based versions of XP and Vista, with solid device driver support for the lion's share of devices for Vista x64 edition. All of those improvements are just that: improvements. They don't take anything away from the end-user, they don't make it harder to use one's computer, they only create an enhanced set of capabilities for a computer running Vista / Windows 7, especially x64.

However, all of these advancements are overshadowed by ineptitude of MS's security model. It seems to me that many parallels could be drawn between MS's wrong-headed approach to security and this nation's wrong-headed approach to foreign diplomacy under the Bush administration. Both sought to improve the security of its "citizens", and both created an atmosphere in which the evildoers and the enforcers thrive at the expense of their citizens. Our rights as owners of our own computers are less today than they were when XP was introduced, which itself restricted some of our former rights under 98.

The intent is understandable. We all want to have our computers virus, and malware, free. We want our data to be off-limits from those who aren't authorized to view it. We want our computers to remain functional under an attack - to prove stronger than the virus or malware its subjected to as we browse the wild west of the Internet. That's a laudable goal.

However, MS has built walls the likes of which can only be seen in Israel and in the formerly partitioned Germany. The very idea of such walls is contrary to freedom, to the free flow of ideas, and to empowering citizens to be able to accomplish what they want with their own computers. It takes away control from the hands of owners (users), and puts it in the hands of overseers (Microsoft). Which from some obviously Machiavellian perspective fits Microsoft's overall business goals perfectly: they gain control of everyone's PCs making people all the more dependent upon Microsoft and its affiliates for relief from problems that arise, and it helps move the entire notion of a computer away from an ownership by the end-user model towards a rental model where Microsoft remains firmly in control. Chances are good that these thoughts have occurred to Steve Ballmer or other luminaries at Microsoft. I'm equally sure that they're not the primary goal or driving force behind MS. There is no conspiracy, just motivating factors. And giving Microsoft more control, and users and businesses even more reason to need MS in order to function well, can only be seen as a good thing to those who wish to get as much as they can out of our wallets at MS (which is by definition the primary goal of any corporation's share-holders, and therefor the primary goal of the corporation's senior management).

When it comes right down to it, Vista and Windows 7's penchant for creating an even more complicated environment in which to operate as an end user is fundamentally a bad idea. It necessitates middle men to become the high priests of computers, specialized individuals who have the aptitude and training to maintain end-users and businesses computers in the face of the growing complexity of doing so. It encourages users to become ever more complacent, ever more putting themselves in the hands of an expert with no hope for taking control of their machines, of having any real understanding or ability to control them and make them work for them, and adapt to their needs, rather than the other way around.

We are more and more forced to become slaves to our computers, forced to figure out how to make them function for what we need to accomplish within the ever-more-restricted environments that they present to us, all in the name of "security."

But I ask you - what security? Real security, measurable with actual results? Or the illusion of security. The sense that it is hard for a virus to succeed because it is hard for us to succeed at using our own machines? But is there any concrete evidence that a machine, running Windows Vista or Windows 7, is any less susceptible to current viruses and malware and net bots than the same machine running Windows 98 or XP? I doubt it. I very much doubt it.

Now, I'm sure that given MS's marketing dept, a reasonable-sounding case can be made that seems like MS has made gains in theses areas. However, ask yourself: "do I feel more confident about my computer now than I did 5 years ago?" Or, "when a virus strikes, am I more or less confident that I can rid my machine of it and restore it to full operation without losing my files and configuration settings on my machine now as compared to 5 years ago?"

I suspect that, like me, you don't feel more confident now on either count. That, in truth, the complexity that makes our machines harder to understand, and harder to control works in favor of the virus writers (who can specialize in this technology) and against ourselves, who are by definition not going to specialize in every nuance of the ever-changing complexities of MS's security measures.

The fact of the matter, from the most obvious and simple stand-point, is that obscurity and complexity aids the bad guys, and transparency and simplicity aids the common person. If everyone knows the rules, then everyone can figure out how to fix it, or at least a much, much larger pool of folks can, which means that viruses are only that much more easily removed and rendered useless. They're also easier to detect, since the system as a whole is easier to understand and see what its supposed to do, and easier to see when its functioning incorrectly.

At the absolute minimum, I would be content if MS would take its ill-thought-out war on simplicity over to its business-only side of the fence. I think its still a disservice to its business customers for much the same reason that I think its a foolish policy in general, anywhere and anytime, but at least there is some real motivation from the business side... primarily in the form of "I don't care if MS bilks bajillions of dollars from a stupid sheep-like business community", but I do care if they debilitate ever man woman and child in their ill-considered quest for world-domination and perfect security.

Home users don't need, nor want, for the most part, a security model that gets in the way of actually accomplishing productive uses for their costly computers. I want my PC to auto-detect other computers on my LAN. I want to share files with my family easily. I want to be able to print to any printer on my LAN without the need for a masters in the bizarre underworld of MS "security." My wife wishes to install and play games as painlessly as possible. My daughter wants to get onto facebook or myspace and hang out with her friends. She doesn't understand, nor should she be required to understand or see or ever have to deal with a retarded "are you sure you want to install blah-blah-blah"?

If software wants to install onto a computer, a simple signature verification should be issued, and confirmed, and a user-friendly and comprehensible dialog asking if you trust X vendor should be shown. If you trust that vendor, then the software is installed and given enough rights to perform its duties. If you don't, it isn't, end of story. And if it doesn't have proper credentials, then it simply shouldn't be allowed at all except by a super-user / administrator. I, as owner of the machine, should have the right to shoot myself in the foot. Its my computer, and hence my prerogative.

But perhaps now we're wandering into a cultural stupidity that needs to be resolved in America before some of these idiocies of technology can be properly addressed. The idea of personal responsibility.

Currently, we seem to be leaning ever so precipitously over the edge of a cliff of "its someone else's responsibility at all times" attitude in this nation. An attitude that is so incredibly at odds with those of our parents and their parents and on back to the settlers and immigrants that founded this nation. If I override the warnings, and install something that ultimately is harmful to me... that IS MY OWN DAMN FAULT. It isn't MS's for failing to warn me adequately. It isn't my governments for failing to protect me adequately. It isn't my parents for failing to teach me adequately. IT IS MY OWN DAMN FAULT. I am responsible for my actions. Me and me alone. And if I do something retarded... well, I suffer the consequences. Not sue someone. "I did something stupid: so pay me lots of money because you should have made it impossible for me to be an idiot." -- who the fuck thought that this was a reasonable approach to life, and why do we as a nation seem to be wedded to this intellectually, morally, and spiritually vacuous and ultimately self-defeating idea?

Gross negligence should be the burden of the plaintiff. And if the negligence isn't gross, obvious, systematic, and otherwise blatantly out of control, then the fault is not the companies, not the governments, but the individuals. You chose to buy the cup of coffee. If you're too fucking dumb to know its really hot and hot things are owie when spilled, then you're too fucking stupid to live on this planet and fuck you already.

And I don't want to live in a world filled with limitations on what I can do with the things I buy because the manufacturer is terrified that I'll sue (and win) over their failure to stop me from being a fucking idiot. Idiots have removed themselves from the gene pool from time immemorial by virtue of their own stupidity. There is no reason to think that this is a bad thing. There is no reason that I can think of that society should endeavour or waste any resources on protecting such idiots from themselves (Though, there is a good reason to protect the larger society from such idiots).

So, in a nutshell, Windows 7 is Vista with a few minor visual baubles, and the myriad patches necessary to make Vista stable, and a continuation of the misguided policies at Microsoft which further alienate the American public and solidify their reputation for being an error-prone and user-unfriendly OS and company in general.

Apple gets no love from me, but that's a rant for another day.

And that's my take on Windows 7. God help us all.