iOS7 Problems
The day of the launch of iOS7, I warned my whole family in an email not to upgrade until it had some time to prove itself. This is why. Apparently it is a mess, at least for some users.
The sort of funny part is that they defend themselves by saying "at least it is not as bad as Microsoft OS launch." We certainly have latched onto a new form of accountability in the Obama age: "Don't criticize me because I am not as bad as the other guy."
By the way, as someone who is had been royally pissed off at Microsoft many times in the past, Microsoft has to accommodate thousands of hardware configurations and a much more loosely controlled development community. There are fewer excuses for Apple, which develops for a single hardware platform that it totally controls.
Also, there is one other difference -- when I am unhappy with a Microsoft OS, as I was with Vista, I can simply roll back to the previous version (in that case XP). Apple does not give users any way to roll back their iPhone OS.
Arrian:
Also, Microsoft has to deal with the fact that anyone can produce and sell software that runs on their OS. Apple doesn't let anything go into the App Store without being certified first. They not only control the hardware, but they're acting as a gatekeeper for the software as well.
October 18, 2013, 9:00 amnorse:
Thank you Warren for actually getting this! ("Microsoft has to accommodate thousands of hardware configurations") The tendency of Apple folks to point to MSFT as a comparison and completely ignoring that we're solving a very different, much bigger problem has always infuriated me. And, having just shipped a major OS revision in under a year (you have to work in OSes to appreciate what kind of heavy lifting that is), I feel that pointing to MSFT as a negative example is completely counterfactual.
October 18, 2013, 9:04 amCraig Howard:
I guess it's not been to Apple's usual standard, but I've not experienced a single problem on my ancient iPhone 4. In fact -- to my surprise -- I like it.
October 18, 2013, 4:45 pmJW:
"at least it is not as bad as Microsoft OS launch."
Huh. I can't recall an Windows OS update virtually deleting entire hard drives, which OS X iterations have done twice. Every OS X iteration is riddled with problems. The last service pack that MS had a truly significant problem with was in Windows NT 4. The worst anyone can accuse MS of is Windows ME. :::shudder::: Vista wasn't great, but it wasn't horrible either.
Windows biggest problem for years was with malware and that was due to 1) had the largest installed base by several orders of magnitude and 2) installed the default user as the admin. Stupid, stupid, stupid. It finally corrected #2 (and some would argue #1) with Vista.
Unlike Microsoft, Apple controls the software *and* the hardware. It has no excuse for these problems, except it's usual poor QA.
October 18, 2013, 9:23 pmobloodyhell:
Vista wasn't great, but it wasn't horrible either. It was slow, and kind of clunky, but it didn't have major flaws like deleting hard drives, either.
No, the only truly wretched OS that M$ has produce was F*** ME (I'll not use the term "Win" in connection with it, not being of Democrat mind and body. I have a connection to reality).
Even W8 isn't BAD, it's just a foolish attempt to produce a unified interface for both WIMP and Touch interfaces. I'll accept the notion that might be possible but W8 sure as hell ain't it.
No, the real problem with Windows is that it isn't really an OS, it's a BIOS with an enhanced GUI for an interface. It doesn't do some of the most basic things an actual OS does, which is a large part of what allows it to be taken over.
I mean, sorry, no one but Microsoft and their delegated agents should be able to produce an INSTALL routine, according to Microsoft specs. No, they can't stop anyone from rolling their own, but it would be utterly stupid to use anything but the official one(s) And the OS *ought* to actually be able to deny that -- that it can't do so is another sign of their incompetent design. And it's long past time that Windows actually USED the full security capabilities of the Intel processors. I gather W7 and its ilk are getting there, but not yet completely. VMWare shouldn't have any reason to even exist, yet it does.
Microsoft programmers' other major incompetence is their love for Giant Balls O' Goo. The registry, the file system itself, and their automatic behavior of creating one giant monolithic hard drive is a prime example of this tendency. It's easier to program, but it just BEGS for catastrophe, when the ball o' goo gets corrupted.
Back in the early days of computing, the Apple ][ had a distributed file management system. The sector locations of files were kept with the file, not in a central repository. The OS merely kept a central linkage to the file sector blocks. The power of this was that, even if something corrupted that central linkage, there were tools that could go out and find many of the uncorrupted files just by looking for the (fairly consistent looking) sector chains. You were likely to get some false positives, but any file which was entirely outside the zone of corruption could usually be retrieved intact.
The same problems exist with the registry as a ball o' goo. If the "settings" for a program were kept WITH the program's files, then it seems blatantly obvious that even a corruption of the registry info could readily be fixed by a search and re-assembly of those files. And a corrupted file would not affect the entire OS, and it would be possible to RELIABLY re-install the OS on top of a corrupted one and re-create the entire system largely intact. But with everything in the registry, you're just flat-out screwed. Something messes up the OS, you have to, in some manner, just re-install the whole mess, either by dropping back to the install CD/DVD, or by reverting to an earlier made image file of the OS partition (which, again, creates issues for the whole sensible idea of breaking your storage into OS/Programs/Data), so as to limit the expansion of dangers of corrupted content.
}}} Windows biggest problem for years was with malware
No, a huge chunk of that was flat-out incompetence in design.
If you removed IE and Outlook from your computer, and used Firefox and Thunderbird for browsing and e-mail, you eliminated 75% of your potential risk of malware. At LEAST...
And that was not just because of differences in numbers, either. Netscape 2, released Mar, 1996, had security features IE3, released Aug, 1996, lacked, and we're not talking subtle ones, we're talking blatant ones that anyone THINKING of potential security holes would have thought of. I remember, when I heard about an exploit using one of them to take over IE3 I was, like, "WTF?" And it's a guarantee that Microsoft got a copy of the beta NS and could see early in their own development cycle what features it had, and could have added them to IE3, even IF they didn't think of them on their own.
October 18, 2013, 9:41 pmobloodyhell:
Apple is a walking zombie. They learned NOTHING WHATSOEVER from their experience with the Macintosh in the 80s and 90s. Android is already outselling them in Smartphones by a factor of 2x to 3x, and its control of the tablet market is steadily softening:
Apple's iPad Market Share [Quarterly!] Slips Farther Below 50%
That's their Q1 2013 share, but it's not gotten better for them:
Research firm Canalys has for the first time found that non-iOS tablets have surpassed the iPad in global market share. While Apple still holds the biggest market share by a single manufacturer with 42.7%, tablets from Samsung, Amazon, Lenovo, Acer and others have combined to pass the company in total market share with a combined 57.3% of the tablet market.
It's the Mac Experience all over again, and Apple Just Doesn't Get It.
And this time, unlike the late 1990s, if Steve Jobs comes back to save them, it'll be a sign of some serious shit coming down, and that Apple's corporate situation isn't that significant any more anyway...
I just recently purchased a brand-new Samsung 7" Galaxy Tab 3 for US$190, they're down another $10 since... There are cheap knockoff 7" tablets out there for as little as $70 bucks. Even the 8" GT3 costs only $280 -- By contrast, the equivalent iPad, the mini, @7.9", is $340 from Amazon.
Why spend that much more for a single inch of screen? I can see going for a 10" tablet, but 7-vs-8 hardly seems worth an extra $160 bucks -- 150% more (yes, there are other small differences).
So it's no surprise Android is eating Apple's lunch.
You'll note that Apple no longer runs their "There's an app for that" commercials, because, well, if Apple has an app for it, Android probably has six.
And there's a reason for this, too -- Apple charges you $100 to list something on the iStore. Google only costs $25 to do so, and you don't have to jump through Apple hoops to even GET listed.
Apple hasn't updated their base development language since the Mac became popular in the 80s, it still uses stuff based on Jobs' Next computer, which uses Objective c, an early attempt to turn 'c' into an object based language that is utterly CLUNKY by comparison to either c++ or even c#... By contrast, most Android development is Java based. Most schools TEACH Java, so anyone coming out of college is already halfway there.
And trust me, well-written "c" is fairly straightforward and obvious in what it does -- even if you don't know programming, you can look at it and make some pretty good guesses as to what it's got going on. Even simple Objective c is hard to read for an experienced programmer, since it's filled with Apple specific jargon as well as an obscure language design:
Here's some c++, for contrast (no, I grant they aren't directly equivalent in level of function):
Even if the reader isn't a programmer, they can probably make some guesses as to what parts of the 'c++' code do what. I defy anyone who isn't a programmer to do the same with Objective c code.
Also, the overall model used by Apple, a message-responder design, is not one commonly taught in schools. I'll grant it has some benefits and elegance, but as you can see above, at the least it needs an entire facelift to improve its readability and ease-of-coding. It needs to get away from Objective c, first and foremost, to something much more modern, like Python or even Ruby on Rails, which I gather uses a similar paradigm.
It's 2013. A development environment should not be using something developed in the mid 1980s as its core defining paradigm.
These things all together really do create huge issues with developers picking up the skills needed to develop for the iPad/iPhone:
1) Huge upfront fee by comparison to distribute
2) Large upfront fee to become a "registered developer" (in addition to the distro expenses)
3) Need permission to distribute, which can be refused without any justification
4) Use of clunky programming paradigm lacking any semblance of easy readability or elegance.
So you'll understand what I say when I note that the companies around here that are interested in developing for Apple are finding people who can code the Apple iPad/iPhone are far harder to find than people who can code for the Android.
... and that will KILL the company even faster than the sales differential. When Apple not only falls behind in app development, but gets STUCK in a limbo where new things come out for the Android long before anyone ports them to the Apple, that's going to eliminate sales until the iPs become nothing more than a niche market just as the Macintosh had by 1998.
October 18, 2013, 10:37 pmJW:
My response wasn't defense of Microsoft coding processes insomuch as poking holes in the Mac iMythos of it being the One True Platform which has no problems.
Apple does some things much better than MS at the desktop level, but it's business model relegated it to minority status in terms of installed base, the bulk of which were smug, intolerable shits for whom platform choice was a statement of self worth. Mobile device computing has changed the market share, but not the iMythos. The PC business model, which the Wintel dynasty was a very large part of, brought (imperfect, but good enough) computing to the masses at a reasonable price point. That's something the Apple elitists and the lickspittle tech press constantly overlook.
October 19, 2013, 8:22 amT:
"aren't directly equivalent" is being generous. Here is the same c++ function written in objective C
- (void) transferAmount: (unsigned int) amount
fromAccountNumber: (unsigned long) fromNumber
toAccountNumber: (unsigned long) toNumber
withDbConnection: (Database *) db {
Transaction* t = [db beginTransaction];
BankAccount* fromAccount = [db loadAccount: fromNumber];
if( fromAccount.balance < amount ){
@throw [InsufficientFundsException exceptionWithMessage:@"There aren't enough funds in the source account"];
}
BankAccount* toAccount = [db loadAccount: toNumber];
[toAccount deposit: amount];
[fromAccount debit: amount];
[db updateAccount: toAccount];
[db updateAccount: fromAccount];
[t commit];
}
I think that's equally as readable as your c++ code. Also in your list of things "wrong" with the iOS development model. The costs for items 1 and 2 are the same thing.
Now to be on topic, the iOS 7 release has been one of the worst releases for Apple in a long time. The whole thing could clearly have used another couple of months in "little things" testing. Certainly one of the trait's Apple has been known or is getting the "little things" right, and this iOS release has many little things wrong with it that add up to make it a lousy experience on top of the new visuals (which will always garner ill will)
October 19, 2013, 10:36 amErikTheRed:
Apple's standards have certainly slipped without the screaming iron fist of Steve Jobs at the helm. That being said, I "boldly" installed IOS 7 the day it came out (since I'm the person everyone I know calls when they have a problem) and... nothing bad happened. The only stupid issue I've had to deal with is the one related to scaling and cropping background images which is annoying but far from the end of the world. I've only had one or two minor issues from the large group of people I support.
I strongly suspect that many issues that have come up are those that would have happened anyway - especially stuff like WiFi configuration. A certain percentage of people always have tech issues whether due to bad hardware, software, documentation (or lack thereof), or just end-user dumbness (it happens plenty). If these issues occur right after an OS release (regardless of vendor), it's automatically associated with the new release. Some of it is, and some of it isn't.
October 19, 2013, 12:22 pmobloodyhell:
}}} The costs for items 1 and 2 are the same thing.
And yet both are substantially greater than the same bar for Android users. I rest that part of my case.
}}} I think that's equally as readable as your c++ code
I would disagree for the simple reason that the message-receiver model used by Oc is ass-backwards from the one used by c, which is structured much more like English. I can't speak for other languages, but for any of the Latin-derived ones, standard c or c++ and even c# are generally going to resemble English far better.
That's not a direct criticism of the model, it's got some naturally elegant underpinnings at the deeper end -- but it makes the "ramp up" to developing for Apple much higher and tougher.
That Oc code I listed is something a **beginner** bumps into and has to deal with. THAT is why I used it with the caveat. There's nothing equivalent to it for Windows, by example. You don 't have to get down to that level of clunky coding until far, far later in the dev learning cycle.
I will grant this is a debatable element of things, mind you.
The other problem is that Xcode is a much more primitive dev environment than, for example, Visual Studio, partly because the way the Apple design team have done some things are abject cretinisms (programmatically setting a control to a common color is ridiculously obtuse for such a common action, for example. It's downright trivial in VS, even for colors you assemble from RGB values, much less about 60-odd predefined color constants) -- I haven't seen the equivalent dev tools for Android, but being as Java is one of the primary dev environments for Android, I'd suspect that the dev environments for Android are either much better or well along the way to becoming much better since many can be ports of Windows tools. If they're still overcomplex like Xcode/Cocoa/Oc are, it won't be that way for long, whereas Apple's had a good 6-7 years in specific and a couple DECADES in general (since the same tools are at least partly, if not completely, relevant to Mac development) -- those tools should be far further along and much more developer friendly than they are. They should recognize at least one or two more advanced programming languages like either Python or Ruby, for one thing, instead of entirely relegating all coding to a language which has never had prominence outside of Apple and not been a commonly taught language at most universities at any point in time.
But that's Apple's way, and one of its main flaws as a company -- "There's only one way to do anything". And this is pretty relevant when you consider all the stuff that they've been missing out on as a result of that. The scroll wheel, for example, is such a massive improvement in functionality it's not funny. And that stupid button they added to do something "similar" to it is still comparative garbage.
October 20, 2013, 5:03 pmobloodyhell:
Agreed. And FWIW, I think Microsoft is in trouble, too. Their death-knell will be sounded whenever you see an Android desktop become available. I also predict that Windows 9 is likely to be the last "significant" version of windows. After that, Android will eat their market share up, too. Google is not Netscape. They aren't going to sit by and let Microsoft beat them over the head with a "can't we all just get along?" business policy approach, they'll set out to eat Microsoft's lunch.
October 21, 2013, 3:29 amT:
And yet both are substantially greater than the same bar for Android users. I rest that part of my case.
Oh sure, no argument there, but to list them out as two separate items implies that they are two separate costs when they are one in the same. The $99 you pay to become a registered developer also covers distribution and visa-versa.
I would disagree for the simple reason that the message-receiver model used by Oc is ass-backwards from the one used by c, which is structured much more like English. I can't speak for other languages, but for any of the Latin-derived ones, standard c or c++ and even c# are generally going to resemble English far better.
I really have to disagree. The syntax is different but the order remains the same. Function calls in C++, Java etc are object.verb(subject, subject, subject), and in objective-c are [object verb: subject with:subject using:subject]. In fact, to be perfectly honest, I think objective c (with the caveat that the functions have to be well written) is far closer to English than c++/java. For example, for the function we're discussing, the call would be written as follows:
In c++ / java:
accountManager.transfer(dbConnection, fromAccountNumber, toAccountNumber, amount);
As with my above caveat, this requires both a well written function (for instance, naming it "doIt()" would be worthless) and well written variable names as accountManager.transfer(db, fnum, tnum, a); (which is sadly all too common) gives no hint as to what is going on.
By comparison here is objective-c:
[accountManager transferAmount: amount fromAccountNumber: fromNumber toAccountNumber: toNumber withDbConnection: dbConnection];
Again, the caveat id that the function needs to be well written, but on the other hand the variables do not. replacing the variables names with a, fnum, tnum and db still retains most of the readability and information.
None of this is to say the Objective-C is perfect. I spend all day writing java code and it has plenty of features that I wish were available in Objective-C, but conversely there are plenty of features (like the message passing syntax, synthesized getters and setters or the entire delegate model for ui) that I wish java had. But as I said, I strongly disagree with your assertion that either language is particularly easy to parse without first having learned some of the syntax.
The other problem is that Xcode is a much more primitive dev environment than, for example, Visual Studio
You will get no argument from me here. VS is far and away one of the best full IDEs I've ever used and both XCode and Eclipse could learn a lot from it.
programmatically setting a control to a common color is ridiculously obtuse for such a common action, for example. It's downright trivial in VS, even for colors you assemble from RGB values, much less about 60-odd predefined color constants
I have to ask, has it been some years since you last looked at Cocoa development, because frankly, I don't see what's obtuse about:
[control setBackgroundColor:[NSColor redColor]]; or even [control setBackgroundColor:[NSColor colorWithDeviceRed: 0.2 green: 0.4 blue: 0.23 alpha: 1]];
Again this isn't to say that Objective-C doesn't have it's obtuseness (seriously string concatenation, manual memory management for pre-ARC (and even still with regards to strong references)) but all languages do (seriously java generics are screwy and god help you if you want to do any reflection work ), and as you point out, it is a language from the 80s, and it holds up surprisingly well.
October 21, 2013, 11:11 amobloodyhell:
}}} The $99 you pay to become a registered developer also covers distribution and visa-versa.
Not to dispute but I thought there was a separate cost for each thing you released? Since Apple gives its imprimatur to everything released, I don't see them not charging you for each review...? Or at least every separate product they allow? That was the impression I got while reading documentation, but no, I'd not be surprised if it was wrong.
That brings up another issue I'm not fond of, which is the notion that you have to get their permission to market anything. You could spend your time developing something and have them tell you, "No, sorry, we've decided we don't like your product."
There was some app that managed to get past their rejection mechanisms, all it did was provide cheesecake Sports-Illustrated-Swimsuit style backgrounds for the phone -- I believe it was up for all of three days before they banned it. So even if you make it past their censors, it can still be pulled from the market at any time.
You might think that's somehow strongly limited, but I see it as arbitrary bureaucratic interference in the free market. Given Apple's Cali-based liberal culture, for example, I could see someone creating a gun-owner's digest, a gun model pricing data base, or a gun maintenance guide, and having Apple decide they didn't like that and refuse to allow them sold, all of which have every legitimate reason to exist and a clear market of legal individuals.
So, as a developer, that would be one more reason not to take the time and money to create something for the iPs, or at least not do do so until AFTER you've already used an Android version to establish a market and provide a cash flow.
}}} replacing the variables names with a, fnum, tnum and db still retains most of the readability and information.
THERE, yes. But what about other references?
tnum = (AcctNumber) current;
doesn't tell you anything about how that's going to be used. Better to use the name toAcctNumber (Apple also appears to hate obvious abbreviations, apparently expecting all future programmers to fail the TOEFL) and then it'll be obvious what it is no matter where you ref it and in what contexts.
}}} [accountManager transferAmount: amount fromAccountNumber: fromNumber toAccountNumber: toNumber withDbConnection: dbConnection];
The increased verbosity alone is more than enough to make the thing harder to read, I assert. You don't need the duplication of "fromAccountNumber:fromNumber" to get across the action being performed. It makes the entire command too long to draw in with a casual glance, you have to actually read it to see what it's doing. This has to do with perceptual limits that vary from person to person.
E.g. How many dots: .... - vs - How many dots: :::::::::::::::::::: ? Your eye/brain can absorb only a certain amount without breaking it up.
This is a particular problem with method names:
- (void)tableView:(UITableView *)tableView
accessoryButtonTappedForRowWithIndexPath:(NSIndexPath *)indexPath
and here's another function call:
- (void)tableView:(UITableView *)tableView
commitEditingStyle:(UITableViewCellEditingStyle)editingStyle
forRowAtIndexPath:(NSIndexPath *)indexPath
and a third:
- (void)tableView:(UITableView *)tableView
moveRowAtIndexPath:(NSIndexPath *)fromIndexPath
toIndexPath:(NSIndexPath *)toIndexPath
And these are TYPICAL method calls, not atypical ones.
The excess verbosity adds up and makes it impossible to scan for the thing you're looking for quickly. You virtually HAVE to use "find" if there's more than 4-5 functions in the ".m".
Then add to that all the damned pointer cast verbiage. Casts should be needed occasionally, not every single variable reference on every occasion. It's visual clutter, and in Objective-c it's not a minor issue.
I'm guessing you learn to block that crap out with experience, but... again, a giant, pointlessly large hill to navigate in learning to program due to a particularly weak development tool.
}}} [control setBackgroundColor:[NSColor redColor]]
Well, first off, that didn't work. I haven't had time to figure out what magic handwave was needed to make it work.
Here's the same for C#:
countDown.BackColor = Color.Azure;
"countDown" is the control name Notice the lack of "set" and "ground" -- both are excess verbiage adding to the amount you have to take in with your eyes, and not needed if you have half a brain -- and the only reason you need "Color." in this instance is because you're referring to a pre-defined enum type that has about 60-odd colors already pre-defined for you, and c# has amended the enum design so that you always have to precede it with the type identifier.
You could, instead (if you were using it a lot) just set
const Color Azure = Color.Azure; // "given the fact that it's obviously a color reference"...
and then it reduces to the remarkably short:
countDown.BackColor = Azure;
It's been reduced to its most basic elements and gets the notion across as succinctly as reasonably possible.
Anyone reading this, even the ones not knowing programming, have a hard time guessing what this does? I'd lay odds anyone who has ever programmed anything more complicated than "Hello, World!" can figure out in an instant what is going on. They have to actually READ the Oc version to get it.
Q.E.D., you don't need twice as many characters to convey the action being performed.
If this was a lone example, I'd not be making an issue, but, as you can see from the above, Apple likes to add all sorts of excess verbiage and requires all manner of unneeded casts to get across even simple activities. I can see some advanced-level benefits in some elements of their design, but am willing to bet that, if I'd had 15-20 odd years of dev time on these tools, I could've CRAPPED something better than the current iteration of XCode/Cocoa/Objective-c.
}}} it is a language from the 80s, and it holds up surprisingly well.
I'll grant that... the real argument lies in whether or not it should still be getting used at all. There's been a lot of development done in how to construct event-driven object-oriented languages in the last 20+ years, and it does not implement any of those developments well. ;-)
October 22, 2013, 6:41 amobloodyhell:
By the way, here's a perfect example of the kind of ridiculously excess verbosity I'm talking about --
NSMutableString *num = [NSMutableString stringWithString:[[NSNumber numberWithInt:index] stringValue]];
What does this do? It declares a string, and converts an integer to its string representation in a non-"fixed" string form ("mutable")
C#:
string myString = myInt.ToString();
LESS than half the characters. :-/ I can READ the C# and take the whole thing in with one glance. The idiot ObC call I have to actually PARSE myself to see what it's doing.
Partly because C# handles the string concept without requiring the user to deal with the issue of fixed vs. dynamic strings. If you want it to be unchanging in "C#", you just use the "const" declaration ONCE when you CREATE it
const string myString = myInt.ToString();
and then you treat it EXACTLY the same from then on -- it doesn't require special overhead methods adding 7 characters in every method called, and an additional specific method call to perform the conversion between const and dynamic strings -- and even if it did, it would probably be nothing other than a cast or simple method call such as the above "ToString()"
November 3, 2013, 1:23 pmChris:
"No, the real problem with Windows is that it isn't really an OS, it's a BIOS with an enhanced GUI for an interface."
No, that's not what Windows is at all. If you want something like this--and it works REALLY well for diagnostics, disk recovery, or [now largely deprecated] want to run a demo on a clean system--you want to look at the Commodore Amiga systems. Like the Amiga 1200. It had GUI--Workbench--that was upgradeable and burned onto a PROM chip. If you needed to run diagnostics or had another troubleshooting demand of the system, you could boot into this firmware and run a GUI from PROM, troubleshoot and fix the system, and then reboot to disk.
You can do similar things with Apple's EFI, and with Sun boxes and their EFI/BootROMs.
These are useful and decent things to have in a system; don't conflate Windows with this.
------
"I mean, sorry, no one but Microsoft and their delegated agents should be able to produce an INSTALL routine, according to Microsoft specs. No, they can't stop anyone from rolling their own, but it would be utterly stupid to use anything but the official one(s) And the OS *ought* to actually be able to deny that -- that it can't do so is another sign of their incompetent design. And it's long past time that Windows actually USED the full security capabilities of the Intel processors."
Sounds like you're talking about leveraging Trusted Computing (TC) with a cert-chain to assure that Developer X really is Developer X. There are two problems with this: (a) you have to wait for Microsoft et al. to get around to issuing a certificate, and (b) this imposes an additional cost that small software shops must shoulder. The certificate costs $500/year? Fine, if you're Adobe, it's a tiny line-item. Yet, if you are a small one-man software operation, this could buy hardware, so you are faced with the choice of shorting users through not having better/more hardware, or shorting users by presenting them with a scary THIS SOFTWARE MIGHT BE DANGEROUS pop-up. And, of course, the fear with TC is always that, one day, Microsoft might decide to revoke keys or stop programs from running altogether. It is significantly more "free" to permit an operating system to run code that was compiled for it, rather than compiled AND deemed trustworthy by a central authority who might have a financial motive to discourage marketplace competition.
------
"Microsoft programmers' other major incompetence is their love for Giant Balls O' Goo. The registry, the file system itself, and their automatic behavior of creating one giant monolithic hard drive is a prime example of this tendency."
Ah, the registry hive. I agree with you here. Remember the post-Windows 95 world and the cottage industry for books and software that promised to protect you from registry errors, and coach you through successful recovery of the registry should a key/branch become corrupted? Good times.
------
"VMWare shouldn't have any reason to even exist, yet it does."
I have no idea what you're trying to say here, but must conclude that I am fundamentally misunderstanding you. VMWare does nothing new. It takes hardware and provisions it for [a] virtual machine[s] to use. It also provides an intermediary between VM devices and physical devices. So what, we've been successfully doing this since at least the old Amoeba project (1983-1996) days.
These days, VMWare makes doing this easily, and is the de facto VM package of choice (if you're not a UNIX shop running OpenBSD jails, say). I think it is so because of two main requirements it has over other packages: (a) it runs as "just an application" on a host OS. Now we get to pretend that this is "just another application" for an administrator to support, rather than a whole new animal (untrue, but that's okay). (b) it doesn't require a dedicated (new, and a new budget line-item) box.
I'd rather run a Xen box (though I can't speak to its performance since Citrix (I think?) took them over) and provision from bare-iron, but this requires yet another physical machine.
The truth is that there are a great many reasons for a product like VMWare to exist, and multiple products have existed for ages on the home/SOHO/IT computing platforms for this purpose, and similar purposes.
November 5, 2013, 3:29 am