It's here! It's here! All the local news headlines you need to know about, delivered straight to your e-mail box and from there to your little grey brain.
Learn more about it here.
Sign up at the handy link below.
CLICK HERE to get on board with your very own MISCmedia MAIL subscription!
Bill AND Melinda Gates as potential Vice Presidents? Ridiculous. Among our other topics today: our pal Kelly Lyles and her art-van; HALA changes; a high school requesting “pledges” of attendance only from Af-Am students; a game company responding to allegations of “enabling” gambling; and an International District institution threatened.
We at #MISCmediaMAIL believe the Northwest autumn isn’t to be endured or survived but savored. We additionally sort out alleged conservative local-media bias; changes at SIFF; not-really-recyclable bags; ethnic emoticons; and a candle that supposedly smells like a “new Mac.”
In your blustery-day MISCmedia MAIL: Two Wash. enviro-stars get honored; how Bill Gates could’ve made more money but didn’t; trees on a condo tower?; no City-owned broadband this year.
via washingtonpost.com
ballmer at we day in keyarena, 3/27/13
In its 36-plus years of existence, Microsoft has had only two CEOs.
But no longer.
Steve Ballmer’s calling it quits, effective some time next year.
Fret not for the big guy with the big voice and the big body language. He’ll get a retirement-severance package bigger than the economies of several Third World states.
It’s what will happen to the empire of code and copyrights in Redmond (with tendrils worldwide) that’s at stake.
•
As all good schoolchilden know, Microsoft began in the primordial-ooze era of pre-personal computers, when tiny startup companies made build-it-youself electronics kits which, when assembled, could perform some of the functions of “real” computers (except, you know, for the function of performing real, practical work).
Bill Gates and Paul Allen made a stripped-down version of the BASIC programming language; then (more importantly) they established the notion that software should be paid for. They backed up this concept with copyrights and patents and lawyers.
With Ballmer at their side, Gates and Allen bought an operating system, re-sold it to IBM, and kept the right to also sell versions of it to other computer makers. MS would let hardware makers battle it out among themselves while it controlled the “platform” their products all ran.
This led to the DOS near-monopoly, which segued into the Windows near-monopoly.
It also led to Office and Internet Explorer.
It led to SQL Server, and to other high-end business software and related services.
Most everything Microsoft makes money from can be traced directly back to its early DOS-era dominance.
The company’s tried to get into other things. But those other things have had mixed results.
Remember MSN.com’s “online shows” concept? (The last survivor of those sites, Slate, is now among the Washington Post Co. properties not being sold to Jeff Bezos.)
Remember WebTV, HD-DVD, Mediaroom, Bob, Clippy, Hotmail, Actimates toys, the Encarta CD-ROM encyclopedia, Sidewalk.com, and Zune?
It’s probably easier to remember the Surface RT tablet device, the one the company recently wrote off to the tune of $900 million.
The company’s most successful new consumer-product line, the XBox game platform, is built (at least marketing-wise) on Windows’ gaming clientele. And even this realm has had its duds (XBox One, anyone?).
The jury’s still out on Windows Phone. Is there room for a third smartphone platform?
Microsoft could afford all these failures. Yes, even the Surface RT.
It could afford to keep an unsuccessful project going long enough to learn every little thing about why it failed.
And it could keep a successful project going long enough to watch its trajectory as the times, and the industry, pass it by.
So: Is today’s tech-universe passing Microsoft by?
Some analysts and pundits are making that claim.
They say the age one-size-fits-all personal computer has peaked.
There aren’t enough reasons for people or companies to keep replacing them as fast as they used to.
Especially with tablets and smart phones, and their hordes of specialty-function “apps” that make everything-for-everybody software like Office seem like lumbering beasts of prey.
So what should the next MS-boss do?
For one thing, he or she (and how come no women have been named as potentials?) could dump the notorious employee “stack ranking” system, that causes percentages of workers in each unit to be labeled as inferior no matter what. It’s horrid for morale and for productivity, and does nothing to improve products or services. If Ballmer really deserves to be called the “worst CEO ever” (he’s not, not by a long shot), it would be over this.
Next: Windows and Office still have many lucrative years left in them. That means there’ll be enough cash on hand to re-steer the company.
But to steer it where?
I say, away from Windows as the “one ring to rule them all.”
Even before phones and tablets, Windows had become an unwieldy thing, needing to perform the same functions (or at least most of them) on umpteen different hardware architectures, from sub-laptops to server arrays; for use by everyone from sophomores and shopkeepers to hospitals and factories.
Word and Excel have similarly undergone years of mounting “feature bloat,” hindering their everyday use at all but the most complex tasks. (Both are also based on a printed-page visual metaphor that’s increasingly obsolete as more people do everything on screens.)
What people increasingly need are simple ways to do specific things (preparing specific kinds of texts or crunching specific kinds of numbers, say), and to bounce the resulting documents around between different machines (their own and other people’s).
Think modular.
Think “apps,” to use the modern parlance.
The New MS could supply a basic ecosystem for modular software, which could be supplemented by developers large and small working in file formats (but not underlying code structures) compatible across different devices running different OSes in different screen sizes.
There’s plenty of space in that for all kinds of software puzzle pieces and building blocks. And for developers and template-scripters to build them.
And there’s no reason (other than entrenched corporate culture) why a lot of those builders couldn’t be at Microsoft.
Think even more “micro,” even more “soft.”
via geekwire.com
comicsbronzeage.com
Just Sayin’ Dept: Here’s something that hasn’t been publicized much in the World’s Fair 50th anniversary celebrations.
wallyhood.org
My adventure in Bellingham this past Sunday was cold but lovely. Will post a complete post about it a little later on.
And I’ve got another presentation coming up this Saturday, right here in Seattle! It’s at 2 p.m. at the Klondike Gold Rush National Historic Park, 319 2nd Ave. S. in pontificous Pioneer Square. (That’s right across from Zeitgeist Coffee.) This one concerns my ’06 book Vanishing Seattle, and perhaps all the things that have vanished around here since then. Be there or be frostbitten.
Now, to catch up with a little randomness:
denny hall, the uw campus's oldest building
Steve Jobs had essentially retired from Apple Inc.’s day to day management back in January. On Wednesday he simply made this move official.
Thus ends the second (third, if you count the NExT/early Pixar years) era of Jobs’s involvement in, and leadership of, the digital gizmo industry.
I will leave it to others more laser-focused on that industry to give the big picture of Jobs’s work and legacy. But here are a few notes on it.
Jobs and Steve Wozniac did not, by themselves, “invent the personal computer.” Many individuals and companies had seen what the early mainframes could potentially do in the hands of smaller-than-corporate users. The early “hacker culture” was a tribe of programmers who worked in corporate, institutional, and particularly collegiate computing centers, who snuck in personal projects whenever and wherever they could get processor time.
As the first microprocessor chips came on the market, several outfits came up with primitive programmable computer-like devices built around them, initially offering them in kit form. One of those kit computers was Jobs and Wozniak’s Apple (posthumously renamed the Apple I).
That begat the pre-assembled (but still user-expandable) Apple II. It came out around the same time as Commodore and Radio Shack’s similar offerings. But unlike those two companies, the two Steves had nerd street cred. This carefully crafted brand image, that Apple was the microcomputer made by and for “real” computer enthusiasts, helped the company outlast the Eagles, Osbornes, Kaypros, Colecos, and Tandons.
Then the IBM PC came along—and with it MS-DOS, and the PC clones, and eventually Windows.
In response, Jobs and co. made the Apple III (a failure).
Then the Lisa (a failure, but with that vital Xerox-borrowed graphic interface).
Then came the original Macintosh.
A heavily stripped-down scion of the Lisa, it was originally capable of not much besides enthralling and inspiring tens of thousands into seeing “computers” for potential beyond the mere manipulation of text and data.
The Mac slowly began to fulfill this potential as it gained more memory, more software, and more peripherals, particularly the Apple laser printer that made “desktop publishing” a thing.
But Jobs would be gone by then. Driven out by his own associates, he left behind a company neither he nor anyone else could effectively run.
Jobs created the NExT computer (a failure, but the machine on which Tim Berners-Lee created the World Wide Web), and bought Pixar (where my ol’ high school pal Brad Bird would direct The Incredibles and Ratatouille).
The Mac lived, but didn’t thrive, in the niche markets of schools and graphic design. But even there, the Windows platform, with its multiple hardware vendors under Microsoft’s OS control, threatened to finally smother its only remaining rival.
Back came Jobs, in a sequence of maneuvers even more complicated than those that had gotten him out of the company.
Out went the Newton, the Pippin, the rainbow logo hues. In came the candy colored iMac and OS X.
And in came a new business model, that of “digital media.”
There had been a number of computer audio and video formats; many of them Windows-only. For the Mac to survive, Apple had to have its own audio and video formats, and they had to become “industry standards” by being ported to Windows.
Thus, iTunes.
And, from there, the iTunes Store, the iPod, the iPhone, the iPad, and an Apple that was less a computer company and more a media-player-making and media-selling company. The world’s “biggest” company, by stock value, for a few moments last week.
Jobs turned a strategy to survive into a means to thrive.
Along the way he helped to “disrupt” (to use a favorite Wired magazine cliche) the music, video, TV, Â cell-phone, casual gaming, book publishing, and other industries.
We have all been affected by Jobs, his products, and the design and business creations devised under his helm.
He’s backing away for health reasons. But we’ve all been the subjects of his own experiments, his treatments for “conditions” the world didn’t know it had.
The post-Jobs Apple is led by operations chief Tim Cook, whom Gawker is already calling “the most powerful gay man in America.” That’s based on speculation and rumor. Cook hasn’t actually outed himself, keeping his private life private.
…Making simple products is way more difficult than making complicated products…. Simple is more complicated, simple is elegant, simple is harder.”
…just now discovered that Bill Gates has bodyguards. Nothing new, folks.
Whenever I’ve seen Gates in public (the first time was circa 1988 at the Crowne Plaza), he’s always surrounded himself by an all-male perimeter crew. I always figured some or all of them were security guys dressed as ye smen and toadies. Whenever he had to walk through or past crowds of strangers, Gates kept an emotional/psychic distance from the civilians by talking nonstop to his dudes, making unanswered rants about the flaws in other companies’ technologies.
…now knighthood for Bill Gates. Liz II’s standards seem to be lowering in her dotage.
…about a third of the way down this linked page, that Bill Gates’s highly publicized anti-AIDS crusade’s really a prop-up for the big drug companies, and for the intellectual-property regulations that protect their monopoly (and his):
“Gates knows darn well that ‘intellectual property rights’ laws… are under attack by Nelson Mandela and front-line doctors trying to get cut-rate drugs to the 23 million Africans sick with the AIDS virus…. He’s spending an itsy-bitsy part of his monopoly profits (the $6 billion spent by Gates’s foundation is less than 2% of his net worth) to buy some drugs for a fraction of the dying. The bully billionaire’s ‘philanthropic’ organization is working paw-in-claw with the big pharmaceutical companies in support of the blockade on cheap drug shipments…”Gates says his plan is to reach one million people with medicine by the end of the decade. Another way to read it: He’s locking in a trade system that will effectively block the delivery of medicine to over 20 million.”
…I’ve been reading lately talk a lot about the principle of “OPM,” or “Other People’s Money.” Nobody knows this better than Bill Gates, who’s been steadily rakin’ in the OS/Windows/Office software royalties for years while the PC hardware makers’ fortunes ebb-‘n’-flow. (Anybody remember Acer, Micron, Packard Bell, or Eagle PCs?) Now his MS minions are promoting a new computer hardware format, code-named “Athens.” The Athens machine’s chock full of MS Windows-only technologies, making it either useless or cumbersome as a potential Linux box. If adopted by enough manufacturers (and their end users), Athens will send more factory-installed-software monies Bill’s way, while leaving the manufacturers themselves to compete on razor-thin profit margins to sell boxes with roughly the same features.