I joined Google [earlier]â€¦as an Engineering Director. This was, as I understand it, soon after an event where Larry either suggested or tried to fire all of the managers, believing they didnâ€™t do much that was productive. (Iâ€™d say it was apocryphal but it did get written up in a Doc that had a bunch of Google lore, so it got enough oversight that it was probably at least somewhat accurate.)
At that time people were hammering on the doors trying to get in and some reasonably large subset, carefully vetted with stringent â€œsmart testsâ€ were being let in. The official mantra was, â€œhire the smartest people and theyâ€™ll figure out the right thing to do.â€ People were generally allowed to sign up for any project that interested them (there was a database where engineers could literally add your name to a project that interested you) and there was quite a bit of encouragement for people to relocate to remote offices. Someone (not Eric, I think it probably was Sergey) proposed opening offices anyplace there were smart people so that we could vacuum them up. Almost anything would be considered as a new project unless it was considered to be â€œnot ambitious enough.â€ The food was fabulous. Recruiters, reportedly, told people they could work on â€œanything they wanted to.â€ There were microkitchens stocked with fabulous treats every 500â€² and the toilets were fancy Japaneseâ€¦uhâ€¦auto cleaning and drying types.
Andâ€¦ infrastructure projects and unglamorous projects went wanting for people to work on them. They had a half day meeting to review file system projects becauseâ€¦it turns out that many, many top computer scientists evidently dream of writing their own file systems. The level of entitlement displayed around things like which treats were provided at the microkitchens wasâ€¦intense. (Later, there was a tragicomic story of when they changed bus schedules so that people couldnâ€™t exploit the kitchens by getting meals for themselves [and familyâ€¦seen that with my own eyes!] â€œto goâ€ and take them home with them on the Google Bus â€” someone actually complained in a company meeting that the new schedulesâ€¦meant they couldnâ€™t get their meals to go. And they changed the bus schedule back, even though their intent was to reduce the abuse of the free food.)
Now, most of all that came from two sources not exclusively related to the question at hand:
Google (largely Larry I think) was fearless about trying new things. There was a general notion that we were so smart we could figure out a new, better way to do anything. That was really awesome. Iâ€™d say, overall, that it mostly didnâ€™t pan outâ€¦but it did once in a while and it may well be that just thinking that way made working there so much fun, that it did make an atmosphere where, overall, great things happened.
Google was awash in money and happy to spray it all over its employees. Also awesome, but not something you can generalize for all businesses. Amazon, of course, took a very different tack. (Itâ€™s pretty painful to hear the stories in The Everything Store or similar books about the relatively Spartan conditions Amazon maintained. I was the site lead for the Google [xxxx] office for a while and we hired a fair number of Amazon refugees. They were really happy to be in Google, generallyâ€¦not necessarily to either of our benefit.)
For your Christmas edification, the fat, bloated, and incompetent corporation that owns your PC has delivered a sanctimonious PC sermonette which ought to make a lot of viewers want to go out and throw up in the street.
Charles O’Rear, the professional photographer who took the photograph titled Bliss which he sold to Microsoft to be used as the world-famous wall-paper for Windows XP, explains where he took the photo, what camera and film he used, and tells us: No, it was not Photoshopped.
Jason Stewart (a self-described Apple addict) does not have much good to say about the current (incredibly expensive) MacBook Pro.
As Dan Ackerman at CNet noted, the Retina Mac â€œfeels like a rest stop on the road to somewhere else,â€ a place where we truly get thin, light and beautiful. Already, the Samsung Series 9 is smaller and lighter. And many of the rest of the radical changes are more marketing hype than features. The asymmetrical fan blades that were going to revolutionize quiet laptop cooling? If you try real hard, you might hear a trivial difference. What about the â€œAll Flash Architectureâ€? In other words, whereas before you had a choice between a fast, but ridiculously expensive SSD drive, or a cheaper, larger capacity conventional hard drive, now you can only have the SSD drive. Only Apple could successfully market that limitation as a revolutionary feature.
Of course, if you need to connect an Ethernet cable you better shell out extra for a dongle and hope you can find it when you need it. Need to watch a movie on disc or load a program or content the old fashioned way? Apple has just the extra accessory to sell you for that, too, since it is no longer included.
The new MacBook Pro with Retina display is a nice computer. The screen is innovative at a cost of both dollars and features. Whether itâ€™s worth the substantial premium (more than $4,000, fully loaded) is a personal decision. Apple has never been accused of catering to the wish lists of the masses, and this is no exception. It has staked its claim on a new display standard and if that means trade-offs, take it or leave it.
Meanwhile, Harry McCracken contends that the Mac world and the PC world are already very different and may soon becoming even more so.
When I sat down to review Appleâ€™s new Retina-display MacBook Pro, I instinctively wanted to compare it with similar Windows laptops. I wanted to discuss how the specs stacked up and whether the price seemed fair. I hoped to contrast its industrial design with those of its closest counterparts.
Then it dawned on me: there are no similar Windows laptops. …
while Apple remains the most influential computer maker in the business, the rest of the industry has chosen to ignore some of its design innovations. When it started sealing up its portables a few years ago â€” eliminating the ability to easily swap in batteries, RAM and hard drives â€” I thought that other hardware makers might follow along. For the most part, they havenâ€™t. …
[I]f Microsoft has its way, PCs and Macs are about to get more different than theyâ€™ve been in decades. For all of the interesting things about the new MacBook Pro, itâ€™s a straightforward notebook computer based on a form factor thatâ€™s been around for 30 years. Apple seems to be content to let Macs be Macs, while the iPad goes places that computing devices never have before.
With Windows 8, however, Microsoft is trying to reinvent the PC from scratch. The Metro interface has little to do with the basic concepts that Windows 7 and OS X share, and itâ€™s conceivable that a bunch of long-standing form factors that have never quite worked, such as touchscreen PCs and laptops that convert into tablets, will finally take off. If they do, and Apple doesnâ€™t push the Mac in the same direction, the average Windows PC could end up having very little in common with any Mac.
Tom Socca writes the epitaph for Redmond’s increasingly annoying ultimate piece of bloatware.
Nowadays, I get [a] feeling of dread when I open an email to see a Microsoft Word document attached. Time and effort are about to be wasted cleaning up someone’s archaic habits. A Word file is the story-fax of the early 21st century: cumbersome, inefficient, and a relic of obsolete assumptions about technology. It’s time to give up on Word. …
[Word] become an overbearing boss, one who specializes in make-work. Part of this is Microsoft’s more-is-more approach to adding capabilities, and leaving all of them in the “on” position. Around the first time Clippy launched himself, uninvited, between me and something I was trying to write, I found myself wishing Word had a simple, built-in button for “cut it out and never again do that thing you just did.” It’s possible that the current version of Word does have one; I have no idea where among the layers of menus and toolbars it might be. All I really know how to do up there anymore is to go in and disable AutoCorrect, so that the program will type what I’ve typed, rather than what some software engineer thinks it should think I’m trying to type.
Word’s stylistic preferences range from the irritatingâ€”the superscript “th” on ordinal numbers, the eagerness to forcibly indent any numbered list it detectsâ€”to the outright wrong. Microsoft’s inability to teach a computer to use an apostrophe correctly, through its comically misnamed “smart quotes” feature, has spread from the virtual world into the real one, till professional ballplayers take the field with amateur punctuation on their hats.
Even so, people can live with typos in their input. (Witness the boom in paraphasic email Sent From My iPhone.) What makes Word unbearable is the output. Like the fax machine, Word was designed to put things on paper. It was a tool of the desktop-publishing revolution, allowing ordinary computer users to make professional (or at least approximately professional) document layouts and to print them out. That’s great if you’re making a lot of church bulletins or lost-dog fliers. Keep on using Word. (Maybe keep better track of your dog, though.)
For most people now, though, publishing means putting things on the Web. Desktop publishing has given way to laptop or smartphone publishing. And Microsoft Word is an atrocious tool for Web writing. Its document-formatting mission means that every piece of text it creates is thickly wrapped in metadata, layer on layer of invisible, unnecessary instructions about how the words should look on paper.
Charlie Booker, at the Guardian, knows that Windows sucks, but explains that he still hates Mac and Mac users more.
Recently I sat in a room trying to write something on a Sony Vaio PC laptop which seemed to be running a special slow-motion edition of Windows Vista specifically designed to infuriate human beings as much as possible. Trying to get it to do anything was like issuing instructions to a depressed employee over a sluggish satellite feed. When I clicked on an application it spent a small eternity contemplating the philosophical implications of opening it, begrudgingly complying with my request several months later. It drove me up the wall. I called it a bastard and worse. At one point I punched a table. …
I know Windows is awful. Everyone knows Windows is awful. Windows is like the faint smell of piss in a subway: it’s there, and there’s nothing you can do about it. OK, OK: I know other operating systems are available. But their advocates seem even creepier, snootier and more insistent than Mac owners. The harder they try to convince me, the more I’m repelled. To them, I’m a sheep. And they’re right. I’m a helpless, stupid, lazy sheep. I’m also a masochist. And that’s why I continue to use Windows â€“ horrible Windows â€“ even though I hate every second of it. It’s grim, it’s slow, everything’s badly designed and nothing really works properly: using Windows is like living in a communist bloc nation circa 1981. And I wouldn’t change it for the world, because I’m an abject bloody idiot and I hate myself, and this is what I deserve: to be sentenced to Windows for life.
That’s why Windows works for me. But I’d never recommend it to anybody else, ever. This puts me in line with roughly everybody else in the world. No one has ever earnestly turned to a fellow human being and said, “Hey, have you considered Windows?” Not in the real world at any rate.
Until now. Microsoft, hellbent on tackling the conspicuous lack of word-of-mouth recommendation, is encouraging people â€“ real people â€“ to host “Windows 7 launch parties” to celebrate the 22 October release of, er, Windows 7. The idea is that you invite a group of friends â€“ your real friends â€“ to your home â€“ your real home â€“ and entertain them with a series of Windows 7 tutorials.
Win 7 Launch Party video: A very serious contender for lamest (interminable at 6:14) video ever made.