Monday, January 31, 2011

Why Cryptographers Get All the Girls

I'm very excited to bring you the very first guest post on Carpe Daemon! Benjamin Templeton is an ace cyber warrior embedded with a splinter cell in [CENSORED]. He is an avid urban hang glider and can make ten minute brownies in under eight minutes. He's also single, ladies. I'll get out of his way and let him do his thing.

- 44 Maagnum


Let me preface this discussion with a question: if you were to rank the varieties of computer scientist by sexiness, how far up would “Secret Code Breaker” be? If the answer is “very high”, then read on. If not, you can read on anyway; you might still find this interesting.

Consider the following hypothetical scenario:

The date is January 2002. The United States is engaged in Afghanistan, and on the brink of entering Iraq. There is a belief that the Iraqi government possesses weapons of mass destruction, but not definitive proof. To gather more information, the US government sends its top secret agents into the country. These covert operatives use their skills in espionage to intercept an internet communication between top government officials which they believe to contain irrefutable proof. Before they leave, they try to inspect the intercepted transmission to confirm the legitimacy of their discovery. They are disappointed to find only a nonsensical series of seemingly random characters. But, with confidence in the investigation that lead them to the data, they declare "It must be encrypted! We'll get this back to the States, and our tech guys will break the code."

Now, let's consider the conclusion to this scenario in two separate worlds. First, the world of Jack Bauer; pop culture espionage and cryptography. The agents bring the data back, and the socially awkward but endearing tech guy (or, disproportionately often, tech girl) coolly and confidently tells them "Don't worry, they might have some good computer geeks over there, but I'm the best there is." He or she boots up his or her workstation, hunches over a bit, and gets to work. Some very intent typing occurs, a few mutters of professional admiration for the adversary, something about an algorithm, and then a triumphant proclamation that the code is broken. The day is saved. Anybody who has watched James Bond  or an episode of 24 knows all about this.



Next, take the actual real-life world. The agents bring the data back, and the socially awkward and not-at-all endearing tech (almost certainly) guy tells them simply "Sorry, if this is all you’ve got, we're out of luck."

Once upon a time, encrypting data was basically a process where two people agreed on a secret system of changing data so that plain language appeared incomprehensible. There are a lot of interesting aspects of this process  (secret-key encryption), but in modern communications, secret-key methods have taken a back seat. This is because in the 1970s, the concept of public-key encryption came about, and revolutionized secure communications. Consider the earlier scenario. If the government officials want to use a secret key, they have to somehow exchange that key. But, of course, that key could be intercepted. The logistical problems are clear. But what if the officials never had to meet each other before and send messages to the other that only the recipient could read? This is the promise and reality of public-key encryption.

Public key encryption is almost universally implemented with the RSA technique, named after its inventors Rivest, Shamir, and Adleman. I won’t go into many technical details, but I’ll give a hugely simplified explanation below:

Encryption and decryption are performed but applying a function to the data (a function which makes meaningful data appear completely (mathematically) random), and then applying the reverse process to undo the encryption. Both functions require a key of some sort; the decryption key must match the encryption key, or the data will remain scrambled. Consider the example of the Caesar Cipher: the encryption function is shifting the letters in one direction, the decryption function shifts them the other direction, and the key (for both directions) is the distance.

RSA is based on the premise of one-way functions. This is a function that is easy to compute in one direction (multiplication) and difficult to perform in the other (factorization). The process uses a decryption function based on two prime numbers, and an encryption function based on their product. Basically, to receive messages, somebody picks two secret prime numbers (private key) and multiplies them together. They then tell everybody what the product is, the public key. With the product, anybody can encrypt messages using the encryption function. However, only the recipient can operate the decryption, because only he knows the private key. The genius of RSA is that they came up with a pair of (en/de)cryption functions such that the public key can be generated easily from the private keys (multiplication), but the private keys cannot be plausibly extracted from the public key (factorization). This is because of the computational difficulty of factoring, which I won’t go into here, but you can read about further in this wikipedia article. Let it suffice to say that the consensus is that factoring very large numbers quickly is essentially impossible.

So now go back to our Iraqi officials. The first official (let’s call her Alice) picks her large primes for the private key, and multiplies them to generate the public key, which she makes available to anybody who wants it. The second official (Bob) writes a message (plaintext) about the status of the nuclear weaponization project, and encrypts it with Alice’s public key. Bob then sends the encrypted message (ciphertext) to Alice. Unfortunately for Alice, the American spies have tapped the wires, and so they get the message too. They return home, confident in their techies’ abilities. But when they return home, they discover the problem: there is no way to get the plaintext without Alice’s secret key, and since Alice didn’t tell the secret key to anybody, it is (in practical terms) impossible to retrieve. This has nothing to do with the cleverness of the code-breakers; it is simply a reality of the world. And this is the reality of cryptography: it’s not about the skill of one programmer against that of another, but about irrefutable math.



This is not at all to say that cryptography is an obsolete field of study. Encryption schemes can be attacked, including RSA. There are known-plaintext attacks (the method by which the Enigma encryption of WWII was broken), ciphertext-only attacks, and many more strategies. In this age of computers, code-breaking still exists.

Real Cryptography


I guess if you were to say “What’s the point?” I would tell you that cryptographers aren’t the international men and women of mystery that they’re made out to be. They’re engineers and mathematicians, like the rest of us techies, and we’re tired of them getting all the hot dates.

Friday, January 28, 2011

The Real Tony Stark

How many people can say the loose cannon genius billionaire from the Iron Man series was actually modeled off them? The correct answer is one--his name is Elon Musk.



Robert Downey Jr.'s Tony Stark from the recent Iron Man series is the outsized personality behind Stark Industries. As easily distracted as he is brilliant, he finds focusing difficult with attractive women in the room. Stark's technology is unrivaled and reflects its founder's egomaniacal superiority complex and overdriven masculinity. How could Elon Musk ever stack up against such a movie character personality? Founding a billion dollar company before 30 helps. Taking those winnings and anteing up again certainly doesn't hurt--especially if the next two acts are an electric car and a private space exploration company. Let me explain.

Musk was born in South Africa and immigrated to Canada partially due to the prospect of compulsory military service oppressing black South Africans. Eventually he ended up in the United States. (See comments from yesterday) He sold his first piece of software at age 12 and has had a seemingly uninterrupted string of hits ever since.

By now everyone has hopefully heard about a little company called PayPal. PayPal was born out of the merger of X.com and Confinity. Musk co-founded X.com, and after the merger he was the largest PayPal share holder. He owned 11.7% of PayPal when eBay aquired it in 2002 for $1.5 billion--just three years after the founding of X.com... a little math... that's a cool $175.5 million in the bank. Musk took that money and retired to a quiet private island in the Caribbean and was never heard from again. Wait. I meant to say he could have retired to a private island but instead started Tesla Motors.

Musk is perhaps best known for his role at Tesla. Musk took over as CEO in 2008 after a series of technical gaffes left Tesla in a freefall toward oblivion. The company's first car was set to go on sale for $109,000 even though the cost to produce the car had climbed to $140,000--not a profitable proposition. After an additional investment of $20 million he set to work turning the company around. Musk orchestrated coupes such as the purchase of an old Toyota factory worth $1 billion for a paltry $42 million and a successful IPO that infused the company with $238 million. He almost single-handedly put the company on solid footing. Musk also negotiated crucial partnerships with established companies such as Daimler-Chrysler, Toyota, and GM. Most of these deals entailed the licensing of Tesla's battery pack and drive train technology for use in cars manufactured by the other companies and gave Tesla enough breathing room to continue to develop the revolutionary Tesla Roadster.

Tesla Roadster


Accusations that the Roadster is a frivolous luxury for the ultra rich don't disqualify the car as a brilliant piece of engineering. The car accelerates from 0-60 mph in 3.7 seconds compared with 3.14 seconds for the Ferrari Enzo. The car's electric motors produce full torque even at 0 rpm which is an advantage electric drive trains enjoy over their gasoline counterparts. The car travels 245 miles on a single charge. However, Musk's grand plan doesn't stop at producing a billionaire's plaything. He intends to parlay the engineering savvy gained on the Roadster into an affordable consumer electric vehicle. The progression starts with Tesla's production of the Model S priced at a downright affordable $49,000. After PayPal and Tesla, Musk probably called it quits, right? Enter SpaceX.

Tesla Model S


As if Musk needed to burnish his Tony Stark image, in 2002 he founded SpaceX with $100 million of his personal fortune. His goal was the production of reusable launch vehicles. In 2010, SpaceX employed 1100 people and became the first private company to launch a space vehicle into orbit and successfully recover it.



I haven't convinced you that Elon Musk is Tony Stark? Granted, Musk doesn't rely on an arc reactor for sustenance. His suits, though expensive, are not armored or weaponized. This side of a Marvel comic book we're just going to have to accept the fact that we can't have a real life Tony Stark. I don't know about you, but I'll let Elon Musk stand in.

Thursday, January 27, 2011

A Technologist's Take on the State of the Union

Admittedly, Obama's speech two nights ago wasn't an Apple WWDC keynote, so I'm going to have to stretch a little bit to make the headline of the day fit the theme here. In my opinion, Obama talked about at least two topics that are important to the technology world--immigration and government transparency. Government policy has an impact on high tech companies through immigration because many of the best programmers, entrepreneurs, and visionaries come from other countries. As far as government transparency goes, technology has a profound effect on the accessibility of information.



In order to see the effect of immigration on technology, one need look no further than Silicon Valley. An uncanny number of the most influential technology companies were either founded by immigrants or featured immigrants in very prominent roles. I'm not claiming that the following anecdotes are conclusive, but here are some companies that you may have heard of. Microsoft was founded by two Americans, but one of its early architects, Charles Simonyi, is the creator of Microsoft Word and also is an immigrant. Hotmail was founded by Sabeer Bhatia, an immigrant. PayPal was founded by Peter Thiel, Max Levchin, Elon Musk, Luke Nosek, and Ken Howery. The first four named are all immigrants to this country. Thiel went on to be one of the earliest and largest investors in Facebook. Those involved with PayPal during its founding have had a disproportionate impact on the software industry since. Colloquially, the group has come to be known as the PayPal mafia and the software industry would be undeniably different today without these individuals.

One last annecdote. I had dinner last night with a friend from high school who went to Stanford. He told me that one of his foreign college friends has to travel to Canada for three days this week before starting work at McKinsey because of visa restrictions. What logical reasoning could possibly support the requirement that a Stanford graduate on his way to work for one of the most prestigious consulting companies in the world jump through hoops in order to work in this country?

Obama did a great job of expressing exactly this sentiment. He said, "There are hundreds of thousands of students excelling in our schools who are not American citizens. Some are the children of undocumented workers... and others come here from abroad to study... But as soon as they obtain advanced degrees, we send them back home to compete against us. It makes no sense." I couldn't agree more. Immigration is a complicated issue, and I don't know enough about it to provide all the answers, but the case of highly educated immigrants seems like it should be a no brainer.

Secondly, Obama touched on using technology to increase the efficiency and transparency of government. Obama said, "Now, we’ve made great strides over the last two years in using technology and getting rid of waste. Veterans can now download their electronic medical records with a click of the mouse." This is an excellent example of using technology to make government more efficient. It reminds me of something New York City did to make non-sensitive government information available to software developers.

For the past two years, New York City has run a competition called NYC Big Apps. The city released massive databases of public data a few years ago. The challenge for developers is to make a piece of software that uses that data to make the city more "transparent, accessible, and accountable." Past entrants have included augmented reality applications that help users find the nearest subway and an application that helps parents compare schools. Outsourcing the task of making this data accessible to private developers is the only way that these applications will ever be created.

Obama went on to propose a website that would allow people to see how tax dollars are spent and with what lobbyists senators are meeting. Technology has always been a powerful force opposing cover-ups and corruption (Wikileaks, anyone?). This is precisely why the internet is severly restricted in places like China and North Korea. Putting information about government spending online will provide voters with more information allowing them to make better decisions. The ubiquity of the internet makes it an obvious choice for proliferating this information.

I've already run on too long, but I'll summarize. Obama's comments on technology (as scant as they may have been) were encouraging--almost as encouraging as his comments on education. If he can make progress on the issue of immigration, he could have a huge impact on technology companies, and if he wants to come through on his promise to make the government more transparent, the internet is certainly the way to do it. Now everybody go read an article about the state of the union that might actually tell you something useful about what Obama had to say.

Monday, January 24, 2011

The iPhone 4 on Verizon: Let the Games Begin

Some called it the most agonizing wait since Apollo 13 lost radio contact on reentry. Others likened it to waiting for your parents to wake up on Christmas morning. Still others claimed the anticipation was more painful than having the last Harry Potter split up into two movies. While I agree, I think this last group just goes too far. Regardless, I'm here to break the news, for the first time in any major publication, that the iPhone 4 is coming to Verizon on February 10, 2011.

I wouldn't hold it against you if you were skeptical. After all, we've heard this same rumor thousands of time before. (Wait, everyone doesn't obsessively poll the technology rumor blogs? I'm confused.) Suffice to say, rumors of a Verizon iPhone started almost the day the iPhone was announced for AT&T. I assure you this time the rumor is true. Verizon held a press event two weeks ago at which the announcement was made, and Apple has saturated the airwaves with its ads ever since.



What does this mean for the current or prospective iPhone owner? The most important change is choice. Heretofore, (yup, said it) the iPhone has only been available on AT&T's network. When Apple originally announced the iPhone would be available exclusively on AT&T for a certain period few thought that period would last almost four years. In that span, AT&T's reputation for reliability, especially in areas of high population density like New York City, has deteriorated. Because of this, AT&T has been the butt of innumerable jokes, and Verizon has gained a reputation as the more reliable of the two major carriers. Speculators wonder how Verizon's network will react to the inevitable influx of data hungry iPhone users.

Other than carrier quality, the Verizon iPhone is almost identical to its AT&T counterpart. Perhaps the most notable difference is that the Verizon phone cannot handle simultaneous voice and data traffic--you can't browse the web while talking on the phone. Another notable difference is the AT&T version's inability to act as a WiFi hotspot for other devices. Finally, a revision to the antenna that allows the phone to operate on Verizon's network caused a few of the buttons to shift slightly which doesn't affect usability but requires some cases to be redesigned.



You may be wondering what this brave new world will look like after seemingly impossible things like choice for iPhone users and iPhones for Verizon users become a reality. I've spent an unhealthy portion of the last two weeks wondering exactly the same thing. As tempting as it may be, I'm not backing down from the bold prediction I made earlier. Many people claim that Android phones have only done so well because they are the only smartphone option available on Verizon. I think that the latest Android phones match up very well with the iPhone in terms of utility and surpass it in flexibility. AT&T doesn't offer many of the leading Android phones, so the iPhone has never faced competition from the best Android has to offer.

Perhaps the biggest advantage that Android phones on Verizon enjoy is 4G data capability. Verizon is currently advertising its 4G network heavily and claims that it will be 10 times as fast as 3G. Several 4G Android phones have been announced for Verizon though none have yet been released. When the iPhone debuts on Verizon, it will not be able to use the 4G network. This will likely be remedied in the near future, but initially the iPhone will have to overcome this disadvantage.

Sure, the Verizon iPhone will enjoy massive popularity when it comes out in February. Sure, it will sell millions of units. However, I think that Android will continue to be the leader in mobile operating system market share--possibly after a brief dip into second place.

I wish I could say the wait is finally over, but seeing as it's only January 24th, it feels like the wait has just begun.

Friday, January 21, 2011

The Wonderful Wizard of Woz

Known variously as the Woz, iWoz, and the other Steve, Steve Wozniak is perhaps one of the most gifted and important technical minds of the past century. A friend of mine, after reading Wozniak's interview in Founders at Work, said to me in an email, "I finished with the impression that he was a truly unique individual destined to bring the personal computer to the world." I couldn't agree more whole-heartedly.

Wozniak attended the University of California, Berkeley in the late 1960s. Berkeley was one of a handful of hotbeds for computer science at the time. This was incredibly early in the evolution of the personal computer, and only the most hard core nerds had any hope of making progress. The computer that set the computer hobbyist community on fire, the Altair 8800, was still half a decade away, and even that computer only had a few blinking lights to offer the user.



Bill Gates was 13 years old and 11 years away from creating MS-DOS.

After withdrawing from Berkeley, Wozniak went to work for Hewlett-Packard. He worked on the company's scientific calculator, one of the most advanced computing devices of the day. Then Steve met Steve. A partnership was born that spawned not only one of the most iconic computers of all time but one of the most iconic companies as well.

The pair began attending meetings of the Homebrew Computer Club. At the time, the name of the club was almost redundant seeing as the first mass produced personal computers were still years in the future. An engineer in the purest sense of the word, Woz cared much more about creating bleeding-edge technology than starting a business. In Founders at Work, Woz is quoted, "We hadn't decided to start a company. Because companies weren't my thing, technology was. I'd make Xeroxes of all my schematics and pass them out and I thought 'I'll get known by doing this stuff.'" Motivated not by profit but rather by intrinsic curiosity and a desire to make a name for himself, Woz created computers years ahead of anything available at the time. All this in his spare time while working full time at HP.

Eventually, the consummate entrepreneur, Jobs, convinced Wozniak that they should create a company to design and sell his computers. Their first product was to be printed circuit boards, but this quickly gave way to the Apple II, an improved version of the computer Wozniak showed off at the Homebrew Computer Club.



The Apple II was designed entirely by Wozniak. Not only did he solder the chips together himself, he wrote the software that ran on the computer. In his interview in Founders at Work he describes how he wrote a progamming language for a system that didn't have one: "Normally you type a computer program into a computer. What I did was I handwrote it on the left side of the pages in what's called machine language. And then I looked at a little card and I translated my program into ones and zeros on the other side." You heard that correctly. He wrote an entire programming language in ones and zeros. Today, complex computer programs called compilers take care of this mind numbing task. In that day, he had to do it himself.

That is the type of technical virtuosity that makes someone "destined to bring the personal computer to the world." The Apple II was an incredible success. It is the computer that made computing palatable for the masses. Today, computers are inescapable, so take a second to appreciate the man who started it all--the Wonderful Wizard of Woz.

Only in Silicon Valley

Thursday, January 20, 2011

Comparing Apples and Androids

Yesterday I gave you a taste Apple's seemingly unstoppable ascent during the past three years. I concluded the post with an attempt at a cliff-hanger and a promise to my booming readership that all would be resolved in today's post. Don't say I never did anything for you.

First, a bit of background for your edification. Steve Jobs unveiled the first iPhone in January of 2007. Since that time, the phone/ music player/ fashion necessity/ utterly touchable gadget has enjoyed an unimpeded rise to world domination... Maybe not world domination, but certainly domination in the smartphone market. Topping out at around 24% market share, the iPhone was by far the most popular smart phone and iOS was similarly dominant in the mobile operating system sphere.

This brings up a crucial distinction that is often overlooked. While hardware makers battle to promote their handsets, another, more important battle rages over mobile operating systems. The operating system is important because it determines what applications can be developed for and run on a particular phone. With so much parity between new phones from HTC, Motorola, and Apple, apps play a big role in differentiating the product in the consumer's eyes. What would the iPhone be without the hundreds of thousands of apps that run on it?

After a few years of unopposed dominance... enter Android.
Android mascot
Android is the open source mobile operating system developed by Google and other members of the Open Handset Alliance. "Open source" is a loaded term with a lot of subtle distinctions, but in this case it more or less means that the operating system is free for handset manufacturers and cell carriers to put on their phones. Top of the line operating system at an unbeatable price? Sounds good to me. Apparently a lot of handset manufacturers agree with me. Not only were more than 50 different Android devices unveiled in 2010, but over 300,000 individual Android devices are activated every day.

Why would Google spend money developing Android only to give it away for free? In order to answer this, we have to look at how Google makes money. Certainly you've noticed the text ads in the right hand column when you perform a Google search. The vast majority of Google's revenue comes from users clicking on those ads. Certainly nobody actually clicks on those, right? The graph below begs to differ.


Which brings us to Google's rationale for giving Android away for free. Google conveniently made its own search engine the default for the Android system. More searches means more $$$.

Android home screen prominently displaying the Google search box.
So far, Google's strategy has been remarkably successful. Andy Rubin, the Android team lead, has confirmed that the Android strategy is profitable for Google. Additionally, Android has surpassed iOS as the most popular mobile operating system. Android's app store still lags behind Apple's but continues its rapid growth.

You can see that a showdown between titans is inevitable in the mobile space--almost certainly the most important space in tech right now. Which is why acting Apple CEO Tim Cook's recent comments regarding Android are so provocative.

Earlier this week Engadget reported that Tim Cook described Windows tablets as "big and heavy and expensive." Of the next generation of Android tablets, he said "today they're vapor." As for Android, Cook said, "In net we think our integrated approach is better, rather than making the end user a systems integrator."

These are bold words from a company facing a company seemingly bent on world domination (Google) and a platform with an insatiable appetite for market share (Android). Not that he cares, but I've got a few words in response to Cook's analysis of Apple's competition.

First of all, the next generation of Android tablets is right around the corner. And this time, they'll be running the newest version of Android (dubbed Honeycomb) which has been optimized for a tablet-sized screen.

More importantly, I take issue with Cook's claim that the "unified" approach will continue to prevail. Granted, Apple has enjoyed enormous success by developing both the hardware and the software for their products under one roof. However, I think this is unsustainable. To me, Apple is saying that it can single handedly take on the majority of the rest of the technology industry. Google's software has proven functional and incredibly popular. Handsets from HTC and Samsung are approaching the iPhone's level of performance. Combining the two creates a more than credible challenge to the iPhone and its ecosystem. In order for Apple's unified approach to continue to succeed, they will have to outrun an entire industry that has been notified of the importance of mobile--the same industry that has closed most of the gap in the past two years.

If I were one to make bold predictions (and I am) I'd say that Android will continue to gobble market share. Consumers will continue to take advantage of the wide choice of hardware and carrier service available under the Android system. Developers will continue to augment the Android Marketplace until it reaches parity with the App Store. Sure, the iPhone will continue to be a popular platform and a popular product, but Apple's sole reign on top of the mobile world is coming to an end.

Wednesday, January 19, 2011

An Apple a Day

I don't want to insult anyone's intelligence here, but unless you've been living under a rock for the past three years then you know that Apple has been killing it lately. I'm willing to bet that many of you own more than one of their sleek, shiny products. In fact, I'm going to go out on a limb and say that most of you will start reading this post on your iPhone on the way home from class, continue reading on your iPad during dinner, and finish the post on your MacBook before starting homework (while listening to your iPod). It doesn't take you three hours to read 1000 words? Ok, maybe it's a contrived example, but you get the point. You personally might not own four Apple devices, but the guy in the picture probably does.

How else do you explain this? If you didn't follow the link, you missed out on every investor's dream come true. For those of you who did follow the link, yes, you read the stock chart correctly. AAPL is up 300% in the past two years. Let that sink in.

As impressive as that graph is, it completely misses the point. Apple has become a cultural phenomenon. Civilization has been afflicted with an earbud epidemic. Those white cords you see dangling from ears on the subway all disappear into a jacket or pants pocket for a rendezvous with an iPhone or iPod... you caught me... maybe it's a Zune. Apple's not shy about touting their mind share, either. They market their latest iPhone, the iPhone 4, under the slogan, "This changes everything. Again." As much as you might want to resist that kind of hubris, it's hard to argue with results. Take a minute to appreciate the unbroken string of smash hits. iPod. iPhone. iPad (the jury was out on this one, that is, until they sold a cagillion of them). Heck, throw the MacBook Air in there if you want. If you're part of the Steve Jobs cult, you're not alone. Starting with his overhaul of the entire music industry when the iPod debuted, he's been nothing short of clairvoyant--which makes this week's news all the more troubling for Apple fans.


Earlier this week, Apple announced that Jobs would be taking a medical leave. Jobs is a pancreatic cancer survivor and the recipient of a liver transplant. This is his second medical leave in as many years. Apple's stock responded to the Monday announcement with a sharp dip of 6.5%. Perhaps more than any CEO short of Mark Zuckerberg, Jobs is identified with the company he leads. His absence is worrisome for Apple, to say the least. However, if I had to pick one thing in the world to assuage investor fears, it would be a 78% surge in profit. There's nothing like hitting a $6 billion (with a 'B') homerun to help support AAPL.

How are we going to go on without Jobs convincing us that the browsing experience on the iPad is surreal or that our iPhones don't have antenna problems? Will iPods even work without Jobs sprinkling magic dust before they ship from Cupertino? I claim to have answers to some questions, but those might be impossible. In the interim, COO Tim Cook will run the day to day operations. You might think that he has big shoes to fill and that he should just lay low for a while. Quite the opposite. He wasn't shy during his first few days on the job having already launched his opening salvo at Android. Check back tomorrow for more about the battle of the titans shaping up in the mobile world.

Sunday, January 16, 2011

Crowning Chrome

I've got to think that the internet browser is the most commonly used piece of software on a personal computer... and if it's not, then it's certainly headed there. Heck, Google has an operating system debuting that's nothing but a browser. Given this fact, I find it incredible how little thought most people put into selecting a browser. I've made a habit of asking my friends what browser they're using when I see them on their computer. I usually get a funny look, but this is nothing new, so I don't let it deter me. Their reaction probably comes from the fact that most people don't notice the difference when comparing one browser to another. This happens with a lot of software, and often the choice really doesn't matter.

Let's take one example where I believe choice does matter--text editors. Most people use word processors now, but text editors are still important. Wikipedia lists WELL over 100 different text editors. All text editors perform the essential task of writing ASCII to the disk, so what's the big deal? All the pseudo-religious emacs and vi fans out there just had a collective aneurysm. What spawns this seemingly irrational devotion? I'll give you a hint... if it's not the big things, then it must be the little things. (Or the medium things, I guess... bear with me, and let's assume that it's the little things.) The fact is that the little things make a huge difference when you're talking about a piece of software you'll likely spend years using throughout your lifetime. Hopefully I've convinced you that the seemingly insignificant distinctions I'm about to draw actually add up to a meaningfully improved experience.

Like text editors, tons of web browsers exist. Internet Explorer is in it's 8th major iteration. (If you care to look at the actual URL for that link, you'll begin to understand why IE isn't my favorite browser.) Mozilla Firefox, an outgrowth of the Netscape browser that started it all, is a popular open source alternative. Safari is Apple's horse in the race. Some upstart browsers like Rockmelt are trying to gain a foothold by offering an original twist on the traditional browser. In the case of Rockmelt, the twist is tight integration with social media. What's that?... I left one out?

Chrome just rocks. As hard as it is to believe, I didn't write this post to break the news to my reader base (makes it sound existent) that there's more than one text editor out there. I wanted to write about Chrome. Here are a couple of things that make Chrome an engine of productivity and enjoyment:

1. Minimalist Design
Chrome gets out of the way and lets you get on with whatever you're trying to accomplish on the internet (presumably not staring at a huge header filled with seldom used buttons.) Tab bar. Search bar. Bookmark bar. That's it. I use these three with almost equal frequency i.e. HIGH frequency. The point here is that Chrome shows you what you need to see and doesn't waste space showing you anything else.

2. One Search Box to Rule Them All
Instead of relying on the user to choose either the address box or the search box, Chrome simply makes the decision itself. Type something into the one stop shop at the top, and Chrome decides whether you're typing a URL or a search term. It's decision is almost never wrong because it's not hard to tell the difference between the two.

3. Search Engine Shortcuts
Remember what I said about small things making a big difference? Hopefully that sunk in because if it didn't, I'm going to get laughed off the page here. Chrome has a feature that allows the user to search a site directly from the one search box. An example will clarify. Rather than typing in "www.wikipedia.com", going to the site, and typing in the search, you can just type "wiki" and then your search term. Chrome takes you directly to the page thereby saving you the intermediate stop on the Wikipedia homepage and the hassle of typing in your search term once you get there.

4. Speed
(Not the white powdery kind you snort through a tube and makes you scream, the fast powerful kind that sucks bits through the tubes onto your screen.) I'm not sure of the technical details, but Chrome loads sections of a web page in parallel. For modern web pages, this makes a huge difference. If you believe Google, Chrome is faster than a potato cannon. In my experience, Chrome isn't that fast, but it IS noticeably faster than Internet Explorer and Firefox.

Those are just a few of the reasons that I use Chrome and why you should too. If you end up switching and like it, leave me a glowing comment. If you switch and wish you hadn't, your comment will be briefly considered and then swiftly dismissed.

Saturday, January 15, 2011

Rants, Reviews, and Random Thoughts

This is it! My brand new shiny blog. If you're reading this, you're most likely a close friend who is a victim of coercion or a wayward internet surfer who has stumbled upon this compendium of rants, reviews, and random thoughts about the tech world. First, I have to credit a few sources of inspiration. The first credit is shared jointly by Joel Spolsky and Paul Graham. These two are my tech blog idols, and I can't recommend them more highly. Following their advice simultaneously will have you dropping out, starting the coolest software company on Earth, and becoming a zillionaire before 30... which only makes me wonder how they read my mind so accurately. At the risk of sending his ego into the stratosphere (probably too late here), I also have to credit Peter Mertens. Peter bravely took the plunge into the blogosphere as part of his studies abroad in Athens, but as an aspiring sports writer, I believe he plans to continue blogging. If you are interested in reading well-informed sports commentary that is undoubtedly angrier and more cynical than anything I could produce, then check out his blog here.

So what could I possibly write about? What does this newly minted computer science undergrad have to contribute to the world at large? In all probability--very little. That's not going to stop me from trying though. Seeing as my interests mostly lie in the realm of software and technology, you're probably going to be hearing a lot about that. However, I'm reserving the right to rant about whatever I damn well please.

Composing a first blog entry purely of boring meta-commentary seems a bit cliche to me, so I'm going to dive into my first entry proper. The past few weeks I've been reading Jessica Livingston's silicon valley classic, Founders at Work, in between studying for exams and a trip home for Christmas. Luckily, the book is comprised of a series of interviews by the author of founding members of several prominent tech startups from the past 15 years or so (coincidentally, Spolsky is one of the interviewees). This allows me to read an interview every couple days or so without completely forgetting what was going on the last time I read. The book is unique in its genre in that it tells the stories of startups in the words of the founders themselves. By reading several of these interviews, one can start to recognize patterns which I think is the most valuble aspect of the book.

I recently finished reading Livingston's interview with Joe Kraus, the cofounder of the search engine Excite. At one point Excite was the fourth largest website in the world. The interview went a long way toward settling the debate over whether to found a company with friends or with business partners. Kraus started the business with a group of friends and a little bit of money from his parents. That's it. No idea. No technology. No company, really. He tells the story of sitting down with five of his friends and brainstorming ideas for a business all the while fully committed to doing something.

As an inexperienced entrepreneur, I'm not sure that I have any insights of my own to contribute, but I feel completely justified in throwing out my biased, underinformed opinion. It's my blog, so I get to do just that. I don't know how to put this diplomatically... I think starting a company with a group of friends would be the best. Any corroboration I can get from experts or "experts" is more than welcome. Kraus's main point is that dark days will come at some point or another. During these times, having something more valuable and enduring than money holding the gang together is necessary. I don't know if this really is true, but in my little experience it certainly seems to be.

Two of the three small "businesses" that I've started have brought me into a state of conflict over divying up the money. In one case I was best friends with my business partner, and I don't think that it's a coincidence that this scenario ended much more amicably and with both parties feeling more satisfied.

The moral of the story is that I'm going to go ahead with my plans for starting a tech company with the most qualified people I can find. If they're my friends then it's all the better. Thanks for reading if you've gotten this far, and check back for more soon!