Sunday, September 23, 2012

Jelly Bean on the GNex

By some strange miracle, the Android Jelly Bean update was released for the Verizon Galaxy Nexus today.  For the last 2 months I never felt the usual (and completely shameful) anxiety I get when waiting for an Android update to be approved by Verizon.  Rather, I was entirely relaxed, perhaps because Ice Cream Sandwich is still good enough on its own.  However, once I heard the news, I went nuts.  All after noon I tried to force the update to push using various tricks I read about online.  A terrible idea, mind you, and a waste of time, so I bit the bullet an attempted Plan B - "Installing the update manually".

I used to do this all the time with my old Droid 1.  Someone in the community would grab the update file, which you loaded onto the SD card. From the bootloader, you could then install the update and reboot.  It was simply, easy, and you didn't need to mess around with rooting the device.

But that was back in the Wild West days of Android, when the bootloader was wide open.  These days, most devices lock their bootloaders, though some, like the Galaxy Nexus, protect it with the equivalent of a screen door.  That is to say, it was made by design to be incredibly easy to unlock.

At least, that's what everyone in the community claimed.  Yet when I first looked for instructions on how to do it, I came away more confused than before.  Everyone recommended downloading a variety of user made tools that would unlock and root the device for you.  As I mentioned in my last post, I'm not a big fan of running software like this without knowing what it is doing.  For me, this isn't an option.  Personal choice aside, however, I failed to understand how unlocking the GNex could be considered easy when it required community enthusiasts to come up with the tools to do it?

I knew there was another way.  There had to be.  Why else would people say it was designed to be unlocked?  I confirmed my suspicion fairly quickly.  At least one tutorial on the subject referred to an alternate, more difficult method of unlocking, one which involved using the Android SDK.  Suddenly it all made sense.

Here's the scoop, so far as I can tell.  If you have the Android SDK, you have all the tools needed to unlock the bootloader on the GNEX.  With one command, you can reboot it into recovery mode, and with another you can issue the unlock command.  Done and done.  Installing the manual update file for Jelly Bean does require from outside help, in the form of a custom recovery tool like Clockwork Mod Recovery.  Thankfully, Clockwork is a well known and reputable piece of software, so I wasn't afraid to use it.  Moreover, once I ran it, I could immediately tell what its purpose was. It looks just like the bootloader/recovery program from the OG Droid, the one I used to use.  My only issue with Clockwork was getting it running.  From what I can tell, you can flash it onto the phone using the SDK tools, but it seemed to go away once the phone restarted.  If that's the case then I'm even more pleased, since it means I can load it up for one time uses when need be. 

When all was said and done, I had the official Jelly Bean update on my phone, and I learned a valuable lesson.  What most Android enthusiasts considered "hard mode" is actually very easy if you're the kind of person who 1) isn't afraid to install the SDK, 2) knows how to do it, 3) did it already for actual development, use and 4) understands what it does.

Moreover, I discovered that while the actual developers in the Android hacking community are far smarter than I, the guys who are obsessed with unlocking/rooting/flashing ROMs are not necessarily so.  Guess I have to trust my instincts a little more.

Saturday, September 08, 2012

Rolling Your own (source code)


When I was first learning how to use Linux, package managers were a godsend.  I still remember the first time I saw someone use apt to download some needed software.  Here was an OS that was free as in beer, free as in speech, and was backed by multiple mirrors of hundreds of software packages which will instantly configure themselves on your system.  For a moment I felt like I was living in the future.  In reality, I was living in a world where the dominance of Windows blinded me to the fact that the cutting edge of computing existed elsewhere, and it was awesome.  On a more practical note, the ease of use of Debian packages made it much easier to set up a working Linux environment. Dare I say that if it weren't for package managers, I would have never considered Linux to be anything more than a curiosity.

Nowadays, I don't use packages that much.  As the years went by, I found myself being forced to build various pieces of software from source.  At first I simply pasted commands into a shell, but eventually I grew to understand what those commands meant, until I finally got to the point where I could grab a tarball and install it without any guidance.

Then something else happened.  Building from source went from being an option, to being my preference.  And it's becoming true more and more with every week.

My current view is that Package Managers are indispensable in at least two situations.  The first is if you're a newbie, or if you use Linux for basic computing tasks.  The second is if you are running a server in an enterprise environment, where your system needs to stay up to date, and simply cannot break.  For someone who falls into these camps, I'd say you're crazy not to rely on packages as much as possible.

For developers and power users, however, I think the negatives of package managers become more severe.  Namely, they introduce a  lack of control and transparency to your system, leading to problems that can waste as much of your time as a finicky source bundle.  For instance, what if you want a certain feature to be enabled, or disabled?  With the precompiled package, you have no say.  On the same token, what if your distro's package repository is slow to update to the latest version of a programming language?  These concerns don't affect everyone, but when they do, it can be incredibly frustrating.

In regards to transparency, consider Synaptic Package Manager, which doesn't tell you what gets installed, or where it gets installed to, until the package is actually on your system.  This is counterintuitive to me.  The package is installed without asking you for a destination.  That means it must know where it is supposed to go.  Why can't I see that ahead of time?

Furthermore, I can't shake the feeling that the dependency lists for some packages could be leaner than they are.  Far too many times have I installed what looked to be a simple package, only to find that it has to bring twenty friends along, some of which look completely unrelated.  Back when I (admittedly foolishly) tried to run Ubuntu on five gigs of disk space, I'd fill it up in a blink, until I went in and uninstalled a half dozen kernel images and a list of libraries as long as my arm.  This problem plays into a more general attitude among developers which assumes that disk space is trivial (for more examples, see Maven, Ivy, rubygems, and other programming-language related package managers).  Even when it is, I don't want to fill my HDD with more data than it needs, nor do I want to spend time pruning it months from now.

For all these reasons, I've grown to rely on building from source almost exclusively.  It lets me know exactly what's on my PC, and where it is located.  For example, I find that it is easier for me to keep track of binaries if I install them to  /opt, rather than spreading them out between /opt/, /usr/, and /usr/local.  It also lets me install multiple versions of software, and change between them as needed.  To be fair, there are pitfalls with this approach.  Some things don't build nicely, especially on OS X.  Thankfully, most of my pitfalls have managed to double as learning experiences, and I hope that the lessons learned will make future installations smoother.

But there's one thing about building from source which bothers me more than anything else.  I'm starting to get fanatical about it.  When I tried to get Ruby setup last month, I found myself increasingly frustrated with the community's insistence on using Macports or Homebrew to grab needed pieces.  I felt allergic to the voodoo that the various Ruby management systems (such as RVM) practiced behind the scenes.  I even chaffed at Google's insistence on making the pre-compiled version of Go a .pkg installer, which placed everything in /usr/local whether you like it or not.  The way I saw it, the people who have an interest in installing Ruby don't fall into either of the two camps which benefit from Package Managers.  Abstraction is good in programming, but abstraction at this level can lead to someone working with a tool that they don't really understand.  If it ever happens to break, they would find themselves at a loss at what to do.  As I mentioned before in regards to Ruby, convention over configuration can only go so far.  Sooner or later, you need to look under the hood.  Once you get over the initial hurdles, and your system is suddenly working exactly the way you want it, the feeling you get is nothing short of triumphant.

Again, I think my own stance is too strong.  It perceives the world as too black and white, and it does not factor in any number of edge cases.  Ultimately, it doesn't really matter whether or not you use package and roll your own, as long as you can do the stuff you want to do.  It's not worth it to get worked up over these kinds of debates. Not when there are stupid projects to undertake, like building Linux from Scratch.