Back to It…. Saturday, Jul 21 2012 

Yes. I was a very naughty boy. I started a blog, got overwhelmed with other stuff and stopped updating it. No more. I’m back to blog like a madman. I apologize to all of you whose posts were left in moderation limbo.

No Fences Thursday, Apr 22 2010 

I don’t believe in
fences or fiends.
I don’t own a
iPad or iPod.
I don’t
tweet or text.
I don’t use
Microsoft or MacOs.
***
I believe in
freedom and friends.
I own a
computer and Creative-Micro-Zen.
I
call and converse.
I use
Lucid Lynx.
I do because it’s
better.

Should the Ubuntu Release Cycle be Cut Back Saturday, Mar 27 2010 

The end of March is near, and in about a month, we will be seeing Lucid Lynx, or Ubuntu version 10.04.

After finally cutting my last machine over to Karmic a few weeks ago (my X60 tablet), I’m wondering if Ubuntu is developing new versions too fast.  I know some may think how can development be bad or too fast.  The answer is simple: stability (or lack thereof).

The main reason why I waited to cut over my last machine was because I was worried about having to tweak my machine so that I could get my tablet functions working.  It took me about an hour to work through, and I got it pretty close to perfect.  (I can’t get one feature, auto-rotation when in tablet mode, which was working in Jaunty to work in Karmic.)  The other reason why I waited was because, aside from a couple of issues, my machine working fine, and I didn’t feel the need to upgrade.  After all, regression sucks.

But I did finally bite the bullet because I noticed my personal T61P at home was running much cooler and more efficiently after I cut it over to Karmic.  I also had a problem with that bone-jarring alert beep occurring on shutdown or when I over-deleted/backspaced (entering one too many of either) in menu dialogues with Jackalope on my X60.  The sound was so annoying I became like a Pavlovian dog; after a few days jumping out of my seat (because of the alert) after beginning the shutdown cycle, I learned to turn off my external speakers before doing so.

So what’s my assessment of Karmic?  Actually, on my personal machines, I have had no major issues with it.  However, after my last cut-over, my feelings are mixed.  I have one issue with Karmic and dual-head mode.  Karmic seems to have better monitor detection capabilities and allows me to extend my desktop over two monitors.  The only problem is that if Compiz is running, X routinely freezes up.  After many months of routine use my X60 (I plug into an X6 multi-base in my office), I’ve had to learn to do things differently.  I have to disable Compiz before plugging into my base.  Not a big deal; however, after many bouts of forgetfulness, I’ve needed to restart at lest two dozen times.

So, for the first time in my computing life, I have decided to run alpha software to see what Lucid Lynx has to offer.  Since I was planning on fresh installs anyway, I decided to upgrade to the 64-bit version instead.  I started off slow.  I put Lucid Alpha 1 on my other Thinkpad (a T60 with integrated graphics).  Amazingly, things worked quite well with little tweaking.  It is alpha software so there were a few glitches here and there, but very minor.  Since it worked so well, I decided to install Lucid on my T61p at home (this one uses NVIDIA graphics); the Alpha 3 was already available, so I installed it instead.  Aside from some issues with the current NVIDIA drivers, the experience has been very positive as well.  I began thinking, Wouldn’t it have been better to hold off on Karmic and wait for Lucid instead?

I also began thinking about Ubuntu’s quick release cycle.  Do they need to release new versions every six months, or should they give themselves more time in between versions to try and release something more refined say once every year?  I’m not a programmer and have the faintest idea what six calendar months equates to in computer software development time, but after using every version of Ubuntu since Feisty Fawn (not to mention several other distros in between), I find myself wondering if it is too much.  Couldn’t the upgrade cycle be cut back to allow for partial upgrades of the current version instead.  For example, instead of going to a new version of the entire OS, couldn’t new versions of apps like Open Office or Firefox be moved into the repos instead?  (I know, things we can, as Linux end users, do anyway.)

Why not give the Ubuntu developers time to work out the kinks and get things more stable?  Don’t get me wrong; this isn’t some off the cuff rant because I am angry or disappointed with Ubuntu.  In fact, this line of questioning is quite the opposite.  Like every good citizen of every country, contrary to what many conservatives in this country believes (they call it unpatriotic), we have a responsibility to ask questions and critique the status quo to see if there is room for improvement.  The only way things stay at the top is if it is willing to adapt and change.

I know.  You must be saying isn’t that what the six month cycle insures.  Well you’re right.  It does insure a new version every six months, and seemingly, a change in the status quo.  The problem is of course time and quality.  Look at Microsoft as a prime example.  Although their release cycle is very slow and not a good model; how many times have they released a sub-standard product because of profit margins?  Vista was a joke because it was poorly designed, too ambitious (in terms of driver development and third-party interest), and was on a timeline that it could not keep up with.  So what did MS users get?  A product that no one wanted to use or move to (like Windows Me).  People were clinging on to XP or jumping ship to Mac and Linux.  I had never seen so many new Macs in my life until Vista was released.  (Which may not be a good thing either…but that’s another post for another time….)

Well, Lucid is still in alpha and we won’t know how good it is until it is finally released in April.  So far, aside from moving the minimize, maximize, and close icons from the right side to the left, a la Apple (something that can be easily moved back), it has been very positive and promising.  The load times seem to be faster–although nowhere near ten seconds–so far I’ve been averaging about 30-35 seconds to working desktop, and I do like the new themes and splash screen they’ve installed.  I do like the fact that the apps seem to have finally caught up; all of my favorites, Firefox, Thunderbird, Kompozer, etc. are there in their newest versions (sans the Gimp…it is in the repos and can be easily installed).

I look forward to when Lucid is finally released in a few weeks, and regardless, I will be a happy Linux user.  But while using whatever version I finally decided upon come that time, I will still be pondering whether Ubuntu could benefit from a slightly less ambitious release cycle.  Cutting edge is good…bleeding edge…not so much….

eBook Pricing Post iPad Tuesday, Feb 16 2010 

A colleague sent me an article from the New York Times last week discussing the likelihood of eBooks prices going up.  Ever since the fabled iPad was rumored, I wondered what would happen to the eBook market.  I was greedily hoping the announcement would bring prices down like it did for music when Apple first opened the iTunes store.  (Of course, we all know what happened to iTune prices once Apple established a strong customer base.)

Considering much of the clamor for the iPad was lauding its ability to be a multimedia device and e-reader in one, many believed the device would do to eBooks what the iPod did to music.  In other words, make book buying a cheaper and more convenient online experience.  After all, who buys CDs these days?

The interesting thing here is that Apple, in Jobs’ attempt to gain a sense of exclusivity, capitulated to publishers’ desire to set their own prices for their book catalog on the iPad.  The deal quickly had ramifications outside of Cupertino, and Amazon was also agreeing to McMillan’s demand to raise prices of bestsellers from $9.99 to up to $14.99 a download.

Since Apple was a second (well actually more like a third or fourth) comer, they couldn’t strong-arm publishers to give them special pricing to entice customers.  Amazon had that racket nailed down already.  So instead of strong-arming the publishers, they did the opposite.  They allowed publishers to dictate the prices to them in an attempt to garner enough support to build a viable and attractive catalog for their iBooks store.

So how is Jobs going to justify higher device and media costs?  Wow factor.

If you saw Jobs announcement for the iPad, you noticed that many of the things he focused on was about the interface.  Like the Nook, you can peruse the bookstore by seeing cover shots of books in the library.  Apple, of course, upped the ante a bit by putting the books on a virtual bookshelf that leads to a secret entrance into the bookstore.  I gotta admit, cool looking.  But would it make me go out and replace my Nook for an iPad?

He also showed how a finger swipe turns the page and how the other functions work–bookmarking and the like.  Again, all interface.  I wonder how the reading experience is outside in sunlight?  Or, how long you can read before your eyes get fatigued?

Of course, he also talked about embedded videos and other mixed-media elements the iPad would be able to support because it is, well, a computer.  Sounds interesting, and it is a feature that could possibly change the way people write books (like graphic novels).  But what kinds of videos are going to be embedded in books initially?  Commercials, maybe?

So let’s review the reasons why the iPad is good for eBooks: a cooler interface, video ads, and, oh yeah, higher book prices.  And don’t forget you do have to buy the device first.  That will set you back anywhere from $500-$700 dollars, and if want to be able to use your device away from WiFi hotspots, another $30 for use of AT&T’s already overtaxed (thanks, iPhone) 3G network.  How could anyone pass up this deal?

Well, obviously, I’m not convinced that the iPad will be as well received as many think.  I’m skeptical that there is a need for such a device.

Do you know why I love my Nook?  Because it’s convenient to carry and use, and it does its job very well.  Quite frankly, I wouldn’t want to hold a 10 inch device to read my paper while I have my lunch.  I want instant on and an easy-on-the-eyes reading experience.  If I want to browse the web and check email, I can power up any of my PCs to do that; they all run Ubuntu, so they load rather quickly.  Besides, they all have the conveniences that make surfing the Net fun and enjoyable–a physical keyboard, stereo speakers, etc.  Okay, I admit, being able to navigate to any place on a page by pointing with my finger is a cool feature.  However, the iPad, as I see (granted I haven’t seen one in person), lacks one major factor that makes e-readers cool (and a viable replacement for printed books), the ability to mimic the reading experience of an actual book.  Books don’t interrupt your reading with an email alert, or try to distract you from the words by embedding videos, and they won’t even allow you to surf the Net.

In terms of book pricing, don’t get me wrong; I am all for writers getting paid for their craft.  After all, without them, there would be no books to speak of.  I believe that everyone should get compensated (if they so choose) for their time and effort should they provide a valuable service or product.  However, what I don’t agree with is how publishers use the poor author as an excuse to raise prices.  If the publishers are so concerned about the author, why not take a smaller cut and give more to the author?

Why raise prices anyway?  One astute person who was interviewed in the aforementioned Times article suggested that since there is no printing, shipping, and storage costs with eBooks, it seemed absurd that publishers could justify raising prices.  Seems logical; however, the counter-argument the publishers make is that printing and handling of physical books is only a small portion of the overall publishing cost.

Unfortunately, I don’t know enough about the publishing industry to say if this is a valid argument.  All I know is that the old lesson of economics will come into play here–supply and demand.  I’m certain the publishers who inked deals with Apple are banking that the iBooks store makes eBooks as ubiquitous as music bought from the iTunes store (and hence, driving up demand).  The problem I see is that people are reading less today than before.  The eBook, I felt, would be one way to get people reading again.  Unfortunately, I’m concerned that the new price-point is going to alienate would-be readers.  Instead of reading, they’ll spend the $15 dollars on the DVD version of Kite Runner instead of Khaled Hosseini’s, much better in my opinion, written narrative.

I don’t want to come off as some maniacal anti-Apple/Jobs Linux fanboy, but we have to realize that Apple is rebuilding what Microsoft was forced to break down–a restrictive market.  At least Gates didn’t require you to buy a Microsoft computer to run his software.  Jobs got his groove back and is willing to take on all comers.  But at what price?  You can’t use Flash on the iPad because Jobs feels Adobe is lazy.  Whether he’s right, isn’t the issue.  It is the fact that you, as the end-user, CANNOT decide this on your own.

For me, the cost is more direct; I’ll have to pay more for eBooks thanks to him.

The Nook with Ubuntu Cont. Friday, Feb 5 2010 

Lee Seigel of the New York Observer likens Steve Jobs’ presentation of the iPad to Moses presenting the Ten Commandments to the Jews.  He says:

On Mount Sinai, Moses received the Ten Commandments written on twin “tablets,” then climbed back down into the desert wilderness and explained the new law to the Jewish people. Clutching his own “tablet,” Steve Jobs orchestrated his appearance before the world press last Wednesday along Mosaic lines, presenting his device as if it heralded the dawn of a new age. Not only that, but the hyperventilating media went on and on about Mr. Jobs’ talent for “shrouding” his announcements of new gadgets in secrecy. The Tablet. The Shroud. Who says that technology is any less mystifying than religion?

So now that Steve Jobs brought the iPad down the mount the world can begin turning again.  I’m a little concerned how the iPad will affect prices for eBooks.  Hopefully, all of Jobs back-room dealings with publishers doesn’t drive the prices up too much.  We’ll have to see.  I did see a recent article that suggested that Amazon will soon be raising their prices because of new deals with publishers that are asking to set their own prices (seems that the iBook store has given them more leverage, and they’re beginning to assert it).  Regardless of what happens, I’m not giving up my Nook just yet.

I wanted to follow up and talk about my month-long experience using the Nook.  So far, I love it!  I haven’t found anything that has shaken my confidence in the device.  Just so that people don’t think I’m some crazed Barnes and Noble shareholder or hardcore fanboy, I will admit that I had a couple of incidences of freezing.  I did hard restarts both times, and everything worked fine afterwards.  The freezes occurred when I was opening a new eBook.  While it was formatting, it seemed to freeze.  Of course, maybe I was just being impatient.

Regarding some earlier complaints that the OS was slow and buggy.  I’m not sure if the new version of the software fixed many of those issues or not; however, I’m pretty satisfied with the speed and response of the device.  It’s not setting any new speed records; however, in terms of what it needs to do, it seems just fine.  When books or documents are first opened, the Nook does need to format them to whatever parameters the reader sets, so that may take a few seconds to do.  But come on, how fast does it need to be?  The act of reading should be an enjoyable and relaxing activity.  The few seconds the Nook takes to format a book gives me an opportunity to sip my coffee.  Just because it’s an electronic device doesn’t mean the cold numbers of diagnostic tests should determine it’s worth.  The quality of the experience is more important when assessing a device like this.  Take NASCAR for example, the fastest car in qualifying isn’t always the best car in the race because of numerous factors–like handling, how the car reacts in traffic, etc.  Same here.  I’m not editing or ripping videos–speed isn’t that big of an issue…unless, of course, it’s freezing.

I also wanted to clarify something I mentioned in my earlier post.  I stated that although I was able to copy some PDF files to my Nook, they did not render properly when I opened them.  I am happy to say that I can retract that statement.  I reopened my PDF documents and changed the font to “extra small,” and guess what?  It rendered perfectly–exactly as it looks when printed.

I also promised to check the audio player, so I converted some of my OGG and WAV music files to Mp3s and transferred them to my Nook.  The files played without a hitch.  The nice thing is that you can play the files while you’re reading.  Although I wouldn’t suggest people do any hardcore reading while listening to music, it is nice to be able to play some background music while perusing the newspaper.

In terms of material available, so far, I’m happy.  I like that I can download a lot of free content.  Many of the classics are available as downloads through B&N’s bookstore.  I also bought a few books and am currently doing a 14-Day trial with the San Jose Mercury News.  I wanted to get a NY Times subscription, but decided that $10 a month is a little too pricey.  I can look at the Times online for free.  The SJMN subscription price on the other hand is quite reasonable at $5 a month.  I do like the paper because they are based in Silicon Valley, so have a lot to say about the tech industry.  The Bay Area is also much closer in terms of region to me, so many of the articles have a little more relevance too.  I will more than likely keep the subscription even after the trial period ends.

Lastly, and most importantly, synchronization and managing my book/music library has worked very well.  As I said before, Ubuntu and my Nook work fine together.  Since Ubuntu mounts the Nook and the installed SD card as mass storage devices, moving files is easy and painless–drag and drop to the file browser windows.  So far, I haven’t had any issues working with my Nook in Ubuntu–I wonder if Moses’ iPad can claim that.

Unicomp Endurapro: Quite a Clacker Sunday, Jan 17 2010 

As promised, I wanted to share with all of you my experiences with my new toy.  Part of the reason why I was posting so little in the last few weeks was because my Lenovo multimedia keyboard began failing.  I first noticed the problem when I was trying to input an address around the Christmas holiday.  Every time I hit a key in the second row of the number pad, the number or the symbol would not register, a problem I could easily work around for the short-term.

But I also noticed, and I thought this was quite odd, that my semicolon and right arrow keys were also malfunctioning.  Now these routinely bothered me.  The semicolon is a fairly common punctuation in the English language, and, although it isn’t as common, its key mate, the colon, also gets some regular play on my keyboards.  And I use the arrow keys quite often to navigate documents and webpages.  So in order to use either key, I would have to take my hand away from my keyboard tray and reach up to the top of the writing surface of my desk to hit the corresponding key on my Thinkpad.  (Yes, I know I could use the keyboard on my laptop; however, my TP sits pretty high on my desk and because of that, isn’t very comfortable to type on.  Besides, I’m a creature of habit; I like to do things a certain way.)

The funny thing was that I was thinking of replacing my Lenovo keyboard anyway.  Although there were many things I liked about it, I didn’t like the fact that I couldn’t make all of the multimedia keys work with Ubuntu.  I kept trying to remap the keys; however, wasn’t able to.  What’s the use of having a keyboard with email, internet, and app shortcut keys if you can’t use them?  Luckily, the volume and the page forward and backward buttons worked fine.  I got the idea of replacing my Lenovo keyboard when I saw a rebadged Logitech keyboard on the ZA Reason website.  (By the way, the ZA people seem very nice.  I spoke to someone there, and she was very polite, patient, and helpful.)  That particular keyboard, the Logitech Internet 350, didn’t require special software to make the multimedia keys work with Ubuntu–plug-and-play right out of the box.  The ZA people also added a nice touch–Ubuntu logos and keys.  Yes, that’s right, no Windows keys to mar your FOSS aesthetics.  But once I calculated what it would cost to send the keyboard to me here in Hawaii, forget it.  The shipping was almost the same price as the keyboard itself.

So instead, I went to the local Walmart (we don’t have many choices when it comes to tech purchases on this island) and bought a Logitech keyboard there–not the 350 mind you.  I got a really good price for it and was initially satisfied.  However, when I got home, I realized that I forgot to take one thing into consideration.  My mouse has always been connected to a USB hub on my keyboard since the day I got it.  So I was faced with a couple of problems.  First, I use a Logitech travel mouse, and the cord is fairly short.  It would have been impossible to reach the USB port on my TP from the keyboard tray.  That meant I would have to place my mouse on the desk surface separate of the keyboard where it would be uncomfortable to use.  The second, and more important problem, was that I only have 3 USB ports on my TP T61P; I didn’t want to dedicate two-thirds of my USB ports just to facilitate the use of a keyboard and mouse.  Strike one.

So I was left with a couple of choices: 1. Buy a USB hub to connect the keyboard and mouse to a USB port without removing either from the keyboard tray, which would leave the other ports open for peripherals, or 2. Get another keyboard that had a built-in hub, or that was wireless and came with its own mouse.  So I returned my initial purchase and bought another Logitech board from the local K-Mart.  This one came with a wireless keyboard and mouse.  I spent about three times as much, but hey, I needed it.  When I got home, I found that the wireless dongle had two cables coming out of it–a USB connection (for the keyboard) and a PS/2 connection (for the mouse).  PS/2?  Who has PS/2 ports anymore?  (Well, my old Dell laptop still has a PS/2 port, so I guess I still do.)  Strike two.

I returned that keyboard to K-Mart and decided that I would need to order a keyboard from the Net instead, which was okay considering places like Amazon usually have options for free shipping.  But it wasn’t as easy as I thought it would be.  I began searching my options and kept coming up with 60 dollar keyboards that would cost me 80-90 bucks by the time it got to me.  Strike three.  If I was going to pay a premium price, I wanted a premium product.

So I decided to research.  As with many of these searches, I began with the ubiquitous Google.  I got many hits for “best keyboard,” however, the most common names that kept coming up was the Das Keyboard and Unicomp (or Model-M).  So after researching both manufacturers’ sites, I thought I would go with the Das.  Their board had a modern look.  It also included a USB hub built in, multi-media keys, Linux support, and it used old school mechanical switch technology for key input.  Yep!  Click, click, clack….  No quiet, mushy rubber dome stuff here.  But before I made my purchase, I did a few more searches and found that some people were claiming that their Dases required a lot of power and were rendering other USB ports on their machines unusable when the boards were plugged in.  Now, I don’t know about you, but I like having USB ports, and I don’t want them being tied up by a keyboard.  So I went back to Unicomp to review their lineup.

Now, to understand who Unicomp is, you have to know that they bought the design for the OG IBM Model-M keyboards from Lexmark, who was (maybe still is) a subsidiary of IBM.  If any of you remember the Model-Ms, they were those keyboards that were big, heavy, and had those really long cords with the coiled ends.  They were solid and well-built.  Granted, they don’t quite have the same flair and styling of modern boards, but they’re built like tanks.  In fact the original Model-Ms, those with the four or five function keys on the left side, had a metal plate at the bottom which prevented flexing.  Metal plate=tank!

Unicomp’s lineup sticks to the roots of the Model-Ms, so they look like throwbacks.  They use buckling spring switches, so they have that terse feel of those old boards.  They have added in some of the newer IBM keyboard designs so have models with built-in stick mice and smaller space-saving designs too.  What I was most impressed by was that Unicomp offered the ability to order some of their keyboards in Linux friendly layouts–I’m guessing for programmers.  They also allow you to customize their keyboards.  I even found out that for you pomaceous fruit users, you can request, although it isn’t listed as an option on their website, Command keys to take the place of those offensive Windows keys.  There were a couple of problems with the Unicomp lineup though.  First, their keyboards don’t include built-in  USB hubs (some models that use PS/2 connections have an additional port on the side of the keyboard).  The second problem, it would be costly to ship the keyboard from Kentucky to Hawaii.

But because I was getting sick of a dying keyboard and didn’t want to settle for a cheap rubber dome one again, I bit the bullet and bought a Unicomp.  The Endurapro with the built-in pointing stick and mouse buttons to be exact.  Initially, I thought I could forego an external mouse all together, but realized that I do need the mouse when I edit photos or images.  That meant I was back at square one.  I had an awesome keyboard coming, but I was going to have to tie up two USB ports and would have to place my mouse where it would be uncomfortable to use.  I knew I had to get a hub or something that would allow me to keep everything where they were, and to keep my ports open for other peripherals.  Luckily, I was able to buy a cheap port replicator that had powered USB ports.

When my keyboard finally arrived, I was very happy to find that it worked well with the port replicator.  The mouse also plugged in fine and was recognized.  Another nice perk with the replicator was that I was able to plug my external speakers into it instead of the jack on my laptop.  That meant fewer cables cluttering my desk.

Another nice touch was that I was able to get a couple of blank keys for my Endura.  So guess what?  No more waving flags on my keyboard!  I was amazed how easy it was to change the keys.  I used a dull butter knife with some paper towel wrapped around it to pop the original keys off.  I slid the knife between the bottom of the key and the top of the lower casing.  I then gently twisted the knife and jiggled it to get under the key.  Without too much force, the key popped right off.  I took the blank key and centered it over the spring inside and pushed it back on.  Beautiful!

As for performance, the keyboard is great.  It has a very nice feel.  Firm key presses are needed to get the characters to register, and you get clear audio feedback.  Some will find the noise a little irritating, but I’m sure it’s something you’d get used to after awhile.  If you have sensitive people around you, tell them to wear ear plugs.  The sound has has a kind of machine gun like quality.  Because there isn’t a metal plate like the OG Model-Ms to dissipate the sound of the key presses, the Endura sounds a little hollow.  I think the sound reverberates inside the plastic casing of the keyboard and hence that hollow sound.  Nonetheless, the keyboard is awesome.  If I had known I was going to use a port replicator from the start, I probably would have ordered the Space Saver instead of the Endura.  It’s kind of cool to have the pointing stick there on they keyboard should I need it; however, I find that I will accidentally hit it when I type.  No big deal as long as it doesn’t break.

I’m a happy camper.  I have a great keyboard and no more Windows keys, and in actuality it didn’t cost me that much more than a modern wireless keyboard/mouse combo would.  I know.  I don’t have all the multi-media keys anymore, but heck, I’ll learn the keyboard shortcuts instead.  Either way, I have great setup once more!  Time to blog!

The Nook E-Book Reader with Ubuntu Saturday, Jan 16 2010 

Hello blogosphere.  It feels good to be back.  I got caught up in a new semester and all the work that goes along with it and haven’t been able to write.  I have good news though; my Nook e-book reader arrived yesterday.  I was quite surprised.  It wasn’t even supposed to ship until the 15th.  Imagine my happiness when it arrived at my door instead.

I had been neglecting my family recently because of semester preparations, so I couldn’t open the package right away.  Instead, I put it on my desk and waited a few hours.  When they went they went for a walk, I tore in.  It was kind of anti-climatic when I did.  When I got the shipping box open, I was happy to find everything inside.  It comes in really pretty packaging; however, that same packaging, like the ones they love to use these days (think toys…they come in that heat sealed plastic with wire twist ties, and plastic zip ties), is horrendous. It took me almost 20 minutes to cajole my Nook out of its package.  First you have to fight through the hermetically sealed plastic, then the lexan-like box, and finally the plastic tray the Nook is cradled in.  It really is pretty; however, overkill considering what it takes to get the Nook out.  Once I did though….

The Nook is beautiful.  It has a sleek look, rounded corners, beveled bezel with a continuous smooth look (even where the page advance buttons are), and don’t forget the color touch screen at the bottom.  Once I put it into the green Italian leather cover I bought…forget about it!

First things first, I chose the Nook over the KIndle for a few reasons I discussed here.  To save you the trip, I’ll briefly review them.  First, I liked that the Nook allows users to expand the memory by installing micro-SD cards.  (You can go up 16GB.) I assumed the Nook engineers were smart enough to put the card slot in a highly accessible place like at the bottom of the unit or on the side.  Unfortunately, I quickly found out after I ordered mine, that the card slot is actually located under the back cover where the battery is located.  I was greatly disappointed with this because I was planning to save documents onto SD cards and them plug them into the Nook to read at my convenience (mostly for student essays and other written work–after all, the Nook supports PDFs natively–see reason two).  But putting the card slot under the back cover put that plan to rest.  I am not going to remove the back cover every few days to get my SD card out.  Ultimately, the back cover isn’t designed to be taken off regularly and takes some effort to do so.

The second reason I chose the Nook, was because it supported PDF, ePUB, and MP3 formats natively.  Unlike the Kindle, you don’t need third-party software to convert files to make them render or play on the Nook.  This was important because I did not want to be forced to buy all my books exclusively from B&N.  I like having choices.  In fact, and this leads into my last reason for choosing the Nook, my initial plan was to read and reread many of the classics (and other texts) that can be found on Google Books and in the public domain free of charge; a service that comes with the Nook.  I guess choosing Android gave them access to Google’s collection.  Don’t get my wrong though, I’m not against buying books and probably will purchase my share.  However, I do plan to read what’s free most often.

The Nook has gotten mixed reviews so far.  Many find the touch-screen interface laggy and slow.  I’m not much of a touch-screen kind of guy, so I’m almost certain the lagginess won’t bother me too much.  As long as I can use the interface, download books, and read them, I’ll be happy.  So far, so good.  I have already downloaded five books and am about 70 pages into one of my favorites, Don Quixote, already.

The one thing I was most concerned about up until this morning was the Nook’s ability to connect and be managed by Ubuntu.  Because my original plan to swap SD cards regularly was dashed by the placement of the card slot, I had to bank on the Nook working with my Linux boxes.  B&N has software downloads for MS and Mac that allow both systems to download, render, and mange eBooks from B&N.  Unfortunately, they don’t offer similar software for Linux users.  I was a little concerned that the Nook would need such software to allow management from a computer terminal.  Luckily though, I can confirm that I connected my Nook to my Thinkpad this morning and Ubuntu recognized it as a mass storage device.  A file manager dialog opened as soon as I plugged in the USB cable.  I could see the individual folders on my Nook.  I was able to move two PDF files from my hard drive to my Nook without a hitch.  One file had a lot of pictures and the other was basically a text file I created in OpenOffice and converted to PDF.

The first file opened; however the formatting was not correct.  The captions under the pictures moved and the alignment of the photos changed.  The pictures rendered, and they looked like photos in a black and white newspaper–nothing spectacular, but acceptable.  The second file also opened without a problem; however the formatting also changed a bit too.  Text that was center aligned on the original file moved to the left, and the numerals in the header did not render at all.  Although many would be disappointed by this, I wasn’t.  The main reason I would be opening a PDF on my Nook would be to read a student essay during times I was not sitting at a terminal.  My plan was never to do a lot of editing and commenting while on my Nook.  When I grade student essays, I like to read them at least a couple of times.  Being able to convert them to PDFs and render them on my Nook will allow me to read them while I wait for my kids or wife in the car or while I commute to work in the morning.  I will be able to do a lot of my first reads during these times, thus allowing me to do the more involved readings sooner.

I did not move any music files, but when I do, I’ll let you know how it goes.  Unfortunately, many of my favorite music files are in OGG format, so I’ll have to convert them to MP3 first before they have any hope of working on my Nook.

I feel fairly confident that I’ll be happy with my purchase even after the fabled iTablet is released in the next couple of months. My electronic arsenal will still be free of panes of glass and forbidden fruits!

By the way, I received another cool product recently; in fact, I’m using it right now.  I’ll be writing about my impressions of it soon.  (I’ll give you a hint; my Lenovo multimedia keyboard began failing about four weeks ago, and I went old school!)  Happy new year!

HP Mini-Note 110: Installing Ubuntu Karmic & Wireless Issues Tuesday, Dec 29 2009 

I have been monitoring the traffic on my blog and noticed an increase in interest in my post about helping my student install Ubuntu Karmic on her HP Mini-Note 110.  Apparently there are quite a few people who are experiencing problems installing Karmic on their Mini-Notes.  One issue that seems to be common is the Broadcom wireless card not working properly.  (Actually, the problem is getting the right driver installed.)  I don’t own a Mini-Note and have worked on only one, so my experience is limited.  Regardless, I have decided to share my sole experience in hopes that it can help any of you who are experiencing problems.

I want to be clear.  I am not a Linux/Ubuntu expert, and what I share here is based on my acquired knowledge (which isn’t a lot) and experience.  I do own a Dell Mini-9 which uses the same Broadcom card as the HP Mini-Note (as far as I know) and so have had to troubleshoot similar issues.  I don’t want to sound ominous; however, please use my suggestions at your own risk.

First of all, let me give you a list of hardware and software I used for the install.

  1. A 2 GB Kingston DataTraveler USB thumb drive,
  2. An ISO image of the Ubuntu Karmic (9.10)–I used the RC version of Karmic which may be the cause of the wireless issues,
  3. Unetbootin installed on a box–you will need this to make the bootable thumb drive,
  4. An active LAN connection,
  5. and the A/C adapter for the Mini-Note.

Step 1: Make a bootable thumb drive using Unetbootin.

  1. First, download the latest ISO image of Ubuntu onto your computer.  (You can copy it anywhere you want; I usually copy directly to my desktop.)
  2. Second, install Unetbootin.  If you’re using Ubuntu, you can find the package in the repos.  (Sytem > Administration > Synaptic Package Manager.  Search for Unetbootin, mark it for installation, and then install).  From the terminal, type: sudo apt-get install unetbootin.  If you happen to be using Windows, shame on you.  Nah, just kidding.  Install the package from this website.  I haven’t done this for a while, but I believe you install it by either running the program when you download it, or you can launch the .exe file after you download it.
  3. Once you install Unetbootin, place your thumbdrive into an available USB slot.  Launch the program.  In Ubuntu, from terminal, type: sudo unetbootin, or via the GUI by going to Applications >  System Tools > Unetbootin.  The app should launch and you will see the following below.

    Unetbootin

    Choose the second option: Diskimage.  In the dropdown box, choose ISO image since that is what you have.  Then, click on the browse button (the square box at the end with the ellipsis inside of it.).  Find your ISO image.  Insure that the Type at the bottom is set to USB Drive.  Also, insure that the correct drive is selected (it should say something like /dev/sdb1).  If you have more than one thumbdrive installed, make sure that you choose the correct one.  To make things simpler, I would suggest unmounting and removing all other unnecessary thumbdrives.  Click OK and wait.  The install shouldn’t take too long.  Once it is done, follow any instructions and remove the thumb drive.

Step 2: Installing Ubuntu onto HP Mini-Note

  1. Plug your newly created Live thumb drive into a USB slot on your Mini-Note 110.  Power it on.  As soon as the BIOS screen appears, press f10 to change the boot order.  Using the arrow keys, choose to boot from your USB drive.  Choose the default image and Ubuntu should load.  If you want to play around before you install Ubuntu permanently, you can run a live session.  (I did have some problems with the Broadcom card, so I wasn’t able to get very far during the live session.  As soon as I launched Firefox, the system froze and/or crashed the X session.)
  2. From here you can double-click the install icon on the Dekstop.  Make sure you have plenty of battery left or plug in your A/C adapter; the install can take a long time.  You will be prompted to answer some questions.  All are pretty self-explanatory.  However, when you are asked how you want to partition your hard drive, you should spend some time determining how you will be using your netbook.  You can install Ubuntu on the entire drive, or you can install it side-by-side with your original OS.  In many cases, the pre-installed OS will be Win XP.  If you need XP for any reason, you may want to choose the side-by-side install.  This  will partition your hard drive evenly between XP and Ubuntu.  You can actually manually partition your hard drive and give one partition more or less space.  It takes a little more work, but you can customize as you see fit.  Since I don’t need XP, I would just install Ubuntu on the entire drive.  (Although, I would consider setting up a Home partition.  See this helpful blog for more information on the best method for partitioning your drive.)

Step 3: Installing the Broadcom STA Proprietary Driver

  1. After you have successfully installed Ubuntu.  You will be prompted to restart your computer.  Make sure to remove the thumb drive when prompted.
  2. Next, once your system has rebooted, connect you live wired connection to the Ethernet port on the Mini-Note.  All I did to get the Broadcom driver to install properly was run an update (System > Administration > Update Manager).  There will quite a few updates to install, so this step may take awhile.
  3. After you run the update, you will undoubtedly be prompted to perform a restart.  Do so.
  4. Once the system restarts, log in, and then choose System > Administration > Hardware Drivers.  You should see a something similar to the following.  Yours should show the Broadcom STA driver however.  In my the following example, you see the NVIDIA driver listed because it is the one I use on my trusty Thinkpad.

    Proprietary Hardware Drivers

    If you don’t see the driver listed, follow the suggestions from the above mentioned blog.  Although the author’s blog deals with Ubuntu on the Dell Mini 9, the steps for installing the Broadcom driver should work on the Mini-Note too.

  5. You should be home free from here.  Disconnect your wired LAN connection and test your wireless card.  You should be able to connect to all open and encrypted networks.  (My Dell Mini can connect to WEP, WPA and WPA2 encrypted networks.)

Good luck!  I hope this helped…

*By the way, have you ever wondered what the numbering convention is for Ubuntu?  The first number tells you the year, and the second the month it was released.  Since Canonical has promised to release new versions every 6 months, you will usually see version numbers ending with .04 and .10…since the official cycle began on October 2004 (although, there has been at least one version that went beyond the 6 month cycle–6.06).  So Lucid Lynx, which will be released in April of 2010, will be numbered 10.04 (or 2010.April).  It will also be designated LTS…more on that another time…

Saints Lose (Part II) Tuesday, Dec 29 2009 

When I saw the Saints/Bucs score Sunday morning, I was relieved–Saints 17, Tampa 0.  My first thought was that the loss to the Cowboys paid off.  The Saints have been struggling at the beginning of the last few games, and consequently, have fallen considerably behind.  With Drew Bress commanding that potent offense they have been able to surmount comebacks in most of those games.

But this week, they were able to put up 17 first half points, and I felt very confident about the outcome.  What did I need to be worried about?

But lo and behold, when I went to ESPN.com to see the carnage the Saints unleashed on the Bucs, I saw this headline instead: “Bucs Stun Saints in OT.”  They lost.  Again.  To another mediocre team.

I don’t like to disparage other teams–tends to come back to haunt me–but the Bucs are not a very good team this year.  How could the Saints lose?  That’s a rhetorical question of course, and I know the answer…

I don’t feel as devastated as I did last week.  I kind of accepted that despite what all the NFL experts were saying for the last few weeks, the Saints’ opponents had exposed some of their weaknesses.  I think the game against the Dolphins was the first that indicated there were some issues on defense.  Had it not been for a quick-strike offense, I think the Saints would have had a couple more losses this year (the Redskins’ game for sure).

Sean Peyton may want to consider shoring up the D a little before the playoffs.  I’m not too sure who is available, but they need to get more pressure up the middle.  The secondary has been getting a lot of blame for the Saints porous D, but I think the problem is deeper than poor secondary play.  QBs seem to be able to roam the pocket against the Saints’ D, and running backs have been having career games.  The D did get a few sacks this week, but in general the pressure isn’t consistent enough.

But it’s not just the defense that needs to make adjustments.  There are a couple of key areas where the Saints’ offense needs to improve: first, the running game, and two, 3rd. down conversions.

The lack of a running game has definitely hurt the overall offense.  Drew Bress had been looking godlike all year long, but since the first Falcons game, has been pressing a bit.  He hasn’t looked as efficient, and I believe he’s trying to make up for the struggling running game.  I’m not taking anything away from the running backs; I think Heath Evans, Pierre Thomas, and Mike Bell have done a particularly good job this year.  (And it is unfortunate that they have all gotten hurt at some point.)  However, in recent games, they haven’t been able to assert themselves quite as well as in the beginning of the season.  I think opposing D-lines are hedging a bit.  Have you noticed how many batted balls Bress has thrown recently?  Have you noticed that he’s given up more INTs?  Although the NFL has become a pass-happy league, you still need a running threat to keep defenses honest.

And these issues lead into my next point: 3rd. down conversions.  Because opposing Ds have been able to play the pass, the Saints have begun to struggle with 3rd. down conversions.  In fact, I have noticed an increase in 3-and-outs.  Again, a running game that would keep defenses home would help them considerably.

Now, I’ll admit; I’m not working from any kind of careful statistical analysis, and basing my opinions on what I personally witnessed.  I haven’t studied statistics and quite frankly, hope I’m completely wrong.  I hope that without a whole lot of tinkering, the Saints will march right into South Florida and take the Superbowl with a dominating performance.

The only thing I do know is that I’m nervous.  I’d feel much better if the Saints handily beat the Panthers this week.  And not in a shootout where both teams put up 300+ yards passing and score over 20 points.  But a game in which the Saints dominate on offense, both on the ground and through the air, and on D by holding the Panthers under 10 points.

The Saints really need to get going and regain some momentum going into the playoffs.  I know they are capable!  Go Saints….

How Ubuntu (Linux) Changed My Life Thursday, Dec 24 2009 

Ubuntu changed my life!  I’m smarter, cooler, and more ethical.

Sound too good to be true?  Well, it isn’t.

I began using Linux about 2 years ago.  Before that, I was a long time Windows user.  (And before that, an Apple user.)  Until my Linux days, I was a pretty typical PC user.  I would word-process documents, play a few games, and later, check email and surf the net.  I really didn’t care about how things worked; I just wanted them to work.  And hence, my Windows days were nothing pioneering or interesting.

In general, I would consider myself to be a pretty inquisitive person. I have always been interested in how things worked, but for some reason, probably because I saw them as too complex, I did not take an interest in knowing how computers worked.  The movie War Games probably had something to do with this too.  I didn’t need or want the FBI knocking down my door.

But I did take a few computing classes growing up, so I knew some of the basics.  But by the time I was in high school, I was more interested in getting my essays word-processed than understanding computer programming.  I figured the computer geeks could worry about that stuff.  Anyway, by that time, my dreams of being a computer scientist were already dashed by complicated math.

It wasn’t until I was in college that I realized how important technology, specifically the personal computer, had become. Aside from the obvious advantages of having a word-processor and other applications that made the production and editing of documents much easier, technology was linking disciplines and essentially changing them. The expectations of what people could produce quickly grew as documents became more sophisticated in appearance.

But even though I realized this importance, I accepted that technology had passed me by, and I would be destined to be a casual end user.  So I did the safe thing: I used Windows.  I knew the interface, it did the job most of the time, and it was available in all of the computer labs on campus.  There were a few times when I considered going back to Apple, but once I figured out what it would cost, I decided to stick with the PC platform.  Besides, I had already begun to dabble in open source applications, so I knew there was plenty of free software out there for Windows users.

It wasn’t until that fateful day I came across a Laptop magazine reference to Ubuntu Linux and its growing popularity.  I began to do some research on Linux and Ubuntu.  I talked with the head of our IT department on campus and asked him what he thought about Linux.  Of course, many of his servers were already running Red Hat, so he suggested I give it a try, but warned me to expect some hitches.

So I did.  My first distro was Ubuntu Feisty Fawn (7.04) and Gutsy soon after that.  I then tried Open SUSE (10.3, I think) and eventually SUSE SLED 10, which was preinstalled on a new Thinkpad I bought.  After some testing and trial in the classroom and work environment, I decided to go with Ubuntu.  I really liked SUSE; however, didn’t feel comfortable with the Novell and Microsoft “agreement.” I figured if I was going open source, I would go with a cleaner and unencumbered distro.  (See, I told you I became more ethical!)

So how did Linux, Ubuntu in my case, make me smarter?  I began to tinker with my computer again.  I had to figure out what I needed it to do, and how to get it to do it.  The one nice thing about Ubuntu is that its default install is very usable “out of the box.”  I think most casual users would need to change very little.

For me, I was pretty apprehensive to change anything at first.  But as time went on, I gained more confidence, and soon was making some small ones. In fact, some of them weren’t voluntary.  I’ll admit; what my campus IT guru told me came to fruition.  There were some hitches along the way.  I had to edit configuration files and learn how to use the command line to effectively solve some problems.  But all of these hitches and changes forced me to get more intimate with my computer.  When I used Windows, everything was veiled behind a GUI (and I suppose in many ways Ubuntu is guilty of this too), and so I didn’t need to understand what I was doing when I made changes; rather, I just needed to know how to execute them. Easier, right? Quicker, right? Well, not always. How do you fix something you don’t understand? And believe me, there was plenty that vexed me about Windows.

But if we take James Taylor’s advice and “[enjoy] the passage of time,” I think we could and would learn so much more.  I understand why many would find what I’m suggesting abhorrent.  Why bother trying to understand your computer when you have better things to do?

But I’m not talking about major time commitments here. I don’t think it’s necessary to understand every intricacy of the OS. And as a matter of fact, I want to suggest quite the opposite.  I think we should invest just enough time so we can understand the larger concepts of computing. These concepts will help us better understand how to solve problems that arise.  And I’m not talking about Linux problems, Windows problems, or Mac problems exclusively.  But in general.  If you understand how WebDav works, then the chances of you setting it up properly increases immensely, regardless of the OS that you’re using.  It’s like cooking.  If you understand the basic concept of stewing, you can probably produce a pretty good beef stew no matter whose kitchen you’re cooking in as long as the basic supplies and comparable ingredients are available.

How has Linux made me cooler.  Simple.  It is not the same old expected stuff.  The interface is different and in many ways more intuitive.  People are amazed with Compiz and the 3-D effects that I use.  And besides, when everybody is panicking about the latest major security hole in Internet Explorer, I can smugly smile and say, “I use Linux.”

Why am I more ethical?  Easy.  I don’t support they guys who are trying to control the way we interact with technology.  I stay away from products that are platform restricted. Also, I really embrace the FOSS philosophy because it levels the playing field.  I remember in the past, you had to buy a commercial word processing program to produce decent documents.  Today, thanks to a slew of community supported office apps, this in no longer the case.

It’s not that I’m necessarily against people getting compensated for their work, and, in fact, think every FOSS end user has some responsibility to “pay” for what they use.  The payment doesn’t necessarily need to be in monetary form though.  One could pay by donating time to test updates or new versions of apps. Or by being a part of a community. Maybe report bugs or issues that come up in daily usage. Or they could take an active role in promoting what they use. I’m sure there are numerous ways that end users could help open source developers. (Developers, if you’re reading this, go ahead and suggest some.)

So how has Ubuntu changed my life?  I’m taller, skinnier, and better looking.  Okay, maybe not.  I do feel that I have become closer to my PC though.  I understand it and the concepts that make my computing experience better and more successful.  And I know not to blame technology, but instead, the ones who wield it. Am I cooler? I don’t know, but using Linux has a sort of geek chic that suits me.  And in terms of ethics, I can sleep a little better at night knowing I haven’t made Bill and Steve any richer.  So has Linux changed me? Yes, I think so. And to the distro that figures out a way to make me taller, skinnier, and better looking, I promise to be your biggest fanboy ever!

Happy Holidays!

Next Page »