Saturday, October 1, 2011

Review of Metasploit: A Penetration Tester's Guide

     Metaspoit:  A Penetration Tester's Guide (MAPTG) from David Kennedy (@Dave_Rel1k), Jim O'Gorman (@_Elwood_), Devon Kearns (@dookie2000ca), and Mati Aharoni (@backtracklinux) is probably the foremost resource one can obtain for learning the basics of the Metasploit framework.  The book is for those behind the curve a little bit and haven't used Metasploit yet.  I feel like the book does a great job of delivering on what it promises, a foundation knowledge of the ins and outs of the great framework.  By the time you finish the book you should understand how to use the framework; you most likely will not understand all of it, but it gives you great information on how you can figure it out really through utilizing the many utilities within the tool.

     Let me start this review off by stating my personal opinion on how you should utilize this book:  build a test penetration lab and follow the books instructions as you go along.  Take advantage of the appendices!  Appendix A will tell you how to setup your test environments, both your attack machines and your victim machines.  Appendix B is your cheat sheet and quick reference for the numerous commands you'll be using.  I suggest starting here and just familiarize yourself here before you begin.  This is not what I did, but in retrospect I really wish I had.  

     The book doesn't waste time, after going into the basics of what will occur in a standard penetration test.  The authors state that this book is not the best source for understanding all that can occur in a Pen Test and refer to the Penetration Test Execution Standard as a better source of gaining better insight on the subject if you're looking for it.  Next the book cover's the basics of metasploit so you can get around the console with better familiarity, or options you have if you want more information.  These chapters are small and cover the essentials of what you'll need to know to get through the book if you have no prior knowledge about the tool.  The following chapters, which I will not cover in depth, go step-by-step through a basic Pen Test outline.  They start and information gathering and go all the way into creating your own exploits to automating your process with scripts within the framework.  

     Every chapter covers it's subject very well.  They're very concise and to the point, which I enjoy a whole lot.  Also, most of the chapters include examples of how to run the tools, and what output should look like (which is why I suggest you set up a lab environment and run the commands as your read them).  At times, I wish the chapters were a bit more in-depth, especially the creation of exploits chapter; however, that is probably a bit outside the scope of this particular book.  I especially enjoyed the chapters on creating exploits and the power of the Social-Engineering Toolkit.  The final chapter uniquely summarized what was learned in the book by explaining how to simulate a penetration test, and if completed properly will have you exploiting your vulnerable test lab in no time.  

     I highly would recommend this book to anyone who is looking to get into Metasploit a lot more.  It gives you a great base to learn the tool, and if nothing else spawn even more of a desire to learn more (I know it did for me).  I started this book with very base knowledge of the Metasploit framework, and after some testing and the guidance of this book I feel a lot more comfortable with using the amazing power behind Metasploit.  You can pick this book up for about $28 dollars on Amazon, this is an amazing value!  I would suggest if you even have an interest in penetration testing that you pick up this book and read it.

Monday, September 5, 2011

Road to CCE, Pt. 3: Review of Digital Forensics with Open Source Tools

     "Digital Forensics with Open Source Tools" (DFwOST), by Cory Altheide and Harlan Carvey is an excellent resource for a beginning forensics student I feel.  I am so happy that I decided to pick up this book, it has proven to be one of the best resources I now have.  This book reads extremely well, as the information it contains is concise and to the point.  DFwOST is certainly a value and I can see myself returning to it in the coming months.  

     As far as the content of the books is concerned, the authors provide a wealth of knowledge covering the basics of digital forensics.  The beginning chapter goes over what open source is and how its going to relate to the book.  The next chapter then discusses the differences in choosing a host operating system (mainly Windows vs Linux).  Chapters 3 through 8 analyze varying topics of digital forensics like file system analysis, points of analysis for varying operating systems, Internet artifacts, and file analysis.  These chapters hold a lot of information relating to the multiple points of interest in digital forensics; and while discussing the topics the authors provide the reader with examples of analysis with popular open source projects.  The final chapter then offers the reader insight into how to utilize the various discussed tools with better efficiency; as well as, the pros and cons of graphical user interfaces versus command line interfaces.  

     Overall, I feel like this is one of the best resources for learning about digital forensics because it provides great information along with practical knowledge of how to use the information.  It's easy enough to follow along with the reading while testing these tools with your own test lab.  The authors often provide easy to follow installation methods, which can often be valuable with dealing with some open source projects.  If you're looking to get into forensics more or even just learn about current open source projects going on in the forensics world I would recommend you go out and pick up this book.  I feel like this book helped me take the knowledge I've learned from other books I've discussed in this blog and transform it into practical knowledge as it's easy to get access to these tools and test them for myself without spending money (a plus for any college student).  

Thursday, July 28, 2011

Password Auditing (or, How I Learned to Crack Passwords)

Password auditing can be a fun little project for an IT security department.  Currently, Microsoft systems store their passwords in the registry via an NTLM hash.  As I will show, NTLM is fairly easily reversed when good password policy is not followed. 

The methodology I will cover is going to utilize rainbow tables for cracking passwords.  Rainbow tables are stored tables of pre-computed hashes that are compared against to the hashes you have in order to find the password in a much faster time.  This method is great when you have a set of good tables.  When you don’t have this ability, there still are brute force attacks which can be fairly lengthy if you don’t have access to a nice cloud or local cluster. 

I utilized 3 different programs to complete my password auditing testing:  ophcrack, rcracki_mt, and rcrack. 


     Ophcrack is great little utility.  You can download a LiveCD to boot from disk (will automatically load your NTLM hashes) or you can load an application to a test machine (what I did).  The methodology of use is pretty much the same.  First off, there are many tables for Ophcrack available from their website. 

This graph shows the various tables available.  Note that there is only one free table.  However, their tables are fairly priced if you have the money.  The Vista free tables are very well put together and will due for basic auditing and searching for really easy passwords.  I was able to get Vista Special tables for about 100$, I believe.  Not a bad deal considering it’ll get a good majority of passwords I found in my testing (unfortunately). 

The interface is fairly straight forward.  “Load” option will allow you to load the various formats of hashes (LM or NTLM).  In order to get the hashes from my machine I used Password Dump v7.1.  Simply run:  pwdump7 –d <output location>  and the application will do the work and create a text similar to this: 

Clark:1000:NO PASSWORD*********************:259745CB123A52AA2E693AAACCA2DB52:::
Barry:1000:NO PASSWORD*********************:2D20D252A479F485CDF5E171D93985BF:::
Bruce:1000:NO PASSWORD*********************:21E6C83723EB7BC2CFED883DA412B804:::
Oliver:1000:NO PASSWORD*********************:8D793BF7E73DAA43A28D04BD4BA1FC05:::
Peter:1000:NO PASSWORD*********************:E803ABFAF249575CAF1529465E243B3E:::
Matt:1000:NO PASSWORD*********************:6B4B376436A5664FEACAC52301155951:::
Logan:1000:NO PASSWORD*********************:71BAD9C6FD984ADD32187A1DDF360F85:::
Scott:1000:NO PASSWORD*********************:67A252C097F568BEC274AF4CC1462DC0:::
Xavier:1000:NO PASSWORD*********************:A086E310475F2B8DFD9E2F7265BD16C8:::
Steve:1000:NO PASSWORD*********************:6EF391B2282F1DA56379EDB11BBB034F:::
Clint:1000:NO PASSWORD*********************:87F65D137998A4CE59EA65B114A0F831:::
Eric:1000:NO PASSWORD*********************:F773C5DB7DDEBEFA4B0DAE7EE8C50AEA:::

This was the text file I used to crack.  No I didn’t have all these accounts setup on a Windows machine; I wanted to test the application so I created this file by simply copying the context of the pwdumd7 file output and generated the hashes via Cain & Abel v.4.9.4 hash generator. 

            Once you have the hashes, you can click on the crack button and you will see something like this: 

From here you can simply save the results into a simple CSV file.  

Voila!  You will have your passwords cracked in roughly 12 minutes (at least on my i5 dual core 8GB DDR3 laptop in this scenario).  


     rcracki is a decent application that is available from  You'll notice also at that website there are a plethora of various rainbow tables from MD5 hashing to LM and NTLM hashed passwords.  I personally, chose to go with the rainbow table for NTLM that had the hashes for 7 character passwords including all possible variations (roughly 140 GB worth of tables).  

     rcracki_mt is a command line interface tool, with fairly simple usage.  Here is my usage example:

This command breaks down as such:
     rcracki_mt -h <hash> (-l <password list>) -t <number of threads to utilize> <directory to your tables>

rcracki_mt was able to find a completely special character password in about 5 minutes of cracking with the 2 threads.  Not bad, however, rcracki_mt tables are separated into a ton of tables that have to load and to me didn't seem super efficient.  However, it's my personal opinion and for free this is a GREAT option.  


      rcrack is a similar version to rcracki_mt, however it uses simple .rt tables and not the indexed versions utilized by rcracki_mt.  rcrack is both a GUI and CLI tool.  I chose for the GUI version because I was using this in a demonstration and pictures are easier for meetings.  With rcrack you also get a tools to generate your own rainbow tables.  I opted to create my own rainbow table to be able to crack any numerical password up to 10 digits.  To generate this table it took roughly 2 days time to complete the 4 tables I made totaling around 200MB.    I then created a simple text file with various numerical passwords into a text file, again I got the hashes using Cain and Abel's hash generator.

     This is what RainbowCrack GUI will look like once you select File |  Load Passwords from List...  Then once you have your hashes loaded, you select "Rainbow Tables" and point it either to a singe file or a directory.  Now to begin cracking:

With this instance in order to find all 5 passwords it took merely 2 minutes.  

It is also possible to convert the tables available at into a format usable by RainbowCrack (rcrack).  However, I have not felt the need to test this as I find rcracki_mt decent enough to use at this moment.  

All of these methodologies are what I have used to begin auditing passwords in order to assess the need for a stronger password policy.  Hopefully you can use it to implement auditing if you need to.

+++Hopefully not needed to say, but use this information ethically!  I did this for research and work, not to harm anyone.

Thursday, June 30, 2011

Road to CCE, Pt. II: Review of Windows Forensic Analysis 2E

“The key to forensic analysis isn’t pushing the button on an application user interface.  After all, as I’ve said time and time again, the age of Nintendo forensics is over!  The key to forensic analysis is understanding what artifacts are available to you and having a logical, reasoned, and comprehensive plan or process for collecting and interpreting data.”  These are the words of Harlan Carvey, the author of Windows Forensic Analysis (as well as other great titles).  Reading that quote within the final part of Carvey’s book really summed up what it was all about, for me.  Truly understanding what information that is available to us through thorough examination and not relying upon a tool, was really the underlying message I got from this book.  That message is spot on to me, and is the reason I picked up this book.  I did not want to become another ‘button monkey’ who had to rely on a program to perform an investigation.  Tools are great when you have the background knowledge necessary to understand what the application is doing in the background, and the value of the information it provides.  That being said this book is FANTASTIC; obviously with recommendations from Eric Huber, Rob Lee, and Richard Bejtlich this really didn’t need to be said.  
The first part (chapters 1, 2, and 3) of this book will cover the importance of live response to an incident.  It’ll give you examples of important places to look for and gather crucial data.  It will then give examples on how to analyze the data collected.  Chapter 3 delves into the truly fascinating world of memory forensics and how this portion of analysis should not be overlooked (memory holds a ton of information).  The next portion of the book goes into the various files that can be used in an investigation.  The fourth chapter of the book dives into the deep pool of information that the registry of a Windows system.  The fifth chapter covers the other various files that can be obtained, such as, event logs, browser history, and other numerous log files available on systems.  These chapters are very technical and provide a vast wealth of knowledge.  The next portion of the text goes over executable files and rootkits; which covers the interesting ways in which a program operates and then can be altered.  And the final portions of this book ‘ties it all together’ with great examples and providing ways to perform an investigation on the cheap (particularly interesting to me as a student).
This book is an excellent source of information if you’re interested in learning more about what a Windows computer has to offer to your investigation.  I will definitely be keeping this book around for all the great information it provides.  Carvey, not only provides a treasure of information but he provides data within the text so you can get a good look at what sort of information you’re going to want.  To go along with all the miscellaneous data sources, you’re provided with suggestions/recommendations on tools that can help you obtain and analyze that data.  On top of this there are also tips, notes, and warnings that can apply to the topic at hand that help put the provided material into better context.
To sum up, this book is a must for anyone interested in the topic, it reads like a dream for such a technically heavy text. 

With yet another book completed on my list of texts I wanted to finish, before moving to more of an intensive hands-on approach to learning forensics.  I will be finishing one last book (of which Harlan Carvey is a contributing author to along with Cory Altheide) before applying everything.  I will be reading Digital Forensics with Open Source Tools next for obvious reasons; with more hands on stuff next in what I want to do why not use open source tools?  I am a bit nervous as to the few “not for beginners” mentioned on the Amazon reviews, but I always like a challenge.  However, hopefully it will be as good as a companion to The Sleuth Kit as some reviews say, as well.  Look for a review in a few weeks!   

Monday, June 6, 2011

B-Sides Detroit Security Conference

B-Sides Detroit was this weekend, woke up early and headed out to Detroit with a couple of buddies for the con for the day.  Got there bright and early and I got really excited when I saw this:

The conference was being held in a semi-warehouse/studio apartment building and it was actually really cool.  Considering this conference is FREE, it wasn't a big deal and it actually added a nice bit of personality to the con.  I should mention for free you get the speakers, a super awesome t-shirt (see below), and lunch (at least the day I attended).

The quality of the speakers were kind of hit or miss for me.  All of them had prevalent information; however, I thought that the some of it was kind of basic and others were more technical (which peaks my interest a bit more.).  Again, this con was free and there were a lot of great enthusiastic people around to chat up and socialize with even on the conferences USTREAM.  The event was very low-key and didn't press any vendor products at you, which is always a plus.  To help pay for the event they had a nicely priced snack bar with some excellent beer to choose from.

Overall, I thought my B-Sides experience was amazing!  I would recommend that anyone check one out, as they host them all over the country and they're growing in popularity.  The environment promotes and encourages new ideas as enthusiastic security professionals gather to talk about the subject they hold a deep passion for.  This con had a personality all in its own and you can't help but like it.


A little background on my experience to gain perspective to my review.  As you can guess from reading this blog I am a undergrad student of 'Information Assurance' here in Michigan.  I have come to love the security field over the last year or two.  The only other conference I've attended was the Minnesota High Technology Crime Investigative Association conference a few months back.  I enjoyed that experience quite a bit, but as it was law enforcement centric it lacked the passion I found in Detroit. 

Friday, May 27, 2011

Road to CCE, Pt. I: Review of Handbook of Digital Forensics

                “Handbook of Digital Forensics and Investigation”, by Eoghan Casey is a fantastic read.  I had recently completed Brian Carrier’s, “File System Forensic Analysis,” (also an amazing book) and was looking for something a bit less in-depth and more of a general digital forensics book.  Luckily, I got a recommendation from Eric Huber over at the ‘A Fistful of Dongles’ blog for this book, as well as, a few others; you can read his review of the book at Amazon.  I really enjoyed this book a lot, and it was exactly what I was looking for; an overview of the wide variety of topics that encompass digital forensics.  Casey had accumulated a great text from a wide variety of contributing authors and put it in volume that will take you through common topics in digital forensics including anything from data gathering to embedded systems analysis.

            Casey begins with an introduction that included a brief of what the book will include and the basics of a forensic examination.  Chapter 2 covers the importance of tried and tested methodologies and some of the complications that arise that arise with digital forensics when gathering and maintaining forensically sound evidence.  The electronic discovery chapter includes an overview of what E-discovery is and what it includes and gives you a great look at how large an E-discovery can be both in examination and cost.  Chapter 4 discusses the use of forensics in incidence response, and the importance maintaining files rather than utilizing the “wipe and re-install” mentality. 

            Part 2 of the book breaks away into the nitty-gritty technical end of forensics, at least as much as it can as an intro level book.  Chapter 5 covers the Windows operating systems and what you’re going to want to look for in an investigation, a sort of areas of interest sort of thing and explaining what those files are.  The next chapters do the same thing with the other main operating systems, both UNIX/Linux and Macintosh systems.  Chapter 8, for me, was really interesting.  It goes over the type of methodologies that will be utilized in an embedded system investigation with things like chip-off techniques and such.  The next chapter discusses network investigations and will show you the types of things you can discover through analyzing network traffic in your investigation.  The final chapter will point out areas of interest to be found in mobile devices. 

            This book is great in that it points out key areas you should be looking for within an investigation.  I particularly loved the “From the Case Files:” sections that give you a real world example of where you would use this knowledge that you’re reading about.  Even more, I LOVED the “Practitioner’s Tip,” sections.  For me, as a student, getting these little tips from experienced forensicators are invaluable.  Those tips, to me, were the best part of the book and was the like cherry on top of a sundae.  I felt the book was very well done.  My one criticism were the amount of output files, sometimes they seemed a bit long but appreciate what they were trying to do with it.  Other than that I would recommend this book to anyone looking to get into the digital forensic field, like myself.


I chose this book to begin my road towards gaining my Certified Computer Examiner certification.  I hope to complete the certification by the end of the year.  Right now my plan includes reading the following:
  • Handbook of Digital Forensics and Investigation by Eoghan Casey (Completed)
  • File System Forensic Analysis by Brian Carrier (Completed)
  • Windows Forensic Analysis by Harlan Carvey (In-Progress)
  • Digital Forensics by Cory Altheide and Harlan Carvey
Open finishing these works, I plan on working on practical application.  So look for more on these books and possibly some examples from my practice sessions.  Any recommendations and comments are welcome.  Enjoy memorial day weekend! 

Sunday, April 3, 2011

Jump Lists in Windows 7 and Possible Forensic Implementations

As many of you who would be interested in this post may know, many important things involving a forensic investigation can be found within the registry.  One location that holds a ton of good information that would relate to how a user is utilizing their PC can be found within the NTUSER.DAT file.  To gain access to the information found on the NTUSER.DAT file, we first need to acquire the file at either C:\Document and Settings\<USER NAME> (for Windows XP) or C:\Users\<USER NAME> (for Vista/Win7).  Natively you will not see this file, so you’ll want a forensic program such as FTK Imager to acquire the file (using FTK Imager:  right-click file and export the file to your desired location).  Once the file has been acquired, you can run it through a program called RegRipper by simply adding the path to the NTUSER.DAT file you acquired, selecting an output, and selecting the proper NTUSER.DAT plugin.  This will output a great text file with a wealth of knowledge an examiner can use in an investigation. 

One great piece of information one can access with this analysis is the recent documents a logged in user can access.  Using RegRipper on my Windows 7 machine I was able to come up with this information on files I’ve last accessed:

This is a great piece of information to have because it can implement a user to accessing a specific file. 

                Windows 7 adds some great new features; this post is going to focus on Jump Lists.  What are jump lists?  In Windows 7 you may have noticed those neat additions to your right-click menu, like recent history and in few instances application options.  These are jump lists, application specific tasks that are added to a programs right click menu.  I provide a few examples of these jump lists below:


Now you’ll notice that the Google Chrome jump list offers more options than that for Explorer.  It is important to note, that only some applications will take use of this feature.  Jump lists are great for accessing your recent/frequently used objects for several applications quickly.  It’s possible that one would assume that this information is pulled from the NTUSER.DAT file which contains recent document information.  This is not the case, examination of the PC will reveal that the jump list information at C:\Users\<USER NAME>\AppData\Roaming\Microsoft\Windows\Recent Items :        

This redundancy provides examiners with something quite vital.  For instance, if a system has CCleaner installed on it, CCleaner is set to delete Recent Document upon default.  So, a quick run of CCleaner and all of this information is wiped (including the secondary location at C:\Users\<USER NAME>\AppData\Roaming\Microsoft\Windows\Recent Items); something quite possible in a case where malicious content is being stored on PC..  However, with the secondary location on the PC can provide an examiner with something recoverable, a link (LNK) file.  Upon examination of the path some LNK files are going to be recoverable within the Recent Items folder.  This may be able to cause an association to a User to a particular file.

Recent Document information can also be found at:  C:\Users<USERNAME>\AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestinations
Examining this location with FTK Imager we can see something like this:

As seen above, examining these files at the hex level we will see the location of a file I recently opened.  In my case we see I recently opened an NTUSER registry rip from RegRipper. This is great bit of information to retain.  Again, this can help show a user was accessing certain files.

For more information on Windows 7 Forensics take a look at this paper on the topic by Piotrek Smulikowski.  You'll find more info on Jump Lists at page 23.  

Tuesday, February 8, 2011

Review of "Social Engineering: The Art of Human Hacking"

“Social Engineering:  The Art of Human Hacking”, by Chris Hadnagy is arguably the best book of its kind.  Mitnick’s books were great reads and highly interesting; however, Chris went above and beyond telling a story and applied an amazing framework around the stories.  The framework implements somewhat of a scientific background to a multitude of techniques used by social engineers that allows for a whole new understanding of the inner workings of the topic.  The book is possibly the best insight one can gain on the dangers faced by security environments from these masters of manipulation. 

However, the book takes more of an objective look into the subject and can be appreciated from a wide variety of audiences; not just the security officers of a company.  Hadnagy’s discussion on communication, persuasion, and influence techniques and tactics can be enjoyed by anyone.  I found myself not enjoying this book as a security student trying to learn more about prevention and mitigation of a threat; but, enjoying it more as a learning experience on how I can use the topics discussed in the book to improve my communication ability.  I was able to use techniques in this book in a recent presentation and found that my communication effectiveness was greatly improved just by focusing on small details and choosing my wording better.  Chapters four, five, and six are great for these means; I found learning about micro-expressions and Neuro-Linguistic Programming particularly gripping.

If you are looking at getting this book for improving your security awareness and learning more about how to prevent yourself from these types of attacks, the book delivers these points as well.  The book provides several examples that allow you to analyze how these attacks are put into place and show you what to lookout for.  A look at a wide variety of tools social engineers can use in their attacks, by providing an overview of each tool and how they would be used.  Also, an entire chapter is dedicated on how to protect against these threats; and a plethora of tips are provided that Hadnagy has learned through his vast experience. 

I would recommend this book to anyone; as I mentioned, it can appeal to a wide audience and is a great purchase.

Interested in this topic?  Check out:

Website the author of the book is a part of, as are many other great folks.  Newsletter is great, as well as, the podcast.

Creator of Social Engineering Toolkit, Dave Kennedy's website.  Interesting Blog.

Friday, January 7, 2011

Minimum Word Requirements Ruin Quality

This topic has come up a few times recently for me, being the beginning of a new semester and all.  When a instructor asks students to meet a certain word requirement, in my opinion it will most likely ruin the results.  For instance, I just completed about 1,000 words pertaining to a proposal for a final project.  In case that seems odd, I will repeat that, I wrote a 1,000 words describing a paper they want to reach 2,500 to 3,000 words.  I wrote a description of what I wanted to complete my final on, and looked down at the good old Microsoft Word word counter and saw a whopping 260 some odd words!  Hmmm... what the hell to do?  Well you read through it and then try to rewrite some things and extend some bit.  Couple hours of creative writing later I'm at about 400 words and at my wits end on how to add more without writing the paper, after all this is just the proposal.

I absolutely hate this feeling, like you've written enough to get your point across with sufficient background, examples, and sources; however, because that all important A requires you to add so many words, you have to become redundant.  Now here come the arguments, well if you're so far behind the requirements you can't possibly have enough information.  To that I say this, 'not true'. When discussing the college paper you have to realize we're simply talking about a 1,000 words equaling anything from 3-8 pages, depending.  Why do I need to add so much content when I can concisely express the same message with better clarity in less?  This is not a book, and I'm not writing anything on an expert level, yet.

So to get that A, what does one do in order to meet these ridiculous minimum word requirements?  You start repeating yourself, try to restate important points.  Basically, create this fluff that you don't really need and do not re-enforce anything in my brain other than my fleeting confidence that my paper is still poignant and written with quality.  I understand that minimums are in place to prevent lazy people from writing 300 words on what should really be 1,500 words, so a stupid requirement is passed along to those who try and can write with proficiency.

Solution?  Don't enforce a damn minimum, switch to a recommendation.  Read the paper and assess quality.  Why should grades be reduced purely based on a number?

Tuesday, January 4, 2011

Is Google Better Than Christmas?

This afternoon, on a very boring day during break, I answered the doorbell to the best guest.  A UPS truck driver!  Stumped, I had not ordered anything.  I opened the box bewildered.  I lifted the flap to the back of another box, and all I noticed were the words "lithium" and thought laptop immediately struck my mind.  Then, after a few moments I remembered back to a few weeks ago when I applied for the new Google Notebook.  Get the box open and sure enough, there it was!  The beautiful CR-48 Google Notebook.  

What proceeded were the longest 20 minutes of my life, it seemed.  See, I live in Michigan; which when in December is quite freezing.  I had to allow this thing to warm-up or I may break it before I even really touch it.  I finally get to turn it on and I spend the next minutes reading agreements and setting this bad boy up, it runs it's updates and I finally get to play.

Now, I'm an avid Chrome web browser user and love it.  I wasn't quite sure what to expect with the Chrome OS.  Really, it's the same thing essentially.  So, I decided I should take some time and browse around the web market.  Which by the way, if you don't have it set-up already, set-up the browser sync capability within the Chrome browser (I already had this set-up for when I go to class).  Anyway, the Web Market had some pretty interesting things, such as, SlideRocket (can't wait for my next presentation!), Vyew, TweetDeck, and SpringPad.  These were really all I left installed after a few hours of playing around with random apps.  

Now for a work in progress, I am a big fan of the OS.  I think it's come a long way since I tried the Chromium Project live USB.  It feels a lot more natural on the hardware provided by Google and it's CR-48.  As for hardware, the computer doesn't have much.  A smaller solid state drive, a couple of gigs of RAM, and a smaller single core ATOM processor from Intel.  But, you don't really need much when working with a cloud based operating system.  Plus, with smaller hardware you get some sweet battery life, around 8 hours on a full charge.  Performance was off and on and really seemed dependent on the wireless strength.   

Aesthetically, I love the look and feel of the device.  The keyboard is awesome and the replacement of the function keys and the caps lock key are not noticeable to me, if not a huge improvement.  I love the thing personally, other than one real big downside for me was the touch pad.  The requirement of two finger tap for right-click, doesn't work well to me (maybe I'm just not used to it).  For me though, the pro's greatly outweigh the con's.  Overall, it's a stupendous device and would like to thank Google for giving me the best Christmas gift I've gotten in a long time.

Google is better than Santa Clause!

*This was written on December 21, 2010