Sunday, January 23, 2011

Recover Iphone contacts from raw backup

Just as I got my new phone ( t-mobile MyTouch4G -- love it!) my 23 month old iphone completely refused to charge from either the wall or computer. So how to get my contacts off?

I have a full mirror of my iphone (3g) filesystem, created using rsync+ssh from within my jailbroken phone. It is way cooler to backup over wifi than through a tethered cable; I had no other choice as the data connector died after 14 months. so I had no other option to make backups. I was able to charge from a wall adapter but not from any USB hosts. This crippled setup worked long enough for me to escape my AT&T contract.

Useful tidbits:

  1. contacts are stored in AddressBook.sqlitedb
  2. file is stored in /private/var/mobile/Library/AddressBook/AddressBook.sqlitedb
  3. There is a second, bare-schema database in /private/var/root/Library/AddressBook/
  4. The database is in sqlite3 format.
  5. Person entries are stored in ABPerson table
  6. Phone number/email/etc entries are stored in ABMultiValue table
We can open this file in sqlite3 and export it into a usable comma-separated-file without any other external tools. The person entries are stored in ABPerson, but the phone number entries are stored in ABMultiValue. We join the two tables together in our CSV output.

The following snippet will copy the database to /tmp,open it in sqlite3 and export to contacts.csv. # copy the db to /tmp, then open it cp /private/var/mobile/Library/AddressBook/AddressBook.sqlitedb /tmp cd /tmp sqlite3 AddressBook.sqlitedb sqlite> .mode csv sqlite> .output contacts.csv sqlite> select ROWID, first, last, identifier, value, record_id from ABPerson p join ABMultiValue mv on (ROWID=record_id) sqlite > .quit

The file locations and names was surprisingly hard to find. On the bright side, I didn't need to decode any plist files.

There are some more interesting fields in ABPerson and ABMultiValue, feel free to update the select to grab more fields.

sqlite> .tables ABGroup ABPersonMultiValueDeletes ABGroupChanges ABPersonSearchKey ABGroupMembers ABPhoneLastFour ABMultiValue ABRecent ABMultiValueEntry ABStore ABMultiValueEntryKey FirstSortSectionCount ABMultiValueLabel LastSortSectionCount ABPerson _SqliteDatabaseProperties ABPersonChanges sqlite> .schema ABPerson CREATE TABLE ABPerson (ROWID INTEGER PRIMARY KEY AUTOINCREMENT, First TEXT, Last TEXT, Middle TEXT, FirstPhonetic TEXT, MiddlePhonetic TEXT, LastPhonetic TEXT, Organization TEXT, Department TEXT, Note TEXT, Kind INTEGER, Birthday TEXT, JobTitle TEXT, Nickname TEXT, Prefix TEXT, Suffix TEXT, FirstSort TEXT, LastSort TEXT, CreationDate INTEGER, ModificationDate INTEGER, CompositeNameFallback TEXT, ExternalIdentifier TEXT, StoreID INTEGER, DisplayName TEXT, ExternalRepresentation BLOB, FirstSortSection TEXT, LastSortSection TEXT, FirstSortLanguageIndex INTEGER DEFAULT 2147483647, LastSortLanguageIndex INTEGER DEFAULT 2147483647); sqlite> .schema ABMultiValue CREATE TABLE ABMultiValue (UID INTEGER PRIMARY KEY, record_id INTEGER, property INTEGER, identifier INTEGER, label INTEGER, value TEXT);
"DBIX::Class::Deployment handler is awesome" is an article on using DBIx::Class::DeploymentHandler (along with SQL::Abstract ) to automatically produce database version upgrade and downgrade scripts from DBIX::Class schema documents and schema layout diagrams.

awesome. This is why I follow the Perl Iron Man blogging feed. Great stuff in there!

Day one with R, head first data analysis

Awesome. I installed R (r-project) about 10 minutes ago, and I just created my first scatterplot! This is a long ways from my days with p-fit and n-fit.

I'm reading Head First Data Analysis, published by the fine folks at O'Reilly. I'm enjoying reading this Head First book. Going in, I always think the asides, cartoons and irreverent colloquial manner will be off-putting, but it really does flow nicely. I look forward to comparing it to my other new O'Reilly book, Data Analysis with Open Source Tools (released in Nov 2010).

On page 291, we see this "Ready Bake Code," to pull a csv from their website, load it into R and print a scatter plot of a subset of the data.

employees <- read.csv( "http://www.headfirstlabs.com/books/hfda/hfda_ch10_employees.csv", header=TRUE)
head( employees, n=30 )
plot ( employees$requested[employees$negotiated==TRUE], employees$received[employees$negotiated==TRUE] )

Boom, I have a scatter plot of the subset of employees where the NEGOTIATED field is TRUE, comparing the requested to the received.

I did a full install onto my ubuntu laptop by adding the official r-project aptitude repository, which gave me a slightly newer version than what was available in the default Ubuntu 10.10 (Maverick) repositories. Cran asks you to manually pick a cran mirror, I chose my local UCLA mirror.
# Create /etc/apt/sources.list.d/r.list
deb http://cran.stat.ucla.edu/bin/linux/ubuntu maverick/
# add key (optional,but preferred)
gpg --keyserver subkeys.pgp.net --recv-key E2A11821
gpg -a --export E2A11821 | sudo apt-key add -
# update aptitude
sudo aptitude update
# install r
aptitude install r-base
# launch R (not 'r' -- that's a shell built-in)
R

Tuesday, January 11, 2011

LA Tech Events -- getting busy again.

After the hibernation month of December, it seems like tech events are popping out of the woodwork here in January!

Tonight (2011-01-11) is CloudCamp LA, an un-conference on all things "Cloud." It is hosted at MorphLabs in El Segundo. More than 200 people are pre-registered! All the in-person tickets are gone, but there are still 30 registrations to watch a streamed video from home. (as of 11:11am on 1/11/11)

Tonight also features a round of Lightning talks at ScaleLA: Los Angeles High Scalability Meetup (formerly Hadoop Meetup) Meetup, hosted by the wonderful folks at Mahalo in Santa Monica.

Tomorrow is the Thousand Oaks Perl Mongers, TO is a bit of a drive from down here, but I try to make it every couple of months to visit my ValueClick peeps. Not gonna happen this month, though.

Friday is TED x Caltech : Feynman's Vision (50 years later). Tilly and I will be volunteering. I'm still unsure if I can make volunteer dinner on Thursday night, for volunteers to mingle with presenters.

Next Tuesday, Steven Hawking returns for a presentation at Caltech. One wonders how much longer he'll be out in public. Caltech Alumni can register for a lottery drawing for tickets (deadline noon on 1/13), everyone else can show up and wait in line.

Wednesday the 19th brings back Los Angeles Perl Mongers, it feel like forever since our November meeting. January finds us visiting our friends at Rent.com -- thanks for hosting! My presentation is still TBD, but I hope to have the directions and presentations squared away this week. Thanks for coming!

Thursday the 20th is another wonderful Mindshare, back in the comfortable digs of The Independent theater downtown. Now with complimentary pre-event Happy Hour! Their schedule is also TBD, good to know I'm not alone on that front.

Just over the horizon to February brings SCALE -- the Southern California Linux Expo., Feb 25-27, 2011. Make your plans now!

L.A. Nerdnite also took off the month of December, as our beloved venue, the Air Conditioned Supper Club, closed or took on new management. Look for an announcement soon of a new hip venue. Who is up for a hollywood BYOB picnic experience?

SCALE presentation proposals : denied.

Sigh, Neither of my modern perl SCALE proposals were accepted -- dev track proposals for hands on demonstrations of using Hadoop Streaming with Big Data and quickly building web applications with Dancer. I hope we get an perl mongers booth/table.

I'm glad to hear there were so many presentation proposals. Sounds like we'll have some great talks!

Dear Speaker,

The SCALE committee has reviewed your proposal(s). Unfortunately, your proposal, while excellent, was not accepted. SCALE again had many high quality submissions, so we could only accept a small fraction of those submitted (47 out of 160 submissions).

We thank you for your interest in SCALE and we appreciate your submittal! We hope you'll participate in future SCALE events. The latest updates for the conference are available at http://www.socallinuxexpo.org

Monday, January 3, 2011

New Year, New Releases

I opened my cpan mail today and received a lovely email from a user of one of my CPAN modules, Hadoop::Streaming. Reading a nice comment was a wonderful way to start the first Monday of this New Year. Included with the praise was a bug report -- double plus good!

You have absolutely no idea (or perhaps you do) how happy I was to see that there is a hadoop streaming module for perl. So I thank you for making this available! I wonder if you are still working on it or have plans to continue working on it? Are there many users to your knowledge? Finally, I've tried to run the example code myself under perl 5.12.2 and receive bareword errors when running the mapper locally.

---- Frank S Fejes III

Looking at the package and the error output in his email I realized that my hastily pushed out synopsis example had not been code reviewed -- it wouldn't compile as I wasn't quoting the arguments to the Moose keyword 'with.'

It was a small matter to fix and a breeze to push to cpan via the magic of Dist::Zilla. Thanks Ricardo!

  1. clone code from github repo : git clone git@github.com:spazm/hadoop-streaming-frontend.git
  2. edit lib/Hadoop/Streaming.pm to fix the Synopsis pod
  3. add comment to Changes file
  4. commit the change locally and back to github: git commit && git push origin master
  5. magic Dist::Zilla command, dzil release, which took care of:
    1. checking for file modified files not checked in to git
    2. running pod weaver,
    3. running tests,
    4. updating the release version in Changes file,
    5. git commit Changes,
    6. git push origin master,
    7. git branch,
    8. tar+gz release,
    9. push release to CPAN

While checking my CPAN mail, I also found a CPANTS fail notice for Net::HTTP::Factual which is built on Net::HTTP::Spore. Spore v0.3 came out and changed the spec format again from v0.2 which was in turn different from v0.1.

I tweaked my factual .spec to work with Sport v0.2 or v0.3 and pushed it up to cpan. Same magic Dist::Zilla command.

Freshly available on CPAN: