Wednesday, December 22, 2010

Included in the End-of-Year message from Caltech President Chameau is this quote from Einstein, delivered in December 1930, on his first visit to Caltech.

"To you all, my American friends, I wish a happy 1931. You are well entitled to look confidently to the future, because you harmoniously combine the joy of life, the joy of work, and a carefree, enterprising spirit which pervades your very being, and seems to make the day's work like blessed play to a child."
---- Albert Einstein, December 1930

May the spirit of joy pervade your life and your work in the coming year! Let us all look confidently and move boldly into the future. Happy Solstice! Merry Christmas! Happy New Year!

Sunday, December 19, 2010

ORM -- database abstractions

I've just learned of DBIx::Class::CDBICompat, a Class::DBI compatibility layer for DBIx::Class. Awesome.

DESCRIPTION

DBIx::Class features a fully featured compatibility layer with Class::DBI and some common plugins to ease transition for existing CDBI users.

This is not a wrapper or subclass of DBIx::Class but rather a series of plugins. The result being that even though you're using the Class::DBI emulation layer you are still getting DBIx::Class objects. You can use all DBIx::Class features and methods via CDBICompat. This allows you to take advantage of DBIx::Class features without having to rewrite your CDBI code.

I have inherited an app based on Class::DBI, but I'm not terribly familiar with Class::DBI. So far, the SQL snippet approach is annoying. There are lurking bugs (in our code, not Class::DBI), but mocking and fixing with CDBI is proving to be a pain. It's good to know that if I rewrite all the DB layer in DBIxClass over the holidays, that I'll be able to shim some of the old code onto it through a compatibility layer.

Related topics:

DBIx::Class vs Class::DBI vs Rose::DB::Object vs Fey::ORM by fRew
http://blog.afoolishmanifesto.com/archives/822
PerlMonks discussion
Class:DBI vs DBIx::Class vs Rose::DB::Object
http://www.perlmonks.org/?node_id=700283
DBIx::DataModel
http://search.cpan.org/perldoc?DBIx::DataModel
"DBIxDM is very interesting and Laurent Dami helps the DBIC team maintain SQL::Abstract – but he also doesn’t manage to market it to save his life so very few people have heard of it." -- Matt Trout
Rose::DB -- hand coded to be fastest of the 4?
Marlon suggested looking more deeply into the DB benchmarks.
"You really should run the benchmarks of Rose::DB vs DBIC. Your database layer is usually your slowest in your entire application, so that’s important. RDBO is also the ORM we use at CBSSPORTS.com because speed is important at this level. I’ll be interested in looking at what the new revision of DBIC with Moose does in that arena. On a side note, I agree that the generative queries are excellent on DBIC."

_Marlon_

2006 discussion of Rose vs DBIx, by the Rose Author.
http://osdir.com/ml/lang.perl.modules.dbi.rose-db-object/2006-06/msg00021.html

Perl 2011: Where are we now?

Piers Cawley wrote an excellent forward looking piece The Perl Future in January 2009. As we approach the two year mark, how have we fared? He talks about perl 6, perl 5.10.0, aka "perl5 version 10," Perl Enlightenment & the rise of Moose, and "on frameworks and the future."

Where are we now?

Perl 5.12 is out, as scheduled, on time -- two years of work representing 750,000 lines of changes over 3000 files and 200 authors. Deprecated features of perl4&5 are finally marked as such. More Unicode improvements. Features improvements for the Y2038 bug (is epoch time 64 bit now ?) Includes pluggable keywords and syntax. A new release schedule means stable releases come out in the spring, followed by a .1 fix release, then monthly releases (on the 20th) for new bug fixes.

Perl 6 released a real, honest to goodness release candidate. Rakudo Star, "A useable perl6 release" was released in June, aimed at "Perl 6 early adopters." Rakudo star has seen monthly updates, most recently Rakudo Star 2010.11 released in November 2011. Rakudo Perl is a specific implementation of Perl 6 the language, this Rakudo Star 2010.11 release includes "release #35 of the Rakudo Perl 6 compiler, version 2.10.1 of the Parrot Virtual Machine, and various modules, documentation, and other resources collected from the Perl 6 community."

A year-and-a-half of the Perl Iron Man blogging project has seen a flurry of posts from nearly 250 perl bloggers! We've seen advocacy, snippets, whining, and community. I've seen a lot more Japanese language perl posts -- folks happy to use perl, python and ruby and pull the best from each.

I now find it strange and unsettling to meet self proclaimed perl programmers who don't use Moose. If you haven't played with it (and it does feel like playing, it's liberatingly fun), go do so now. I'll wait.

I don't know about you, but I just switched from one startup using perl to another startup using perl. Awesome perl folks are hard to find, they're mostly already busy doing work they love. Why are we using perl? -- because perl works, it scales with developer time, and perl is beautiful.

Piers mentioned frameworks -- yes individual frameworks are important but the vast armada of options available at CPAN as a whole provide an immense multiplier on developer productivity. It's so massive, it is easy to overlook -- Doesn't everyone have a massive, distributed, user-written body of code with excellent testing methodology available at the touch of a button?

Merry Christmas to all!

[...snip...]
However, if you look at the good parts (O'Reilly haven't announced "Perl: The Good Parts", but it's a book that's crying out to be written), there's a really nice language in there. Arguably there's at least two. There's the language of the one-liner, the quick throwaway program written to achieve some sysadmin related task, and there's the more 'refined' language you use when you're writing something that is going to end up being maintained.

I think it's this split personality that can put people off the language. They see the line noise of the one liner school of programming, the games of Code Golf (originally called Perl golf, the idea spread), the obfuscated Perl contests, the terrible code that got written by cowboys and people who didn't know any better in the dotcom bubble (you can achieve an surprising amount with terrible Perl code, but you will hit the wall when you try and change it) and they think that's all there is.

But there is another Perl. It's a language that runs The Internet Movie Database, Slashdot, Booking.com, Vox.com, LiveJournal and HiveMinder. It's a language which enables people to write and maintain massive code-bases over years, supporting developers with excellent testing and documentation. It's a language you should be considering for your next project. It's also something of a blue sky research project - at least, that's how some people see Perl 6.

----http://www.h-online.com/open/features/Healthcheck-Perl-The-Perl-Future-746527.html

Wednesday, December 15, 2010

A story of one man's journey to Vim Nirvana (Vimvana). For those of you stuck on Monday, keep trying you'll make it.
I was watching a violinist bow intensely and I had this thought: I probably have as many brain cells devoted to my text editor as he does to playing his chosen instrument. Is it outlandish to imagine that an MRI of his brain during a difficult solo wouldn’t look much different than mine while manipulating code in vim?

Consider, if you will, the following montage from one vimmer’s journey.

---- http://kevinw.github.com/2010/12/15/this-is-your-brain-on-vim/

Tuesday, December 14, 2010

Perl Advent Calendars

It's Advent Calendar time in the perl ecosystem! Start each day with a delicious treat of knowledge.

I've found a half dozen english language perl advent calendars, starting with the original perl advent calendar. For extra fun I've included another half dozen Japanese language calendars -- I can still read the perl it's just the prose that is lost in translation.

Perl Mongers Perl Advent calendar
http://perladvent.pm.org/2010/
Catalyst Advent Calendar -- The Catalyst Web Framework
http://www.catalystframework.org/calendar/
Perl Dancer -- the Dancer mini web framework
http://advent.perldancer.org/2010
Ricardo's 2010 advent calendar -- a month of RJBS
http://advent.rjbs.manxome.org/2010/
UWE's advent calendar - a cpan module every day.
http://www.perl-uwe.com/
Perl 6
http://perl6advent.wordpress.com/
Last Year's Plack calendar
http://advent.plackperl.org/
For the adventurous: Japanese Perl Advent Calendars, 8 different tracks!
http://perl-users.jp/articles/advent-calendar/2010/
Hacker Track
Casual Track
English Track
Acme Track
Win32 Track
Meta Advent Calendar Track
Symbolic Programing Track
perl 6

One bonus list, for the sysadmin in your life:

SysAdvent - The Sysadmin Advent Calendar.
http://sysadvent.blogspot.com/

Monday, December 13, 2010

Vimana : cpan module to automate vim plugin installation

VIMANA! The Vim script manager. A cpan module for downloading and installing vim plugins! It works with .vim files, archive files (zip, rar), and vimball formats. By c9s / cornelius / Yo-An Lin. caveat: the "installed" command only recognizes plugins installed via vimana.

Cornelius's perl hacks on vim presentation has been on slideshare for two years. It covers "why you should improve your editor skills" -- ranging from "stop moving around with the arrow keys" to advanced commands and plugins. He's written quite a few vim plugins and quite a lot of cpan modules. Props!

Vimana Example:

% cpan Vimana
% vimana search nerd
nerd-tree-&-ack     [utility]      Adding search capability to NERD_Tree with ack
the-nerd-tree       [utility]      A tree explorer plugin for navigating the filesystem
nerd-tree-project   [utility]      It tries to find out root project directory, browse project file with NERD_tree.
the-nerd-commenter  [utility]      A plugin that allows for easy commenting of code for many filetypes.
findinnerdtree      [utility]      Expands NERDTree to file in current buffer

% vimana info the-nerd-tree
#... shows the install instructions ...

% vimana install the-nerd-tree
Plugin will be installed to runtime path: /home/andrew/.vim
Package the-nerd-tree is not installed.
Downloading plugin
.
 - Makefile : Check if makefile exists. ...not ok
 - Meta : Check if 'META' or 'VIMMETA' file exists. support for VIM::Packager. ... - Rakefile : Check if rakefile exists. ...not ok
Package doesn't contain META,VIMMETA,VIMMETA.yml or Makefile file
Copying files...
/tmp/yohBObI3iy/ => /home/andrew/.vim
Updating helptags
Done

There are quite a few plugins mentioned in the presentation. I've listed the ones I'm interested below. I'll be installing and reviewing them soon. Slide 120 begins a nice section on advanced movement keys. Slide 135 has a list of the variables controlling the perl syntax highlighter.

Perl folding variables:

" set :help folding for more information
:set foldmethod=syntax              " enable syntax based folding
let perl_include_pod=1              " fold POD documentation.
let perl_extended_vars=1            " for complex things like @{${"foo"}}
let perl_want_scope_in_variables=1  " for something like $pack::var1
let perl_fold=1                     " enable perl language based folding
let perl_fold_blocks=1              " enable folding based on {} blocks
---- slide 135

Vim Plugins:

Exciting vim plugins to check out from vim.org. Reviews and usage information to come in future posts. I'm excited to try Cornelius's updated OmniCompletion helpers.

perlprove.vim
http://www.vim.org/scripts/script.php?script_id=1319
How does this compare with efm_perl.pl, included with vim?
DBExt.vim
http://www.vim.org/scripts/script.php?script_id=356
Run database queries from inside vim. I skipped past this the first time around, but now I see that it will let you copy the query directly from your source language, apply that languages string mechanics to get the output string and then prompt for bound variables. interesting!
FuzzyFinder
http://www.vim.org/scripts/script.php?script_id=1984
Allows a handy shorthand mapping to search for files/buffers/etc.
the NERD tree
http://www.vim.org/scripts/script.php?script_id=1658
updated file listing explorer
You're already using this, right?
the NERD commentor
http://www.vim.org/scripts/script.php?script_id=1218
improved commenting, under current development.
taglist
http://www.vim.org/scripts/script.php?script_id=273
a ctag integration that shows tag information for the current file / source code browser
"The most downloaded and highest rated plugin on vim.org"
BufExplorer
http://www.vim.org/scripts/script.php?script_id=42
Buffer Explorer / Browser -- easily switch between buffers.
Git-Vim
https://github.com/motemen/git-vim
Git commands within vim.
HyperGit
http://www.vim.org/scripts/script.php?script_id=2954
a git plugin for vim ( with a git tree menu like NERDtree ), by c9s.
screenshot
Updates to vim omnicompletion for perl:
http://www.vim.org/scripts/script.php?script_id=2852
Demonstration video
https://github.com/c9s/perlomni.vim
AutoComplPop
http://www.vim.org/scripts/script.php?script_id=1879
Automatically opens popup menu for completions

TEDxCALTECH -- Friday, January 14, 2010

"Feynman's Vision -- The next 50 years"

TEDx, the independent TED event series is coming to Caltech in January. TEDx events are inspired by TED and use the same plans and speaking formats. I'm surprised I haven't heard more buzz about this event. I wasn't able to get into TEDxUSC, the first of the TEDx events. I'll be volunteering for the event and hope to see you there!

Will we see coverage of Feynman's vision of the "race to the bottom" -- his challenge to his engineering and scientist peers to work together and compete to see how small we can go in nanotech. Will anyone from Professor Tai's micro-machine lab be speaking? I'm sure we'll see interesting wetware - bio/cs/eng research, given the Institute's focus on Bio over the past decade.

You won't know until you go! (Or until you check the speakers tab)

Feynman's Vision -- The next 50 years.

TEDxCaltech is a community of people who are passionate about sharing "Feynman’s Vision: The Next 50 Years." If that sounds like something you want to be a part of, complete and submit the application below. Due to limited venue space, we cannot approve all applicants instantaneously. If you are approved, you will receive an email shortly either inviting you to register for the event, or letting you know that you are on the waiting list. The registration fee is $25 for Caltech students; $65 for Caltech faculty, staff, postdocs, alumni and JPL; and $85 for all others. The all-inclusive day will begin with breakfast and will be punctuated with generous breaks for food and conversation with fellow attendees. It promises to be an exciting and entertaining intellectual adventure—a time to unplug from the day-to-day routine. You won’t want to miss a minute!

Registration opens at 8:00 am, doors open at 9:30 am, talks conclude at 6:00 pm and will be followed by a reception. Parking is free.
-- http://www.tedxcaltech.com/apply

Thursday, December 9, 2010

perl, tags, and vim : effective code browsing.

Adding a tags file to vim makes for an effective code browser for perl.

I've just started a new job, so I have a large new repository of perl code and modules to familiarize myself with. I've taken this as an opportunity to refresh my tag-fu in vim. After creating a tag file with ctags [exhuberant ctags], I can now jump around my whole perl repo from within my vim session.

The -t command-line flag to vim opens files by tag (module) name, e.g. vim -t My::Module::Name. Within a vim session, I jump to the definition of a function by hitting ctrl-] with the cursor over a usage of the function, even if that definition is another file! Ctrl-t and I'm back where I started.

Today I found the -q flag to ctags, which adds the fully qualified tag for package methods, e.g. My::Package::method_1, which aids with long package names and the ctrl-] key. FTW!
I have this set of ctag flags aliased as "ctagit":


ctags -f tags --recurse --totals \
--exclude=blib --exclude=.svn \
--exclude=.git --exclude='*~' \
--extra=q \
--languages=Perl \
--langmap=Perl:+.t

In my .vimrc file, I defined the tags search path as ./tags, tags,~/code/tags ;, this will look for a tags file in the directory of a loaded file, then the current directory, and then a hardcoded path in my code area.

" [ .vimrc file ] " set tag search path: directory of current file, current working directory, hard-path set tags=./tags,tags,~/code/tags

More info on using tags in vim is available in :help tags. I've found the following commands useful.

ctrl-]jump to tag from under cursor
visual ctrl-] jump to tag from visual mode
:tag tagname jump to tag tagname
:stag tagname split screen and open tagname
:tags show the tag stack, '>' labels the current tag
:ctrl-t jump to [count] older entry in the tag stack
:[count]pop jump to [count] older entry in tag stack
:[count]tag jump to [count] newer entry in tag stack

Update:
To configure vim to treat ':' (colon) as part of the keyword to match Long::Module::Sub::Module package names, add it to the iskeyword setting. I have multiple perl filetype hooks stored in files in .vim/ftplugin/perl/. These filetype hooks are enabled with the filetype plugin on directive in my main .vimrc file.

" [ .vimrc file]
"enable loading the ftplugin directories
filetype plugin on
" [ .vim/ftplugin/perl/keyword_for_perl file]
" Append : to the list of keyword chars to allow completion on Module::Names
set iskeyword+=:

The same effect could be conjured directly in the .vimrc file via autocmd:

" append colon(:) to the iskeyword list for perl files, to enable Module::Name completion.
autocmd FileType perl set iskeyword+=:

My Configuration files are available in the spazm/config git repository on github.

Monday, November 29, 2010

NoNaNoWriMo


Seems I have just said "no," to blogging in November.  I have not been off writing the great American novel.
This drops me back to  paperman status for perl ironman blog competition.  Time to start back up?

Friday, October 29, 2010

naptime

Siesta! Nap time! Plenty of studies show that people who nap are more productive and more creative. We all napped in kindergarten. Why is it that adult naps are dismissed by corporate and polite society? Rather, we feed the endless caffeine addiction that powers the modern factory mentality. Drucker shows us that knowledge workers are paid for their thought, insight and productivity not for their time.

It's time to let them nap and see if effectiveness improves.

"Knowledge worker productivity is the biggest of the 21st century management challenges. In the developed countries it is their first survival requirement. In no other way can the developed countries hope to maintain themselves, let alone to maintain their leadership and their standards of living."
---- Peter Drucker, Management Challenges for the 21st Century (1999)

"We know now that the source of wealth is something specifically human: knowledge. If we apply knowledge to tasks we already know how to do, we call it 'productivity'. If we apply knowledge to tasks that are new and different we call it 'innovation'. Only knowledge allows us to achieve these two goals."
---- Peter Drucker, Managing for the Future (1992)

Simba (my dog) is on his second nap of the day. I think I'll join him for this 2pm siesta and see how it affects the rest of my day and outlook. Night night!

Wednesday, October 27, 2010

Announce: Net::HTTP::Factual

Factual hosts community editable data on a variety of topics, and hopes to be your one-stop-shop for wiki-data. They provide an element of "truthiness" for rows to show if the data is verified and some other interesting ideas.

They hosted a contest at Startup weekend LA last week, with the top team winning kindles for each member. I didn't win the contest[*], since I got pulled onto a different team and didn't finish my factual thingy (clearly I didn't define my factual thingie either). I did create a quick cpan module to wrap the factual ReST interface, Net::HTTP::Factual. I built it using Spore.

All I needed to do was translate the API documentation into json for spore to load. I also needed to work out the differences between Spore v0.2 spec format the v0.1 documentation. The demonstration spec files in the spore github are in the v0.1 format. These are the problems of using bleeding edge alphaware -- Spore v.01 came out Oct 12, v0.2 on Oct 14 and I was using them Oct 16.

* : I didn't win the contest, but it did remind that I wanted a kindle. So I went home and ordered one. It arrived two days ago, and I'm 80 pages into my first book. The screen is gorgeous. I'm reading a PDF of the Mythical Man-Month, and while the pdf handling isn't perfect it works well -- flows quite nicely reading from a rotated screen. Props!

LA.pm November meeting: Wed Nov 17 @Media Temple

Dearest Mongers,

See you in a few weeks at the November meeting, generously hosted by Media Temple. With our early October meeting it feels like it's such a long time in between, I'm missing you all.

I hope you can make it!

What:     Los Angeles Perl Mongers meeting
Date:     Wednesday, November 17
Time:     7-9pm.
Location: Media Temple.

topics and directions will be posted at la.pm.org

--Andrew

your data is your program.

Much more often, strategic breakthrough will come from redoing the representation of the data or tables. This is where the heart of a program lies. Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowchargs; they'll be obvious.
--The Mythical Man-Month

In theory, I read The Mythical Man-Month almost 20 years ago for CS1. At the time, his historical examples seemed so dated, talking about monthy rental of kilobytes of ram and OS360 and PDP8 and PDP11 design. What could that teach me about cranking out modern C code?

As I reread it know, I realize how timeless are his examples of software as engineering discipline. Problems we still haven't solved -- how to maintain the vision of a small number of minds over a project that requires many, many hands.

I wish I could have understood his words back then, but wisdom is not so easily gained.

Experience is a dear teacher, but fools will learn at no other.
-- Benjamin Franklin

Dost thov love life? Then do not sqvander time
[ for that is what life is made of ]
-- Benjamin Franklin, as inscribed in Blacker Hovse

Friday, October 8, 2010

GRT : Guttman Rosler Transform

The Guttman-Rosler Transform is a technique from Uri Guttman and Larry Rosler for improving sort speed in perl.

The Guttman-Rosler transform runs faster than the orcish or Schwartzian transforms by avoiding the use of custom search subroutines. This is accomplished via pre and post transformation of the list. This allows the native optimizations of the perl sort function to shine.

Sort::External has a special affinity for the GRT or Guttman-Rosler Transform, also known as the "packed-default" sort. The algorithm is examined at length in "A Fresh Look at Efficient Perl Sorting", a paper by Uri Guttman and Larry Rosler, located as of this writing at http://www.sysarch.com/perl/sort_paper.html.

For many applications, the GRT is the most efficient Perl sorting transform. This document explores how to use it to solve common coding problems in conjunction with either Perl's built-in sort() function or Sort::External.

-- Sort::External::Cookbook

Synopsis:
  1. prefix each element with a synthetic key (string or numeric).
  2. sort.
  3. remove the prefix key from each element.

Example:

    
    my %colors = (
        "Granny Smith"     => "green",
        "Golden Delicious" => "yellow",
        "Pink Lady"        => "pink",
    );

  #A) standard sort:
    my @sorted_by_color = sort { $colors{$a} cmp $colors{$b} } keys %colors;

  #B) GRT sort:
    # 1. Encode
    my @sortkeys;
    while ( my ( $variety, $color ) = each %colors ) {
        push @sortkeys, sprintf( "%-6s%-16s", $color, $variety );
    }
    # @sortkeys = (
    #     "green Granny Smith    ",
    #     "yellowGolden Delicious",
    #     "pink  Pink Lady       ",
    # );

    # 2. Sort
    my @sorted_by_color = sort @sortkeys;    # no sortsub!

    # 3. Decode
    @sorted_by_color = map { substr( $_, 6 ) } @sorted_by_color;

Links:

Research Paper from Uri Guttman and Larry Rosler:
http://www.sysarch.com/Perl/sort_paper.html
Sort Sort::Maker on cpan:
http://search.cpan.org/perldoc?Sort::Maker
Sort::External::Cookbook
Http://Search.cpan.org/perldoc?Sort::External::Cookbook

Thursday, October 7, 2010

Aloha Rubicon!

From:    Andrew Grangaard
To:      Rubicon Project
Subject: Aloha Rubicon

Dearest Rubicon Coworkers,

I have moved on from the Rubicon Project, saddened that opening new doors requires closing old ones. A closing of two wonderful years at tRP -- challenges bested, bonds forged and friends earned.

I will miss you all and our shared sense of wonder and excitement at preparing to attack the sleeping google giant. I hope my two years of improvements to infrastructure and fundamentals will aid you in the days and years ahead.

Pacing, RTS, click tracking, proration, scheduling, scheduler v2, scheduler v3, aggregation, merging, hadoop, datacube, svn branch maintenance and svnmerge merging, TDD, unit testing, jira tickets and wiki pages, reproducability, deployment, quality assurance ... I'll miss all my initiatives and projects (I won't miss you, aggregator-multithreaded.pl !!)

Thanks and adieu to my wonderful teammates from my two years on the core team Masao, Bill, Peter, Ian, Subash, Damien, David, Jonathan, Willis, Yushik and my core team leads: Mark Douglas, Damien, Mon-Chaio,, Damien(x2), Mon-Chaio(x2), Karim, Duc(x2) and most recently Sam. Core work is challenging and difficult. I salute the dedication and devotion you have all shown to our team to battle the twin complexities of our problem domain.

James, thanks for working with me during my first week when I was pitching out to help the UI team. Subash, you were a wonderful first week buddy. I hope I was able to help other newRubies the way you helped me. Jonathan, you've been a great desk neighbor and programming pair, even when frustrated you're still unassailably upbeat.

RI team, thanks for fighting to put me your squad and then graciously loaning me back to core for the okanagan push.

Mallory, our Den Mother and Cruise Director, you've done an impressive job of adapting the culture over the two-and-a-half population doubles we've seen in my two years.

Frank and Craig, thanks for always telling it like it is and welcoming my questions. I've never seen this level of internal transparency. I know you'll keep the ship on a steady course.

To my many friends and acquaintances on the other teams: you are family. Look me up any time. Life is so different without you all around everyday. Thanks for always bringing a smile to my face.

Thanks to the many of you that I leaned over the past 13 months through my divorce. Especially to those who did not know the circumstances, yet supported me anyways because that's just what awesome people do. Your strength and support have been invaluable.


Aloha,
Andrew

PS. "woof :(" -- Simba

Reading is FUNdamental

I am currently reading three CS books. The differences in styles and approaches are striking, but they are all effective in teaching.

Structure and Interpretation of Computer Programs aka SICP.
Classic text computer science and functional programming using Scheme as the implementation language. Used in 6.001 at MIT.
Related links:
Programming Collective Intelligence
Interesting algorithms for mining the huge datasets available in the web era. Uses python as the implementation language. Working through the first three chapters, the first thing I did was rewrite the examples in perl. Then I found I wasn't the only one to do so.
Modern Perl
chromatic has written a lovely book on Modern Perl, and shared the review copy for comments/editing. It was pretty sweet to email in some typos and receive a link back to an updated git repository with my changes applied.

Friday, October 1, 2010

LA.PM hits the road for October meeting

Los Angeles Perl Mongers meeting for October will be held downtown hosted by Oversee.net. Randall Schwartz is gathering speakers and handling the Oversee details. We are meeting early too, second Wednesday. see you October 13th!

send email to the list if you are interested in carpooling from the westside.

la.pm.org

Wednesday, September 22, 2010

September Los Angeles Perl Mongers

Game on!

I'm too sick to be here, but I came into the office tonight to run the meeting. Mad props to Tommy for driving down from Westlake to present tonight. Aran is also sick, so he's bailing on presenting. His presentation "12 cpan modules in 12 penta minutes" may well be cursed.

Big crowd tonight, 20+.

Monday, September 20, 2010

convert to CPAN Testers 2.0 (CPANTS)

I woke up to find a bevy of "Mail Delivery Failure" messages in my inbox. Seems the cpan-test reports I emailed in bounced back because cpan tester 2.0 dropped support of incoming email reports in favor of http. I'm excited to hear about this http switch, as I hated not being able to send test reports from machines that lacked email configurations.
This message was created automatically by the mail system (ecelerity).

A message that you sent could not be delivered to one or more of its recipients. This is a permanent error. The following address(es) failed:

>>> cpan-testers@perl.org (after RCPT TO): 550 cpan-testers no longer accepts test submissions via email. Please convert to CPAN Testers 2.0 and the http submission method. Instructions are at:
   http://wiki.cpantesters.org/wiki/QuickHowToCT20.

Let's follow the wiki article: http://wiki.cpantesters.org/wiki/QuickHowToCT20 and get upgraded!

Upgrade steps for a current CPAN::Reporter user:

  1. Upgrade CPAN::Reporter, add Test::Reporter::Transport::Metabase module
    # check current version:
    % perl -MCPAN::Reporter -l -e 'print $CPAN::Reporter::VERSION'
    1.1711
    #upgrade
    % cpanm CPAN::Reporter
    ...
    Successfully installed CPAN::Reporter
    % cpanm Test::Reporter::Transport::Metabase
    ...
    Successfully installed Test-Reporter-Transport-Metabase-1.999008
  2. create a profile using 'metabase-profile', put it into location.
    % metabase-profile
    Enter full name: ...
    Enter email address: ...
    Enter password/secret: ...
    Writing profile to 'metabase_id.json'
    % mkdir ~/.cpantesters
    % mv metabase_id.json ~/.cpantesters
    % chmod 400 ~/.cpantesters/metabase_id.json
  3. upgrade my ~/.cpanreporter/config.ini file to add a transport line
    #add transport line to my ~/.cpanreporter/config.ini file
    % echo 'transport = Metabase uri https://metabase.cpantesters.org/api/v1 id_file ~/.cpantesters/metabase_id.json' >> ~/.cpanreporter/config.ini
  4. Test
    cpan Hadoop::Streaming
    ...
    CPAN::Reporter: Test result is 'pass', All tests successful.
    CPAN::Reporter: preparing a CPAN Testers report for Hadoop-Streaming-0.102520
    CPAN::Reporter: ssending test report with 'pass' via Metabase
    ...
  5. Verify Test : Check metabase tail log for my entry.
    % wget --quiet http://metabase.cpantesters.org/tail/log.txt -O- | grep -i grangaard
    [2010-09-20T20:47:06Z] [Andrew Grangaard] [pass] [SPAZM/Hadoop-Streaming-0.102520.tar.gz] [i486-linux-gnu-thread-multi] [perl-v5.10.1] [3acd2e9e-c4f8-11df-b898-64160c3e84b1] [2010-09-20T20:47:06Z]
Dr. Frankenstein, It lives!

Tuesday, September 7, 2010

Hadoop::Streaming PAUSE registration submitted

Submitted a PAUSE (Perl Authors Upload SErver) request to register Hadoop::Streamingin the User Interface tree at CPAN. I wasn't really sure which top-level category to put it in, but settled on UI as it provides a simple adaption of the Streaming interface of Hadoop.

Woo, my first registered module space. Update: oooh, brian d foy!

On Wed, Sep 08, 2010 at 06:50:34AM +0200, Perl Authors Upload Server wrote:
> 
> The next version of the Module List will list the following module:
> 
>   modid:       Hadoop::Streaming
>   DSLIP:       RdpOp
>   description: simple interface to Hadoop Streaming
>   userid:      SPAZM (Andrew Grangaard)
>   chapterid:   8 (User_Interfaces)
>   enteredby:   BDFOY (brian d foy)
>   enteredon:   Wed Sep  8 04:50:33 2010 GMT
> 
> The resulting entry will be:
> 
> Hadoop::
> ::Streaming       RdpOp simple interface to Hadoop Streaming         SPAZM

Monday, September 6, 2010

Hadoop::Streaming 0.102490 pushed to CPAN

I've pushed a new release of Hadoop::Streaming to CPAN. It should be available in a couple of hours, depending on how long it takes your CPAN mirror to do the mirror update dance.

The release includes expanded documentation in the base Hadoop::Streaming placeholder file. Also included is a Hadoop::Streaming::Combiner role, for creating combiners. Combiners are like reducers that run post-map, per-merge. Once can reuse the reducer as combiner, if the reducer produces the same key/value format on output as input.

After writing my new documentation, test and code, I tested it with dzil test. After passing tests, it's a simple 1 step push to CPAN and github via dzil release. AWESOME! Dist::Zilla makes maintaining CPAN modules brilliantly easy.

Happy Labor Day!

Links

CPAN - Comprehensive Perl Archive Network
http://search.cpan.org
Hadoop::Streaming perl modules
http://search.cpan.org/perldoc?Hadoop::Streaming
Dist::Zilla
http://search.cpan.org/perldoc?Dist::Zilla

There's no Step Two!

[andrew@mini]% dzil release                                1 ~/src/hadoop-streaming-frontend
[DZ] beginning to build Hadoop-Streaming
[DZ] guessing dist's main_module is lib/Hadoop/Streaming.pm
[DZ] extracting distribution abstract from lib/Hadoop/Streaming.pm
[DZ] writing Hadoop-Streaming in Hadoop-Streaming-0.102490
[DZ] writing archive to Hadoop-Streaming-0.102490.tar.gz
[@Basic/TestRelease] Extracting /home/andrew/src/hadoop-streaming-frontend/Hadoo
p-Streaming-0.102490.tar.gz to .build/dVEDcaew44
Checking if your kit is complete...
Looks good
Writing Makefile for Hadoop::Streaming
cp lib/Hadoop/Streaming.pm blib/lib/Hadoop/Streaming.pm
cp lib/Hadoop/Streaming/Combiner.pm blib/lib/Hadoop/Streaming/Combiner.pm
cp lib/Hadoop/Streaming/Role/Emitter.pm blib/lib/Hadoop/Streaming/Role/Emitter.p
m
cp lib/Hadoop/Streaming/Reducer/Input/ValuesIterator.pm blib/lib/Hadoop/Streamin
g/Reducer/Input/ValuesIterator.pm
cp lib/Hadoop/Streaming/Reducer.pm blib/lib/Hadoop/Streaming/Reducer.pm
cp lib/Hadoop/Streaming/Reducer/Input/Iterator.pm blib/lib/Hadoop/Streaming/Redu
cer/Input/Iterator.pm
cp lib/Hadoop/Streaming/Role/Iterator.pm blib/lib/Hadoop/Streaming/Role/Iterator.pm
cp lib/Hadoop/Streaming/Reducer/Input.pm blib/lib/Hadoop/Streaming/Reducer/Input.pm
cp lib/Hadoop/Streaming/Mapper.pm blib/lib/Hadoop/Streaming/Mapper.pm
Manifying blib/man3/Hadoop::Streaming::Combiner.3pm
Manifying blib/man3/Hadoop::Streaming.3pm
Manifying blib/man3/Hadoop::Streaming::Role::Emitter.3pm
Manifying blib/man3/Hadoop::Streaming::Reducer::Input::ValuesIterator.3pm
Manifying blib/man3/Hadoop::Streaming::Reducer::Input::Iterator.3pm
Manifying blib/man3/Hadoop::Streaming::Reducer.3pm
Manifying blib/man3/Hadoop::Streaming::Role::Iterator.3pm
Manifying blib/man3/Hadoop::Streaming::Reducer::Input.3pm
Manifying blib/man3/Hadoop::Streaming::Mapper.3pm
PERL_DL_NONLAZY=1 /usr/bin/perl "-MExtUtils::Command::MM" "-e" "test_harness(0, 'blib/lib', 'blib/arch')" t/*.t
t/00-load.t ....... ok
t/01-wordcount.t .. 8/? # perl path -> /usr/bin/perl
t/01-wordcount.t .. ok
t/02-analog.t ..... ok
All tests successful.
Files=3, Tests=19,  3 wallclock secs ( 0.05 usr  0.01 sys +  2.34 cusr  0.20 csys =  2.60 CPU) 
Result: PASS   

[@Basic/TestRelease] all's well; removing .build/dVEDcaew44
*** Preparing to upload Hadoop-Streaming-0.102490.tar.gz to CPAN ***

Do you want to continue the release process? [y/N]: y
[@Git/Check] branch master is in a clean state
[@Basic/UploadToCPAN] registering upload with PAUSE web server
[@Basic/UploadToCPAN] POSTing upload for Hadoop-Streaming-0.102490.tar.gz
[@Basic/UploadToCPAN] PAUSE add message sent ok [200]
[@Git/Commit] Committed Changes
[@Git/Tag] Tagged v0.102490
[@Git/Push] pushing to origin

Thursday, September 2, 2010

github + cpan = gitpan

Gitpan is a clone of all the modules on cpan in git form, nearly twenty-two thousand public repositories. This is not a place for development of modules. Instead it is a place to easily pull the current source for a module to make a patch and send to the maintainer, without having to find where she maintains her golden copy.

I read about gitpan a while ago, but then when I wanted to find it last week, I couldn't find the correct search terms. [github cpan] produces a list that doesn't include gitpan in the first page, as it is crowded out by the many perl modules developed on github for release to cpan and of course things like Net::GitHub and GitHub::Import, and an interesting discussion at perlmonks on (informal) perl naming convention for github projects.

Now that I know the name, it is still hard to find information! From the FAQ section of the readme:

What is gitPAN?
---------------
gitPAN is a project to import the entire history of CPAN (known as BackPAN) into a set of git repositories, one per distribution.

Why is gitPAN?
--------------
CPAN (and thus BackPAN) is a pile of tarballs organized by author. It is difficult to get the complete history of a distribution, especially one that has changed authors or is released by multiple authors (for example, Moose). Because releases are regularly deleted from CPAN even sites like search.cpan.org provide an incomplete history. Having the complete history of each distrubtion in its own repository makes the full distribution history easy to access.

gitPAN also hopes to make patching CPAN modules easier. Ideally you simply clone the gitPAN repository and work. New releases can be pulled and merged from gitPAN.

gitPAN hopes to showcase using a repository as an archive format, rather than a pile of tarballs. A repository is far more useful than a pile of tarballs, and contrary to many people's expectations, the repository is turning out smaller.

Finally, gitPAN is being created in the hope that "if you build it they will come". Getting data out of CPAN in an automated fashion has traditionally been difficult.

Where is gitPAN?
----------------
The repositories are on github.com at http://github.com/gitpan (watch out, it may overload your browser).

Code, discussion, and issues can be had at http://github.com/schwern/gitpan.

[...]

How can I contact gitPAN?
-------------------------

Email:   schwern+gitpan@gmail.com
Web:     http://github.com/gitpan/
Dev:     http://github.com/schwern/gitpan
Issues:  http://github.com/schwern/gitpan/issues
Twitter: #gitpan

Links:

google search for [github cpan]
http://google.com/search?q=github+cpan
google search for [gitpan]
http://google.com/search?q=gitpan
gitpan at github -- 21,976 public repositories and counting!
Schwern's announcement of gitpan on his use.perl blog.
http://use.perl.org/~schwern/journal/39972
http://github.com/gitpan
discussion of gitpan and code:
http://github.com/schwern/gitpan
gitpan issues:
http://github.com/schwern/gitpan/issues
a page with 4 links at integra.net
http://gitpan.integra.net/

Thursday, August 26, 2010

LA.pm august meeting prep (exclusive behind-the-scenes look!)

"Uh oh, an email from Aran, one of my presenters for tonight," I thought as I opened it with terpidation.
Tommy said to me yesterday "So, you're presenting tomorrow?" And I was like "I am?" And he was like "Ya." And then I was like "Dude, I gotta make my presentation!" And he was like "Burn!" And I was like "Its cool, should be an easy one to prep for." And he was like "Cool." And I was like "Cool." And then I asked "Are you presenting?" And he was like "I was, but I flaked and now Andrew is forced to present." And I was like "Burn!". And he was like "Ya man." And I was like "Cool." And he was like "Cool."

See ya in a couple hours!
Aran, 2:28pm

"Cool," with Tommy leaving me a late night voicemail on Tuesday to postpone his presentation, I was not looking for another presentation slot to fill. Happy dance! Aran always does lovely, polished talks with cute slides. What was I thinking, not checking my email until 5pm anyways?

What's this, he's replied to his own message? "Ruh roh!" said the alarm bells in my head as I opened it to read his retraction and cancellation from 2:48pm. There goes my plan of finishing (writing) my presentation during Aran's talk. " 'Burn!' ," to use the parlance of our times.

Dear Perlers,

Yes Laziness is a Virtue[1] but so is Hubris. "Get up and present!" I say Impatiently.

kthxbai,
Andrew (your la.pm[2] host and default presenter)

P.S. Please try and pick more sensible defaults.

--me

Links

  1. http://threevirtues.com/
  2. http://la.pm.org/

Thursday, August 19, 2010

LA.pm august meeting: Wed Aug 25

Just a quick reminder: next los angeles perl mongers meeting is Wednesday, August 25

The september meeting will be on Sep 22.

la.pm.org

Thursday, August 12, 2010

perl iron man posts missing

I've noticed my posts not making it into the iron man blog aggregator. I haven't been able to ascertain why they are being excluded. I've emailed the organizers and attempted to sign-up a second time. No dice.

Any ideas?

Thursday, August 5, 2010

perldoc Completion in zsh

I just saw a link to a bash perldoc completion script: http://github.com/ap/perldoc-complete. I was confused by his need to write it and then realized that bash hadn't cribbed the awesome completion for perldoc that zsh has. I thought there was pretty good parity after bash improved their system to (nearly) match zsh, apparently I was incorrect.

Checking out the git source for zsh and running git status on _perldoc , I found that perldoc completion has existed since at least 2001, when it was moved from User/_perldoc to Unix/_perldoc. Roxor!

And people wonder why "I <3 Zsh!".

[agrangaard@willamette]% git status _perldoc
                                                           1 ~/src/zsh/Completion/Unix/Command
# On branch master
nothing to commit (working directory clean)
[agrangaard@willamette]% git log _perldoc                                                                                          0 ~/src/zsh/Completion/Unix/Command
commit 3d215fd53ed47cbec57283509b2b3c36165303dd
Author: Peter Stephenson 
Date:   Wed Aug 2 22:20:45 2006 +0000

    22579: find .pod files in include path for perldoc

commit 0ba8ae87eac21281e0b17eb9cbb523d133067a4a
Author: Oliver Kiddle 
Date:   Wed Jun 8 12:45:24 2005 +0000

    21315: make completion functions give precendence to descriptions passed as
    parameters and cleanup descriptons in calling functions

commit c3b929c6340834dacf7888a96ce505325c3a85af
Author: Oliver Kiddle 
Date:   Fri Feb 13 18:42:03 2004 +0000

    19418: update completions to new versions

commit 63b336243fdf5e60058472fa456ed11e75280189
Author: Oliver Kiddle 
Date:   Wed Jan 21 13:53:28 2004 +0000

    19387: add (-.) glob qualifier to globs where only files are directly applicable

commit 88fc3c951f3758acffcecc1e1016d23ca31a3560
Author: Sven Wischnowsky 
Date:   Mon Apr 2 12:00:14 2001 +0000

    moved from Completion/User/_perldoc

Everything in its place.

While trawling the Ironman perl aggregation[1], I found this lovely snippet linking to App::MisEnPlace [2].

It is an example of "using source control to manage a homedir" and serves as an example of an App::Cmd[3] application -- specifically MooseX::App::Cmd[4]. Tommy, this one is for you!

PS. It has a nice set of tests[5] as well!

Finishing my project/file/repository management tool.[2]
http://genehack.org/2010/08/stuff_im_working_on_august_2010/ [5]

Links:

  1. http://ironman.enlightenedperl.org
  2. http://github.com/genehack/App-MiseEnPlace
  3. http://search.cpan.org/perldoc?App::Cmd
  4. http://search.cpan.org/perldoc?MooseX::App::Cmd
  5. http://genehack.org/2010/08/stuff_im_working_on_august_2010/
  6. http://github.com/genehack/App-MiseEnPlace/tree/master/t

Tuesday, August 3, 2010

Iron Man Badges have returned!

Woohoo! Iron man badges have returned.

My badge listing page is listing all the images as the same. I'm in the system as both "First Last" and "FirstLast", I wonder which is up-to-date? I haven't seen my posts in the iron man stream for a while.

AndrewGrangaard:

Andrew Grangaard:

Sign up now for the Perl Iron Man blogging Challenge!

RJBC presentation on App::Cmd

Writing Modular Commandline Apps with App::Cmd
RJBS looked into writing command line apps and didn't find many options. (Just one?!). What happened to TMTWWTDOI? So he wrote App::Cmd. I've been looking at this for a few weeks and finally got a chance to dig into it today.

I'm happy to see that there is a ::Simple version for writing 'single command' commands, which seems a nice way to get started.

I hope to get my example app written and pushed to cpan soon.

Thursday, July 29, 2010

App::cpanoutofdate for keeping local::lib cpan up-to-date

A new version of App::cpanoutdated was released yesterday which ads a new -l and -L option to the bundled binary,cpan-outdated. These flags work the same as -l and -L in cpanminus, pointing to a local::lib controlled directory.

--compare-changes shows the difff to the changes file between releases. That's awesome.

This code will update all of the out of date modules in /apps/perl5 which is a local::lib controlled directory on my system: cpan-outdated -l /apps/perl5 | xargs cpanm -l /apps/perl5

--compare-changes example

cpan-outdated --compare-changes| head -20
S/SA/SARTAK/Any-Moose-0.13.tar.gz
0a1,23
> 0.13  Wed 19 May 2010
>   * Add load_first_existing_class (gfx)
> 
> 0.12  Fri 02 Apr 2010
>   * t/000-version.t for better diagnostics (tokuhirom)
>   * Slight performance improvements for is_class_loaded,
>       lazily loading Carp, etc (Sartak)
>   * Start some real documentation (Sartak)
>   * Document $ENV{ANY_MOOSE} (Sartak)
>     - fixes [rt.cpan.org #52339]
>   * Test that Moose is loaded, not CMOP (Sartak)
>     - fixes [rt.cpan.org #56093]
>   * Alias class_of and more functions (Sartak)
>     - requested by [rt.cpan.org #52275]

Usage:

cpan-outdated --help                
Usage:
        # print list of outdated modules
        % cpan-outdated

        # verbose
        % cpan-outdated --verbose

        # output changes diff
        % cpan-outdated --compare-changes

        # alternate mirrors
        % cpan-outdated --mirror file:///home/user/minicpan/

        # additional module path
        % cpan-outdated -I extlib/

        # install with cpan
        % cpan-outdated | xargs cpan -i

        # install with cpanm
        % cpan-outdated | xargs cpanm

Wednesday, July 28, 2010

July LA.pm meeting recap

Los Angeles Perl Mongers meeting for July 28th, 2010.

We had two speakers, Troy Will and Guy Shaw. This was Troy's first visit to LA.pm. Thank you both for presenting. Guy was a last minute fill to cover when Aran took ill.

Troy talked about his weight tracking project Getfit, written in perl. He also talked about GNU Stow, a perl system for packaging.

Guy is talking about pe-cpp, a partial c-pre-processor evaluator for making simplified header files. pe-cpp produces "sliced" cpp header files and does partial evaluation of arithmetic expressions. It does not simplifiy data structures. This is a source-to-source transformation for c header files to create simpler c header files that have been compressed by collapsing the known options. Yacc/Bison style!

After a thorough analysis of the available projects in this space, he found he needed to write his own to work around the limitations of the other. Originally written to assist C+asm integration and uses Sun's CTF for definitions of offsets and the like. "It's less scary than looking into h2ph," says Guy.

A good question: what happened to preprocessors? They seem to have fallen out of fashion, one camp views preprocessors as needed "only to cover up bad language design." Perhaps with Common Lisp and pure functional languages getting a resurgence of interest we will see more use of source-to-source transforms of code like this.

Convert::Binary::C may be useful to a similar audience, if one wants to use C data structure definitions within perl.

Tuesday, July 20, 2010

Hadoop::Streaming module updated and awesome

Verison 0.101881 of my Hadoop::Streaming module is released on CPAN.

This is a bug fix release -- fixing bugs in my test suite! I had problems with errors from the smoke testers, even though it all "worked for me." To fix this I wrote my tests in a sane manner, so they weren't held together by "gravity and good luck." See Bug #59164 for the gory details. 

I'm happy and proud to report that version 0.101881 has 81 PASS and 0 FAIL results from cpants!

Now all I need is a well written tutorial on how to use it.

links
Bug #59164 for Hadoop-Streaming: Test failures on a variety of platforms
https://rt.cpan.org/Public/Bug/Display.html?id=59164
Hadoop::Streaming Module
Documentation
http://search.cpan.org/perldoc?Hadoop::Streaming
Module Overview:
http://search.cpan.org/search?q=Hadoop::Streaming
Test Reports:
http://www.cpantesters.org/show/Hadoop-Streaming.html#Hadoop-Streaming-0.101881
Open Bugs (0!):
https://rt.cpan.org/Public/Dist/Display.html?Name=Hadoop-Streaming
Git Repo:
http://github.com/spazm/hadoop-streaming-frontend

morning reading: Getopt::Long::Descriptive (GLD) and App::Cmd

"Getopt::Long::Descriptive - Getopt::Long, but simpler and more powerful"
given a descriptive getopt argument, returns a usage summary and an object containing set options.
I am a fan of this less is more approach of "simpler AND more powerful."
Getopt::Long::Descriptive
"App::Cmd - write command line apps with less suffering"
Simplify writing command line apps by breaking functionality into module(s) with a minimal runner script. Makes extending command line apps easy by adding additional sub-modules.
Uses Getopt::Long::Descriptive.
App::Cmd

Have you ever asked "why do I have to document my command line options twice? once in the usage command and once in the Getopt statement?" I know I was asking just last week at the Thousand Oaks Perl Mongers meeting. If you don't ask, is it because you consider your Getopt command documentation? We discussed and compared various Getopt alternatives: Getopt::Long, Getopt::Euclid, Getop::Clade, etc.

One interesting solution is Getopt::Euclid. Euclid parses specially formatted pod commands to create the Getopt configuration. This is an awesome idea, but leaves me with two problems: 1) There is an extra step to create this parsing setup, and that means I can't test the exact code that my user will see. 2) it employs a source filter which freaks me out. Perhaps the source filter has been upgraded to use one of the PPI parsing modules instead, that would be less scary than the black magic happening in source filters.

GetOpt::Long::Descriptive (GLD) is built on Getopt::Long and takes a more verbose/descriptive input format. It outputs a usage command and an object containing options and values. Unlike Euclid, it doesn't read or create perldoc/POD.

I think adding a podweaver+Getopt::Long::Descriptive plugin would be interesting. Creating a nicely documented POD section during the dzil build step of my Dist::Zilla based modules would be a big win. I just browsed the code for Getopt::Long::Descriptive::Usage, and it looks like adding a weaver transform would be possible, if I could figure out how to extract the usage code from the module during the build step. This may be easier to visualize once I start investigating App::Cmd.

Example
From the SYNOPSIS section:

use Getopt::Long::Descriptive;

my ($opt, $usage) = describe_options(
    'my-program %o ',
    [ 'server|s=s', "the server to connect to"                  ],
    [ 'port|p=i',   "the port to connect to", { default => 79 } ],
    [],
    [ 'verbose|v',  "print extra stuff"            ],
    [ 'help',       "print usage message and exit" ],
);
print($usage->text), exit if $opt->help;
Client->connect( $opt->server, $opt->port );
print "Connected!\n" if $opt->verbose;

...and running my-program --help will produce:

  my-program [-psv] [long options...] 
    -s --server     the server to connect to
    -p --port       the port to connect to
                  
    -v --verbose    print extra stuff
    --help          print usage message and exit

I'm quite fond of how s|server became -s --server, showing both the long and short form of the arguments.

Go play with these modules and let me know if you start using them. Many thanks to their Authors for sharing their work on CPAN. You guys rock. I'm going to start using these modules and will report back when I have a better feel for them. This will likely become a future topic for the Los Angeles Perl Mongers (LAPM).

Links:

Getopt::Long::Descriptive
http://search.cpan.org/perldoc?Getopt::Long::Descriptive
App::Cmd
http://search.cpan.org/perldoc?App::Cmd
Getopt::Euclid
http://search.cpan.org/perldoc?Getopt::Euclid
Dist::Zilla
http://search.cpan.org/perldoc?Dist::Zilla
Pod::Weaver
http://search.cpan.org/perldoc?Pod::Weaver
PPI documentation
http://search.cpan.org/perldoc?PPI
Los Angeles Perl Mongers
la.pm.org
Thousand Oaks Perl Mongers
http://thousand-oaks-perl.org

Sunday, July 18, 2010

cpanminus with CPAN::mini?

At the July Thousand Oaks Perl Mongers [1] open-discussion meeting, we discussed both CPAN::Mini [2],[3] and App::cpanminus [4],[5]. They both work well individually, but can they be combined?

I did an example install of CPAN::Mini[2] for the meeting. This was a learning experience for me, as I'd only briefly heard of it before and this was my first install. CPAN::Mini makes a minimal private mirror of CPAN[6], of all the current release versions of all the CPAN modules! The idea is you keep this mirror up-to-date and anytime you need to install a module, you have it available. To keep the repository more manageable only the current version of each module is mirrored. Installation was super simple, the hardest being polling the room for a preferred CPAN mirror. We settled on ASK's develooper.com.
% cpan CPAN::Mini
% mkdir -p $HOME/mirrors/cpan
% minicpan -L $HOME/mirrors/cpan -R http://cpan.develooper.com/

Tommy then discussed his four favorite modules-of-the-moment for perl5 development and usage, one of which was App::cpanminus[4]. I'd heard of this previously, but hadn't gotten around to testing it. It installs a binary, cpanm[5], that is used to install modules. It tries to just do-the-right-thing, and install modules and keep the output to a minimum. I especially liked the flag to install to a local::lib maintained directory, which is useful in a sudo case where environment variables are annoying to pass.

I then tried to merge the two. First I installed App::cpanminus from my local CPAN::Mini mirror using cpan with the network unplugged. Success!

Then I tried to install a module with cpanm, and found that it was going to the network even though my repo was at a file:// url. Bummer!

I haven't gotten back to testing this, and a brief search found one comment from a user who reverts to cpan to use his CPAN::Mini mirror when he's away from the network. Does anyone have this working? Is it just part of the cpanminus magic that it looks online for the CPAN meta data?

My only reason to teach cpan.pm after that is to support CPAN::Mini and a local CPAN mirror. Thats the only reason I still use CPAN, I'm usually offline.
-- Pedro [7],[8]

Links

Thousand Oaks Perl Mongers
http://thousand-oaks-perl.org
CPAN::Mini
href=http://search.cpan.org/search?q=CPAN::Mini
cpanmini
href=http://search.cpan.org/search?q=minicpan
App::cpanminus
href=http://search.cpan.org/search?q=App::cpanminus
cpanm
href=http://search.cpan.org/search?q=cpanm
C.P.A.N. -- the Comprehensive Perl Archive Network.
http://cpan.org
Pedro
http://www.simplicidade.org/
Modern perl comment
http://www.modernperlbooks.com/mt/2010/04/should-novices-prefer-cpanminus.html#comment-362

Saturday, July 17, 2010

ironman badges, I miss you

I miss my smiling ironman badge. It is a beautiful thing.

I know it's been months, but I still miss it. Am I the only one? It was so good at reminding me to write. And not just self referential ironman related posts.

mmm skyscraper I love you.
mmm skyscraper I love you.
mmm skyscraper I love you.
mmm skyscraper I love you.

thirty thousand feet above the earth. it's a beautiful thing.
and you're a beautiful thing.
thirty thousand feet above the earth. it's a beautiful thing.
everybody's a beautiful thing.

---- Underworld, Skyscraper, I love you

Lot's of exciting happenings of late. I have two la.pm mongers write-ups that are in the todo list. Last week I hit two events. Wednesday was thousand-oaks.pm last week for an open rap session. Last Tuesday was a great talk at The Hammer museum at UCLA on the question of what is current internet use doing to our brains, which left me rejuvenated and ready to spend some quality time focusing rather than skimming back and forth.

Thursday, July 8, 2010

DavMail gateway provides LDAP and WebDav interface to MS Exchange

Goal: Get calendar and ldap-lookup working in thunderbird to my corporate hosted exchange service.

Status: WORKING!

Tools:

DavMail
DavMail provides a gateway from open protocols to MS Exchange
davmail.sf.net
Lightning
Thunderbird plugin for calendaring, built on sunbird
Current version requires Thunderbird 3.1.
http://www.mozilla.org/projects/calendar/lightning/
Thunderbird
Email client.
Calendar Client (with Lightening installed)
http://www.mozillamessaging.com/en-US/thunderbird/
Exchange Server
My office uses an exchange server hosted by AppRiver's Shoreline system.
http://www.appriver.com/exchange
Installation
  1. Install DavMail
    On my ubuntu system, I downloaded the deb from sourceforge and installed it manually with dpkg. I ran into a problem where I didn't have the proper swing libaries installed and the partially installed dpkg blocked the install of the swing libraries. In reality it was as simple as answering "Y" to "delete DavMail" and letting the swing install go through.
    1. #manually download deb from http://sourceforge.net/projects/davmail/files/
    2. sudo aptitude install sun-java-6 libswt-gtk-3.5-java
    3. sudo dpkg -i davmail_3.6.6-1032-1_all.deb

    The install integrated into the menu system, providing a menu entry to launch the app: Applications->Internet->DavMail.
  2. Upgrade/Install Thunderbird 3.1
    3.1 is currently required for lightning. I installed via the ubuntuzilla repository. Caveat: only 32 bit builds are available. I installed via the ubuntu repository, as per the Instructions.
    1. Create a new sources file /etc/apt/sources.d/ubuntuzilla.list containing:
      deb http://downloads.sourceforge.net/project/ubuntuzilla/mozilla/apt all main
    2. Then add the key, update and install:
      1. sudo apt-key adv --recv-keys --keyserver keyserver.ubuntu.com C1289A29
      2. sudo apt-get update
      3. sudo apt-get install thunderbird-mozilla-build
  3. Install Lightning into Thunderbird.
    This is most easily done directly through Thunderbird:
    1. Tools->Add-ons menu item.
    2. [Get Add-ons] button
    3. Type Lightning in the search box and press enter to search.
    4. push [Add to Thunderbird] button on the Lightning item.
    5. confirm install
    6. Restart Thunderbird
Configuration The devil is in the details. Getting the correct configuration from our hosted exchange server took quite a bit of trial and error. The email system predates the current company, so I have at least three logins:
My primary email address
username @ rubiconproject.com
This is the address used to send and receive mail.
Same account at a different domain (the founder's vanity domain).
username @ addante.com
This is the address used to authenticate with the pop/imap servers and to access Outlook WebMail.
my username with internally assigned id number attached. No domain.
username_12345
Used internally. Will be used in our ldap setup.
  1. Launch and Configure DavMail
    1. Launch from menus Applications->Internet->DavMail, which launches to a gnome icone in by the clock.
    2. Open settings right-click DavMail icon->Settings...
    3. Set the OWA (Exchange) URL : https://exg3.exghost.com/Exchange
    4. Make note of the ports for CalDav (1080) and LDAP (1389), change if desired.
  2. Configure calendar in Thunderbird-lightning
    1. File->New->Calendar...
    2. choose [*] On the Network and [Next]
      1. choose [*] CalDav
      2. Location: http://localhost:1080/users/username@addante.com/calendar
      3. [Next]
    3. Configure the name, color and alarm settings as desired. Email should by your work email, aka username@rubiconproject. [Next]
    4. Provide login credentials, the same as if you were using the webmail tool, for for me that is the username@addante.com account.
  3. Configure LDAP
    1. Thunderbird Edit->Preferences->Composition->Addressing
    2. [Edit Directories]
    3. [Add]
    4. Configuration settings. Base DN and Bind DN were tricky.
      • Name: proxy ldap
      • Hostname: localhost
      • BaseDN: ou=people
      • Port: 1389
      • Bind DN: username_12345
      • [OK]
    5. Test the connection:
      1. select proxy ldap and press Enter
      2. [Offline] tab
      3. [Download Now].
      4. enter username@addante.com as username and correct password in the authentication box.
      5. if the download completes and "Replication Succeeded" appears, all is working.

And there you have it, working exchange integration through open standards. Yay! It'll only take 15 minutes or so total to install now (after my hour putzing it around and 3 hours writing it up.)

I now have ldap lookup and caldav alerts functioning. I haven't delved that far into the system, but it's a big step up from my previous interaction lightning. I'll see what happens when I get my next meeting invite and attempt to reply via thunderbird.

Tuesday, July 6, 2010

Do you have CPAN::Reporter installed? Go do it!

Do you have CPAN::Reporter installed? Did you maybe not know what it is, what it does, and why it's the easiest way to help the perl+cpan community? CPAN::Reporter sends back test results every time you download a module from CPAN. This gives module authors something tangible to know that people are using their modules and to automatically report errors.
Now anyone with an up-to-date CPAN.pm can contribute to CPAN Testers -- no smoke server needed.
...
Becoming a CPAN Tester with CPAN::Reporter is easy. From the CPAN shell prompt:

cpan> install CPAN::Reporter
cpan> reload cpan
cpan> o conf init test_report
cpan> o conf commit

  --  http://use.perl.org/articles/06/11/08/1256207.shtml

My Hadoop::Streaming module has all sorts of crazy errors on esoteric platforms. I only know this because of automated smoke tests run by CPANTesters. It would be nice to have some reports from actual users mingled in with the automated runs.

In the latest release I pushed out over Independence Day weekend, I added monitoring on STDERR from my sub-command tests. Now I have some test results that capture error messages from external apps I try to run from my tests. At least now I have a chance of figuring them out... not sure I understand these error messages yet. Maybe they'll make more sense tomorrow. I'd like to make it so all my users see this output from cpan:


All tests successful.
Files=5, Tests=12, 2 wallclock secs ( 0.05 usr 0.00 sys + 2.04 cusr 0.20 csys = 2.29 CPU)
Result: PASS
(/usr/bin/make test exited with 0)
CPAN::Reporter: Test result is 'pass', All tests successful.
CPAN::Reporter: preparing a CPAN Testers report for Hadoop-Streaming-0.100270
CPAN::Reporter: sending test report with 'pass' to cpan-testers@perl.org

Tuesday, June 29, 2010

next LA.PM meeting: Wed Jun 30.

Hiya Mongers! See ya tomorrow, Wednesday June 30th for our next perl mongers meeting. All the details are at the la.pm.org website. Presentations are still "to be determined."

Please let me know if you are coming and if you'd like to present!

Tentative presentation ideas:

  1. CPAN author tools: care and maintenance of your cpan module, the rt queue, cpants. Scenario: "my module works great for me but fails install tests on 90% of hosts, how to fix that." I wish that was completely hypothetical.
  2. Perl hadoop streaming examples follow-up. I didn't get to show much code for Hadoop::Streaming::Mapper last time around. I'd like to show snippets of the code I'm working on now for $work (cleansed appropriately)
  3. mega extreme prorgramming to the max. "Pairing" with a room full of coders (instead of just a pair) trying to code onto the main display. Expanding App::PM::Announce to our nefarious purposes
What: Los Angeles Perl Mongers Meeting
When: 7-9pm
Date: Wednesday, June 30, 2010
Where: The Rubicon Project HQ - 1925 S. Bundy, 90025
Theme: Perl!
RSVP: Responses always appreciated.

Saturday, June 26, 2010

Iron Man Challenge -- what happened to the status images?

Does anyone out there know the status of the "IronMan Status Images"? They disappeared when the code rewrite went out. I miss seeing mine, but I miss seeing them on random blogs even more.

I don't have the tuits to rewrite that part of the system, but maybe if I knew where the blog meta info was kept, or even if the old csv list is being updated with each blog's posts listed by date( unsorted ). Maybe I will be writing it afterall.

Thursday, June 17, 2010

hideous remakes and sequels

I listened to the Manager Tools podcast on keeping a "Delta File" this morning [Career Tools]. A Delta File is where you list stuff that annoys you about your job and/or your manager that you promise that "I won't do that when I'm a manager. I'll fix it!" When you do become a manager, you can look back and get feedback from your previous self, lest you find yourself doing a scene-for-scene remake of your previous manager's actions.

Waiting for me on gale chat was a conversation on hideous remakes and sequels. Beautiful Synchronicity! The pillaging of my childhood memories continues, thanks to the soulless profiteering freaks in Hollywood, to paraphrase Jason and provides plenty of fodder for my 'Hollywood Executive Delta File' to remind myself what not to do when I become a hollywood executive. Remind me not to read slashfilm, lest I learn about these things sooner than I need to.

I do plan to go see A-Team soon, it looks pretty sweet. I mean, it's the friggin' A-Team!

The A-Team and The Karate Kid?
No, seriously, W - T - F? I mean... I just... why?
--jason

Just wait until Peter Berg's "Battleship" [1], Sony's "Risk" [2], Gore Verbinski's "Clue" [3], Steve Oedekirk's "Stretch Armstrong" [4], Etan Cohen and Kevin Lima's "Candyland", and even Ridley Scott's "Monopoly" [6]

  1. Peter Berg's Battleship moved up a week
  2. Sony Pictures to make movie based on the board game RISK
  3. Gore Verbinski to direct CLUE
  4. Worst idea ever -- Steve Oedekerk to write Stretch Armstrong movie/
  5. From the guy who wrote Tropic Thunder comes Candy-Land the movie
  6. Worst Idea Ever - Ridley Scott is directing Monopoly/
--presto
You forgot Universal's Asteroids
--erich

Save us all.

Tuesday, June 15, 2010

g33k d1nner -- returns for one night only (tonight!)

g33k d1nner (Los Angeles Geek Dinner) is back for one last performance: tonight only. Come out to Cantor's deli in Midtown around 7:30 to meet and greet. Leave your sales pitches at home.

WhatDinner with your fellow geeks
WhenJun 15, 2010, 7:30pm
WhereCanters Deli [ Canter's Fairfax Restaurant ]
Address419 North Fairfax Avenue, Los Angeles, CA 90036
WhoPeople who self-identify as geeks
RSVPat upcoming
Why"It’s a chill scene. No uber networking, self-promotion or company promotion. Just be cool. We’re all friends you haven’t met yet."
HowShow up. Look for the like minded peeps. pay for your own food. enjoy.

More info available on the geekdinner website [ g33kd1nner ]:

  1. What is Geek Dinner About
  2. Geek Dinner Redux

Friday, June 11, 2010

ubuntu 10.4 + perl 5.10.1 => personal config moved to .local/share/.cpan

The personal .cpan directory for perl 5.10.1 under ubuntu 10.4 has moved from $HOME/.cpan to $HOME/.local/share/.cpan. I was very confused to get messages about not being able to write to /root/.cpan/build when I knew I had that overridden in my personal configuration file.

Once I found this new location, I was able to delete the new .cpan directory and replace it with a symlink to my old ~/.cpan. Now I can get back to building modules as a non-privileged user into a local::lib based configuration.

Delete empty new .cpan directory and replace with a symlink to my old .cpan:
rm -rf $HOME/.local/share/.cpan && ln -s $HOME/.cpan $HOME/.local/share/

Tuesday, June 8, 2010

Thunderbird SSL Exceptions

Steps to manually add an exception for SSL handling in Thunderbird.
  1. Open Preferences:
    Edit Menu -> Preferences
  2. View Certificates:
    Advanced Tab -> Certificates Sub Tab -> [View Certificates] button
  3. Servers Tab :
    [Add Exception] button
  4. Add domain :
    Enter domain in Server Location text field
  5. Get Certificate :
    [Get certificate]
  6. View Certificate (optional) :
    [View Certificate] then [Close]
  7. Check permanently store exception :
    [X] Permanently store this exemption
  8. Confirm Security exception :
    [Confirm Security Exception] button
  9. close certificates :
    [close] button
  10. close preferences :
    [close] button

The exception core for Thunderbird is separate from the one for Mozilla / Firefox, so if you've already added an exception there you'll need to repeat it for Thunderbird. I needed to do this because I get legitimate emails from a site that come to me from the internal company domain using our external company SSL certificate. Having to click through 3 pop-ups each time I saw a bamboo build status message is a non-starter.

I wonder if there is a simpler perl interface to directly manipulate these configuration options? Perhaps using Mozilla::Prefs::Simple?

Friday, June 4, 2010

Ubuntu 10.4 upgrade and reset metacity bindings.

I've updated my laptop to Ubuntu 10.4. The upgrade went very smoothly, and I'm pretty happy with it. I needed to upgrade to thunderbird3 as thunderbird2 was having trouble sending mail to my work email server -- 50 seconds of single thread blocking doing the AUTH calculation on each send had made it impossible to work with.

The metacity window manager configuration has been changed considerably: the minimize, maximize, close buttons have been moved from the right to the left, and reordered. Rather than adjust to these new settings, I modified the configuration to put them back where they were before. This is a good reminder that I should go back to my own window manager and config. I'm not sure what will be best for the laptop, I am very happy with ion, at least with the ideas behind it, it's kind of a pain to configure. I still haven't duplicated my ion1 settings in ion3. Now that the developer gave up on it and switched to windows (sigh), I'm not sure if I should move on.

How to change the metacity button layout: Edit the gnome config for apps/metacity/general/button_layout, which is a string describing the button locations. I have changed mine to menu:maximize,minimize,close. Using the gconf-editor tool, navigate to apps/metacity/general, then find the button_layout option. Changes are immediate. Invalid button names are silently ignored, to future proof against updated button names in later configurations.

Thanks to lifehacker and how to geek for tips. Check out the howtogeek article if you need a walk through for gconf-editor.

Friday, May 28, 2010

The One Mile Solution! Get back on your bike.

An easy way to get back into biking/walking and save your soul (and our planet). Pick a one (1) mile trip, once a week, and bike or walk it instead of driving. Your car is least efficient on those short routes and you need the exercise anyways. win-win! What a great idea. As you get into the hang of it, you can add more trips via bike and extend to a few miles.

nearly half of all trips in the United States are three miles or less; more than a quarter are less than a mile.

--March/April issue of Sierra

Also, if you do the exercise and realize you have no destinations within 1 mile of home, why do you live there? move. seriously.

What if there was something you could do to improve your health and fitness, save money, reduce our dependence on foreign oil, improve air quality, and reduce your carbon footprint, all at the same time—would you do it?

Maybe that’s a bit of preaching to the choir here, but that’s the idea behind The 1-Mile Solution. As Andy Cline explains,

The idea is simple: Find your home on a map…Draw a circle with a 1-mile radius around your home. Try to replace one car trip per week within that circle by riding a bicycle or walking. At an easy riding pace you can travel one mile on a bicycle in about seven minutes. Walking takes about 20 minutes at an easy pace.


Read more: http://velonews.competitor.com/2008/12/news/legally-speaking-with-bob-mionske-the-1-mile-solution_86235#ixzz0p3YNvInf

The One Mile Solution at legally speaking

http://www.bicyclelaw.com/articles/a.cfm/legally-speaking-the-1-mile-solution

Wednesday, May 26, 2010

Volt DB

Another DB company from database guy Mike Stonebreaker (Postgres, Ingres, etc). Sounds fast and encourages the use of stored procedures and lets you write them in Java. Crazy. Anyone heard of this? tested it? using it?

Under the leadership of Postgres and Ingres co-founder, Mike Stonebraker, VoltDB has been developed as a next-generation, open-source DBMS that has been shown to process millions of transactions per second on inexpensive clusters of off-the-shelf servers. It has outperformed traditional OLTP database systems by a factor of 45 on a single server, and unlike NoSQL key-value stores, VoltDB can be accessed using SQL and ensures transactional data integrity (ACID). VoltDB is ideal for developers of ad serving, gaming, software as a service (SaaS), financial trading, on-line businesses and other systems with large, fast-growing transaction volumes because VoltDB scales out easily on low-cost servers while providing automatic high availability for 24x7 operation.

-- http://voltdb.com/voltdb-launches-next-generation-open-source-oltp-dbms

We had a lovely talk last month at LA Perl Mongers on Cassandra, a highly distributed, eventually correct key-value store with just a touch of metadata. This is just one of the current crop of key-value stores discussed as "NOSQL" (what a terrible moniker, but interesting ideas in the distributed key-value space).

I'm exciting to see an ACID compliant fast db. I do wonder what has been skipped to make it speedy. Can it persist to disk? snapshot to disk? Or do I just have to maintain enough of my live servers to keep the data up?

What project can I test this out with?
Postgres and Ingres father Michael Stonebraker is answering NoSQL with a variant of his relational baby for web-scale data — and it breaks some of the rules he helped pioneer.

On Tuesday, Stonebraker’s VoltDB company is due to release its eponymous open-source OLTP in-memory database. It ditches just enough DBMS staples to be faster than NoSQL while staying on the right side of the critical ACID database compliance benchmark for atomicity, consistency, isolation and durability of data.

http://www.theregister.co.uk/2010/05/25/voltdb_cloud_database_nosql/


Another question: how do I interact with the DB? I don't see a DBD::Volt namespace yet, and the db *really* wants to be called via stored procedures. From the FAQ:
3.4. Why doesn't VoltDB support ODBC/JDBC?

The primary interaction with a VoltDB database is through stored procedures, and the stored procedure interface (callProcedure) is easy to understand and interpret for those who are familiar with procedure calls through ODBC.

More importantly, although it is possible to perform occasional ad hoc queries on a live VoltDB database, you cannot modify the schema or perform other arbitrary actions through interactive SQL statements. Therefore a generic database connector is not useful when dealing with VoltDB.

Some comparisons also from the FAQ:
2.2. How does VoltDB differ from MySQL used with memcached?

Memcached is a distributed in-memory cache. It provides none of the reliability or consistency of an ACID-compliant SQL database. Memcached is often used as a cache in front of MySQL to improve performance of frequent transactions. But this requires the client application to manage the hash algorithms for both memcached and MySQL, as well as all of the transactional consistency and reliability between the two systems and across the cluster.

VoltDB automates all of these functions with none of the penalties, while providing similar or better performance. In addition, caching can help improve read performance for products such as MySQL, but does not help scale write performance. VoltDB scales linearly for both read and write performance.

2.3. How does VoltDB differ from a Key-Value Store (such as Cassandra)?

Key-Value stores are a mechanism for storing arbitrary data (i.e. values) based on individual keys. Distributing Key-Value stores is simple, since there is only one key. However, there is no structure within the data store and no transactional reliability provided by the system.

VoltDB provides the ability to store either structured or unstructured data, while at the same time providing full transactional consistency, reliability, and standard data access syntax through ANSI SQL. VoltDB can even define a transaction that includes reads and writes across multiple keys. Finally, VoltDB provides comparable or better performance in terms of throughput.

Thursday, May 20, 2010

Perl Survey is Live!

Perl: solving your problems since 1987
--me

From: Kieren Diment

The Perl Survey 2010 is now live. Its purpose is to better understand the demographics and opinions of the Perl community. You can complete the survey at http://survey.perlfoundation.org - it should take about 10 to 15 minutes.

Once you've done that, please let your relevant friends and colleagues know about the survey so they can complete it as well. My aim is to get a response of over 1000 individuals, and to run the survey (lightly adapted) every two or three years so we can see how the community changes over time. The official announcement of the survey is here:

http://news.perlfoundation.org/2010/05/grant-update-the-perl-survey-1.html


My notes:
  1. This is a well put together survey that looks like it has a good methodology, not just a survey-monkey one-off. It'll take a good 15-20 minutes.
  2. captcha required (boo)
  3. The survey is a bit clever with jscript/html and prevented me from filling it out on my iphone browser.

Tuesday, May 11, 2010

Perl, alive and kicking

Odds are, you didn't read about the 5.12 release outside of Ryan Paul's overview on Ars Technica. You see, Perl not being dead and just continuing business as usual doesn't make for compelling news. Also, let's admit, the Perl community has a great deal of competence in producing software, but couldn't market its way out of a soaking wet paper bag.

--Jeff Hobbs, Director of Engineering, ActiveState.

Interesting read over at ostatic.com, a guest piece by Jeff Hobbs. This topic is a proverbial discussion point in the perl community and has heated up quite a bit over the past year or two, "how to better market the awesomeness of perl?" It's hard to evangelize when we're busy getting stuff done. This article provides a nice collection of the arguments why 'perl is still relevant.'

Disclosure: ActiveState produces a professionally packaged perl for windows, so they are of course a bit biased into the pro-perl camp. (And they may be feeling some pressure from Vanilla perl now shipping a "normal" perl for windows).

Thursday, May 6, 2010

Unit Testing our Mental Models

Charlie Munger spoke for three hours at the Wesco financial meeting yesterday. Charlie is a wise old man. I played hooky from work yesterday to attend my third yearly meeting. The 1500 of us in the room don't get much in the way of money investment advice but instead lots of advice on time and wisdom investment. He is a big fan of mental models (of behavior) and checklists.

He speaks in frequent double-negatives. I believe this is a consequence of his mental patterns. Instead of "I do X," he'll say, "I avoid doing not-X." This is not insignificant. He has checklists in his head of behaviors to watch out for. These are his mental Unit Tests, he ticks through them to make ensure he's avoiding these (self-)destructive traps.

These checklists are based on years of watching people succeed and make mistakes. Mistakes are easier to notice and analyze and turn into rules of things to avoid, but only if looked at full on without sugar coating. These are the edges of the human veil of rationality, places where we still think we are rational but we are not. This is where damage and failure happen.

He watches for systemic failure of incentives: incentives that are misaligned with behavior that is good for society; systemic failures of the human rationality. Examples include selection bias, "I'm doing it so it's right", "I did well so I must have done the right thing", "it's tied to what I do for work, so it is ok."

You can see a great example of this in his interview this week with CNN Money and last week with CNBC about Lehman Brothers. When asked about why Lehman's Fuld was defective, he runs down a list of reasons. It's like he's reading off of a testing report. His first answer is so obvious to him, that he doesn't expand on it here. If your system is set to bring out the worst in people, you will get the worst in people, 100% of the time.

The system was all wrong. It wasn't that the people were so bad. The systems were all wrong. The systems brought out the worst in people instead of the best.


Q: What characteristic is it in Dick Fuld [...] that is defective?

A: Hard to name a defect he didn't have. [...] Yeah really.



http://money.cnn.com/video/news/2010/05/04/n_munger_lehman_fuld.cnnmoney/index.html

Dick Fuld would not be, for me, an exemplar of the best that can exist in investment banking.
[...]
Q: Worse than Bear Stearns?
A: Yes.
Q: Why?
A: Meglomania.
Envy Driven.
Poor Cognition.
Isolation.
And of course, its an interesting example of corporate governance. 'Cause You didn't have to be very wise to see that the place had the wrong leaders and they had a board that did exactly nothing to fix their problem.

Looking at this interview with an adware author (Thanks for the forward, J^K), I see that interviewee has learned one of these models, the power of gradualism. He inadvertently shows that he is unaware of another, the tendency to dehumanize the other end of a business transaction to make it easier emotionally to take advantage of them. It doesn't sound like he's aware that he's marginalizing his target: first they're less-savvy, then they're apathetic or ignorant. This first model is helpful to him to feel better about his reprehensible actions while the second helped him to take those actions. Which model and associated tests will be more effective in your mental toolbox?

It was funny. It really showed me the power of gradualism. It’s hard to get people to do something bad all in one big jump, but if you can cut it up into small enough pieces, you can get people to do almost anything.

Most adware targets Internet Explorer (IE) users because obviously they’re the biggest share of the market. In addition, they tend to be the less-savvy chunk of the market. If you’re using IE, then either you don’t care or you don’t know about all the vulnerabilities that IE has.

You'll hear echos of these sentiments from all of the bankers being interviewed on Capital Hill. They are too close to the problem to be aware of their model deficiencies. The brief period where our anger can trigger change is slipping away. How shall we find the root problems and re-engineer the incentive system to keep them from reoccurring?