Friday, April 20, 2012

From fetchmail/procmail to Google Calendar

I spent many hours scouring Google before finally figuring out how to implement this easily. Calendar synchronization is very important to me; I want to be able to look at my phone, my tablet, or my browser and see my entire schedule (otherwise I'd never show up anywhere.)

.procmailrc:
:0 Wc: Mail/multipart.lock
* H ?? ^Content-Type: multipart/(alternative|mixed)
* B ?? ^Content-Type: text/calendar
{
        :0 Wac:
        | munpack -q -t -C ~/Mail/.munpack 2> /dev/null
        :0 Wac:
        | /home/oliver/bin/calendar.sh
        :0 Wc:
        | rm -f ~/Mail/.munpack/*
}
This beautiful undocumented recipe is actually quite simple, but perhaps needs some explanation. From man procmailrc:

A line starting with ':' marks the beginning of a recipe.
It has the following format:

       :0 [flags] [ : [locallockfile] ]

       zero or more conditions (one per line)
       exactly one action line

W means wait until the program finishes and ignore any failure messages. a means the preceding recipe must have successfully completed before procmail will execute this recipe. Finally, c means carbon-copy (this recipe does not handle delivery of the mail, so it must proceed further through the chain of recipes.)

It may be more appropriation to use f (this pipe is a filter) instead. I am new to procmail configuration; c works for me.

First match messages containing a Content-Type header of either multipart/alternative or multipart/mixed, second check the message body for a Content-Type of text/calendar (which is the data we want.) If these conditions are true unpack the MIME multipart message into separate files with munpack into a temporary directory. Proceed and run calendar.sh which does the magic, and finally clean up after ourselves.

calendar.sh:
#!/bin/bash

http_proxy=http://proxy.example.com:8080

CALENDAR_ID='XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX@group.calendar.google.com'
for i in ~/Mail/.munpack/*; do
        if [ -n "$(grep 'BEGIN:VCALENDAR' ${i} 2> /dev/null)" ]; then
                expect -dc \
                "spawn /usr/bin/cadaver -p ${http_proxy/http:\/\/} \
                https://www.google.com/calendar/dav/${CALENDAR_ID}/events ; \
                expect \"dav:\" ; \
                send \"put ${i}\\r\" ; \
                expect \"dav:\" ;
                send \"bye\\r\""
        fi
done
Newlines have been inserted into the script so that it doesn't break layouts on other pages; you will need to fix those if copying this script.

I use expect and cadaver to upload the iCalender file extracted by munpack to Google Calendar via their WebDav interface. Finding the CALENDAR_ID is a little confusing. You can find it from the "Settings" option when you login to the Google Calender via a browser. You must use the public Calendar Address, not the Private Address, but you do not need to share the calender publicly; Google will request your login details via WebDav. This is easiest to configure in .netrc. chmod 600 the file for some additional safety.
machine www.google.com
login john.doe
password example123
The login should not include the @gmail.com otherwise it will fail. Happy hacking!

4 comments:

  1. Note that I put the Google Calendar rules as the very first rules in my procmail configuration, even before duplicate message elimination. Just in case Microsoft Exchange does something insane; which would not be unprecedented.

    http://ramblings.narrabilis.com/filtering-duplicate-emails-with-procmail

    ReplyDelete
  2. This is really great. Keep it up the good going. Really very great blog this has given me all the information that i needed, good for visiting daily it will increase our knowledge. Best luck for future.
    paper core pipe

    ReplyDelete
    Replies
    1. Thanks, you're welcome. Remember to check the updated post for Google Calendar sync:
      http://omcfadde.blogspot.com/2012/04/google-calendar-bug-fixes.html

      The .procmailrc in this post has some bugs in the locking mechanism which can cause fetchmail to hang in some rare cases.

      Delete
  3. This comment has been removed by a blog administrator.

    ReplyDelete