Python script that scrapes public and private Mailman archive pages and republishes them to local files, and generates an RSS feed of recent emails.
Modified MailmanArchiveScraper.py to accept an arbitrary number of configuration files as command line arguments. Those configs need not reside in the project directory. Did not modify the GZ scaper or README. If no arguments are given, the default configuration is used from its present location, so no behavior is changed from the current distribution's use case. This allows the scrape of many lists in a single run. Example: `python $pathTo/MailmanArchiveScraper.py ~/lists/*.cfg`
This issue appears to be discussing a feature request or bug report related to the repository. Based on the content, it seems to be resolved. The issue was opened by bibby and has received 2 comments.