Monitor, archive, go back in time.. Website Watchman is an easy to use website archival utility.
– Monitor a whole website, part of a website or a single page
– Set up configs for multiple sites / pages
– Schedule hourly, daily, weekly, monthly scan
– Be alerted to any changes, visible text, source code or changes to the page’s resources
– View and be able to demonstrate what a page looked like on a particular date
– Be aware of every change to a competitor’s page / site
– Runs locally, not a cloud service. Own your own data.
– An archive is kept, including all changes to pages, images, style sheets and js
– View a ‘living’ version of a historical page, not a screenshot
– Switch between versions of the page to compare them
– Export a historical page as image or collection of all of its files
– Export the entire site, preserving all files as they were on a given date, or processed to make a browsable local copy of the site
css and image files which were hosted externally (different domain to starting url) weren’t being archived and therefore pages weren’t being displayed in the archive browser correctly for those sites. Now fixed.
Better at finding images that use lazy loading.
Updates the selectable user-agent strings and adds more (in particular, Edge and some more mobile browsers)
Updates Paddle’s licensing framework to the latest Big Sur/M1 compatible version
Changes default setting for treating http:// links on the same domain (when starting with an https:// url). Now treats them as internal, which is probably what’s expected