Automatically runs the generate-site-icons script every 5 minutes. The
script was also updated to use checksum-based rsync instead of cp, so
that the file won't be replaced (and need to be redownloaded by users)
unless it actually changes.
This adds a trigger to the scraper_results table which will add rabbitmq
messages whenever a scrape finishes, as well as a consumer that picks up
these messages, and uses Embedly data to download (and resize if
necessary) the favicons from any sites that are scraped. These are
downloaded into the input folder for the site-icons-spriter, so it
should be able to use these to generate spritesheets.
This adds a consumer (in prod only) that uses Embedly's Extract API to
scrape the links from all new link topics and stores some of the data in
the topic's content_metadata column.
Boussole (which watches the SCSS files for changes and compiles them)
has just been running in the same venv as the main app until now, but
it's holding back the version of the click package. There's no real
reason that it needs to be in the app venv, so this moves it to its own
one, which also eliminates quite a few other packages that were only
being installed because of Boussole.
This detects mentions of users in comments using the same pattern as the
markdown parsing uses to generate user links. Mentioned users are sent a
notification, and mentions are added/deleted if needed on comment edits.
As part of this, setup was done to generate rabbitmq messages for
comment creation and edits, and the mentions are handled by an async
consumer of these messages.