Parker

Latest version: v0.9.6

Safety actively analyzes 628969 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

0.9.0

----------------------------------------

- Completely refactored storage and added the ability to store consumed data
to Amazon's S3 service. This requires your AWS Access Key ID to be set up
as an environment variable.

- Added an 'unclassified' path prefix for sites configured without a
classification.

- Added a 'crawl_uri_filters' site configuration list which should contain
regular expressions to match any URIs you would like filtered out of the
crawl.

- Added a 'seconds_until_expire' site configuration value which sets the expiry
of the sets stored in Redis. Would recommend you ensure this is set to an
interval similar to how long it takes Parker to crawl your site so that when
it finishes, the Sets will expire shortly after to save memory.

0.8.0

----------------------------------------

- Updated key-value functionality to allow a sub-selector to return
a list of values.

- Added the ability to add an expiry time in seconds to the RedisSet
objects, with a default of 5 days.

- Added an extra set to track URIs that are already on the crawl queue. This
should hopefully cut down on duplication but may eat memory if there are
multiple possiple URIs for the same page.

0.7.3

----------------------------------------

- Patch to fix an issue where the consumer was overlooking media URIs that start
with / and are therefore relative to the base_uri configuration.

- Added boto to the requirements for future use.

0.7.2

----------------------------------------

- Patch to fix an issue where the crawler was overlooking URIs that start
with / and are therefore relative to the base_uri configuration.

0.7.1

----------------------------------------

- Patch to fix an issue where, if class is not present in the site config, the
path includes "None".

0.7.0

----------------------------------------

- Rework the client to allow for improved proxy failover should we
need it. Improve testing a little to back this up.

- Add tagging to the configuration. These are simply passed through to the
resulting JSON objects output by the model so that you can tag them with
whatever you want.

- Add classification to the configuration. Again this is passed through, but
is also used in the output file path from the consumer worker.

Page 2 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.