Sec-edgar-downloader

Latest version: v5.0.2

Safety actively analyzes 629959 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 6

4.2.0

New

- The `httpx` package has been replaced by `requests` to enable the use of an exponential backoff retry mechanism to help alleviate `403 Forbidden` errors some users are seeing. A request to `sec.gov` will be retried at most 10 times (with an exponential backoff applied to each request) before failing.
- A random `User-Agent` string is now included in the headers of each `GET` and `POST` request to `sec.gov`, rather than per session.

4.1.0

New

- HTTP connections are now re-used when possible (using [`httpx.Client()`](https://www.python-httpx.org/advanced/#client-instances)) to improve download performance.

Fixed

- Requests are now retried at most 5 times if a request fails. This should solve the `500 Server Error`s that some users are experiencing when downloading a large number of filings.

Changed

- Replaced the internal `requests` package with [`httpx`](https://github.com/encode/httpx), a more modern and performant alternative.

4.0.3

Fixed

- Fixed a `403 Client Error` that could randomly occur when bulk downloading a large number of filings. This error was most likely caused by recent changes to SEC rate-limiting behavior. It has been fixed by including a random user-agent string, generated by the [Faker package](https://faker.readthedocs.io/en/stable/providers/faker.providers.user_agent.html), in the request headers.

4.0.2

Fixed

- Fixed a `RecursionError` that could occur when downloading older filings with the `download_details` flag set to true. Thanks to neilbartlett for reporting and fixing this bug!

4.0.1

Fixed

- Downloads will no longer halt prematurely if a filing document (full or detail) cannot be found (e.g. when the EDGAR Search API outputs incorrect download URLs). Now, the package will automatically catch such network errors, print a helpful warning message, and then proceed to download the remaining filings.

4.0.0

This is a major breaking release. Please see the [v4 migration guide](https://github.com/jadchaar/sec-edgar-downloader/blob/master/docs/v4_migration_guide.md) for information on how to upgrade and adapt your existing codebase to the new package version.

Added

- The [SEC Edgar Full-Text Search API](https://www.sec.gov/edgar/search/) is now used to fetch filing download URLs. This approach replaces the existing fragile scraping and ATOM RSS implementation used in existing versions of this package.
- Note: this API only allows for fetching filings after December 1, 2000.
- Added support for searching and downloading filings via a `query` kwarg:
python
dl = Downloader()
Download all Apple proxy statements that contain the word "antitrust"
dl.get("DEF 14A", "AAPL", query="antitrust")

- Filing details, whose extensions vary based on filing type, are now downloaded in addition to the full submission `txt` file. See the migration guide for information on the revamped folder structure.
- Added the ability to download all available SEC filings. Please see the [README](https://github.com/jadchaar/sec-edgar-downloader/blob/master/README.rst) for a full list of all supported filings.
- `Path` objects can now be used to specify a `download_folder` when constructing a `Downloader` object.
- Added type annotations throughout the codebase.

Changed

- The current working directory is now used as the default download location. Custom paths can be specified when constructing `Downloader` objects.
- All arguments passed to `dl.get()` other than `filing` and `ticker_or_cik` must be used with a keyword:
python
dl = Downloader()
dl.get(
"10-K",
"AAPL",
All other arguments must be used with a keyword
amount=1,
after="2019-01-01",
before="2021-01-01",
include_amends=True,
download_details=True,
query="sample query"
)

- The `after_date`, `before_date`, and `num_filings_to_download` kwargs have been renamed to `after`, `before`, and `amount`, respectively.

Page 2 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.