Sentry

Latest version: v23.7.1

Safety actively analyzes 630254 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 156 of 223

1.19.1

Various fixes & improvements

- Make auto monitoring beat update support Celery 4 and 5 (1989) by antonpirker

1.19.0

Various fixes & improvements

- **New:** [Celery Beat](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html) auto monitoring (#1967) by antonpirker

The CeleryIntegration can now also monitor your Celery Beat scheduled tasks automatically using the new [Crons](https://blog.sentry.io/2023/01/04/cron-job-monitoring-beta-because-scheduled-jobs-fail-too/) feature of Sentry.

To learn more see our [Celery Beat Auto Discovery](https://docs.sentry.io/platforms/python/guides/celery/crons/) documentation.

Usage:

python
from celery import Celery, signals
from celery.schedules import crontab

import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration


app = Celery('tasks', broker='...')
app.conf.beat_schedule = {
'set-in-beat-schedule': {
'task': 'tasks.some_important_task',
'schedule': crontab(...),
},
}


signals.celeryd_init.connect
def init_sentry(**kwargs):
sentry_sdk.init(
dsn='...',
integrations=[CeleryIntegration(monitor_beat_tasks=True)], 👈 here
environment="local.dev.grace",
release="v1.0",
)


This will auto detect all schedules tasks in your `beat_schedule` and will monitor them with Sentry [Crons](https://blog.sentry.io/2023/01/04/cron-job-monitoring-beta-because-scheduled-jobs-fail-too/).

- **New:** [gRPC](https://grpc.io/) integration (#1911) by hossein-raeisi

The [gRPC](https://grpc.io/) integration instruments all incoming requests and outgoing unary-unary, unary-stream grpc requests using grpcio channels.

To learn more see our [gRPC Integration](https://docs.sentry.io/platforms/python/configuration/integrations/grpc/) documentation.

On the server:

python
import grpc
from sentry_sdk.integrations.grpc.server import ServerInterceptor


server = grpc.server(
thread_pool=...,
interceptors=[ServerInterceptor()],
)


On the client:

python
import grpc
from sentry_sdk.integrations.grpc.client import ClientInterceptor


with grpc.insecure_channel("example.com:12345") as channel:
channel = grpc.intercept_channel(channel, *[ClientInterceptor()])



- **New:** socket integration (1911) by hossein-raeisi

Use this integration to create spans for DNS resolves (`socket.getaddrinfo()`) and connection creations (`socket.create_connection()`).

To learn more see our [Socket Integration](https://docs.sentry.io/platforms/python/configuration/integrations/socket/) documentation.

Usage:

python
import sentry_sdk
from sentry_sdk.integrations.socket import SocketIntegration
sentry_sdk.init(
dsn="___PUBLIC_DSN___",
integrations=[
SocketIntegration(),
],
)


- Fix: Do not trim span descriptions. (1983) by antonpirker

1.18.0

Various fixes & improvements

- **New:** Implement `EventScrubber` (1943) by sl0thentr0py

To learn more see our [Scrubbing Sensitive Data](https://docs.sentry.io/platforms/python/data-management/sensitive-data/#event-scrubber) documentation.

Add a new `EventScrubber` class that scrubs certain potentially sensitive interfaces with a `DEFAULT_DENYLIST`. The default scrubber is automatically run if `send_default_pii = False`:

python
import sentry_sdk
from sentry_sdk.scrubber import EventScrubber
sentry_sdk.init(
...
send_default_pii=False,
event_scrubber=EventScrubber(), this is set by default
)


You can also pass in a custom `denylist` to the `EventScrubber` class and filter additional fields that you want.

python
from sentry_sdk.scrubber import EventScrubber, DEFAULT_DENYLIST
custom denylist
denylist = DEFAULT_DENYLIST + ["my_sensitive_var"]
sentry_sdk.init(
...
send_default_pii=False,
event_scrubber=EventScrubber(denylist=denylist),
)


- **New:** Added new `functions_to_trace` option for central way of performance instrumentation (1960) by antonpirker

To learn more see our [Tracing Options](https://docs.sentry.io/platforms/python/configuration/options/#functions-to-trace) documentation.

An optional list of functions that should be set up for performance monitoring. For each function in the list, a span will be created when the function is executed.

python
functions_to_trace = [
{"qualified_name": "tests.test_basics._hello_world_counter"},
{"qualified_name": "time.sleep"},
{"qualified_name": "collections.Counter.most_common"},
]

sentry_sdk.init(
...
traces_sample_rate=1.0,
functions_to_trace=functions_to_trace,
)


- Updated denylist to include other widely used cookies/headers (1972) by antonpirker
- Forward all `sentry-` baggage items (1970) by cleptric
- Update OSS licensing (1973) by antonpirker
- Profiling: Handle non frame types in profiler (1965) by Zylphrex
- Tests: Bad arq dependency in tests (1966) by Zylphrex
- Better naming (1962) by antonpirker

1.17.0

Various fixes & improvements

- **New:** Monitor Celery Beat tasks with Sentry [Cron Monitoring](https://docs.sentry.io/product/crons/).

With this feature you can make sure that your Celery beat tasks run at the right time and see if they where successful or not.

> **Warning**
> Cron Monitoring is currently in beta. Beta features are still in-progress and may have bugs. We recognize the irony.
> If you have any questions or feedback, please email us at crons-feedbacksentry.io, reach out via Discord (cronjobs), or open an issue.

Usage:

python
File: tasks.py

from celery import Celery, signals
from celery.schedules import crontab

import sentry_sdk
from sentry_sdk.crons import monitor
from sentry_sdk.integrations.celery import CeleryIntegration


1. Setup your Celery beat configuration

app = Celery('mytasks', broker='redis://localhost:6379/0')
app.conf.beat_schedule = {
'set-in-beat-schedule': {
'task': 'tasks.tell_the_world',
'schedule': crontab(hour='10', minute='15'),
'args': ("in beat_schedule set", ),
},
}


2. Initialize Sentry either in `celeryd_init` or `beat_init` signal.

signals.celeryd_init.connect
signals.beat_init.connect
def init_sentry(**kwargs):
sentry_sdk.init(
dsn='...',
integrations=[CeleryIntegration()],
environment="local.dev.grace",
release="v1.0.7-a1",
)


3. Link your Celery task to a Sentry Cron Monitor

app.task
monitor(monitor_slug='3b861d62-ff82-4aa0-9cd6-b2b6403bd0cf')
def tell_the_world(msg):
print(msg)


- **New:** Add decorator for Sentry tracing (1089) by ynouri

This allows you to use a decorator to setup custom performance instrumentation.

To learn more see [Custom Instrumentation](https://docs.sentry.io/platforms/python/performance/instrumentation/custom-instrumentation/).

Usage: Just add the new decorator to your function, and a span will be created for it:

python
import sentry_sdk

sentry_sdk.trace
def my_complex_function():
do stuff
...


- Make Django signals tracing optional (1929) by antonpirker

See the [Django Guide](https://docs.sentry.io/platforms/python/guides/django) to learn more.

- Deprecated `with_locals` in favor of `include_local_variables` (1924) by antonpirker
- Added top level API to get current span (1954) by antonpirker
- Profiling: Add profiler options to init (1947) by Zylphrex
- Profiling: Set active thread id for quart (1830) by Zylphrex
- Fix: Update `get_json` function call for werkzeug 2.1.0+ (1939) by michielderoos
- Fix: Returning the tasks result. (1931) by antonpirker
- Fix: Rename MYPY to TYPE_CHECKING (1934) by untitaker
- Fix: Fix type annotation for ignore_errors in sentry_sdk.init() (1928) by tiangolo
- Tests: Start a real http server instead of mocking libs (1938) by antonpirker

1.16.0

Various fixes & improvements

- **New:** Add [arq](https://arq-docs.helpmanual.io/) Integration (#1872) by Zhenay

This integration will create performance spans when arq jobs will be enqueued and when they will be run.
It will also capture errors in jobs and will link them to the performance spans.

Usage:

python
import asyncio

from httpx import AsyncClient
from arq import create_pool
from arq.connections import RedisSettings

import sentry_sdk
from sentry_sdk.integrations.arq import ArqIntegration
from sentry_sdk.tracing import TRANSACTION_SOURCE_COMPONENT

sentry_sdk.init(
dsn="...",
integrations=[ArqIntegration()],
)

async def download_content(ctx, url):
session: AsyncClient = ctx['session']
response = await session.get(url)
print(f'{url}: {response.text:.80}...')
return len(response.text)

async def startup(ctx):
ctx['session'] = AsyncClient()

async def shutdown(ctx):
await ctx['session'].aclose()

async def main():
with sentry_sdk.start_transaction(name="testing_arq_tasks", source=TRANSACTION_SOURCE_COMPONENT):
redis = await create_pool(RedisSettings())
for url in ('https://facebook.com', 'https://microsoft.com', 'https://github.com', "asdf"
):
await redis.enqueue_job('download_content', url)

class WorkerSettings:
functions = [download_content]
on_startup = startup
on_shutdown = shutdown

if __name__ == '__main__':
asyncio.run(main())


- Update of [Falcon](https://falconframework.org/) Integration (#1733) by bartolootrit
- Adding [Cloud Resource Context](https://docs.sentry.io/platforms/python/configuration/integrations/cloudresourcecontext/) integration (#1882) by antonpirker
- Profiling: Use the transaction timestamps to anchor the profile (1898) by Zylphrex
- Profiling: Add debug logs to profiling (1883) by Zylphrex
- Profiling: Start profiler thread lazily (1903) by Zylphrex
- Fixed checks for structured http data (1905) by antonpirker
- Make `set_measurement` public api and remove experimental status (1909) by sl0thentr0py
- Add `trace_propagation_targets` option (1916) by antonpirker
- Add `enable_tracing` to default traces_sample_rate to 1.0 (1900) by sl0thentr0py
- Remove deprecated `tracestate` (1907) by sl0thentr0py
- Sanitize URLs in Span description and breadcrumbs (1876) by antonpirker
- Mechanism should default to true unless set explicitly (1889) by sl0thentr0py
- Better setting of in-app in stack frames (1894) by antonpirker
- Add workflow to test gevent (1870) by Zylphrex
- Updated outdated HTTPX test matrix (1917) by antonpirker
- Switch to MIT license (1908) by cleptric

1.15.0

Various fixes & improvements

- New: Add [Huey](https://huey.readthedocs.io/en/latest/) Integration (#1555) by Zhenay

This integration will create performance spans when Huey tasks will be enqueued and when they will be executed.

Usage:

Task definition in `demo.py`:

python
import time

from huey import SqliteHuey, crontab

import sentry_sdk
from sentry_sdk.integrations.huey import HueyIntegration

sentry_sdk.init(
dsn="...",
integrations=[
HueyIntegration(),
],
traces_sample_rate=1.0,
)

huey = SqliteHuey(filename='/tmp/demo.db')

huey.task()
def add_numbers(a, b):
return a + b


Running the tasks in `run.py`:

python
from demo import add_numbers, flaky_task, nightly_backup

import sentry_sdk
from sentry_sdk.integrations.huey import HueyIntegration
from sentry_sdk.tracing import TRANSACTION_SOURCE_COMPONENT, Transaction


def main():
sentry_sdk.init(
dsn="...",
integrations=[
HueyIntegration(),
],
traces_sample_rate=1.0,
)

with sentry_sdk.start_transaction(name="testing_huey_tasks", source=TRANSACTION_SOURCE_COMPONENT):
r = add_numbers(1, 2)

if __name__ == "__main__":
main()


- Profiling: Do not send single sample profiles (1879) by Zylphrex
- Profiling: Add additional test coverage for profiler (1877) by Zylphrex
- Profiling: Always use builtin time.sleep (1869) by Zylphrex
- Profiling: Defaul in_app decision to None (1855) by Zylphrex
- Profiling: Remove use of threading.Event (1864) by Zylphrex
- Profiling: Enable profiling on all transactions (1797) by Zylphrex
- FastAPI: Fix check for Starlette in FastAPI integration (1868) by antonpirker
- Flask: Do not overwrite default for username with email address in FlaskIntegration (1873) by homeworkprod
- Tests: Add py3.11 to test-common (1871) by Zylphrex
- Fix: Don't log whole event in before_send / event_processor drops (1863) by sl0thentr0py

Page 156 of 223

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.