VYPR
Medium severity6.4OSV Advisory· Published Oct 6, 2025· Updated Apr 15, 2026

CVE-2025-61765

CVE-2025-61765

Description

python-socketio is a Python implementation of the Socket.IO realtime client and server. A remote code execution vulnerability in python-socketio versions prior to 5.14.0 allows attackers to execute arbitrary Python code through malicious pickle deserialization in multi-server deployments on which the attacker previously gained access to the message queue that the servers use for internal communications. When Socket.IO servers are configured to use a message queue backend such as Redis for inter-server communication, messages sent between the servers are encoded using the pickle Python module. When a server receives one of these messages through the message queue, it assumes it is trusted and immediately deserializes it. The vulnerability stems from deserialization of messages using Python's pickle.loads() function. Having previously obtained access to the message queue, the attacker can send a python-socketio server a crafted pickle payload that executes arbitrary code during deserialization via Python's __reduce__ method. This vulnerability only affects deployments with a compromised message queue. The attack can lead to the attacker executing random code in the context of, and with the privileges of a Socket.IO server process. Single-server systems that do not use a message queue, and multi-server systems with a secure message queue are not vulnerable. In addition to making sure standard security practices are followed in the deployment of the message queue, users of the python-socketio package can upgrade to version 5.14.0 or newer, which remove the pickle module and use the much safer JSON encoding for inter-server messaging.

Affected packages

Versions sourced from the GitHub Security Advisory.

PackageAffected versionsPatched versions
python-socketioPyPI
>= 0.8.0, < 5.14.05.14.0

Affected products

1

Patches

1
53f6be094257

Replace pickle with json (#1502)

https://github.com/miguelgrinberg/python-socketioMiguel GrinbergSep 30, 2025via ghsa
11 files changed · +48 64
  • docs/server.rst+11 11 modified
    @@ -1096,17 +1096,17 @@ For a production deployment there are a few recommendations to keep your
     application secure.
     
     First of all, the message queue should never be listening on a public network
    -interface, to ensure that external clients never connect to it. The use of a
    -private network (VPC), where the communication between servers can happen
    -privately is highly recommended.
    -
    -In addition, all message queues support authentication and encryption.
    -Authentication ensures that only the Socket.IO servers and related processes
    -have access, while encryption prevents data to be collected by a third-party
    -listening on the network.
    -
    -Access credentials can be included in the connection URLs that are passed to the
    -client managers.
    +interface, to ensure that external clients never connect to it. For a single
    +node deployment, the queue should only listen on `localhost`. For a multi-node
    +system the use of a private network (VPC), where the communication between
    +servers can happen privately is highly recommended.
    +
    +In addition, all message queues support authentication and encryption, which
    +can strenthen the security of the deployment. Authentication ensures that only
    +the Socket.IO servers and related processes have access, while encryption
    +prevents data from being collected by a third-party that is listening on the
    +network. Access credentials can be included in the connection URLs that are
    +passed to the client managers.
     
     Horizontal Scaling
     ~~~~~~~~~~~~~~~~~~
    
  • src/socketio/async_aiopika_manager.py+3 3 modified
    @@ -1,6 +1,6 @@
     import asyncio
    -import pickle
     
    +from engineio import json
     from .async_pubsub_manager import AsyncPubSubManager
     
     try:
    @@ -82,7 +82,7 @@ async def _publish(self, data):
                 try:
                     await self.publisher_exchange.publish(
                         aio_pika.Message(
    -                        body=pickle.dumps(data),
    +                        body=json.dumps(data),
                             delivery_mode=aio_pika.DeliveryMode.PERSISTENT
                         ), routing_key='*',
                     )
    @@ -113,7 +113,7 @@ async def _listen(self):
                         async with queue.iterator() as queue_iter:
                             async for message in queue_iter:
                                 async with message.process():
    -                                yield pickle.loads(message.body)
    +                                yield message.body
                                     retry_sleep = 1
                     except aio_pika.AMQPException:
                         self._get_logger().error(
    
  • src/socketio/async_pubsub_manager.py+4 11 modified
    @@ -3,7 +3,6 @@
     import uuid
     
     from engineio import json
    -import pickle
     
     from .async_manager import AsyncManager
     
    @@ -202,16 +201,10 @@ async def _thread(self):
                         if isinstance(message, dict):
                             data = message
                         else:
    -                        if isinstance(message, bytes):  # pragma: no cover
    -                            try:
    -                                data = pickle.loads(message)
    -                            except:
    -                                pass
    -                        if data is None:
    -                            try:
    -                                data = json.loads(message)
    -                            except:
    -                                pass
    +                        try:
    +                            data = json.loads(message)
    +                        except:
    +                            pass
                         if data and 'method' in data:
                             self._get_logger().debug('pubsub message: {}'.format(
                                 data['method']))
    
  • src/socketio/async_redis_manager.py+2 2 modified
    @@ -1,5 +1,4 @@
     import asyncio
    -import pickle
     from urllib.parse import urlparse
     
     try:  # pragma: no cover
    @@ -20,6 +19,7 @@
         valkey = None
         ValkeyError = None
     
    +from engineio import json
     from .async_pubsub_manager import AsyncPubSubManager
     from .redis_manager import parse_redis_sentinel_url
     
    @@ -108,7 +108,7 @@ async def _publish(self, data):
                     if not retry:
                         self._redis_connect()
                     return await self.redis.publish(
    -                    self.channel, pickle.dumps(data))
    +                    self.channel, json.dumps(data))
                 except error as exc:
                     if retry:
                         self._get_logger().error(
    
  • src/socketio/kafka_manager.py+3 3 modified
    @@ -1,11 +1,11 @@
     import logging
    -import pickle
     
     try:
         import kafka
     except ImportError:
         kafka = None
     
    +from engineio import json
     from .pubsub_manager import PubSubManager
     
     logger = logging.getLogger('socketio')
    @@ -53,7 +53,7 @@ def __init__(self, url='kafka://localhost:9092', channel='socketio',
                                                 bootstrap_servers=self.kafka_urls)
     
         def _publish(self, data):
    -        self.producer.send(self.channel, value=pickle.dumps(data))
    +        self.producer.send(self.channel, value=json.dumps(data))
             self.producer.flush()
     
         def _kafka_listen(self):
    @@ -62,4 +62,4 @@ def _kafka_listen(self):
         def _listen(self):
             for message in self._kafka_listen():
                 if message.topic == self.channel:
    -                yield pickle.loads(message.value)
    +                yield message.value
    
  • src/socketio/kombu_manager.py+2 2 modified
    @@ -1,4 +1,3 @@
    -import pickle
     import time
     import uuid
     
    @@ -7,6 +6,7 @@
     except ImportError:
         kombu = None
     
    +from engineio import json
     from .pubsub_manager import PubSubManager
     
     
    @@ -102,7 +102,7 @@ def _publish(self, data):
                 try:
                     producer_publish = self._producer_publish(
                         self.publisher_connection)
    -                producer_publish(pickle.dumps(data))
    +                producer_publish(json.dumps(data))
                     break
                 except (OSError, kombu.exceptions.KombuError):
                     if retry:
    
  • src/socketio/pubsub_manager.py+4 11 modified
    @@ -2,7 +2,6 @@
     import uuid
     
     from engineio import json
    -import pickle
     
     from .manager import Manager
     
    @@ -196,16 +195,10 @@ def _thread(self):
                         if isinstance(message, dict):
                             data = message
                         else:
    -                        if isinstance(message, bytes):  # pragma: no cover
    -                            try:
    -                                data = pickle.loads(message)
    -                            except:
    -                                pass
    -                        if data is None:
    -                            try:
    -                                data = json.loads(message)
    -                            except:
    -                                pass
    +                        try:
    +                            data = json.loads(message)
    +                        except:
    +                            pass
                         if data and 'method' in data:
                             self._get_logger().debug('pubsub message: {}'.format(
                                 data['method']))
    
  • src/socketio/redis_manager.py+2 2 modified
    @@ -1,5 +1,4 @@
     import logging
    -import pickle
     import time
     from urllib.parse import urlparse
     
    @@ -17,6 +16,7 @@
         valkey = None
         ValkeyError = None
     
    +from engineio import json
     from .pubsub_manager import PubSubManager
     
     logger = logging.getLogger('socketio')
    @@ -145,7 +145,7 @@ def _publish(self, data):
                 try:
                     if not retry:
                         self._redis_connect()
    -                return self.redis.publish(self.channel, pickle.dumps(data))
    +                return self.redis.publish(self.channel, json.dumps(data))
                 except error as exc:
                     if retry:
                         logger.error(
    
  • src/socketio/zmq_manager.py+5 5 modified
    @@ -1,6 +1,6 @@
    -import pickle
     import re
     
    +from engineio import json
     from .pubsub_manager import PubSubManager
     
     
    @@ -75,14 +75,14 @@ def __init__(self, url='zmq+tcp://localhost:5555+5556',
             self.channel = channel
     
         def _publish(self, data):
    -        pickled_data = pickle.dumps(
    +        packed_data = json.dumps(
                 {
                     'type': 'message',
                     'channel': self.channel,
                     'data': data
                 }
    -        )
    -        return self.sink.send(pickled_data)
    +        ).encode()
    +        return self.sink.send(packed_data)
     
         def zmq_listen(self):
             while True:
    @@ -94,7 +94,7 @@ def _listen(self):
             for message in self.zmq_listen():
                 if isinstance(message, bytes):
                     try:
    -                    message = pickle.loads(message)
    +                    message = json.loads(message)
                     except Exception:
                         pass
                 if isinstance(message, dict) and \
    
  • tests/async/test_pubsub_manager.py+6 7 modified
    @@ -1,5 +1,6 @@
     import asyncio
     import functools
    +import json
     from unittest import mock
     
     import pytest
    @@ -482,31 +483,29 @@ async def test_background_thread(self):
             host_id = self.pm.host_id
     
             async def messages():
    -            import pickle
    -
                 yield {'method': 'emit', 'value': 'foo', 'host_id': 'x'}
                 yield {'missing': 'method', 'host_id': 'x'}
                 yield '{"method": "callback", "value": "bar", "host_id": "x"}'
                 yield {'method': 'disconnect', 'sid': '123', 'namespace': '/foo',
                        'host_id': 'x'}
                 yield {'method': 'bogus', 'host_id': 'x'}
    -            yield pickle.dumps({'method': 'close_room', 'value': 'baz',
    -                                'host_id': 'x'})
    +            yield json.dumps({'method': 'close_room', 'value': 'baz',
    +                              'host_id': 'x'})
                 yield {'method': 'enter_room', 'sid': '123', 'namespace': '/foo',
                        'room': 'room', 'host_id': 'x'}
                 yield {'method': 'leave_room', 'sid': '123', 'namespace': '/foo',
                        'room': 'room', 'host_id': 'x'}
                 yield 'bad json'
    -            yield b'bad pickled'
    +            yield b'bad data'
     
                 # these should not publish anything on the queue, as they come from
                 # the same host
                 yield {'method': 'emit', 'value': 'foo', 'host_id': host_id}
                 yield {'method': 'callback', 'value': 'bar', 'host_id': host_id}
                 yield {'method': 'disconnect', 'sid': '123', 'namespace': '/foo',
                        'host_id': host_id}
    -            yield pickle.dumps({'method': 'close_room', 'value': 'baz',
    -                                'host_id': host_id})
    +            yield json.dumps({'method': 'close_room', 'value': 'baz',
    +                              'host_id': host_id})
     
             self.pm._listen = messages
             await self.pm._thread()
    
  • tests/common/test_pubsub_manager.py+6 7 modified
    @@ -1,4 +1,5 @@
     import functools
    +import json
     import logging
     from unittest import mock
     
    @@ -465,31 +466,29 @@ def test_background_thread(self):
             host_id = self.pm.host_id
     
             def messages():
    -            import pickle
    -
                 yield {'method': 'emit', 'value': 'foo', 'host_id': 'x'}
                 yield {'missing': 'method', 'host_id': 'x'}
                 yield '{"method": "callback", "value": "bar", "host_id": "x"}'
                 yield {'method': 'disconnect', 'sid': '123', 'namespace': '/foo',
                        'host_id': 'x'}
                 yield {'method': 'bogus', 'host_id': 'x'}
    -            yield pickle.dumps({'method': 'close_room', 'value': 'baz',
    -                                'host_id': 'x'})
    +            yield json.dumps({'method': 'close_room', 'value': 'baz',
    +                              'host_id': 'x'})
                 yield {'method': 'enter_room', 'sid': '123', 'namespace': '/foo',
                        'room': 'room', 'host_id': 'x'}
                 yield {'method': 'leave_room', 'sid': '123', 'namespace': '/foo',
                        'room': 'room', 'host_id': 'x'}
                 yield 'bad json'
    -            yield b'bad pickled'
    +            yield b'bad data'
     
                 # these should not publish anything on the queue, as they come from
                 # the same host
                 yield {'method': 'emit', 'value': 'foo', 'host_id': host_id}
                 yield {'method': 'callback', 'value': 'bar', 'host_id': host_id}
                 yield {'method': 'disconnect', 'sid': '123', 'namespace': '/foo',
                        'host_id': host_id}
    -            yield pickle.dumps({'method': 'close_room', 'value': 'baz',
    -                                'host_id': host_id})
    +            yield json.dumps({'method': 'close_room', 'value': 'baz',
    +                              'host_id': host_id})
     
             self.pm._listen = mock.MagicMock(side_effect=messages)
             try:
    

Vulnerability mechanics

Generated by null/stub on May 9, 2026. Inputs: CWE entries + fix-commit diffs from this CVE's patches. Citations validated against bundle.

References

5

News mentions

0

No linked articles in our index yet.