关于redis的pub/sub的一个问题,小生着急希望大神进来指点迷津

zrd634550666 2013-10-29 09:38:13
我也是不太了解,有表述不太明确到地方还望见谅!
现在有这样到一个问题,我有四台游戏服务器,每个用户可能登录到时候根据游戏帐号到不同登录不同到游戏服务器,redis存储服务器就一个,比如玩家1在1服务器上,玩家2在2服务器上,现在为了使玩家1,2通讯,通过redis的消息列队pub/sub实现!
现在问题是,当每个玩家登录的时候每个人分别订阅自己的队列(这样我也不知到redis消息队列可以订阅到上限是多少),还是所有玩家订阅统一的一个,然后服务器发布消息到时候每个玩家去做判断好?
还是我根本没理解好这个实现过程!
希望大神指点迷津!
加上一小段简单意思到代码:
发布(发送):
1 import redis
2 rc = redis.Redis(host = '127.0.0.1')
3 ps = rc.pubsub()
4 ps.subscribe(['foo','bar'])
5 rc.publish('foo',['a','hello'])
订阅(接收):

1 import
redis
2 rc = redis.Redis(host = '127.0.0.1')
3 ps = rc.pubsub()
4 ps.subscribe(['foo','bar'])
5 for item in ps.listen():
6 if item['type'] == 'message':
7 print item['data']
...全文
457 8 打赏 收藏 转发到动态 举报
写回复
用AI写文章
8 条回复
切换为时间正序
请发表友善的回复…
发表回复
he610445763 2015-01-30
  • 打赏
  • 举报
回复
zrd634550666 2013-11-16
  • 打赏
  • 举报
回复
引用 5 楼 panghuhu250 的回复:
[quote=引用 4 楼 jeky198306 的回复:] [quote=引用 3 楼 panghuhu250 的回复:] 这儿有一些讨论: https://groups.google.com/forum/#!topic/redis-db/R09u__3Jzfk 最终你还得通过测试来确定是否会有性能问题。
你好,这个地址打不开,我也想了解下,谢谢~~[/quote]这是最主要的部分: This is how Pub/Sub works performance wise in Redis. You have two things. 1) Subscribers to channels (SUBSCRIBE) 2) Subscribers to *patterns* (PSUBSCRIBE) 3) Publishers (PUBLISH) You can consider the work of subscribing/unsubscribing as a constant time operation, O(1) for both subscribing and unsubscribing (actually PSUBSCRIBE does more work than this if you are subscribed already to many patterns with the *same* client). All the complexity on the end is on the PUBLISH command, that performs an amount of work that is proportional to: a) The number of clients receiving the message. b) The number of clients subscribed to a pattern, even if they'll not match the message. This means that if you have N clients subscribed to 100000 different channels, everything will be super fast. If you have instead 10000 clients subscribed to the same channel, PUBLISH commands against this channel will be slow, and take maybe a few milliseconds (not sure about the actual time taken). Since we have to send the same message to everybody. Also, if you have clients subscribed to 10000 *patterns* publish will be slower than usually, but the work to do for every existing pattern is smaller compared to the work that there is to do for every client *receiving* the message. About memory, it is similar or smaller than the one used by a key, so you should not have problems to subscribe to millions of channels even in a small server. Salvatore[/quote] 感谢细心帮助!我会仔细学习优化下!
panghuhu250 2013-11-02
  • 打赏
  • 举报
回复
引用 4 楼 jeky198306 的回复:
[quote=引用 3 楼 panghuhu250 的回复:] 这儿有一些讨论: https://groups.google.com/forum/#!topic/redis-db/R09u__3Jzfk 最终你还得通过测试来确定是否会有性能问题。
你好,这个地址打不开,我也想了解下,谢谢~~[/quote]这是最主要的部分: This is how Pub/Sub works performance wise in Redis. You have two things. 1) Subscribers to channels (SUBSCRIBE) 2) Subscribers to *patterns* (PSUBSCRIBE) 3) Publishers (PUBLISH) You can consider the work of subscribing/unsubscribing as a constant time operation, O(1) for both subscribing and unsubscribing (actually PSUBSCRIBE does more work than this if you are subscribed already to many patterns with the *same* client). All the complexity on the end is on the PUBLISH command, that performs an amount of work that is proportional to: a) The number of clients receiving the message. b) The number of clients subscribed to a pattern, even if they'll not match the message. This means that if you have N clients subscribed to 100000 different channels, everything will be super fast. If you have instead 10000 clients subscribed to the same channel, PUBLISH commands against this channel will be slow, and take maybe a few milliseconds (not sure about the actual time taken). Since we have to send the same message to everybody. Also, if you have clients subscribed to 10000 *patterns* publish will be slower than usually, but the work to do for every existing pattern is smaller compared to the work that there is to do for every client *receiving* the message. About memory, it is similar or smaller than the one used by a key, so you should not have problems to subscribe to millions of channels even in a small server. Salvatore
jeky_zhang2013 2013-11-01
  • 打赏
  • 举报
回复
引用 3 楼 panghuhu250 的回复:
这儿有一些讨论: https://groups.google.com/forum/#!topic/redis-db/R09u__3Jzfk 最终你还得通过测试来确定是否会有性能问题。
你好,这个地址打不开,我也想了解下,谢谢~~
panghuhu250 2013-10-30
  • 打赏
  • 举报
回复
我觉得pub/sub机制是多对多通讯用的:发送方不必知道有多少接收方,接收方也不必知道有多少发送方。 我会为每个用户建一个信箱(list or set),发送消息就是把消息加到接收方的信箱,接收消息就是获取自己信箱的内容(并清空信箱)。
panghuhu250 2013-10-30
  • 打赏
  • 举报
回复
这儿有一些讨论: https://groups.google.com/forum/#!topic/redis-db/R09u__3Jzfk 最终你还得通过测试来确定是否会有性能问题。
zrd634550666 2013-10-30
  • 打赏
  • 举报
回复
引用 1 楼 panghuhu250 的回复:
我觉得pub/sub机制是多对多通讯用的:发送方不必知道有多少接收方,接收方也不必知道有多少发送方。 我会为每个用户建一个信箱(list or set),发送消息就是把消息加到接收方的信箱,接收消息就是获取自己信箱的内容(并清空信箱)。
感谢版主,大体了解来这个过程!现在我做的是,为每一个上线到用户订阅了信箱(但不是list和set,使用reids的pubsub()和subscribe()订阅的消息队列),这样为每一个上线到用户启用一个线程去监听,代码我贴下,大版帮忙看看会不会因为玩家过多出现问题 :链接部分: class PlayerMQ(DynamicRedisHandler): """ 用户消息队列处理 """ _MQ = None def __init__(self, player_id): _MQ_client = self.__class__.get_client(self.__class__.get_client_config(player_id)) self._MQ = _MQ_client.pubsub() self._MQ.subscribe([self.__class__.get_kvs_key(player_id)]) 监听部分: @property def listen(self): #监听消息队列 for message in self.MQ._MQ.listen(): print message @property def online(self): """ 上线操作 """ self.MQ = PlayerMQ(self.id) t = threading.Thread(target=self.listen) t.setDaemon(True) t.start() self.set("_online", 1) 然后根据需要获取玩家得信箱去publish()

37,744

社区成员

发帖
与我相关
我的任务
社区描述
JavaScript,VBScript,AngleScript,ActionScript,Shell,Perl,Ruby,Lua,Tcl,Scala,MaxScript 等脚本语言交流。
社区管理员
  • 脚本语言(Perl/Python)社区
  • WuKongSecurity@BOB
加入社区
  • 近7日
  • 近30日
  • 至今

试试用AI创作助手写篇文章吧