Conditional in a coroutine based on whether it was called again?

I am trying to translate this key "debouncing" logic from Javascript to Python. function handle_key(key) { if (this.state == null) { this.state = '' } this.state += key clearTimeout(this.timeout) this.timeout = setTimeout(() => { console.log(this.state) }, 500) } handle_key('a') handle_key('b') The idea is that subsequent key presses extend

基于它是否被再次调用,在协程中有条件?

我正试图将这个关键的“debouncing”逻辑从Javascript翻译成Python。 function handle_key(key) { if (this.state == null) { this.state = '' } this.state += key clearTimeout(this.timeout) this.timeout = setTimeout(() => { console.log(this.state) }, 500) } handle_key('a') handle_key('b') 这个想法是,随后的按键会延长超时时间。 Javascript版本打印: ab 我不想

How can I wrap a synchronous function in an async coroutine?

I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API. So, just using sleep as a simple e

如何在异步协程中包装同步函数?

我正在使用aiohttp构建一个将TCP请求发送到单独服务器的API服务器。 发送TCP请求的模块是同步的,并且是我的目的的黑盒子。 所以我的问题是这些请求阻止了整个API。 我需要一种将模块请求封装在异步协程中的方法,该协议不会阻塞其余的API。 那么,只要使用sleep作为一个简单的例子,有没有什么办法可以将非常耗时的同步代码封装在非阻塞协程中,如下所示: async def sleep_async(delay): # After calling sleep, loop

Calling a coroutine from asyncio.Protocol.data

This is similar to Calling coroutines in asyncio.Protocol.data_received but I think it warrants a new question. I have a simple server set up like this loop.create_unix_server(lambda: protocol, path=serverSocket) It works fine, if I do this def data_received(self, data): data = b'data reply' self.send(data) my client gets the reply. But I can't get it to work with any sort of

从asyncio.Protocol.data调用协程

这与asyncio.Protocol.data_received中的调用协程类似,但我认为它需要一个新的问题。 我有这样一个简单的服务器设置 loop.create_unix_server(lambda: protocol, path=serverSocket) 它工作正常,如果我这样做 def data_received(self, data): data = b'data reply' self.send(data) 我的客户得到答复。 但我无法使用任何类型的asyncio调用。 我尝试了以下所有内容,但都没有工作。 @asyncio.coroutine def go

Understanding Python Concurrency with Asyncio

I was wondering how concurrency works in python 3.6 with asyncio. My understanding is that when the interpreter executing await statement, it will leave it there until the awaiting process is complete and then move on to execute the other coroutine task. But what I see here in the code below is not like that. The program runs synchronously, executing task one by one. What is wrong with my un

通过Asyncio了解Python并发

我想知道如何使用asyncio在python 3.6中并发工作。 我的理解是,当解释器执行await语句时,它会将其留在那里,直到等待过程完成,然后继续执行其他协程任务。 但是我在下面的代码中看到的不是这样的。 程序同步运行,逐个执行任务。 我的理解和实施代码有什么问题? import asyncio import time async def myWorker(lock, i): print("Attempting to attain lock {}".format(i)) # acquire lo

Is there a way to manually switch on asyncio event loop

I want to use the event loop to monitor any inserting data into my asyncio.Queue(you can find its source code here https://github.com/python/cpython/blob/3.6/Lib/asyncio/queues.py), but I run into some problems. Here is the following code: import asyncio import threading async def recv(q): while True: msg = await q.get() print(msg) async def checking_task(): while True

有没有办法手动打开asyncio事件循环

我想使用事件循环监视任何插入数据到我的asyncio.Queue(你可以在这里找到它的源代码https://github.com/python/cpython/blob/3.6/Lib/asyncio/queues.py),但我遇到了一些问题。 这是下面的代码: import asyncio import threading async def recv(q): while True: msg = await q.get() print(msg) async def checking_task(): while True: await asyncio.sleep(0.1) def loop_in_thread(

Unsure how to cancel long

I'm developing a CLI that interacts with a web service. When run, it will try to establish communication with it, send requests, receive and process replies and then terminate. I'm using coroutines in various parts of my code and asyncio to drive them. What I'd like is to be able to perform all these steps and then have all coroutines cleanly terminate at the end (ie in a way that

不确定如何取消多长时间

我正在开发一个与Web服务交互的CLI。 运行时,它将尝试与它建立通信,发送请求,接收并处理回复,然后终止。 我在我的代码和asyncio的各个部分使用协程来驱动它们。 我希望能够执行所有这些步骤,然后让所有协程完全终止(即以不会导致异步抱怨的方式)。 不幸的是,我发现asyncio比C#等其他语言中的异步性更难以使用和理解。 我定义了一个类来处理通过websocket连接与Web服务的所有直接通信: class CommunicationServic

Asyncio coroutines

I thought I had grokked coroutines with David Beazley's very good presentation but I can't reconcile it fully with the new syntax described in PEP-492. In the presentation, he explains how coroutines can be thought of as a pipeline that gets pushed to as opposed to pulled from like in generators. For example: # cofollow.py # # A simple example showing how to hook up a pipeline with #

Asyncio协程

我认为我已经用David Beazley的非常好的演示文稿构建了协程,但我无法用PEP-492中描述的新语法完全调和它。 在演讲中,他解释了协程可以被看作是一个被推到的管道,而不是像发电机那样被拉出。 例如: # cofollow.py # # A simple example showing how to hook up a pipeline with # coroutines. To run this, you will need a log file. # Run the program logsim.py in the background to get a data # source. from co

How to not await in a loop with asyncio?

Here is a toy example that downloads the home page from several websites using asyncio and aiohttp: import asyncio import aiohttp sites = [ "http://google.com", "http://reddit.com", "http://wikipedia.com", "http://afpy.org", "http://httpbin.org", "http://stackoverflow.com", "http://reddit.com" ] async def main(sites): for site in sites: download(site)

如何不等待与asyncio循环?

以下是一个玩具示例,使用asyncio和aiohttp从几个网站下载主页: import asyncio import aiohttp sites = [ "http://google.com", "http://reddit.com", "http://wikipedia.com", "http://afpy.org", "http://httpbin.org", "http://stackoverflow.com", "http://reddit.com" ] async def main(sites): for site in sites: download(site) async def download(site): response

Coroutine based state machines

I have a tricky and interesting question to You. While working on I/O tasks such as protocol implementation via some transport layer in Twisted, Tornado, I found a similar scenario or pattern. The pattern is rather generic than abstract. For example, when you are working with MODEM-like device, you send him commands and receive the results. However, sometimes you need to react on the respon

基于协程的状态机

我有一个棘手而有趣的问题给你。 在处理Twisted,Tornado中某些传输层的协议实现等I / O任务时,我发现了类似的场景或模式。 该模式比抽象更具普遍性。 例如,当您使用类似MODEM的设备时,可以向他发送命令并接收结果。 但是,有时您需要对最后一条命令上的调制解调器对新命令的响应做出反应。 例如,假设调制解调器是M, - >是采用一个参数的通信运营商,消息密钥和服务器是S. 1. s ->(a) M 1.1 M ->

Python `with` context vs generators/coroutines/tasks

I want to experiment with using python with blocks to apply modifiers to action within that block. But I'm not sure if it's possible to do this sensibly in the presence of coroutines. For example, suppose I have a WithContext object that temporarily pushes onto a stack like this: class WithContext: stack = [] def __init__(self, val): self.val = val def __enter__(se

具有`上下文与发生器/协程/任务的Python'

我想尝试使用python with块来在该块中应用修饰符。 但我不确定在协同程序的存在下是否可以做到这一点。 例如,假设我有一个WithContext对象,它临时推入一个堆栈,如下所示: class WithContext: stack = [] def __init__(self, val): self.val = val def __enter__(self): WithContext.stack.append(self.val) def __exit__(self, exc_type, exc_val, exc_tb): WithContext.stack.p