Can someone explain

I have been using Python more and more, and I keep seeing the variable __all__ set in different __init__.py files. Can someone explain what this does? It's a list of public objects of that module, as interpreted by import * . It overrides the default of hiding everything that begins with an underscore. Linked to, but not explicitly mentioned here, is exactly when __all__ is used. It is

有人可以解释

我一直在使用Python越来越多,我一直看到变量__all__设置在不同的__init__.py文件中。 有人可以解释这是什么吗? 它是该模块的公共对象列表,由import *解释。 它将覆盖隐藏以下划线开头的所有内容的默认值。 链接到,但这里没有明确提到,正是使用__all__时。 它是一个字符串列表,用于定义在模块上使用from <module> import *时模块中的哪些符号将被导出。 例如, foo.py的以下代码明确地导出符号bar和baz : _

Calculating lunar/lunisolar holidays in python

Any calendar nuts here? I've been looking for information on how to calculate the holidays of the current year that occur irregularly in the Gregorian Calendar. Typically this happens because the holiday is based on an older lunar calendar. I've googled ad nauseum and made some progress but not able to finish. If anyone has sample code in a modern language that describes their calcul

计算蟒蛇的月球/阴历假期

任何日历坚果在这里? 我一直在寻找有关如何计算在格里历中不规则地发生的当年假期的信息。 通常情况下会发生这种情况,因为假期是基于较旧的阴历。 我已经使用了广告的恶作剧并取得了一些进展,但无法完成。 如果任何人有用现代语言描述他们的计算的示例代码,我会非常感激。 我更喜欢Python或其中一种C *语言。 迄今我的进展: 完成: 复活节:可以用python-dateutil找到。 光明节和其他希伯来日历假日可以使用p

Python 3.5 async keyword

PEP 0492 adds the async keyword to Python 3.5. How does Python benefit from the use of this operator? The example that is given for a coroutine is async def read_data(db): data = await db.fetch('SELECT ...') According to the docs this achieves suspend[ing] execution of read_data coroutine until db.fetch awaitable completes and returns the result data. Does this async keyword actually

Python 3.5的异步关键字

PEP 0492将async关键字添加到Python 3.5中。 Python如何从使用这个操作符中受益? 给出了一个协程的例子是 async def read_data(db): data = await db.fetch('SELECT ...') 根据这个文件达到 暂停执行read_data协同程序直到db.fetch等待完成并返回结果数据。 这个async关键字是否实际上涉及到创建新线程或可能使用现有的保留异步线程? 在async使用保留线程的情况下,它是单独的共享线程吗? 不,协同例程不涉

Why await doesn't wait asyncio.create

I'm writing a coroutine to execute shell command in python base on a tutorial. Here are basic: import asyncio async def async_procedure(): process = await asyncio.create_subprocess_exec('ping', '-c', '2', 'google.com') await process.wait() print('async procedure done.') loop = asyncio.get_event_loop() loop.run_until_complete(async_procedure()) loop.close() This code above wor

为什么等待不等asyncio.create

我正在编写一个协程来在基于教程的python中执行shell命令。 这里是基本的: import asyncio async def async_procedure(): process = await asyncio.create_subprocess_exec('ping', '-c', '2', 'google.com') await process.wait() print('async procedure done.') loop = asyncio.get_event_loop() loop.run_until_complete(async_procedure()) loop.close() 上面的代码完美地工作。 它给出了这样的结果: P

Seemingly infinite recursion with generator based coroutines

The following is taken from David Beazley's slides on generators (here for anybody interested). A Task class is defined which wraps a generator that yields futures, the Task class, in full (w/o error handling), follows: class Task: def __init__(self, gen): self._gen = gen def step(self, value=None): try: fut = self._gen.send(value) fut.add_d

用基于生成器的协程看似无限递归

以下内容来自David Beazley关于发电机的幻灯片(这里是对任何感兴趣的人)。 定义了一个Task类,它包装一个产生期货的生成器,即Task类,完全(没有错误处理),如下所示: class Task: def __init__(self, gen): self._gen = gen def step(self, value=None): try: fut = self._gen.send(value) fut.add_done_callback(self._wakeup) except StopIteration as exc:

"async with" in Python 3.4

The Getting Started docs for aiohttp give the following client example: import asyncio import aiohttp async def fetch_page(session, url): with aiohttp.Timeout(10): async with session.get(url) as response: assert response.status == 200 return await response.read() loop = asyncio.get_event_loop() with aiohttp.ClientSession(loop=loop) as session: content =

Python 3.4中的“async with”

aiohttp的入门文档给出了以下客户端示例: import asyncio import aiohttp async def fetch_page(session, url): with aiohttp.Timeout(10): async with session.get(url) as response: assert response.status == 200 return await response.read() loop = asyncio.get_event_loop() with aiohttp.ClientSession(loop=loop) as session: content = loop.run_until_complete(

based coroutine versus native coroutine

I just read PEP0492 talking about the new approach on coroutines but the PEP failed to make me understand the difference between generator-based coroutines and native ones. Can someone tell me the difference (maybe with examples)? For what I understood they uses different words (yield/yield from and await/async/yield). I understand that at the end of a native coroutine a yield is expected, bu

基于协程和本地协程

我刚刚读到了PEP0492,谈到了协程的新方法,但是PEP没有让我明白基于生成器的协程和本地协程之间的区别。 有人可以告诉我有什么不同(也许有例子)? 对于我所理解的,他们使用不同的词语(产量/产量和等待/异步/产量)。 我知道在本地协程结束时预计会有收益,但这对于基于生成器的收益也是如此。 没有功能差异。 使用async和await关键字的“本机协同程序”仅仅是以前在“基于生成器的协同程序”中实现的语法糖。 如果不需

Are Python/ES6 Generators also Coroutines?

My understanding of the generators in Python and ECMAScript is that they are more capable than ordinary generators. For example, both allow for values to passed back into the generator via next() , and they both allow yielding from another generator ( yield from in Python and yield * in ES6), two things that aren't needed in generators. So, given this extended functionality, are generators

Python / ES6生成器是否也是协程?

我对Python和ECMAScript中的生成器的理解是它们比普通生成器更有能力。 例如,既允许值以经由传递回到发电机next()并且它们都允许从另一个生成得到( yield from在Python和yield *在ES6),未在发电机需要两件事情。 那么,考虑到这个扩展功能,Python和ES6中实现的生成器的所有意图和目的都与协程相同? 有什么区别吗? 从PEP 380 yield from : Python生成器是协程的一种形式,但具有限制,它只能屈服于其直接调用者。

How to unify sender

As far as I understood the coroutine concept in Python, you can basically have two different modes of passing data (sorry, I couldn't come up or find better terms for these): Sender-based: Each coroutine consumes data from "outside" and sends it to a consumer, eg def coro(consumer): while True: item = yield consumer.send(process(item)) To build pipelines, one

如何统一发件人

就我理解Python中的协程概念而言,基本上可以有两种不同的数据传递模式(对不起,我找不到或者找不到更好的条件): 基于发件人:每个协程从“外部”消费数据并将其发送给消费者,例如 def coro(consumer): while True: item = yield consumer.send(process(item)) 为了建立管道,人们可以从外部协程向内部生成: producer(filter(sink())) 基于接收者:每个协同程序从其参数中消费数据并将其交给消费者

asyncio how to pause coroutine until send is called

Lets say that I have a bus that receives messages from somewhere. Every message has target and msg , and I want to implement subscription mecahnism, so other coroutines can subscribe to specific target subscriptions = {} async def subscribe(target): subscriptions[target]= wait_for_messages() async def proc_msg(target,msg); subscriptions[target].send(msg) async def wait_for_messages():

asyncio如何暂停协程直到发送被调用

假设我有一辆公共汽车从某个地方接收消息。 每条消息都有target和msg ,我想实现订阅mecahnism,所以其他协程可以subscribe特定的target subscriptions = {} async def subscribe(target): subscriptions[target]= wait_for_messages() async def proc_msg(target,msg); subscriptions[target].send(msg) async def wait_for_messages(): while True: asyncio.sleep(1) async def subscribe(target)