Can you control a number of Amazon Elastic Load Balancer instances?

We are expecting huge surges of traffic for our application - ~1000 websocket keepalive connections opening per second for about 15 minutes. We want to route them through ELB but are worried that we might lose some traffic (haven't run the tests yet) due to the fact that amazon won't provision enough ELB instances on time. My question is - can you force them to provision at least x ELB instances?

The docs suggest you can only get ~64k ports per ELB instance, we need ~1m in total.

Maybe I am mistaken in thinking this is necessary, and 64k is enough since ELB instance doesn't sit in the middle of the websocket connection once established (given our traffic surge characteristic). Clarification on this would be helpful too.

EDIT : The ELB instance seems to be sitting in the middle of the connection. When I close the ELB listeners the connection dies (my websocket server sees it as a client close).


  • You can't force or control ELB provisioning from the API, it will automatically scale according to perceived traffic.
  • You can open a support ticket and request more capacity.
  • If you must handle unpredictable peaks, using more than one ELB in DNS Round Robin may help. Also consider other services more suited to scalable data ingestion, if that is the case, such as Amazon Kinesis.
  • If you use HTTP listeners, ELB will "sit in the middle". For websockets traffic, you probably want TCP listeners.
  • Mind that ELB closes idle connections automatically, consider sending "keep-alives" periodically and recovering from connection drops.
  • 链接地址: http://www.djcxy.com/p/87166.html

    上一篇: Mac OS 10.6.8上的SciPy和NumPy

    下一篇: 您可以控制多个Amazon Elastic Load Balancer实例吗?