Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • teliumcustomer13
    Participant
    Post count: 1

    What is the call handling limit for the unlimited version of the HAAst? Does it depend only on Hardware and Asterisk limitations? We plan to install a pair of Asterisk servers with 400 simultaneous calls. Is it possible to use HAAst with them?

    Avatar photoTelium Support Group
    Participant
    Post count: 265

    The unlimited edition of HAAst does not impose any limit on number of simultaneous calls. From a practical standpoint HAAst is limited only by the capacity of your hardware, and the design of Asterisk.

    We have HAast “PBX” deployments with over 14500 phone sets, 3500+ simultaneous calls, 800 call setups per minute, etc. HAAst operates perfectly in those environments. We also have HAast “gateway” deployments (i.e. HAast is bridging calls and offering services, but is part of a larger telephony service) with tens of thousands of simultaneous calls.

    Properly sizing your host to support a desired call volume is outside the scope/role of HAast. But you need to consider many factors like transcoding (CPU heavy), recording (I/O heavy), bridging/conferencing, etc. As well, placing Asterisk inside a container or virtual machine imposes other limits on CPU and IO availability. If Telium is provide a turnkey HA solution we assume responsibility for scaling the solution to meet your needs; however, in general Telium does not providing hardware scaling assistance.

    For companies requiring HA for call volumes beyond what a single host/VM/container can support, we recommend running multiple clusters to achieve HA. To keep the clusters in sync we recommend adding our PBXsync product.

    If you are at the point of sizing hardware for purchase, we recommend adding 5% to CPU capacity, and 500MB to memory capacity for HAast, above what is demanded by Asterisk. (Which in the overall scale of these systems is negligible).

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.