Scaling Content Delivery
You can scale Content Delivery by load balancing; by installing separate server roles on separate machines; by running the Content Deployer multi-threaded on one machine; by defining different publish behavior for staging sites or live sites; and by publishing different content to different servers based on BluePrinting.
- Content Delivery microservice load balancing
The optimal ratio of Presentation Servers to microservices depends on the specific load you place on your setup. Load testing may reveal that your Presentation Servers are putting too high of a stress on your Content Delivery microservices. If this is the case, then, depending on the specific type of microservice being used, you may be able to relieve the stress by running multiple microservice instances side by side and inserting a load balancer in between. - Content Delivery presentation environment load balancing
Implement load balancing by replicating your website and implementing a load balancer on top of the websites. - Scaling out Content Delivery Server Roles to separate machines
Each of the Server Roles can be installed on a separate machine to improve Content Delivery performance. Especially the Content Deployer, which typically carries most of the performance burden, is best installed on a separate machine. In such a scaled-out scenario, if you install a standalone microservice on its own machine, ensure that its PowerShell installation script does not include a dependency on the Discovery Service. - Scaling out the Content Deployer
You can make the Content Deployer scalable if you store temporary incoming content in a Redis database, and use a JMS Content Deployer queue. All Content Deployer instances can then check the queue and pick up content from the database. - Content Delivery search feature scaling
The Content Delivery search feature can be scaled out in various ways, depending on your type of environment. - Using BluePrinting to publish different content to different machines
Use your BluePrint structure to decide which content to publish to which machine, for efficiency and better performance. - Sharing an OpenSearch cluster across a staging and live website
If you host a staging website and a live website, each with their own Content Delivery environment, you might want the two to use the same OpenSearch cluster, but different indexes for your search feature. To achieve this, modify your scripts so that the indexes in both environments have different names. - Sharing an Experience Optimization OpenSearch cluster across a staging and live website
If you host a staging website and a live website, each with their own Content Delivery environment, you might want the two to use the same OpenSearch cluster, but different indexes for Experience Optimization. To achieve this, modify your scripts so that the indexes in both environments have different names. - Use of parameters in configuration files
Most Content Delivery configuration files let you set configuration values not as hardcoded literal strings, but as parameters, resolved at runtime using system settings. This way, by using different system settings in different environments, you can reuse the same configuration files without having to modify them.
Related concepts