Content
Content filtering (URL, mobile code) is covered in Chapter 4, "Network Security Technologies." This section briefly covers security considerations around caching and CDNs.
Caching
Caching in the context of this section refers to storing data from servers in an intermediary device to speed responses to data queries. Caching in either forward proxy, transparent mode, or reverse proxy (each term is defined in the following sections) lowers bandwidth utilization and shortens response time for users. It is most commonly used to provide caching for local users accessing a remote location, but it can also be used to provide certain content to users on your own servers.
Security Considerations
The security issues around caching are minimal. The primary attack involves compromising the cache to cause it to distribute false information or to learn information stored in the cache that the attacker would ordinarily not be able to access. Secondary attacks include setting up a rogue caching system or running flooding attacks against the production caching servers. Rogue device threat mitigation is covered in Chapter 5, "Device Hardening," and DoS attack mitigation is covered in Chapter 6, "General Design Considerations." Protecting the cache server involves hardening the device as discussed in Chapter 5 and restricting the conversations with the cache when possible at the network level.
Forward Proxy Cache
Forward proxy cache deployments are identical to generic proxy server deployments as discussed in Chapter 7, "Network Security Platform Options and Best Deployment Practices." In a forward proxy cache, you are simply running a web cache on the same device as your proxy server. Caches can be deployed either behind the firewall (most common) or on a perimeter firewall interface.
Transparent Cache
In a transparent cache, the clients are not aware that any caching is functioning. Network devices (usually routers) redirect web queries to the cache over the Web Cache Control Protocol (WCCP). The cache can then either provide the content directly to the user, fetch the content and then provide it, or direct the user to download the information directly, in the case of noncacheable items. Because transparent caches require a routing device to deliberately send queries to the cache, the chances of a rogue transparent cache sneaking onto the network are limited. In addition, the system should be put on a dedicated router interface where it can be shielded from most direct attack.
Reverse Proxy Cache
In reverse proxy cache deployments, the cache is acting as a proxy for the server instead of the client. Reverse proxy cache can be deployed in transparent mode using WCCP or as a standalone proxy just like forward proxy caches.
The security issues around reverse proxy caches are the same as for any other cache, except the impact of the attack is greater. By compromising a reverse proxy cache at an organization, bogus data can be passed to the user without directly compromising the servers. Be sure to provide the same security vigilance for your reverse cache as you do the servers.
Content Distribution and Routing
Content distribution and routing refers to a broad area of networking concerned with efficient delivery of content to a diverse set of clients. You might have already used such a system when downloading a file or viewing streaming content on the web. Such systems generally work by creating several copies of a given piece of content in different geographic locations. The system determines your location on the network when you make a request and can therefore forward the copy of the content closest to you.
The security considerations around content distribution are less like a specific application's security considerations and more like an entire network's security considerations. Typically, in a CDN, you have several mechanisms, each of which needs some level of security:
- Original content location The original source of the content must be secured just like any server on a network. In addition, its method of transporting the content to the distribution points can require security. If you are concerned about the content being manipulated in transport, make sure you use a cryptographically secure mechanism such as HTTPS to distribute the content. Typically, content replication can occur over unicast or multicast packets. Unicast can be secured with HTTPS, but multicast is more difficult to secure. Oftentimes, GRE + IPsec can be used to create secure tunnels that allow the transfer of multicast traffic.
- Content distribution locations The local servers have the same security considerations as the original content; the difference is that you might not be able to keep as close an eye on them in many cases. In addition, if you are using a content distribution service, you must rely on that service to secure your content and ensure it is not changed. This is certainly something to investigate when you select a vendor. The content distribution locations can be protected by firewalls limiting access to specific protocols just like the central servers can have IDS or other security mechanisms.
- Decision-making entity In a CDN, you have a system (they can take many forms) that directs users to a particular content location. Such systems generally involve the DNS and some dedicated technology to make the content location decision for each client. This device needs special attention because its compromise can lead to several undesirable outcomes. The easiest attack is just to take the system out of service, which will either overload one content source or stop all access to the content. Worse, users could be directed to false versions of the content. The primary considerations for this device are a hardened configuration and secure management.