Skip Navigation Links

Why use shared cache?

There is no more efficient way to increase the scalable performance of applications then the use caching to unload deeper layers.

Skip Navigation Links>> Home

General Overview 

high performance distributed and replicated memory object caching system

Shared Cache is high performance distributed and replication cache system built for .Net applications running in server farms.

Shared Cache provides distributed and replicated cache topologies which minimize load factor on your databases. Your advantages from this point of view is quite simple, you can scale out your application linear only with hardware and non additional software cost.
Shared Cache is written in C# and its 100% managed code.

Why you should consider using shared cache?

There is no more efficient way to increase performance of applications then the usage of caching to unload deeper layers.

Key Features

Fast and light weight:

  • Fast - Extensive performance tests in the test suite keep SharedCache's performance consistent between releases
  • Simple - Caching provider is very simple and easy to use, making it possible to get up and running within minutes. See the usage sample for more details.
  • Minimal dependencies - nlog is the only dependency http://www.nlog-project.org

Scalable

  • Provide Memory store - Shared Cache installations use memory stores in the gigabyte range. SharedCache is tuned for these large sizes.
  • Scalable tuned - concurrent load on multi-cpu servers

Complete

  • based on byte[] array
  • Supports cache-wide or element based expiry policies
  • Introduction of following cache eviction policies / purge / cleanup strategies: CACHEITEMPRIORITY, LRU, LFU, TIMEBASED, SIZE, LLF, HYBRID. These round out the eviction policies.
  • Distribution Caching Topology
  • Replicated Caching Topology

High Quality

  • Fully documented - A core belief held by the project team is that a project needs good documentation to be useful.
  • In Shared Cache, this is manifested by:
  • Conservative Commit policy - maintain quality through a restricted change process, whereby changes are submitted, then reviewed by the maintainer and included, or modified.
  • Responsiveness to serious bugs - The Shared Cache team is serious about quality. If one user is having a problem, it probably means others are too, or will have. The Shared Cache team uses Shared Cache themselves in production. Every effort will be made to provide fixes for serious production problems as soon as possible.

Shared Cache Topologies

Shared Cache provides a rich set of different topology scenarios to let you pick the option which suits your caching requirements.

Distributed Caching - partitioned

Replicated Caching

Single Instance Caching