Valkey is a fully open-source, in-memory data store backed by the Linux Foundation, offering microsecond-latency operations on rich data structures. See the Introduction for more.
Getting Valkey up and running is straightforward. See the Installation Guide for detailed instructions.
If you just want to try Valkey quickly, head over to Try-Valkey — an interactive playground where you can run any Valkey command right in your browser.
Once connected with Valkey, you can interact by issuing commands to store and retrieve data. Valkey behaves like a remote dictionary – you can think of it as a giant hash map on a server. Each piece of data is stored under a unique key, and you use commands to read or modify values associated with those keys.
Let’s walk through some fundamental operations **** with two of the most commonly used data types in Valkey: strings and hashes.
127.0.0.1:6379> SET user:1000 "Alice"
OK
127.0.0.1:6379> GET user:1000
"Alice"
Here we use the SET command to save the value
"Alice" under the key user:1000, and
GET to fetch it back. Valkey keys are often namespaced with
prefixes (like user:) to group related items. You can store
any data serialized as a string – numbers, JSON, binary blobs, etc. (Up
to 512 MB per value, though very large values are not recommended for
performance).
127.0.0.1:6379> HSET user:1000 name "Alice" email "alice@example.com" age "30"
(integer) 3
127.0.0.1:6379> HGET user:1000 name
"Alice"
127.0.0.1:6379> HGETALL user:1000
1) "name"
2) "Alice"
3) "email"
4) "alice@example.com"
5) "age"
6) "30"
We add three fields to the hash stored at user:1000.
HGET retrieves a single field, and HGETALL
returns all fields and values. Hashes are memory-efficient for storing
structured data.
LPUSH/LRANGE for lists,
SADD/SMEMBERS for sets). You can find a full
overview in the Valkey data types
documentation and the command reference.One of the most popular ways to use Valkey is as a caching layer in front of a traditional database or expensive API. By caching results in Valkey, applications can serve repeated requests much faster and reduce load on back-end systems.
Scenario: Imagine a web application that needs to fetch user profile data from a database. Without caching, each page load would query the database, making the app slow under load. With Valkey, you can cache the user data after the first retrieval:
user:42:profile) to see if the data is already cached.In Valkey, setting a key with an expiration can be done in one command. For example, to cache a rendered page for 5 minutes (300 seconds):
127.0.0.1:6379> SET page:homepage "<html>...rendered content...</html>" EX 300
OK
The EX 300 option tells Valkey to automatically expire
(remove) the key after 300 seconds. Until it expires, any request for
page:homepage will be served the cached content from
memory. You can adjust TTLs based on how fresh the data needs to be.
Expiring keys ensures the cache doesn’t serve stale data
indefinitely.
Valkey can cache nearly anything – from database query results and API responses to session tokens, rendered HTML, or even generated reports. Database caching is a classic use case, but it’s only the beginning. E-commerce platforms use Valkey to serve personalized recommendations instantly. Gaming companies rely on it for real-time leaderboards and matchmaking. Fintech systems trust Valkey to cache fraud detection signals and scoring results under heavy load. In AI and ML pipelines, Valkey accelerates inference by caching model outputs, storing precomputed embeddings, and managing access tokens across distributed systems. With sub-millisecond latency and the capacity to process hundreds of thousands of operations per second, Valkey is built to keep up — no matter how demanding the workload.
Best Practices:
user:42:settings).
This makes it easier to manage related keys and avoid collisions.maxmemory limit and an eviction policy (like
Least Recently Used eviction) if using it as a cache. This
ensures Valkey evicts least-used entries when full, rather than failing
once memory is exhausted.Troubleshooting Common Issues:
-h and -p in
valkey-cli, and that the port 6379 is open
through any firewalls. You can always test connectivity with
valkey-cli ping (expect a PONG).GET returns nothing, consider that
the key might have expired or been evicted if you set a max memory
policy. Use the TTL <key> command to check
time-to-live, and ensure your application logic correctly stores the
data. Also, confirm that you’re connecting to the same Valkey instance
(and database number, if applicable) where the data was written.INFO
command to get stats on memory, CPU, and command usage. For deeper
analysis of latency spikes, Valkey provides a latency monitoring feature
and a benchmarking tool (valkey-benchmark). Common causes
of slowdowns include very large payloads or expensive commands blocking
the server. If needed, consider distributing load via clustering or
splitting data across multiple instances.valkey-server --test-memory to perform a
memory test of your system.For further diagnostics, see the official troubleshooting guide. We want to ensure that Valkey runs smoothly in your environment.
Now that you have Valkey running and understand the basics, you can explore more advanced topics and use cases:
Happy caching with Valkey! With its speed and flexibility, you now have a powerful tool to build fast, scalable applications. Next steps above will guide you as you deepen your Valkey knowledge and tackle more complex scenarios. Good luck on your Valkey journey!