I'm new to redis, this might be a basic question.
I use @Cacheable()
and @CacheEvict()
annotation. When the user gets updated, and if i fetch the user by id, it fetches the cached (outdated) data. Of course, if i were to use @CacheEvict()
this wouldn't happen.
However, i'm confused about @CacheEvict()
, because the results are the same as if i don't use it -- so whats the point of using it? If there is a process that takes 3 seconds to finish, then using CacheEvict()
would also take 3 seconds.
Here is my UserServiceImpl.java
class:
package com.example.demo.serviceImpl;
import lombok.AllArgsConstructor;
import com.example.demo.model.User;
import com.example.demo.repository.UserRepository;
import com.example.demo.service.UserService;
import org.springframework.cache.annotation.CacheEvict;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.stereotype.Service;
import java.time.LocalDateTime;
import java.util.List;
@Service
@EnableCaching
@AllArgsConstructor
public class UserServiceImpl implements UserService {
private UserRepository userRepository;
@Override
public User createUser(User user) {
return userRepository.save(user);
}
@Override
@CacheEvict(value = "users")
public User findUser(String userId) {
doLongRunningTask();
return userRepository.findById(userId).orElseThrow();
}
@Override
@Cacheable(value = "users")
public List<User> findAll() {
return (List<User>) userRepository.findAll();
}
@Override
@CacheEvict(value = "users", key = "#user.id")
public User updateUser(String userId, User user) {
doLongRunningTask();
user.setUpdatedAt(LocalDateTime.now());
return userRepository.save(user);
}
@Override
@CacheEvict(value = "users", key = "#userId")
public void deleteUser(String userId) {
userRepository.deleteById(userId);
}
private void doLongRunningTask() {
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
My RedisConfig.java
class:
package com.example.demo.config;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.redis.cache.RedisCacheConfiguration;
import org.springframework.data.redis.cache.RedisCacheManager;
import org.springframework.data.redis.connection.RedisStandaloneConfiguration;
import org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory;
import org.springframework.data.redis.serializer.GenericJackson2JsonRedisSerializer;
import java.time.Duration;
import static org.springframework.data.redis.serializer.RedisSerializationContext.SerializationPair.fromSerializer;
@Configuration
public class RedisConfig {
@Value("${redis.host}")
private String redisHost;
@Value("${redis.port}")
private int redisPort;
@Bean
public LettuceConnectionFactory redisConnectionFactory() {
RedisStandaloneConfiguration configuration = new RedisStandaloneConfiguration();
configuration.setHostName(redisHost);
configuration.setPort(redisPort);
return new LettuceConnectionFactory(configuration);
}
@Bean
public RedisCacheManager cacheManager() {
RedisCacheConfiguration cacheConfig = myDefaultCacheConfig(Duration.ofMinutes(10)).disableCachingNullValues();
return RedisCacheManager
.builder(redisConnectionFactory())
.cacheDefaults(cacheConfig)
.withCacheConfiguration("users", myDefaultCacheConfig(Duration.ofMinutes(5)))
.build();
}
private RedisCacheConfiguration myDefaultCacheConfig(Duration duration) {
return RedisCacheConfiguration
.defaultCacheConfig()
.entryTtl(duration)
.serializeValuesWith(fromSerializer(new GenericJackson2JsonRedisSerializer()));
}
}
Fetching data for the first time takes 3 seconds. Fetching the same data next time takes 5 ms (this time gets pulled from Redis instead of postgres). However updating this user and fetching it again, gives outdated data instead of the newly updated user, causing data inconsistencies.
UPDATE: this is my model/User.java
model class
package com.example.demo.model;
import lombok.*;
import org.springframework.data.annotation.Id;
import org.springframework.data.redis.core.RedisHash;
@Data
@Builder
@RedisHash("user")
public class User {
@Id
private String id;
private String name;
private Integer age;
}
I also have dto/UserDTO.java
for converting the model into a REST response/request via API:
package com.example.demo.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.io.Serializable;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class UserDTO implements Serializable {
@JsonProperty(value = "id")
private String id;
@JsonProperty(value = "name")
private String name;
@JsonProperty(value = "age")
private Integer age;
}
Thanks to @Max Kozlov this DTO class is now a Serializable
so that Redis Cache can work properly.
The new RedisCacheConfig.java
thanks to @Max Kozlov's answer looks like this:
package com.example.demo.config;
import com.example.demo.handler.DefaultCacheErrorHandler;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.cache.CacheManager;
import org.springframework.cache.annotation.CachingConfigurer;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.cache.interceptor.CacheErrorHandler;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.redis.cache.RedisCacheConfiguration;
import org.springframework.data.redis.cache.RedisCacheManager;
import org.springframework.data.redis.connection.RedisConnectionFactory;
import org.springframework.data.redis.connection.RedisStandaloneConfiguration;
import org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory;
import java.time.Duration;
@Configuration
@EnableCaching
public class RedisCacheConfig implements CachingConfigurer {
@Value("${redis.host}")
private String redisHost;
@Value("${redis.port}")
private int redisPort;
@Bean
public LettuceConnectionFactory redisConnectionFactory() {
RedisStandaloneConfiguration configuration = new RedisStandaloneConfiguration();
configuration.setHostName(redisHost);
configuration.setPort(redisPort);
return new LettuceConnectionFactory(configuration);
}
@Bean
public RedisCacheConfiguration redisCacheConfiguration() {
return RedisCacheConfiguration
.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(15));
}
@Bean
@Override
public CacheErrorHandler errorHandler() {
return new DefaultCacheErrorHandler();
}
@Bean("longLifeCacheManager")
public CacheManager longLifeCacheManager() {
RedisCacheConfiguration defaultConfiguration = RedisCacheConfiguration
.defaultCacheConfig()
.entryTtl(Duration.ofDays(90));
return RedisCacheManager
.RedisCacheManagerBuilder
.fromConnectionFactory(redisConnectionFactory())
.cacheDefaults(defaultConfiguration)
.build();
}
@Primary
@Bean("shortLifeCacheManager")
public CacheManager shortLifeCacheManager() {
RedisCacheConfiguration defaultConfiguration = RedisCacheConfiguration
.defaultCacheConfig()
.entryTtl(Duration.ofDays(1));
return RedisCacheManager
.RedisCacheManagerBuilder
.fromConnectionFactory(redisConnectionFactory())
.cacheDefaults(defaultConfiguration)
.build();
}
}
You have the wrong logic for using the annotation @Cachable
because you are caching the entire list of users without a specific key.
In other words, you need to cache a specific user, for example, by id
.
Now you have a full list of users cached with key users
. But the entry with the users:id
key is deleted. Therefore your cache is not evict.
For the cache to work, you need to rewrite your service class in this way.
@Service
@EnableCaching
@AllArgsConstructor
public class UserServiceImpl implements UserService {
private UserRepository userRepository;
@Override
public User createUser(User user) {
return userRepository.save(user);
}
@Override
@Cacheable(value = "users", key = "#userId")
public User findUser(String userId) {
doLongRunningTask();
return userRepository.findById(userId).orElseThrow();
}
@Override
public List<User> findAll() {
return (List<User>) userRepository.findAll();
}
@Override
@CacheEvict(value = "users", key = "#user.id")
public User updateUser(String userId, User user) {
doLongRunningTask();
user.setUpdatedAt(LocalDateTime.now());
return userRepository.save(user);
}
@Override
@CacheEvict(value = "users", key = "#userId")
public void deleteUser(String userId) {
userRepository.deleteById(userId);
}
}
Here I moved the annotation @Cacheable(value = "users", key = "#userId")
from method findAll()
to method findUser(String userId)
. I also corrected the annotation @Cacheable
and added the key key = "#userId"
there.
Anyway, if you want to cache data in Redis. You need to avoid list caching, and only apply this approach to specific entities. You also need to pay attention to the fact that if you want to store entities in the cache, then you need to create a serial version in the entity itself.
Anyway, if you want to cache data in Redis. You need to avoid list caching, and only apply this approach to specific entities. You also need to pay attention to the fact that you are serializing entities in json. It is highly discouraged to do this, because relationships such as @ManyToOne
and @OneToMany
can cause you to get a recursive call to these relationships at the time of serialization.
Hope my answer helps you =)
UPDATE
CacheConfiguration
class for spring-boot 2.7.* version.
@EnableCaching
@Configuration
public class CacheConfiguration extends CachingConfigurerSupport {
@Bean
public RedisCacheConfiguration redisCacheConfiguration() {
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(15));
}
@Bean
@Override
public CacheErrorHandler errorHandler() {
return new DefaultCacheErrorHandler();
}
@Bean("longLifeCacheManager")
public CacheManager longLifeCacheManager(
RedisConnectionFactory redisConnectionFactory
) {
RedisCacheConfiguration defaultConfiguration = RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofDays(90));
return RedisCacheManager.RedisCacheManagerBuilder
.fromConnectionFactory(redisConnectionFactory)
.cacheDefaults(defaultConfiguration)
.build();
}
@Bean("shortLifeCacheManager")
@Primary
public CacheManager shortLifeCacheManager(
RedisConnectionFactory redisConnectionFactory
) {
RedisCacheConfiguration defaultConfiguration = RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofDays(1));
return RedisCacheManager.RedisCacheManagerBuilder
.fromConnectionFactory(redisConnectionFactory)
.cacheDefaults(defaultConfiguration)
.build();
}
}
and DefaultCacheErrorHandler
class for exception handlings
public class DefaultCacheErrorHandler extends SimpleCacheErrorHandler {
private static final Logger LOG = LoggerFactory.getLogger(DefaultCacheErrorHandler.class);
@Override
public void handleCacheGetError(
@NotNull RuntimeException exception,
@NotNull Cache cache,
@NotNull Object key
) {
LOG.info(
"handleCacheGetError ~ {}: {} - {}",
exception.getMessage(),
cache.getName(),
key
);
}
@Override
public void handleCachePutError(
@NotNull RuntimeException exception,
@NotNull Cache cache,
@NotNull Object key,
Object value
) {
LOG.info(
"handleCachePutError ~ {}: {} - {}",
exception.getMessage(),
cache.getName(),
key
);
super.handleCachePutError(exception, cache, key, value);
}
@Override
public void handleCacheEvictError(
@NotNull RuntimeException exception,
@NotNull Cache cache,
@NotNull Object key
) {
LOG.info(
"handleCacheEvictError ~ {}: {} - {}",
exception.getMessage(),
cache.getName(),
key
);
super.handleCacheEvictError(exception, cache, key);
}
@Override
public void handleCacheClearError(
@NotNull RuntimeException exception,
@NotNull Cache cache
) {
LOG.info(
"handleCacheClearError ~ {}: {}",
exception.getMessage(),
cache.getName()
);
super.handleCacheClearError(exception, cache);
}
}
in this case, a simple java serealizer is used. The object classes that need to be cached need to be implemented from the Serializable
interface.
public class User implements Serializable {
private Long id;
provate Long name;
// getters/setters
}
The config class is standard for enabling caching via redis, except for one detail, namely the DefaultCacheErrorHandler
. The DefaultCacheErrorHandler
is needed in order to reset the cache if you change the entity class and change the serialVersionUID of this class accordingly. This is usually required if you are adding or removing fields from a class. By default, for some reason, spring does not delete the cache when a serialization error occurs, but throws an error that leads to the need to manually delete the necessary keys from the redis.
Additional Answers
The problem of serializing entities to json is no longer related to the cache as such. This is an old problem that is easier to avoid and has no real solution. link In general, this approach lies in the plane of bad architecture. For this reason, if it is necessary to serialize data received from the database, then the most correct option would be to make an DTO object and fill it with data from the entity. For this reason, since caching data from the database is a common task, it is better to use standard java serialization.
It is also necessary to remember about the speed of serialization and deserialization. Often, in tasks where a lot of time is not required to process business logic, most of the time is taken just by serialization and deserialization of data from json. Unfortunately, no matter how good the Jackson library is, it needs to be remembered periodically. This question is beyond the scope of this discussion, but there are many interesting answers on this topic in the stackoverflow.
Plus, you can speculate on the amount of memory that is needed to store the cache. Json, as shown by recent changes in the backend development, is well replaced by the serialization approach in the same grpc. This allows in some cases to significantly save in terms of information transmitted over the network, which can also save a lot of memory in terms of caching. True, I do not know how much the standard java serialization is better in this respect than json.
Summarizing, we can say that although the data in the radis is better perceived by a person in the form of a json object. But as my experience suggests, for this task it is better to compromise than to end up with a bad architecture.