Spring 4.1 to 4.2 migrattion : Why the persistence does not work? - spring-transactions

I used Spring 4.1.0 with Hibernate 4.3.6 and all is Ok.
After Spring migration to 4.2.8, the persistence does not work.
No exception, no trace the persist methode of entity manager is called but nothing in the database.
It's like if the transaction manager was not working.
this is my persistence configuration :
#Configuration
#EnableTransactionManagement
public class PersistenceConfiguration {
#Bean
public BasicDataSource driverManagerDataSource() {
final BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/xxx");
dataSource.setUsername("root");
dataSource.setPassword("root");
dataSource.setValidationQuery("SELECT 1");
dataSource.setDefaultAutoCommit(false);
dataSource.setInitialSize(10);
dataSource.setMaxActive(20);
dataSource.setMaxIdle(10);
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean() {
final LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
localContainerEntityManagerFactoryBean.setDataSource(driverManagerDataSource());
localContainerEntityManagerFactoryBean.setPersistenceUnitName("xxxPersistenceUnitName");
localContainerEntityManagerFactoryBean.setPackagesToScan("org.xxx.model");
localContainerEntityManagerFactoryBean.setJpaVendorAdapter(new org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter());
final HashMap<String, String> map = new HashMap<>();
map.put("hibernate.dialect", "org.hibernate.dialect.MySQL5Dialect");
map.put("hibernate.hbm2ddl.auto", "update");
map.put("hibernate.show_sql", "false");
map.put("hibernate.format_sql", "false");
localContainerEntityManagerFactoryBean.setJpaPropertyMap(map);
localContainerEntityManagerFactoryBean.setJpaDialect(new org.springframework.orm.jpa.vendor.HibernateJpaDialect());
return localContainerEntityManagerFactoryBean;
}
#Bean
public JpaTransactionManager transactionManager() {
final JpaTransactionManager jpaTransactionManager = new JpaTransactionManager();
jpaTransactionManager.setEntityManagerFactory(localContainerEntityManagerFactoryBean().getNativeEntityManagerFactory());
return jpaTransactionManager;
}
Dependence injection :
#Configuration
#Import({PersistenceConfiguration.class, UserConfiguration.class, SecurityConfiguration.class})
#ComponentScan(basePackages = "org.xxx")
#EnableWebMvc
public class XxxProjectConfiguration {
private static Logger LOG = Logger.getLogger(XxxProjectConfiguration.class);
#Autowired
private Environment env;
#PostConstruct
public void initApp() {
LOG.debug("Looking for Spring profiles...");
if (env.getActiveProfiles().length == 0) {
LOG.info("No Spring profile configured, running with default configuration.");
} else {
for (String profile : env.getActiveProfiles()) {
LOG.info("Detected Spring profile: {}" + profile);
}
}
}
#Autowired
private UserConfiguration userConfiguration;
// DAO
#Bean
public RelationshipDAO relationshipDAO() {
return new RelationshipDAOImpl();
}
#Bean
public RelationshipStatusDAO relationshipStatusDAO() {
return new RelationshipStatusDAOImpl();
}
#Bean
public MessageDAO messageDAO() {
return new MessageDAOImpl();
}
// Services
#Bean
public UserServiceImpl userService() {
return new UserServiceImpl(userConfiguration.userDAO(), relationshipDAO(), relationshipStatusDAO(), messageDAO());
}
}
And
#Configuration
#Import(PersistenceConfiguration.class)
public class UserConfiguration {
#Bean
public UserDAO userDAO() {
return new UserDAOImpl();
}
}
The service :
#Transactional(propagation=Propagation.SUPPORTS)
public class UserServiceImpl implements UserService, Serializable {
private static final long serialVersionUID = 1L;
private UserDAO userDAO;
private RelationshipDAO relationshipDAO;
private RelationshipStatusDAO relationshipStatusDAO;
private MessageDAO messageDAO;
public UserServiceImpl(final UserDAO userDAO, final RelationshipDAO relationshipDAO, final RelationshipStatusDAO relationshipStatusDAO, final MessageDAO messageDAO) {
this.userDAO = userDAO;
this.relationshipDAO = relationshipDAO;
this.relationshipStatusDAO = relationshipStatusDAO;
this.messageDAO = messageDAO;
}
#Override
#Transactional(propagation = Propagation.REQUIRED, rollbackFor = UserServiceException.class)
public RelationshipStatus wantsRelationship(final long fromUserId, final long toUserId) throws UserServiceException {
try {
final Relationship relationship = new Relationship(new Date());
User fromUser = userDAO.get(fromUserId);
User toUser = new User(toUserId);
relationship.getUsers().add(fromUser);
fromUser.getRelationships().add(relationship);
relationship.getUsers().add(toUser);
toUser.getRelationships().add(relationship);
relationship.setWantsFromUserId(fromUserId);
final Message message = new Message(fromUserId, "Hi ! My name is " + fromUser.getFirstName() + ", I want to meet you");
relationship.getMessages().add(message);
relationship.setStatus(new RelationshipStatus(Status.WANTS));
relationshipDAO.persist(relationship);
return relationship.getStatus();
} catch (Exception e) {
throw new UserServiceException(e);
}
}
...
}
I do not understand anything...

The missing code is :
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory emf){
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(emf);
return transactionManager;
}

Related

SFTP #Poller not triggering polling nothing happens

I am trying to set the spring boot application which will pool the csv . i do not see any activity happning in the spring boot application nor on filezilla SFTP server but if I change the same code to FTP then it works
#Component
#EnableIntegration
public class IntegrationConfiguration {
#Autowired
FTPConfigProperties ftpConfigProperties;
#Autowired
private BeanFactory beanFactory;
#Value("classpath:certificate.crt")
Resource certficateFile;
#Bean
public SessionFactory<ChannelSftp.LsEntry> ftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory();
factory.setHost("127.0.0.1");
factory.setPort(990);
factory.setUser("abhinav");
factory.setPassword("nssdw");
factory.setPrivateKey(certficateFile);
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<ChannelSftp.LsEntry>(factory, 100000);
}
#Bean
public SftpInboundFileSynchronizer ftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer fileSynchronizer = new SftpInboundFileSynchronizer(ftpSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setRemoteDirectory("/");
fileSynchronizer.setFilter(filter());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setPreserveTimestamp(true);
fileSynchronizer.setBeanFactory(beanFactory);
return fileSynchronizer;
}
//here the poller is configured
#Bean
#InboundChannelAdapter(channel = "fromSftpChannel", poller = #Poller(fixedDelay = "10000"))
public MessageSource<File> ftpMessageSource() throws Exception {
SftpInboundFileSynchronizingMessageSource source = new SftpInboundFileSynchronizingMessageSource(
ftpInboundFileSynchronizer());
source.setLocalDirectory(new File("ftp-inbound"));
source.setAutoCreateLocalDirectory(true);
source.setMaxFetchSize(1);
source.setBeanFactory(beanFactory);
source.setUseWatchService(true);
return source;
}
public CompositeFileListFilter<ChannelSftp.LsEntry> filter() {
CompositeFileListFilter<ChannelSftp.LsEntry> filter = new CompositeFileListFilter<ChannelSftp.LsEntry>();
filter.addFilter(new SftpSimplePatternFileListFilter("*.csv"));
filter.addFilter(acceptOnceFilter());
filter.addFilter(new LastModifiedLsEntryFileListFilter());
return filter;
}
#Bean
public SftpPersistentAcceptOnceFileListFilter acceptOnceFilter() {
SftpPersistentAcceptOnceFileListFilter filter = new SftpPersistentAcceptOnceFileListFilter(metadataStore(),"ftpPersistentAcceptOnce");
filter.setFlushOnUpdate(true);
return filter;
}
#Bean
public ConcurrentMetadataStore metadataStore() {
PropertiesPersistingMetadataStore propertiesPersistingMetadataStore = new PropertiesPersistingMetadataStore();
propertiesPersistingMetadataStore.setBaseDirectory("./metastore");
propertiesPersistingMetadataStore.setFileName("ftpStream.properties");
return propertiesPersistingMetadataStore;
}
#Bean
#ServiceActivator(inputChannel = "jobChannel", outputChannel = "nullChannel")
protected JobLaunchingMessageHandler launcher(JobLauncher jobLauncher) {
return new JobLaunchingMessageHandler(jobLauncher);
}
}
here the next call where I trigger the spring batch then it goes to service actuator
#Component
public class FileToJobTransformer implements ApplicationContextAware {
private ApplicationContext context;
#Autowired
private Job job;
#Transformer(inputChannel = "fromSftpChannel", outputChannel = "jobChannel")
public JobLaunchRequest transform(File aFile) throws Exception {
String fileName = aFile.getName();
JobParameters jobParameters = new JobParametersBuilder().addString(
"input.file", aFile.getAbsolutePath()).toJobParameters();
JobLaunchRequest request = new JobLaunchRequest(job, jobParameters);
return request;
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
}
}
the custome code is as follow
public class LastModifiedLsEntryFileListFilter implements FileListFilter<ChannelSftp.LsEntry> {
private static final long DEFAULT_AGE = 60;
private volatile long age = DEFAULT_AGE;
public long getAge() {
return this.age;
}
public void setAge(long age) {
setAge(age, TimeUnit.SECONDS);
}
public void setAge(long age, TimeUnit unit) {
this.age = unit.toSeconds(age);
}
#Override
public List<ChannelSftp.LsEntry> filterFiles(ChannelSftp.LsEntry[] files) {
System.out.println("files = [" + files.length + "]");
List<ChannelSftp.LsEntry> list = new ArrayList<ChannelSftp.LsEntry>();
long now = System.currentTimeMillis() / 1000;
for (ChannelSftp.LsEntry file : files) {
if (file.getAttrs()
.isDir()) {
continue;
}
int lastModifiedTime = file.getAttrs()
.getMTime();
if (lastModifiedTime + this.age <= now) {
list.add(file);
}
}
Collections.reverse(list);
ArrayList<ChannelSftp.LsEntry> oneElementList = new ArrayList<ChannelSftp.LsEntry>(1) ;
oneElementList.add(list.get(0));
return oneElementList;
}
}

Spring KafkaEmbedded Testing listener is not consuming message

i like to unit test some spring kafka listeners. This works fine in production, but i have some problems with the unit tests. I defined the configuration complettly with spring configuration beans but the listener is never called. Did i miss something?
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = {KafkaSpringBootTest.class})
#Configuration
#DirtiesContext
public class KafkaSpringBootTest {
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1);
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
System.setProperty("spring.kafka.consumer.auto-offset-reset", "earliest");
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return props;
}
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public DefaultKafkaConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public KafkaTransferListener kafkaTransferListener() {
return new KafkaTransferListener();
}
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Test
public void testSendMessage() throws InterruptedException {
System.out.println("now sending");
kafkaTemplate.send("test", "hello");
Thread.sleep(5000);
}
}
class KafkaTransferListener {
#KafkaListener(topics = "test")
public void listen(String test) {
System.out.println("received message via kafka: " + test);
}
}
Versions:
org.springframework.kafka:spring-kafka-test:2.0.0.RELEASE
org.apache.kafka:kafka_2.11:0.11.0.0
Thanks in advance
I don't see #SpringBootApplication configuration to be sure that Spring Kafka is auto-configured.
You use KafkaEmbedded and its properties to configure ProducerConfig, but at the same time I don't see how you configure ConsumerConfig. Essentially you should use the same properties from the EmbeddedKafka.
With the Boot auto-configuration you really don't need all that kung-fu. There is a simplest way to configure everything against EmbeddedKafka:
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
}
And you definitely must have #SpringBootApplication class in the same package as this test. Everything rest will be done by Boot.
See this sample on the matter.

Mocking a method inside my test class

Android Studio 2.3
I have the following method I want to test inside my model class:
public class RecipeListModelImp implements RecipeListModelContract {
private Subscription subscription;
private RecipesAPI recipesAPI;
private RecipeSchedulers recipeSchedulers;
#Inject
public RecipeListModelImp(#NonNull RecipesAPI recipesAPI, #NonNull RecipeSchedulers recipeSchedulers) {
this.recipesAPI = Preconditions.checkNotNull(recipesAPI);
this.recipeSchedulers = Preconditions.checkNotNull(recipeSchedulers);
}
#Override
public void getRecipesFromAPI(final RecipeGetAllListener recipeGetAllListener) {
subscription = recipesAPI.getAllRecipes()
.subscribeOn(recipeSchedulers.getBackgroundScheduler())
.observeOn(recipeSchedulers.getUIScheduler())
.subscribe(new Subscriber<List<Recipe>>() {
#Override
public void onCompleted() {
}
#Override
public void onError(Throwable e) {
recipeGetAllListener.onRecipeGetAllFailure(e.getMessage());
}
#Override
public void onNext(List<Recipe> recipe) {
recipeGetAllListener.onRecipeGetAllSuccess(recipe);
}
});
}
#Override
public void shutdown() {
if(subscription != null && !subscription.isUnsubscribed()) {
subscription.unsubscribe();
}
}
}
Inside my test class I am testing like this:
public class RecipeListModelImpTest {
#Mock Subscription subscription;
#Mock RecipesAPI recipesAPI;
#Mock RecipeListModelContract.RecipeGetAllListener recipeGetAllListener;
#Mock List<Recipe> recipes;
#Inject RecipeSchedulers recipeSchedulers;
private RecipeListModelContract recipeListModel;
#Before
public void setup() {
TestBusbyComponent testBusbyComponent = DaggerTestBusbyComponent.builder()
.mockRecipeSchedulersModule(new MockRecipeSchedulersModule())
.build();
testBusbyComponent.inject(RecipeListModelImpTest.this);
MockitoAnnotations.initMocks(RecipeListModelImpTest.this);
recipeListModel = new RecipeListModelImp(recipesAPI, recipeSchedulers);
}
#Test(expected = NullPointerException.class)
public void testShouldThrowExceptionOnNullParameter() {
recipeListModel = new RecipeListModelImp(null, null);
}
#Test
public void testRecipeListModelShouldNotBeNull() {
assertNotNull(recipeListModel);
}
#Test
public void testShouldGetRecipesFromAPI() {
when(recipesAPI.getAllRecipes()).thenReturn(Observable.just(recipes));
recipeListModel.getRecipesFromAPI(recipeGetAllListener);
verify(recipesAPI, times(1)).getAllRecipes();
verify(recipeGetAllListener, times(1)).onRecipeGetAllSuccess(recipes);
verify(recipeGetAllListener, never()).onRecipeGetAllFailure(anyString());
}
#Test
public void testShouldFailToGetRecipesFromAPI() {
when(recipesAPI.getAllRecipes())
.thenReturn(Observable.<List<Recipe>>error(
new Throwable(new RuntimeException("Failed to get recipes"))));
recipeListModel.getRecipesFromAPI(recipeGetAllListener);
verify(recipesAPI, times(1)).getAllRecipes();
verify(recipeGetAllListener, times(1)).onRecipeGetAllFailure(anyString());
verify(recipeGetAllListener, never()).onRecipeGetAllSuccess(recipes);
}
#Test
public void testShouldShutdown() {
when(subscription.isUnsubscribed()).thenReturn(false);
final Field subscriptionField;
try {
subscriptionField = recipeListModel.getClass().getDeclaredField("subscription");
subscriptionField.setAccessible(true);
subscriptionField.set(recipeListModel, subscription);
} catch(NoSuchFieldException e) {
e.printStackTrace();
}
catch(IllegalAccessException e) {
e.printStackTrace();
}
recipeListModel.shutdown();
verify(subscription, times(1)).unsubscribe();
}
}
However, the problem is the Subscription in my model class is always null so will never enter the if blook. Is there any way to test this with using Mockito or spys?
Many thanks for any suggestions,
You should for testing recipeListModel class, where you have shutdown() method , set mock into this class.
If you don't have set method for subscription in recipeListModel , or constructor param.... ),you can set mock object with reflection like :
#Test
public void testShouldShutdown() {
Subscription subscription = mock(Subscription.class);
when(subscription.isUnsubscribed()).thenReturn(false);
Field subscriptionField = recipeListModel.getClass().getDeclaredField("subscription");
subscriptionField.setAccessible(true);
subscriptionField.set(recipeListModel, subscriptionMock);
recipeListModel.shutdown();
verify(subscription, times(1)).unsubscribe();
}
after your update :
if you can't change way of creation , you should mock it like (full way of creation) , i don't know your api , so it's just idea:
Subscription subscription = mock(Subscription.class);
when(subscription.isUnsubscribed()).thenReturn(false);
// preparation mock for create Subscription
//for recipesAPI.getAllRecipes()
Object mockFor_getAllRecipes = mock(....);
when(recipesAPI.getAllRecipes()).thenReturn(mockFor_getAllRecipes );
//for subscribeOn(recipeSchedulers.getBackgroundScheduler())
Object mockFor_subscribeOn = mock();
when(mockFor_getAllRecipes.subscribeOn(any())).thenReturn(mockFor_subscribeOn);
//for .observeOn(recipeSchedulers.getUIScheduler())
Object mockFor_observeOn = mock();
when(mockFor_subscribeOn .observeOn(any())).thenReturn(observeOn);
// for .subscribe
when(observeOn.subscribe(any()).thenReturn(subscription);

Spring Cloud App Starter, sftp source, recurse a directory for files

I am using SFTP Source in Spring cloud dataflow and it is working for getting files define in sftp:remote-dir:/home/someone/source , Now I have a many subfolders under the remote-dir and I want to recursively get all the files under this directory which match the patten. I am trying to use filename-regex: but so far it only works on one level. How do I recursively get the files I need.
The inbound channel adapter does not support recursion; use a custom source with the outbound gateway with an MGET command, with recursion (-R).
The doc is missing that option; fixed in the current docs.
I opened an issue to create a standard app starter.
EDIT
With the Java DSL...
#SpringBootApplication
#EnableBinding(Source.class)
public class So44710754Application {
public static void main(String[] args) {
SpringApplication.run(So44710754Application.class, args);
}
// should store in Redis or similar for persistence
private final ConcurrentMap<String, Boolean> processed = new ConcurrentHashMap<>();
#Bean
public IntegrationFlow flow() {
return IntegrationFlows.from(source(), e -> e.poller(Pollers.fixedDelay(30_000)))
.handle(gateway())
.split()
.<File>filter(p -> this.processed.putIfAbsent(p.getAbsolutePath(), true) == null)
.transform(Transformers.fileToByteArray())
.channel(Source.OUTPUT)
.get();
}
private MessageSource<String> source() {
return () -> new GenericMessage<>("foo/*");
}
private AbstractRemoteFileOutboundGateway<LsEntry> gateway() {
AbstractRemoteFileOutboundGateway<LsEntry> gateway = Sftp.outboundGateway(sessionFactory(), "mget", "payload")
.localDirectory(new File("/tmp/foo"))
.options(Option.RECURSIVE)
.get();
gateway.setFileExistsMode(FileExistsMode.IGNORE);
return gateway;
}
private SessionFactory<LsEntry> sessionFactory() {
DefaultSftpSessionFactory sf = new DefaultSftpSessionFactory();
sf.setHost("10.0.0.3");
sf.setUser("ftptest");
sf.setPassword("ftptest");
sf.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(sf);
}
}
And with Java config...
#SpringBootApplication
#EnableBinding(Source.class)
public class So44710754Application {
public static void main(String[] args) {
SpringApplication.run(So44710754Application.class, args);
}
#InboundChannelAdapter(channel = "sftpGate", poller = #Poller(fixedDelay = "30000"))
public String remoteDir() {
return "foo/*";
}
#Bean
#ServiceActivator(inputChannel = "sftpGate")
public SftpOutboundGateway mgetGate() {
SftpOutboundGateway sftpOutboundGateway = new SftpOutboundGateway(sessionFactory(), "mget", "payload");
sftpOutboundGateway.setOutputChannelName("splitterChannel");
sftpOutboundGateway.setFileExistsMode(FileExistsMode.IGNORE);
sftpOutboundGateway.setLocalDirectory(new File("/tmp/foo"));
sftpOutboundGateway.setOptions("-R");
return sftpOutboundGateway;
}
#Bean
#Splitter(inputChannel = "splitterChannel")
public DefaultMessageSplitter splitter() {
DefaultMessageSplitter splitter = new DefaultMessageSplitter();
splitter.setOutputChannelName("filterChannel");
return splitter;
}
// should store in Redis, Zookeeper, or similar for persistence
private final ConcurrentMap<String, Boolean> processed = new ConcurrentHashMap<>();
#Filter(inputChannel = "filterChannel", outputChannel = "toBytesChannel")
public boolean filter(File payload) {
return this.processed.putIfAbsent(payload.getAbsolutePath(), true) == null;
}
#Bean
#Transformer(inputChannel = "toBytesChannel", outputChannel = Source.OUTPUT)
public FileToByteArrayTransformer toBytes() {
FileToByteArrayTransformer transformer = new FileToByteArrayTransformer();
return transformer;
}
private SessionFactory<LsEntry> sessionFactory() {
DefaultSftpSessionFactory sf = new DefaultSftpSessionFactory();
sf.setHost("10.0.0.3");
sf.setUser("ftptest");
sf.setPassword("ftptest");
sf.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(sf);
}
}

#CacheEvict with key="#id" throws NullPointerException

I'm trying to use Spring Caching annotations #Cacheable and #CacheEvict together with the GuavaCacheManager.
I've created a test case with these two tests:
cachesById - verifies that two invocations to a method annotatted with #Cacheable returns the same object
evict - verifies that two different instances are returned if a method annotated with #CacheEvict is called in-between those two invocations
Both work fine when i don't specify a key for #CacheEvict, however when I do i get the following exception:
java.lang.NullPointerException
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:210)
at com.google.common.cache.LocalCache$LocalManualCache.invalidate(LocalCache.java:4764)
at org.springframework.cache.guava.GuavaCache.evict(GuavaCache.java:135)
at org.springframework.cache.interceptor.AbstractCacheInvoker.doEvict(AbstractCacheInvoker.java:95)
at org.springframework.cache.interceptor.CacheAspectSupport.performCacheEvict(CacheAspectSupport.java:409)
at org.springframework.cache.interceptor.CacheAspectSupport.processCacheEvicts(CacheAspectSupport.java:392)
at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:362)
at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:299)
at org.springframework.cache.interceptor.CacheInterceptor.invoke(CacheInterceptor.java:61)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:653)
at com.myorg.caching.CacheTest$Repo$$EnhancerBySpringCGLIB$$eed50f3e.update(<generated>)
at com.myorg.caching.CacheTest.evict(CacheTest.java:50)
This can be reproduced by executing the below test.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(
classes = { Repo.class, CacheTest.SpringConfig.class },
loader = AnnotationConfigContextLoader.class)
public class CacheTest {
private static final String CACHE_NAME = "cacheName";
#Inject
private Repo repo;
#Test
public void cachesById() {
Entity aResult1 = repo.getEntity(1);
Entity aResult2 = repo.getEntity(1);
assertEquals(aResult1.getId(), aResult2.getId());
assertSame(aResult1, aResult2);
}
#Test
public void evict() {
Entity aResult1 = repo.getEntity(1);
repo.update(aResult1);
Entity aResult2 = repo.getEntity(1);
assertEquals(aResult1.getId(), aResult2.getId());
assertNotSame(aResult1, aResult2);
}
/** Mock repository/entity classes below. */
#Component
public static class Repo {
#Cacheable(value = CACHE_NAME, key = "#id")
public Entity getEntity(int id) {
return new Entity(id);
}
#CacheEvict(value = CACHE_NAME, key = "#id")
public void update(Entity e) {
}
}
public static class Entity {
private int id;
public Entity(int id) {
super();
this.id = id;
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
}
/** Guava Cachemanager Spring configuration */
#Configuration
#EnableCaching
public static class SpringConfig {
#Bean
public CacheManager cacheManager() {
GuavaCacheManager manager = new GuavaCacheManager(CACHE_NAME);
manager.setCacheBuilder(CacheBuilder.newBuilder().expireAfterWrite(
1, TimeUnit.MINUTES).recordStats());
return manager;
}
}
}
However the test passes if I change
#CacheEvict(value = CACHE_NAME, key = "#id")
public void update(Entity e) {
into:
#CacheEvict(value = CACHE_NAME)
public void update(Entity e) {
..but then I'm missing the point where I need to specify the cache key for Entity. Does anyone know what I'm missing?
Thanks!
You have to fix you component class from
#Component
public static class Repo {
#Cacheable(value = CACHE_NAME, key = "#id")
public Entity getEntity(int id) {
return new Entity(id);
}
#CacheEvict(value = CACHE_NAME, key = "#id")
public void update(Entity e) {
}
}
to
#Component
public static class Repo {
#Cacheable(value = CACHE_NAME, key = "#id")
public Entity getEntity(int id) {
return new Entity(id);
}
#CacheEvict(value = CACHE_NAME, key = "#e?.id")
public void update(Entity e) {
}
}
Why? In getEntity method you're caching an Entity object using int id, you have to pass the same int id into the #CacheEvict annotated method. You don't have to change method's signature - by using SPEL you can "get into" entity and use its id field.
Hope I helped.

Resources