I am using spring data cassandra, and i have a #Table as defined below.
#Table(CassandraConstants.NotificationThread.NAME)
public class Event implements Serializable {
private static final long serialVersionUID = 1L;
#PrimaryKey
private EventKey primaryKey;
#Column(value = CassandraConstants.Event.COL_COMPONENT_TYPE)
private ComponentType componentType;
...
}
In my dao code i am doing setting the enum value and doing a save. but i get an error.
event.setComponentType(ComponentType.CONNECTOR);
....
this.eventDao.save(event);
but i see this error reported while doing the save action
Invalid value CONNECTOR of type unknown to the query builder...
Does the spring Data not handle the conversion of enums to string data type for cassandra ?
Any pointers to what is failing here.
Are you using the official spring-data-cassandra project, (repo at https://github.com/spring-projects/spring-data-cassandra), that's under the Spring Data umbrella of projects, or a different project?
Related
I have a Repository interface that has two implementations. One reads data from a locally stored CSV file while the other reads from an Amazon Dynamo DB. I would like to be able to switch between which implementation I'm using based on an application property or custom build profile. I would normally use a Factory to retrieve the correct class at runtime, but I would like to do this with injection if possible.
I found a similar question using Spring boot but couldn't find an equivalent that would work in Quarkus Spring choose bean implementation at runtime
I also tried implementing a Configuration class similar to what is found in the docs here but again didn't have much luck. https://quarkus.io/guides/cdi-reference#default_beans
It feels like I'm missing something obvious so any pointers would be much appreciated.
Here is a simple example of my classes:
#ApplicationScoped
public class ExampleService {
#Inject
ExampleRepository repository;
public List<Data> retrieveData() {
return repository.retrieveData();
}
}
public interface ExampleRepository {
List<Data> retrieveData();
}
#ApplicationScoped
public class DynamoRepository implements ExampleRepository {
#Override
public List<Data> retrieveData() {
//Get Data from DynamoDb
}
}
#ApplicationScoped
public class CsvRepository implements ExampleRepository {
#Inject
CsvBeanHandler csvBeanHandler;
#Inject
LocalFileReader fileReader;
#Override
public List<Data> retrieveData() {
// Get data from CSV
}
}
I currently also have the following in my application.yml:
com:
example:
application:
storage-type: 'CSV' # OR AMAZON_DYNAMO_DB
It looks like they've added this directly to the documentation:
https://quarkus.io/guides/cdi-reference#declaratively-choose-beans-that-can-be-obtained-by-programmatic-lookup
I feel a bit guilty pasting this much, but it's the SO way.
I can add that it is NOT like a Guice 'binding'; BOTH classes will be instantiated, but only one will be injected. Also unlike Guice, you cannot inject the interface (or I did it wrong) - you have to do what's shown below, with Instance.
Personally I just use constructor injection and then drop the value of the Instance wrapper into a final field, so I'm not crying about the extra step. I do miss the power and explicit bindings possible with Modules ala Guice, but the simplicity here has its own value.
5.16. Declaratively Choose Beans That Can Be Obtained by Programmatic Lookup
It is sometimes useful to narrow down the set of beans that can be
obtained by programmatic lookup via javax.enterprise.inject.Instance.
Typically, a user needs to choose the appropriate implementation of an
interface based on a runtime configuration property.
Imagine that we have two beans implementing the interface
org.acme.Service. You can’t inject the org.acme.Service directly
unless your implementations declare a CDI qualifier. However, you can
inject the Instance instead, then iterate over all
implementations and choose the correct one manually. Alternatively,
you can use the #LookupIfProperty and #LookupUnlessProperty
annotations. #LookupIfProperty indicates that a bean should only be
obtained if a runtime configuration property matches the provided
value. #LookupUnlessProperty, on the other hand, indicates that a bean
should only be obtained if a runtime configuration property does not
match the provided value.
#LookupIfProperty Example
interface Service {
String name();
}
#LookupIfProperty(name = "service.foo.enabled", stringValue = "true")
#ApplicationScoped
class ServiceFoo implements Service {
public String name() {
return "foo";
}
}
#ApplicationScoped
class ServiceBar implements Service {
public String name() {
return "bar";
}
}
#ApplicationScoped
class Client {
#Inject
Instance<Service> service;
void printServiceName() {
// This will print "bar" if the property "service.foo.enabled" is NOT set to "true"
// If "service.foo.enabled" is set to "true" then service.get() would result in an AmbiguousResolutionException
System.out.println(service.get().name());
}
}
If your request is to bind at startup time the right implementation based on a configuration property, I suppose your problem may be resolved used #Produces annotation:
public class ExampleRepositoryFactory {
#Config("storage-type")
String storageType;
#Produces
public ExampleRepository dynamoInstance() {
return storageType == "CSV" ? new CsvRepository() : new DynamoRepository();
}
}
I have a class
#Getter
#Setter
#AllArgsConstructor
#NoArgsConstructor
public class Store {
private Double probability;
private String store;
}
and json file
{"probability":"0.26","store":"abc/s3"}
{"probability":"0.57","store":"abc/s1"}
I try to read it as a dataset and convert it to map. Reading as dataset is successful and able to operate on it using spark sql commands and also able to view the dataset using show(), etc
Dataset<Store> ds = ss.read().json(path).as(Encoders.bean(Store.class));
Map<String, Double> storeMap = ds.collectAsList().stream()
.collect(Collectors.toMap(Store::getStore, Store::getProbability));
But converting to map fails with error. This error is on the command ds.collectAsList() itself
No applicable constructor/method found for actual parameters
"org.apache.spark.unsafe.types.UTF8String";
candidates are:
"public static java.lang.Double java.lang.Double.valueOf(java.lang.String) throws java.lang.NumberFormatException",
"public static java.lang.Double java.lang.Double.valueOf(double)"
What am i doing wrong ?
problem is not in your code but in your JSON data . It should be
{"probability":0.26,"store":"abc/s3"}
{"probability":0.57,"store":"abc/s1"}
Or change private Double probability; to private String probability;
I have created a UDT named widgetData in cql for which i have a corresponding POJO class named widgetData. I want to use this in another domain POJO class as List. What kind of annotation should be used to do so?
#Table("dashboardManagement")
public class Dashboard implements Serializable {
#Column("dashboardState")
#CassandraType(type = DataType.Name.UDT, userTypeName = "widgetData")
private List<widgetData> dashboardState;
....
The above code does not work.
Do I have to write a seperate userTypeResolver for this?
I realize this question is a little old, but I have made this work.
Basically, I had a user profile with an address UDT, and that UDT had its own POJOs/entity classes. The UDT address entity class used the #UserDefinedType annotation:
#UserDefinedType("address")
public class AddressEntity implements Serializable {
private static final long serialVersionUID = 1817053316281666003L;
#Column("mailto_name")
private String mailtoName;
private String street;
private String street2;
private String city;
...
The user entity utilized the Address UDT entity:
#Table("user")
public class UserEntity implements Serializable {
private static final long serialVersionUID = 4067531918643498429L;
#PrimaryKey("user_id")
private UUID userId;
#Column("user_email")
private String userEmail;
#Column("first_name")
private String firstName;
#Column("last_name")
private String lastName;
#Column("addresses")
private List<AddressEntity> addresses;
...
Then, it was a simple matter to map a user's address data to a UserEntity object (userE below) and save it with standard repository methods.
// save to DB
userRepo.save(userE);
You can find everything built to support the User services here: https://github.com/datastaxdevs/workshop-ecommerce-app/blob/main/backend/src/main/java/com/datastax/tutorials/service/user/
So I would say to have a look at the class for the widgetData object, make sure it's using the #UserDefinedType annotation, and mark the column using the #Column annotation in the Dashboard class (basically, get rid of the #CassandraType):
#Column("dashboardState")
private List<WidgetData> dashboardState;
I am unable to save child class data with the persistence of parent class using astyanax in cassandra.
I created the child object with all necessary data, but when I try to store that object, only values from the parent class is stored, not from child object.
Here is the sample Code not real:
#Entity
class Shape{
#Id
private String id;
#Column
private String name;
}
#Entity
class square extends Shape{
#column
private int width;
}
now to store I am using EntityManager of astyanax.
square s=new square();
s.setName("sqaure");
s.setWidth(100);
s.setId("1234");
EntityManager em= //initialization code
em.put(s);
after doing this only "name" and "id" is stored into database. not width.
The EntityManager requires the type of the entity via the withEntityType() method. This type is used to build an EntityMapper via reflection which then determines the fields to serialize. There is nothing in the Entity persistence documentation or examples that says Astyanax supports polymorphism. This is not a bug, just a feature that doesn't exist. You will need a type-specific EntityManager for each subtype of your base class.
I need to display a large table with about 1300 roles at one time. (I know I should use a data scroll but my users want to see the whole table at one time.) The table displays 4 columns. Two of those columns are from the object but the other two are from referenced objects in the original object. I need to find the best/efficient way to do this. I currently have this working but when I reload the table it give an out of memory error. I think that it's caused by the large amount of redundant data in memory.
Create a view object that the repository will fill in only the needed fields.
Any other suggestions.
Here are the objects:
public class Database extends EntityObject {
private Long id;
private String name;
private String connectionString;
private String username;
private String password;
private String description;
// getter and setters omitted
}
public class Application extends EntityObject {
private Long id;
private String name;
private String fullName = "";
private String description;
private Database database;
private List<Role> roles = new ArrayList<Role>(0);
// getter and setters omitted
}
public class Role extends EntityObject {
private Long id;
private String name;
private String nameOnDatabase;
private Application application;
// getter and setters omitted
}
What I need displayed from the list of Roles is:
role.id, role.name, role.application.name, role.application.database.name
To optimize wisely, define what are you going to do with data, view or/and edit. Here are some common scenarios:
Retrieval using lazy fetch type.
Mark your roles in application with FetchType.LAZY annotation.
Retrieval using multiselect query. Create your custom class (like DTO) and populate it with data from the database using multiselect query. (Similar to VIEW mapped as Entity)
There are also other possibilities, such as Shared (L2) Entity Cache or Retrieval by Refresh.
See if you are using EntityManager correctly reading Am I supposed to call EntityManager.clear() often to avoid memory leaks?.