Is there a way to create and use following column family with Astyanax:
CREATE TABLE mytest ( id text, subid text, clustering text, static_value text static, value text, PRIMARY KEY((id, subid), clustering));
If not, what are the best options for static columns?
The Astyanax Getting Started section contains a section on how to ensure proper serialization by annotating key fields with the "ordinal" keyword:
// Annotated composite class
Class SessionEvent{
private #Component(ordinal=0) String sessiondId;
private #Component(ordinal=1) UUID timestamp;
public SessionEvent() {
}
public int hashCode() { ... }
public boolean equals(Object o) { ... }
public int compareTo(Object o) { ... }
}
Otherwise, the Astyanax repo also has an example showing how to work directly with CQL3. To create your CF:
String CREATE_STATEMENT = "CREATE TABLE mytest ( id text, subid text, clustering text, static_value text static, value text, PRIMARY KEY((id, subid), clustering))";
try {
#SuppressWarnings("unused")
OperationResult<CqlResult<Integer, String>> result = keyspace
.prepareQuery(EMP_CF)
.withCql(CREATE_STATEMENT)
.execute();
} catch (ConnectionException e) {
logger.error("failed to create CF", e);
throw new RuntimeException("failed to create CF", e);
}
The (CQL3) link above also contains example methods that demonstrate reading and inserting as well.
Related
I want to persist an entity. I want to skip it in case it already exists in the datastore. Assume the name field is part of the primary key. Assume p1 exists in the datastore. Only p2 should be inserted. Inserting p1 produces duplicate key exception.
#Entity
public class PersonEntity extends PanacheEntity {
String name;
public PersonEntity(String name){
this.name=name;
}
public static Uni<PersonEntity> findByName(String name) {
return find("name", name).firstResult();
}
}
#QuarkusTest
public class PersonResourceTest {
#Test
#ReactiveTransactional
void persistListOfPersons() {
List<PersonEntity> persons = List.of(new PersonEntity("p1"), new PersonEntity("p2"));
Predicate<PersonEntity> personExists = entity -> {
//How to consume Uni?
Uni<PersonEntity> entityUni = PersonEntity.findByName(entity.name);
//entityUni.onItem().ifNull().continueWith(???);
//include entity in filtered stream
//return true;
//exclude entity from filtered stream
return false;
};
List<PersonEntity> filteredPersons = persons.stream().filter(personExists).toList();
PersonEntity.persist(filteredPersons);
}
}
I can't produce a valid filter predicate. I need a boolean value somehow produced by the person query. But how?
This should serve as a minimum reproducable example.
I have an Azure table where I have inserted heterogeneous entities. After the retrieval, I want to convert them to some specific type using "as". I tried to do this, but it threw the following error:
Cannot be able to convert DynamicTableEntity to TestingEntity Via reference conversion, boxing conversion, unboxing conversion, wrapping conversion or null type conversion.
Is there any way I can convert my entities to a particular type?
My code is as follows:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the table client.
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudTable table = tableClient.GetTableReference("TestingWithTableDatetime");
// Create the table if it doesn't exist.
table.CreateIfNotExists();
TableQuery<DynamicTableEntity> entityQuery =
new TableQuery<DynamicTableEntity>();
var employees = table.ExecuteQuery(entityQuery);
IEnumerable<DynamicTableEntity> entities = table.ExecuteQuery(entityQuery);
foreach (var e in entities)
{
EntityProperty entityTypeProperty;
if (e.Properties.TryGetValue("EntityType", out entityTypeProperty))
{
if (entityTypeProperty.StringValue == "SampleEntity1")
{
//Cannot be able to Use as
var TestingWithTableDatetime = e as SampleEntity1;
}
if (entityTypeProperty.StringValue == "SampleEntity2")
{
// Use entityTypeProperty, RowKey, PartitionKey, Etag, and Timestamp
}
if (entityTypeProperty.StringValue == "SampleEntity3")
{
// Use entityTypeProperty, RowKey, PartitionKey, Etag, and Timestamp
}
}
}
Class definition for Sample1
public class Sample1 : TableEntity
{
public Sample1(string pk, string rk)
{
this.PartitionKey = pk;
this.RowKey = rk;
EntityType = "MonitoringResources";
}
public string EntityType { get; set; }
public Sample1()
{
}
}
Things I have tried.I have created a class as Testing and in that I inherited Table entity.Then Testing is inherited by sample1 as follow
Testing Class definition
public class testing : TableEntity
{
public testing(string pk, string rk)
{
this.PartitionKey = pk;
this.RowKey = rk; //MetricKey
}
public string EntityType { get; set; }
public testing()
{
}
}
modified Class sample1:
public class sample1 : testing
{
public sample1(string pk, string rk) : base(pk, rk)
{
EntityType = "sample1";
}
public sample1()
{
}
}
In this i didnt get any error but
when I am converting it to sample1 by using "as" it returns as null.
Finally I ended with creating some helper.
public static class AzureManager
{
/// <summary>
/// Converts a dynamic table entity to .NET Object
/// </summary>
/// <typeparam name="TOutput">Desired Object Type</typeparam>
/// <param name="entity">Dynamic table Entity</param>
/// <returns>Output Object</returns>
public static TOutput ConvertTo<TOutput>(DynamicTableEntity entity)
{
return ConvertTo<TOutput>(entity.Properties, entity.PartitionKey, entity.RowKey);
}
/// <summary>
/// Convert a Dynamic Table Entity to A POCO .NET Object.
/// </summary>
/// <typeparam name="TOutput">Desired Object Types</typeparam>
/// <param name="properties">Dictionary of Table Entity</param>
/// <returns>.NET object</returns>
public static TOutput ConvertTo<TOutput>(IDictionary<string, EntityProperty> properties, string partitionKey, string rowKey)
{
var jobject = new JObject();
properties.Add("PartitionKey", new EntityProperty(partitionKey));
properties.Add("RowKey", new EntityProperty(rowKey));
foreach (var property in properties)
{
WriteToJObject(jobject, property);
}
return jobject.ToObject<TOutput>();
}
public static void WriteToJObject(JObject jObject, KeyValuePair<string, EntityProperty> property)
{
switch (property.Value.PropertyType)
{
case EdmType.Binary:
jObject.Add(property.Key, new JValue(property.Value.BinaryValue));
return;
case EdmType.Boolean:
jObject.Add(property.Key, new JValue(property.Value.BooleanValue));
return;
case EdmType.DateTime:
jObject.Add(property.Key, new JValue(property.Value.DateTime));
return;
case EdmType.Double:
jObject.Add(property.Key, new JValue(property.Value.DoubleValue));
return;
case EdmType.Guid:
jObject.Add(property.Key, new JValue(property.Value.GuidValue));
return;
case EdmType.Int32:
jObject.Add(property.Key, new JValue(property.Value.Int32Value));
return;
case EdmType.Int64:
jObject.Add(property.Key, new JValue(property.Value.Int64Value));
return;
case EdmType.String:
jObject.Add(property.Key, new JValue(property.Value.StringValue));
return;
default:
return;
}
}
}
the above one works for me.
var obj= AzureManager.ConvertTo<Sample1>(e);
If you find any other way.Please suggest.
Here is an alternative and much simpler solution for you that is natively supported by Azure Storage SDK version > 8.0.0. You do not even need to write any transformation / conversion code :)
Have a look at:
TableEntity.Flatten method: https://msdn.microsoft.com/en-us/library/azure/mt775434.aspx
TableEntity.ConvertBack method: https://msdn.microsoft.com/en-us/library/azure/mt775432.aspx
These methods are provided by the SDK as static, standalone helper methods. Flatten method will convert your entities to a flat dictionary of entity properties where you can simply assign a partition key and row key, create a dynamictableentity from the flat dictionary and write to azure table storage.
When you want to read the entity back, read it as dynamic table entity and pass the property dictionary of the returned dynamic table entity to TableEntity.ConvertBack method. Just tell it which type of object you want the method to convert the property dictionary into, via its generic type parameter and it will do the conversion for you.
I originally implemented these api s as nuget packages and now they are integrated into azure storage sdk. If you want to read a bit more about how they work you can see the article I wrote originally about the nuget packages here:
https://doguarslan.wordpress.com/2016/02/03/writing-complex-objects-to-azure-table-storage/
Is there any way I can convert my entities to a particular type?
We could use DynamicTableEntityConverter to do that.
According to your code, we could use the following code to covert DynamicTableEntity to Sample1
var TestingWithTableDatetime = DynamicTableEntityConverter.ConvertToPOCO<Sample1>(e);
Versions: Datastax Java driver 3.1.4, Cassandra 3.10
Consider the following table:
create table object_ta
(
objid bigint,
version_date timestamp,
objecttype ascii,
primary key (objid, version_date)
);
And a mapped class:
#Table(name = "object_ta")
public class ObjectTa
{
#Column(name = "objid")
private long objid;
#Column(name = "version_date")
private Instant versionDate;
#Column(name = "objecttype")
private String objectType;
public ObjectTa()
{
}
public ObjectTa(long objid)
{
this.objid = objid;
this.versionDate = Instant.now();
}
public long getObjId()
{
return objid;
}
public void setObjId(long objid)
{
this.objid = objid;
}
public Instant getVersionDate()
{
return versionDate;
}
public void setVersionDate(Instant versionDate)
{
this.versionDate = versionDate;
}
public String getObjectType()
{
return objectType;
}
public void setObjectType(String objectType)
{
this.objectType = objectType;
}
}
After creating a mapper for this class (mm is a MappingManager for the session on mykeyspace)
final Mapper<ObjectTa> mapper = mm.mapper(ObjectTa.class);
On calling
mapper.save(new ObjectTa(1));
I get
Query preparation failed: INSERT INTO mykeyspace.object_ta
(objid,objid,version_date,objecttype) VALUES (?,?,?,?);:
com.datastax.driver.core.exceptions.InvalidQueryException: The column
names contains duplicates at
com.datastax.driver.core.Responses$Error.asException(Responses.java:136)
at
com.datastax.driver.core.SessionManager$4.apply(SessionManager.java:220)
at
com.datastax.driver.core.SessionManager$4.apply(SessionManager.java:196)
at
com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:906)
at
com.google.common.util.concurrent.Futures$1$1.run(Futures.java:635)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
at java.lang.Thread.run(Thread.java:745)
I am at a loss to understand, why the duplicate objid is generated in the query.
Thank you in advance for pointers to the problem.
Clemens
I think it is because the inconsistent use of case on the field name (objid) vs the setter/getters (getObjId). If you rename getObjId and setObjId to getObjid and setObjid respectively, I believe it might work.
In a future release, the driver mapper will allow the user to be more explicit about whether setters/getters are used (JAVA-1310) and what the naming conventions are (JAVA-1316).
I have a Cassandra table trans_by_date with columns origin, tran_date (and some other columns). I try to run the below code get error:
java.util.NoSuchElementException: Columns not found in table trans.trans_by_date : TRAN_DATE. The column does exist.
Any syntax gotcha?
JavaRDD<TransByDate> transDateRDD = javaFunctions(sc)
.cassandraTable("trans", "trans_by_date", CassandraJavaUtil.mapRowTo(TransByDate.class))
.select(CassandraJavaUtil.column("origin"), CassandraJavaUtil.column("TRAN_DATE").as("transdate"));
public static class TransByDate implements Serializable {
private String origin;
private Date transdate;
public String getOrigin() { return origin; }
public void setOrigin(String id) { this.origin = id; }
public Date getTransdate() { return transdate; }
public void setTransdate(Date trans_date) { this.transdate = trans_date; }
}
Thanks
If you change CassandraJavaUtil.column("TRAN_DATE") to CassandraJavaUtil.column("tran_date"), i.e. only use lower-case column names, your code should work.
It seems that the CassandraJavaUtil puts the column name into double quotes when creating the select query.
See the following link for uppercase and lowercase handling in cassandra:
https://docs.datastax.com/en/cql/3.3/cql/cql_reference/ucase-lcase_r.html
POST EDITED - see edit below
I have a query about the FLuent Automapping which is used as part of the SHarp Architecture. Running one of the tests cases will generate a schema which I can use to create tables in my DB.
I'm developing a site with Posts, and Tags associated with these posts. I want a tag to be able to be associated with more than one post, and for each post to have 0 or more tags.
I wanting to achieve a DB schema of:
Post {Id, Title, SubmitTime, Content}
Tag {Id, Name}
PostTag {PostId, TagId}
Instead, I'm getting:
Post {Id, Title, SubmitTime, Content}
Tag {Id, Name, PostID (FK)}
I'm using sharp architecture, and may classes look as follows (more or less):
public class Post : Entity
{
[DomainSignature]
private DateTime _submittime;
[DomainSignature]
private String _posttitle;
private IList<Tag> _taglist;
private String _content;
public Post() { }
public Post(String postTitle)
{
_submittime = DateTime.Now;
_posttitle = postTitle;
this._taglist = new List<Tag>();
}
public virtual DateTime SubmitTime { get { return _submittime; } private set { _submittime = value; } }
public virtual string PostTitle { get { return _posttitle; } private set { _posttitle = value; } }
public virtual string Content { get { return _content; } set { _content = value; } }
public virtual IList<Tag> TagList { get { return _taglist; } set { _taglist = value; } }
public class Tag : Entity
{
[DomainSignature]
private String _name;
public Tag() { }
public Tag(String name)
{
this._name = name;
}
public virtual String Name
{
get { return _name; }
private set { _name = value; }
}
public virtual void EditTagName(String name)
{
this.Name = name;
}
}
I can see why it's gone for the DB schema set up that it has, as there will be times when an object can only exist as part of another. But a Tag can exist separately.
How would I go about achieving this? I'm quite new to MVC, Nhibernate, and SHarp architecture, etc, so any help would be much appreciated!
EDIT*
OK, I have now adjusted my classes slightly. My issue was that I was expecting the intermediate table to be inferred. Instead, I realise that I have to create it.
So I now have (I've simplified the classes a bit for readability's sake.:
class Post : Entity
{
[DomainSignature]
String Title
[DomainSignature]
DateTime SubmitTime
IList<PostTag> tagList
}
class Tag : Entity
{
[DomainSignature]
string name
}
class PostTag : Entity
{
[DomainSignature]
Post post
[DomainSignature]
Tag tag
}
This gives me the schema for the intermediate entity along with the usual Post and Tag tables:
PostTag{id, name, PostId(FK)}
The problem with the above is that it still does not include The foreign key for Tag. Also, should it really have an ID column, as it is a relational table? I would think that it should really be a composite key consisting of the PK from both Post and Tag tables.
I'm sure that by adding to the Tag class
IList<PostTag> postList
I will get another FK added to the PostTag schema, but I don't want to add the above, as the postList could be huge. I don't need it every time I bring a post into the system. I would have a separate query to calculate that sort of info.
Can anyone help me solve this last part? Thanks for your time.
Ok, I'd been led to believe that modelling the composite class in the domain was the way forward, but I finally come across a bit of automapper override code which creates the composite table without me needing to create the class for it, which was what I was expecting in the first place:
public class PostMappingOverride
: IAutoMappingOverride
{
public void Override(AutoMapping map)
{
map.HasManyToMany(e => e.TagList)
.Inverse()
.Cascade.SaveUpdate();
}
}
This will now give me my schema (following schema non simplified):
create table Posts (
Id INT not null,
PublishTime DATETIME null,
SubmitTime DATETIME null,
PostTitle NVARCHAR(255) null,
Content NVARCHAR(255) null,
primary key (Id)
)
create table Posts_Tags (
PostFk INT not null,
TagFk INT not null
)
create table Tags (
Id INT not null,
Name NVARCHAR(255) null,
primary key (Id)
)
alter table Posts_Tags
add constraint FK864F92C27E2C4FCD
foreign key (TagFk)
references Tags
alter table Posts_Tags
add constraint FK864F92C2EC575AE6
foreign key (PostFk)
references Posts
I think the thrower is that I've been looking for a one-to-many relationship, which it is, but it is called HasManytoMAny here...