Failed to fetch next results: QUERY_STATE_NEXT failed: transaction ID: XXXXX: no ledger context - hyperledger-fabric

I am trying to fetch data from blockchain using query in chaincode. I have invoked around 2,50,000 records in blockchain and trying to fetch the data using query. When I run the chaincode and get the peer logs, I am getting the below error.
Chaincode error in peer logs:
Here is my code:
queryStringsa := fmt.Sprintf("{\"selector\":{\"$and\":[{\"savesID\":{\"$ne\":\"%s\"}},{\"bankID\":{\"$eq\":\"%s\"}},{\"ytdSavedFlag\":{\"$ne\":\"%s\"}},{\"saveMonthYear\":{\"$eq\":\"%s\"}}]},\"use_index\":[\"_design/indexSavesDataReportDoc\",\"indexSavesDataReportName\"]}","null",bankidsave,"Yes",lastImportDatekey)
queryResultss11sa, errsav := getQueryResultForQueryString(stub, queryStringsa)
// getQueryResultForQueryString
func getQueryResultForQueryString(stub shim.ChaincodeStubInterface, queryString string) ([]byte, error) {
_scbLogger.Infof(string("**********************************"))
_scbLogger.Infof(string("getQueryResultForQueryString queryString : "+ queryString))
_scbLogger.Infof(string("**********************************"))
resultsIterator, err := stub.GetQueryResult(queryString)
if err != nil {
_scbLogger.Error("Error Starting SCB-Efficiency Chaincode is " + string(err.Error()))
return nil, err
}
defer resultsIterator.Close()
// buffer is a JSON array containing QueryRecords
var buffer bytes.Buffer
buffer.WriteString("[")
bArrayMemberAlreadyWritten := false
fmt.Println("resultsIterator length : ", (resultsIterator))
for resultsIterator.HasNext() {
queryResponse, err := resultsIterator.Next()
//fmt.Println("queryresponse inside for next : ", queryResponse)
if err != nil {
fmt.Println("$$$$$$$$$$$ error in reuslt iterator : ", err)
return nil, err
}
// Add a comma before array members, suppress it for the first array member
if bArrayMemberAlreadyWritten == true {
buffer.WriteString(",")
}
buffer.WriteString("{\"Key\":")
buffer.WriteString("\"")
buffer.WriteString(queryResponse.Key)
buffer.WriteString("\"")
buffer.WriteString(", \"Record\":")
// Record is a JSON object, so we write as-is
//fmt.Println("string(queryResponse.Value) : ",string(queryResponse.Value))
buffer.WriteString(string(queryResponse.Value))
buffer.WriteString("}")
bArrayMemberAlreadyWritten = true
}
buffer.WriteString("]")
//fmt.Printf("- getQueryResultForQueryString queryResult:\n%s\n", buffer.String())
return buffer.Bytes(), nil
}
I have 5 set of different queries in same function. sometimes few queries return the query result and sometimes none of the queries gives the result, rather it shows the above error.
When I run the same queries in CouchDB fauxton I am getting the query result. When I run the same function for less number of records, the queries work properly without any errors.

Related

Notification feed following Flat feed isn't showing activities

I have a notification feed like NOTIFICATIONS:userID and I have a flat feed GLOBAL:domain.
The notification feed is set up to follow the flat feed, but when I push activities to the flat feed they are not coming through to the notification feed. I can't get them to come through the react components or making the API calls directly. Any items in the notification feed come through fine, but not the flat feed.
Is there anything I would have missed when setting up the feeds to make this possible? I'm not sure why it isn't working.
Here's the code used to call getstream:
// AddNotification writes a feed notification to the provided feed.
func (c *Client) AddNotification(feedID, actor string, n *feed.Notification) error {
keys := map[string]bool{}
feeds := make([]stream.Feed, 0)
for _, s := range n.Streams {
if s == feed.STREAM_NONE {
continue
}
if _, ok := keys[s.String()]; ok {
continue
}
f, err := c.getstream.FlatFeed(s.String(), feedID)
if err != nil {
return errors.Wrapf(err, "failed to get feed %s", feedID)
}
keys[s.String()] = true
feeds = append(feeds, f)
}
extra, err := getExtraFromString(n.Content)
if err != nil {
return errors.Wrap(err, "failed to marshal extra content")
}
appliesAt, err := time.FromProtoTS(n.GetAppliesAt())
if err != nil {
return errors.Wrap(err, "failed to cast applies at time")
}
activity := stream.Activity{
Actor: actor,
Verb: n.GetVerb(),
Object: n.GetObject(),
Extra: extra,
ForeignID: n.GetIdempotentKey(),
Time: stream.Time{Time: appliesAt},
}
log.WithFields(log.Fields{
"activity": activity,
"feeds": keys,
}).Debug("sending request to stream.io")
if err = c.getstream.AddToMany(activity, feeds...); err != nil {
return errors.Wrap(err, "error while feeding to stream.io")
}
return nil
}
Just to explain the code a bit. We have a feed.Notification type that allows you to specify what we've called "streams", these are just types that represent the slugs.
In this case, I'm using the GLOBAL:domain feed, which the user's NOTIFICATION:userID feed is set up to follow.
From batch add docs:
Activities added using this method are not propagated to followers. That is, any other Feeds that follow the Feed(s) listed in the API call will not receive the new Activity.
If you're using batching, you need to specify all feeds you want to add the activity for. Another way is that you can add to feeds one by one to push to followers.

Reading Redis key-value which is JSON string using redigo

I am trying to read Redis Key-val in Go. Key is string and value is JSON string. Eg- Key=
discov_32161296
and Value as Json string=
"{\"10283\":true,\"11064\":true,\"15123\":true,\"15447\":true,\"15926\":true,\"16530\":true,\"16537\":true,\"16799\":true,\"17088\":true,\"17249\":true,\"18501\":true,\"18529\":true,\"18601\":true,\"3044\":true,\"3687\":true,\"4926\":true,\"5483\":true,\"6\":true,\"6675\":true,\"8332\":true,\"8336\":true,\"8674\":true}"
Getting below error while reading in Go
redis.Values err redigo: unexpected type for Values, got type []uint8
Here's my code :
uIDDiscoveryOffer := fmt.Sprintf("%s_%d", "discov", uid)
opDataStr, err := redis.String(redis.Values(con.Do("GET", uIDDiscoveryOffer)))
if err != nil || err != redis.ErrNil {
utils.Log1("readCacheTxnByUID-Disc-redis.Values-err", fmt.Sprint("redis.Values err : ", uidDiscoveryOffer, " error: ", err.Error()))
} else {
//Some Logic
}
The Redis GET returns the value of a key. redis.Values() may be used to convert the result of a command that returns multiple items.
Since GET returns a single item, only use redis.String(), you don't need redis.Values() here:
opDataStr, err := redis.String(con.Do("GET", uIDDiscoveryOffer))

Hyperledger Fabric golang chaincode not working as expected store data on ledger manually but not when try to store via function call

i am trying to store fund transfer record on hyperledger fabric. i have written chain code in go lang. it work fine when i add data in initLedger function. but when i call it from other function like createTransfer(i will provide both codes) it show successful transaction but when i retrieve chain data .it does not appear in it.
transfer struct
type Transfer struct {
TransferID string `json:"transferID"`
FromAccount string `json:"fromAccount"`
ToAccount string `json:"toAcount"`
Amount string `json:"amount"`
}
this function write data to ledger: it workis fine when i directly call it in initLedger method
func writeTransferToLedger(APIStub shim.ChaincodeStubInterface, transfers []Transfer) sc.Response {
for i := 0; i < len(transfers); i++ {
key := transfers[i].TransferID
chkBytes, _ := APIStub.GetState(key)
if chkBytes == nil {
asBytes, _ := json.Marshal(transfers[i])
err := APIStub.PutState(transfers[i].TransferID, asBytes)
if err != nil {
return shim.Error(err.Error())
}
} else {
msg := "Transfer already exist" + key + " Failure---------------"
return shim.Error(msg)
}
}
return shim.Success([]byte("Write to Ledger"))
}
Call writeToTransferLedger method with in createTransfer function:
func (s *SmartContract) createTransfer(APIStub shim.ChaincodeStubInterface, args []string) sc.Response {
if len(args) != 4 {
return shim.Error("Incorrect Number of arguments for transfer func, Expecting 4")
}
transfers := []Transfer{Transfer{TransferID: args[0], FromAccount: args[1], ToAccount: args[2], Amount: args[3]}}
writeTransferToLedger(APIStub, transfers)
return shim.Success([]byte("stored:" + args[0] + args[1] + args[2] + args[3]))
}
when i call createTransfer from nodesdk code it execute succefully but when i retrive data from chain code not thing return.
i Expect it to work with createTransfer function as it is working with writeTransferToLedger.
inside initLedger method i have created transfer struct with given data and called writeTransferToLedger function code is given below:
transfer := []Transfer{
{TransferID: "1233", FromAccount: "US_John_Doe_123", ToAccount: "UK_Alice_456", Amount: "200"},
{TransferID: "231", FromAccount: "JPY_Alice_456", ToAccount: "UK_John_Doe", Amount: "3000"},
}
writeTransferToLedger(APIstub, transfer)
thanks for your help . i have resolved the issue.
i was calling invoke function when trying to retrieve data from customer ledger.
i have to query the ledger and get transfer data from ledger.

Insert a mgo query []M.bson result into a file.txt as a string

i have to insert into a file the result of a mgo query MongoDB converted in Go to get the id of images
var path="/home/Medo/text.txt"
pipe := cc.Pipe([]bson.M{
{"$unwind": "$images"},
{"$group": bson.M{"_id": "null", "images":bson.M{"$push": "$images"}}},
{"$project": bson.M{"_id": 0}}})
response := []bson.M{}
errResponse := pipe.All(&response)
if errResponse != nil {
fmt.Println("error Response: ",errResponse)
}
fmt.Println(response) // to print for making sure that it is working
data, err := bson.Marshal(&response)
s:=string(data)
if err22 != nil {
fmt.Println("error insertion ", err22)
}
Here is the part where I have to create a file and write on it.
The problem is when I got the result of the query in the text file I got an enumeration values in the last of each value for example:
id of images
23456678`0`
24578689`1`
23678654`2`
12890762`3`
76543890`4`
64744848`5`
so for each value i got a number sorted in the last , and i can't figure out how , after getting the reponse from the query i converted the Bson to []Byte and then to Stringbut it keeps me getting that enumeration sorted values in the last of each results
I'd like to drop those 0 1 2 3 4 5
var _, errExistFile = os.Stat(path)
if os.IsNotExist(errExistFile) {
var file, errCreateFile = os.Create(path)
if isError(erro) {
return
}
defer file.Close()
}
fmt.Println("==> done creating file", path)
var file, errii = os.OpenFile(path, os.O_RDWR, 0644)
if isError(errii) {
return
}
defer file.Close()
// write some text line-by-line to file
_, erri := file.WriteString(s)
if isError(erri) {
return
}
erri = file.Sync()
if isError(erri) {
return
}
fmt.Println("==> done writing to file")
You could declare a simple struct eg
simple struct {
ID idtype `bson:"_id"`
Image int `bson:"images"`
}
The function to put the image ids into the file would be
open file stuff…
result := simple{}
iter := collection.Find(nil).Iter()
for iter.Next(&result){
file.WriteString(fmt.Sprintf("%d\n",result.Image))
}
iter.Close()

How to search a string in the elasticsearch document(indexed) in golang?

I am writing a function in golang to search for a string in elasticsearch documents which are indexed. I am using elasticsearch golang client elastic. For example consider the object is tweet,
type Tweet struct {
User string
Message string
Retweets int
}
And the search function is
func SearchProject() error{
// Search with a term query
termQuery := elastic.NewTermQuery("user", "olivere")
searchResult, err := client.Search().
Index("twitter"). // search in index "twitter"
Query(&termQuery). // specify the query
Sort("user", true). // sort by "user" field, ascending
From(0).Size(10). // take documents 0-9
Pretty(true). // pretty print request and response JSON
Do() // execute
if err != nil {
// Handle error
panic(err)
return err
}
// searchResult is of type SearchResult and returns hits, suggestions,
// and all kinds of other information from Elasticsearch.
fmt.Printf("Query took %d milliseconds\n", searchResult.TookInMillis)
// Each is a convenience function that iterates over hits in a search result.
// It makes sure you don't need to check for nil values in the response.
// However, it ignores errors in serialization. If you want full control
// over iterating the hits, see below.
var ttyp Tweet
for _, item := range searchResult.Each(reflect.TypeOf(ttyp)) {
t := item.(Tweet)
fmt.Printf("Tweet by %s: %s\n", t.User, t.Message)
}
// TotalHits is another convenience function that works even when something goes wrong.
fmt.Printf("Found a total of %d tweets\n", searchResult.TotalHits())
// Here's how you iterate through results with full control over each step.
if searchResult.Hits != nil {
fmt.Printf("Found a total of %d tweets\n", searchResult.Hits.TotalHits)
// Iterate through results
for _, hit := range searchResult.Hits.Hits {
// hit.Index contains the name of the index
// Deserialize hit.Source into a Tweet (could also be just a map[string]interface{}).
var t Tweet
err := json.Unmarshal(*hit.Source, &t)
if err != nil {
// Deserialization failed
}
// Work with tweet
fmt.Printf("Tweet by %s: %s\n", t.User, t.Message)
}
} else {
// No hits
fmt.Print("Found no tweets\n")
}
return nil
}
This search is printing tweets by the user 'olivere'. But if I give 'olive' then search is not working. How do I search for a string which is part of User/Message/Retweets?
And the Indexing function looks like this,
func IndexProject(p *objects.ElasticProject) error {
// Index a tweet (using JSON serialization)
tweet1 := `{"user" : "olivere", "message" : "It's a Raggy Waltz"}`
put1, err := client.Index().
Index("twitter").
Type("tweet").
Id("1").
BodyJson(tweet1).
Do()
if err != nil {
// Handle error
panic(err)
return err
}
fmt.Printf("Indexed tweet %s to index %s, type %s\n", put1.Id, put1.Index, put1.Type)
return nil
}
Output:
Indexed tweet 1 to index twitter, type tweet
Got document 1 in version 1 from index twitter, type tweet
Query took 4 milliseconds
Tweet by olivere: It's a Raggy Waltz
Found a total of 1 tweets
Found a total of 1 tweets
Tweet by olivere: It's a Raggy Waltz
Version
Go 1.4.2
Elasticsearch-1.4.4
Elasticsearch Go Library
github.com/olivere/elastic
Could anyone help me on this.? Thank you
How you search and find data depends on your analyser - from your code it's likely that the standard analyser is being used (i.e. you haven't specified an alternative in your mapping).
The Standard Analyser will only index complete words. So to match "olive" against "olivere" you could either:
Change the search process
e.g. switch from a term query to a Prefix query or use a Query String query with a wildcard.
Change the index process
If you want to find strings within larger strings then look at using nGrams or Edge nGrams in your analyser.
multiQuery := elastic.NewMultiMatchQuery(
term,
"name", "address", "location", "email", "phone_number", "place", "postcode",
).Type("phrase_prefix")

Resources