concurrent query and insert have any side effect in android with objectbox? - multithreading

In my android project, I use objectbox as database, if I insert with lock and query without lock, is there any side effect ? such as crash and so on.
fun query(uniqueId: String = ""): MutableList<T> {
if (box.store.isClosed) return mutableListOf()
val query = box.query()
withQueryBuilder(query, uniqueId)
//开始
return query.build().find()
}
private fun putInner(entity: T): Long {
synchronized(box.store) {
if (box.store.isClosed) return -1
if (entity.unique.isBlank()) {
entity.unique = entity.providerUnique()
}
entity.timestamp = System.currentTimeMillis()
return try {
box.put(entity).let { id -> entity.id = id }
entity.id
} catch (ex: Exception) {
-1
}
}
}

Related

How do I get a function in another thread to start the next function in the main thread after it is done?

Here is, what I'm trying to do:
A Switch is turned on, starting a service in another thread (works fine so far)
When this service is successful, it should then start another function within the main thread
I don't mind whether the function is called directly by the service or the service is returning a "success"-value to the main thread, what then starts the next function from there.
Here is, what the important parts of the code looks like:
Main thread:
class SendNotif : AppCompatActivity() {
val context = this
private lateinit var Switch: Switch
// Start LocationService when the switch is on
Switch.setOnCheckedChangeListener { buttonView, isChecked ->
if (isChecked) {
Toast.makeText(context, "Starting LocationService", Toast.LENGTH_SHORT).show()
Intent(applicationContext, LocationService::class.java).apply {
action = LocationService.ACTION_START
startService(this)
}
} else {
Toast.makeText(context, "Stopping LocationService", Toast.LENGTH_SHORT).show()
Intent(applicationContext, LocationService::class.java).apply {
action = LocationService.ACTION_STOP
startService(this)
}
}
}
}
fun InitiateMessage() {
// This is the function, that is supposed to start after the LocationService
}
}
This is the LocationService. After being successful, the function InitiateMessage() should start.
class LocationService: Service() {
private val serviceScope = CoroutineScope(SupervisorJob() + Dispatchers.IO)
private lateinit var locationClient: LocationClient
var lat = 0.0F
var long = 0.0F
override fun onBind(p0: Intent?): IBinder? {
return null
}
override fun onCreate() {
super.onCreate()
locationClient = DefaultLocationClient(
applicationContext,
LocationServices.getFusedLocationProviderClient(applicationContext)
)
}
// Start or stop the service
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
when(intent?.action) {
ACTION_START -> start()
ACTION_STOP -> stop()
}
return super.onStartCommand(intent, flags, startId)
}
private fun start() {
// Starting notification
val notification = NotificationCompat.Builder(this, "location")
.setContentTitle("Tracking location...")
.setContentText("Location: null")
.setSmallIcon(R.drawable.ic_launcher_background)
// Can't swipe this notification away
.setOngoing(true)
val notificationManager = getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager
// Starting the location updates
locationClient
// Every 10 seconds
.getLocationUpdates(10000L)
.catch { e -> e.printStackTrace() }
.onEach { location ->
lat = location.latitude.toString().toFloat() // .takeLast(3) // taking only the last 3 digits
long = location.longitude.toString().toFloat() // .takeLast(3)
val updatedNotification = notification.setContentText(
"Location: ($lat, $long)"
)
// notificationManager.notify(1, updatedNotification.build())
// Geofence
MyGeofence(lat, long)
}
.launchIn(serviceScope)
// startForeground(1, notification.build())
}
private fun stop() {
// Stopping the notification
stopForeground(true)
// Stopping the location service
stopSelf()
}
override fun onDestroy() {
super.onDestroy()
serviceScope.cancel()
}
companion object {
const val ACTION_START = "ACTION_START"
const val ACTION_STOP = "ACTION_STOP"
}
fun MyGeofence(lat : Float, long : Float){
val context = this
var db = DataBaseHandler(context)
var data = db.readData()
// Setting the accuracy of the geofence
val acc = 2
val safelat : Double = data.get(0).LocLat.toFloat().round(acc)
val safelong = data.get(0).LocLong.toFloat().round(acc) // .take(acc).take(acc)
val h = Handler(context.mainLooper)
if(safelat == lat.toFloat().round(acc) && safelong == long.toFloat().round(acc)){
h.post(Runnable { Toast.makeText(context, "You have reached your safe refuge! " + lat.toFloat().round(acc) + " " + long.toFloat().round(acc), Toast.LENGTH_LONG).show() })
// ToDo: Right hereafter the function InitiateMessage() should start
}
else{
h.post(Runnable { Toast.makeText(context, "You are still in great danger! " + lat.toFloat().round(acc) + " " + long.toFloat().round(acc), Toast.LENGTH_LONG).show() })
}
}
fun Float.round(decimals: Int): Double {
var multiplier = 1.0
repeat(decimals) { multiplier *= 10 }
return round(this * multiplier) / multiplier
}
}
So far, I tried it with a Looper, which did not work.
java.lang.RuntimeException: Can't create handler inside thread Thread[DefaultDispatcher-worker-1,5,main] that has not called Looper.prepare()
But I guess the far easier way would be a returned value by the service. How do I implement this, and how do I start the next function through this returned value?
I solved my problem with an observe-function and a companion object, that is a MutableLiveData.
The companion object is placed inside the main thread:
companion object {
// var iamsafe: Boolean = false
val iamsafe: MutableLiveData<Boolean> by lazy {
MutableLiveData<Boolean>()
}
}
The observe-function is placed within onCreate:
val safeObserver = Observer<Boolean> { newState ->
Toast.makeText(context, "Initiating message to my mate.", Toast.LENGTH_SHORT).show()
InitiateMessage()
}
iamsafe.observe(this, safeObserver)
The companion is changed in the second thread like this:
SendNotif.iamsafe.postValue (true)

Loading indicator does not hide if api failed to retrieve data although it hides if api succeed to retrieve data in Android Paging library

I have a remote server from where I want to fetch 20 items(Job) per api call and show them in RecyclerView using paging library.
For that, I want to show a loading indicator at the beginning of the first api call when list of items is being fetched from the server. Everything is okay if data is fetched successfully. That means the loading indicator got invisible if data loaded successfully. The code is given bellow.
JobService.KT
#GET(Constants.API_JOB_LIST)
fun getJobPost(
#Query("page") pageNumber: Int
): Observable<Response<JobResponse>>
JobResponse.kt
data class JobResponse(
#SerializedName("status") val status: Int? = null,
#SerializedName("message") val message: Any? = null,
#SerializedName("data") val jobData: JobData? = null
)
JobData.kt
data class JobData(
#SerializedName("jobs") val jobs: List<Job?>? = null,
#SerializedName("total") val totalJob: Int? = null,
#SerializedName("page") val currentPage: Int? = null,
#SerializedName("showing") val currentlyShowing: Int? = null,
#SerializedName("has_more") val hasMore: Boolean? = null
)
NetworkState.kt
sealed class NetworkState {
data class Progress(val isLoading: Boolean) : NetworkState()
data class Failure(val errorMessage: String?) : NetworkState()
companion object {
fun loading(isLoading: Boolean): NetworkState = Progress(isLoading)
fun failure(errorMessage: String?): NetworkState = Failure(errorMessage)
}
}
Event.kt
open class Event<out T>(private val content: T) {
private var hasBeenHandled = false
fun getContentIfNotHandled() = if (hasBeenHandled) {
null
} else {
hasBeenHandled = true
content
}
fun peekContent() = content
}
JobDataSource.kt
class JobDataSource(
private val jobService: JobService,
private val compositeDisposable: CompositeDisposable
) : PageKeyedDataSource<Int, Job>() {
val paginationState: MutableLiveData<Event<NetworkState>> = MutableLiveData()
val initialLoadingState: MutableLiveData<Event<NetworkState>> = MutableLiveData()
val totalJob: MutableLiveData<Event<Int>> = MutableLiveData()
companion object {
private const val FIRST_PAGE = 1
}
override fun loadInitial(params: LoadInitialParams<Int>, callback: LoadInitialCallback<Int, Job>) {
compositeDisposable += jobService.getJobPost(FIRST_PAGE)
.performOnBackgroundOutputOnMain()
.doOnSubscribe { initialLoadingState.postValue(Event(loading(true))) }
.doOnTerminate { initialLoadingState.postValue(Event(loading(false))) }
.subscribe({
if (it.isSuccessful) {
val jobData = it.body()?.jobData
totalJob.postValue(Event(jobData?.totalJob!!))
jobData.jobs?.let { jobs -> callback.onResult(jobs, null, FIRST_PAGE+1) }
} else {
val error = Gson().fromJson(it.errorBody()?.charStream(), ApiError::class.java)
when (it.code()) {
CUSTOM_STATUS_CODE -> initialLoadingState.postValue(Event(failure(error.message!!)))
else -> initialLoadingState.postValue(Event(failure("Something went wrong")))
}
}
}, {
if (it is IOException) {
initialLoadingState.postValue(Event(failure("Check Internet Connectivity")))
} else {
initialLoadingState.postValue(Event(failure("Json Parsing error")))
}
})
}
override fun loadAfter(params: LoadParams<Int>, callback: LoadCallback<Int, Job>) {
compositeDisposable += jobService.getJobPost(params.key)
.performOnBackgroundOutputOnMain()
.doOnSubscribe { if (params.key != 2) paginationState.postValue(Event(loading(true))) }
.doOnTerminate { paginationState.postValue(Event(loading(false))) }
.subscribe({
if (it.isSuccessful) {
val jobData = it.body()?.jobData
totalJob.postValue(Event(jobData?.totalJob!!))
jobData.jobs?.let { jobs -> callback.onResult(jobs, if (jobData.hasMore!!) params.key+1 else null) }
} else {
val error = Gson().fromJson(it.errorBody()?.charStream(), ApiError::class.java)
when (it.code()) {
CUSTOM_STATUS_CODE -> initialLoadingState.postValue(Event(failure(error.message!!)))
else -> initialLoadingState.postValue(Event(failure("Something went wrong")))
}
}
}, {
if (it is IOException) {
paginationState.postValue(Event(failure("Check Internet Connectivity")))
} else {
paginationState.postValue(Event(failure("Json Parsing error")))
}
})
}
override fun loadBefore(params: LoadParams<Int>, callback: LoadCallback<Int, Job>) {}
}
JobDataSourceFactory.kt
class JobDataSourceFactory(
private val jobService: JobService,
private val compositeDisposable: CompositeDisposable
): DataSource.Factory<Int, Job>() {
val jobDataSourceLiveData = MutableLiveData<JobDataSource>()
override fun create(): DataSource<Int, Job> {
val jobDataSource = JobDataSource(jobService, compositeDisposable)
jobDataSourceLiveData.postValue(jobDataSource)
return jobDataSource
}
}
JobBoardViewModel.kt
class JobBoardViewModel(
private val jobService: JobService
) : BaseViewModel() {
companion object {
private const val PAGE_SIZE = 20
private const val PREFETCH_DISTANCE = 20
}
private val jobDataSourceFactory: JobDataSourceFactory = JobDataSourceFactory(jobService, compositeDisposable)
var jobList: LiveData<PagedList<Job>>
init {
val config = PagedList.Config.Builder()
.setPageSize(PAGE_SIZE)
.setInitialLoadSizeHint(PAGE_SIZE)
.setPrefetchDistance(PREFETCH_DISTANCE)
.setEnablePlaceholders(false)
.build()
jobList = LivePagedListBuilder(jobDataSourceFactory, config).build()
}
fun getPaginationState(): LiveData<Event<NetworkState>> = Transformations.switchMap<JobDataSource, Event<NetworkState>>(
jobDataSourceFactory.jobDataSourceLiveData,
JobDataSource::paginationState
)
fun getInitialLoadingState(): LiveData<Event<NetworkState>> = Transformations.switchMap<JobDataSource, Event<NetworkState>>(
jobDataSourceFactory.jobDataSourceLiveData,
JobDataSource::initialLoadingState
)
fun getTotalJob(): LiveData<Event<Int>> = Transformations.switchMap<JobDataSource, Event<Int>>(
jobDataSourceFactory.jobDataSourceLiveData,
JobDataSource::totalJob
)
}
JobBoardFragment.kt
class JobBoardFragment : BaseFragment() {
private val viewModel: JobBoardViewModel by lazy {
getViewModel { JobBoardViewModel(ApiFactory.jobListApi) }
}
private val jobAdapter by lazy {
JobAdapter {
val bundle = Bundle()
bundle.putInt(CLICKED_JOB_ID, it.jobId!!)
navigateTo(R.id.jobBoard_to_jobView, R.id.home_navigation_fragment, bundle)
}
}
override fun getLayoutResId() = R.layout.fragment_job_board
override fun initWidget() {
job_list_recycler_view.adapter = jobAdapter
back_to_main_image_view.setOnClickListener { onBackPressed() }
}
override fun observeLiveData() {
with(viewModel) {
jobList.observe(this#JobBoardFragment, Observer {
jobAdapter.submitList(it)
})
getInitialLoadingState().observe(this#JobBoardFragment, Observer {
it.getContentIfNotHandled()?.let { state ->
when (state) {
is Progress -> {
if (state == loading(true)) {
network_loading_indicator.visible()
} else {
network_loading_indicator.visibilityGone()
}
}
is Failure -> context?.showToast(state.errorMessage.toString())
}
}
})
getPaginationState().observe(this#JobBoardFragment, Observer {
it.getContentIfNotHandled()?.let { state ->
when (state) {
is Progress -> {
if (state == loading(true)) {
pagination_loading_indicator.visible()
} else {
pagination_loading_indicator.visibilityGone()
}
}
is Failure -> context?.showToast(state.errorMessage.toString())
}
}
})
getTotalJob().observe(this#JobBoardFragment, Observer {
it.getContentIfNotHandled()?.let { state ->
job_board_text_view.visible()
with(profile_completed_image_view) {
visible()
text = state.toString()
}
}
})
}
}
}
But the problem is if data fetching failed due to internet connectivity or any other server related problem loading indicator does not invisible that means it still loading though I make the loadingStatus false and error message is shown. it means .doOnTerminate { initialLoadingState.postValue(Event(loading(false))) } is not called if error occured. This is the first problem. Another problem is loadInitial() and loadAfter() is being called simultaneously at the first call. But I just want the loadInitial() method is called at the beginning. after scrolling loadAfter() method will be called.
Try replacing all your LiveData's postValue() methods by setValue() or simply .value =.
The problem is that the postValue() method is for updating the value from a background thread to observers in the main thread. In this case you are always changing the values from the main thread itself, so you should use .value =.
Hope it's not too late.

How to access the data from database into groovy using filters

Here I try to get the data from uploadcdr table but I cannot understand how the filters work. Please explain me.
def private getFilteredUploadCDR(filters, GrailsParameterMap params) {
params.max = params?.max?.toInteger() ?: pagination.max
params.offset = params?.offset?.toInteger() ?: pagination.offset
params.sort = params?.sort ?: pagination.sort
params.order = params?.order ?: pagination.order
return UploadCDRFileDTO.createCriteria().list(
max: params.max,
offset: params.offset
) {
and {
filters.each { filter ->
log.debug("fileter field ${filter.field}")
if (filter.value) {
addToCriteria(filter.getRestrictions());
}
}
}
// apply sorting
SortableCriteria.sort(params, delegate)
}
}

C# powershell output reader iterator getting modified when pipeline closed and disposed

I'm calling a powershell script from C#. The script is pretty small and is "gps;$host.SetShouldExit(9)", which list process, and then send back an exit code to be captured by the PSHost object.
The problem I have is when the pipeline has been stopped and disposed, the output reader PSHost collection still seems to be written to, and is filling up. So when I try and copy it to my own output object, it craps out with a OutOfMemoryException when I try to iterate over it. Sometimes it will except with a Collection was modified message. Here is the code.
private void ProcessAndExecuteBlock(ScriptBlock Block)
{
Collection<PSObject> PSCollection = new Collection<PSObject>();
Collection<Object> PSErrorCollection = new Collection<Object>();
Boolean Error = false;
int ExitCode=0;
//Send for exection.
ExecuteScript(Block.Script);
// Process the waithandles.
while (PExecutor.PLine.PipelineStateInfo.State == PipelineState.Running)
{
// Wait for either error or data waithandle.
switch (WaitHandle.WaitAny(PExecutor.Hand))
{
// Data
case 0:
Collection<PSObject> data = PExecutor.PLine.Output.NonBlockingRead();
if (data.Count > 0)
{
for (int cnt = 0; cnt <= (data.Count-1); cnt++)
{
PSCollection.Add(data[cnt]);
}
}
// Check to see if the pipeline has been closed.
if (PExecutor.PLine.Output.EndOfPipeline)
{
// Bring back the exit code.
ExitCode = RHost.ExitCode;
}
break;
case 1:
Collection<object> Errordata = PExecutor.PLine.Error.NonBlockingRead();
if (Errordata.Count > 0)
{
Error = true;
for (int count = 0; count <= (Errordata.Count - 1); count++)
{
PSErrorCollection.Add(Errordata[count]);
}
}
break;
}
}
PExecutor.Stop();
// Create the Execution Return block
ExecutionResults ER = new ExecutionResults(Block.RuleGuid,Block.SubRuleGuid, Block.MessageIdentfier);
ER.ExitCode = ExitCode;
// Add in the data results.
lock (ReadSync)
{
if (PSCollection.Count > 0)
{
ER.DataAdd(PSCollection);
}
}
// Add in the error data if any.
if (Error)
{
if (PSErrorCollection.Count > 0)
{
ER.ErrorAdd(PSErrorCollection);
}
else
{
ER.InError = true;
}
}
// We have finished, so enque the block back.
EnQueueOutput(ER);
}
and this is the PipelineExecutor class which setups the pipeline for execution.
public class PipelineExecutor
{
private Pipeline pipeline;
private WaitHandle[] Handles;
public Pipeline PLine
{
get { return pipeline; }
}
public WaitHandle[] Hand
{
get { return Handles; }
}
public PipelineExecutor(Runspace runSpace, string command)
{
pipeline = runSpace.CreatePipeline(command);
Handles = new WaitHandle[2];
Handles[0] = pipeline.Output.WaitHandle;
Handles[1] = pipeline.Error.WaitHandle;
}
public void Start()
{
if (pipeline.PipelineStateInfo.State == PipelineState.NotStarted)
{
pipeline.Input.Close();
pipeline.InvokeAsync();
}
}
public void Stop()
{
pipeline.StopAsync();
}
}
An this is the DataAdd method, where the exception arises.
public void DataAdd(Collection<PSObject> Data)
{
foreach (PSObject Ps in Data)
{
Data.Add(Ps);
}
}
I put a for loop around the Data.Add, and the Collection filled up with 600k+ so feels like the gps command is still running, but why. Any ideas.
Thanks in advance.
Found the problem. Named the resultant collection and the iterator the same, so as it was iterating, it was adding to the collection, and back into the iterator, and so forth. Doh!.

groovy multithreading

I'm newbie to groovy/grails.
How to implement thread for this code . Had 2500 urls and this was taking hours of time for checking each url.
so i decided to implement multi-thread for this :
Here is my sample code :
def urls = [
"http://www.wordpress.com",
"http://67.192.103.225/QRA.Public/" ,
"http://www.subaru.com",
"http://baldwinfilter.com/products/start.html"
]
def up = urls.collect { ur ->
try {
def url = new URL(ur)
def connection = url.openConnection()
if (connection.responseCode == 200) {
return true
} else {
return false
}
} catch (Exception e) {
return false
}
}
For this code i need to implement multi-threading .
Could any one please suggest me the code.
thanks in advance,
sri.
I would take a look at the Groovy Parallel Systems library. In particular I think that the Parallel collections section would be useful.
Looking at the docs, I believe that collectParallel is a direct drop-in replacement for collect (bearing in mind the obvious caveats about side-effects). The following works fine for me:
def urls = [
"http://www.wordpress.com",
"http://www.subaru.com",
"http://baldwinfilter.com/products/start.html"
]
Parallelizer.doParallel {
def up = urls.collectParallel { ur ->
try {
def url = new URL(ur)
def connection = url.openConnection()
if (connection.responseCode == 200) {
return true
} else {
return false
}
} catch (Exception e) {
return false
}
}
println up
}
See the Groovy docs for an example how to use an ExecutorService to do what you want.
You can use this to check the URL in a separate thread.
class URLReader implements Runnable
{
def valid
def url
URLReader( url ) {
this.url = url
}
void run() {
try {
def connection = url.toURL().openConnection()
valid = ( connection.responseCode == 200 ) as Boolean
} catch ( Exception e ) {
println e.message
valid = Boolean.FALSE
}
}
}
def reader = new URLReader( "http://www.google.com" )
new Thread( reader ).start()
while ( reader.valid == null )
{
Thread.sleep( 500 )
}
println "valid: ${reader.valid}"
Notes: The valid attribute will be either null, Boolean.TRUE or Boolean.FALSE. You'll need to wait for a while to give all the threads a chance to open the connection. Depending on the number of URLs you're checking you will eventually hit a limit of the number of threads / connections you can realistically handle, so should check URLs in batches of the appropriate size.
I think this way is very simple to achieve.
import java.util.concurrent.*
//Thread number
THREADS = 100
pool = Executors.newFixedThreadPool(THREADS)
defer = { c -> pool.submit(c as Callable) }
def urls = [
"http://www.wordpress.com",
"http://www.subaru.com",
]
def getUrl = { url ->
def connection = url.openConnection()
if (connection.responseCode == 200) {
return true
} else {
return false
}
}
def up = urls.collect { ur ->
try {
def url = new URL(ur)
defer{ getUrl(url) }.get()
} catch (Exception e) {
return false
}
}
println up
pool.shutdown()
This is how I implemented:
class ValidateLinks extends Thread{
def valid
def url
ValidateLinks( url ) {
this.url = url
}
void run() {
try {
def connection = url.toURL().openConnection()
connection.setConnectTimeout(5000)
valid = ( connection.responseCode == 200 ) as Boolean
} catch ( Exception e ) {
println url + "-" + e.message
valid = Boolean.FALSE
}
}
}
def threads = [];
urls.each { ur ->
def reader = new ValidateLinks(ur.site_url)
reader.start()
threads.add(reader);
}
while (threads.size() > 0) {
for(int i =0; i < threads.size();i++) {
def tr = threads.get(i);
if (!tr.isAlive()) {
println "URL : " + tr.url + "Valid " + tr.valid
threads.remove(i);
i--;
}
}
}

Resources