How can I force matomo to archive older data with a cronjob - cron

While accessing archived Matomo data I realized that a part of the archive is missing. (see Image)
There is some data visible from about November 2020 to February 2021, but this data was generated in the web interface. There is an option to archive reports if they are displayed in the browser. This would work, but it takes hours to archive just small parts.
An onsite employee did this during some of his lunchbreaks.
The cronjob is running since January 2022 and as visible in the graph, the reports got archived from there on.
I've searched the web and found others with kinda similar problems, but never a solution.
We currently run on a "old" system with PHP 7.3, MySQL 5.7, Matomo 3.9.1 on an Ubuntu 18.04 server.
Upgrading is no option at the moment and yes, I am aware of the security risks.
Has someone experienced anything similar or knows a way to force the cronjob to archive "older" data in Matomo?

Related

Change Azure VM Time (not timezone) running Windows Server 2022

I would like to change an Azure server's local time to a month ago. This is for testing purposes on a custom application. Happy if it changes back after reboot.
There are a number of similar posts with answers, but most either refer to changing the timezone, or are old(ish) posts whose answers are not valid anymore. Not sure if Microsoft has made some changes.
What I have tried to to stop the Hyper-V Time Synchronization service as well as the Windows Time service, but to no avail. I also wrote a small c# program to try and do this, but no success.
Anything else that might work?
Thank you DeepDave-MT . Posting your recommendation into answer section as per Cameron Castillo confirmation. This would help other community members.
Steps to change time settings in Local Server section of the Server Manager:
Open the time and date settings in the Local Server section of the Server Manager.
from the opened time manager window, make the required changes.

System crashes while using clearcase 8.0.1.x /9.0.1.x (checking out files) on windows 10 (1803) platform

After upgrading system to Windows 10 - os 1803 we are getting below issues while working with ClearCase 8.0.1.x/9.0.1.x
Unable to checkin/checkout.
Not able to create views.
Not able to add any file to source control.
The system hangs & crashes while performing any ClearCase operation.
There is no error message, but I have attached screenshot for reference.
Please let us know if there is any issue with the Windows 10 ver(1803), any security system enabled?
Or has ClearCase provided any fix?
We have tried 9.0.1.5 and issue still persists.
This is what we got from windows event log.
The computer has rebooted from a bugcheck.
The bugcheck was:
0x000000c2 (0x0000000000000004, 0x00000000535be990, 0x000000000004efd3, 0xfffff803e01848b1)
for most of them whoever has upgraded to windows 1803 ver :( for people who are still using ver1709 it is working perfectly fine
Then I would recommand contacting IBM support: only them can update their ClearCase 9/Windows 10 compatibility matrix and confirm if MVFS is supported on a more recent (1803) Windows 10 edition.
We also facing same problem and I have raised the case with IBM. Still not yet resolved. As IBM said there are some limitations to work ClearCase with windows 10 and windows 2016.
We tried all the options except Secure boot disable. If possible please do disable secure boot option in Windows 10 and try to checkin/checkout code from CleraCase
Note : It works for Snapshot views. That means the issue related to MVFS
I'm seconding #VonC's recommendation to open a ticket with IBM. When you do that, save a step and collect a clearbug2 and a kernel memory dump to send in as soon as the case is opened. It will save the turn-around time of us asking you for it. If the installed programs list doesn't list installed security software (DLP, Privilege management sw like Avecto, other endpoint security tools), please list those separately as well.
I would also love to know who # IBM told you there are "limitations" with Win10-1803.
There are a few issues with Windows 10 "version upgrades" breaking things, but they generally don't cause system crashes. Windows 10 upgrades are actually full OS installs that then (imperfectly) migrate application settings. Anything that uses custom network providers (ClearCase is one example) will find that the network providers will be broken or partially broken. Reinstalling is usually required. Again, that has not yet been reported as a cause of a BSOD.
If the upgrade/reinstall didn't fix view creation, please post a separate question on the view creation issue. There may be things we can do to the SMB 2 caches to allow view creation to work in cases where the view storage is not on the client host.
I noticed that the screen shot you posted is a Terminal Services disconnect screenshot. Does the issue only occur over a Terminal Services client connection or does it also happen on a local connection?

SVF model derivative downloaded as an (almost) empty ZIP file (Autodesk Forge)

I have been a happy developer on Autodesk Forge these recent months, and have been able to perform several tasks using the API's.
However, I can't seem to be able to download SVF model derivatives properly (derived either from IFC or RVT files). I've tried the direct curl command or the forge-apis NPM package, without success. Oddly the download works fine, but all I get is a ZIP file with empty directories ("geometry", "material", "scene"), and (non-empty) "manifest.json" and "metadata.json" files.
I use a two-legged authentication process to generate the token (the files are on my Forge developer's account, not on A360 or BIM360). I am able to view the files with the 3D viewer, so the conversion from RVT or IFC to SVF works nicely.
I also tried the https://extract.autodesk.io model extractor, but this doesn't allow me to retrieve the derivative either ("Cannot GET /extracted/2836276-AC11-Institute-Var-2-IFCifc.zip").
Any idea? Thanks in advance.
Sorry for the inconvenience, the issue was that I assumed the latest version was set to 2.15 while it is still set as 2.10.
The codefix to force version 2.15 has been pushed live.
This is a known issue recently, we got some of the issues reported and have already reported to the owner who unfortunately is on vacation now.
I debugged into the code and find that the issue is related to the new added file "wgs.min.js" which was added recently at commit of Jun 16, 2017. Because the project is still using the version of 2.10, unfortunately, there is no 2.10 version of "wgs.min.js" online when it tries to get from https://developer.api.autodesk.com/viewingservice/v1/viewers/wgs.min.js?v=2.10 (2.11 ~ 2.15 are there), this made it fail to package the zip file, so you will get a 404 error downloading the bubbles.
To get the bubble for now, I suggest you to setup a local server from https://github.com/cyrillef/extract.autodesk.io using the version on March 24, 2017. I verified it, should be working good. Or if it's urgent, you can send me the source file and I can help generate the bubbles for you. Let me know.

Windows Azure (dotnetnuke module installation error)

I'm new with windows Azure and i've just signup for a 3months free trial, i've installed dotnetnuke 7.0.1 the problem that i have is that everytime when i try to install a module on to my dotnetnuke website i get the sql error message, please help as i don't know what is the problem.
my windows Azure is disabled because i've created more than 1 databases but now i've deleted all the other databases now i'm left with one, how do i reactivate my trial so i can complete my tests.
The problem with the modules that you are trying to install is probably that are not SQL Azure compatible. Ensure that those modules are compatible with SQL Azure asking to the module developer/vendor. If the problem is within the open source/non-core modules, some time ago I modified all of them to be SQL Azure compatible (check this link: http://dotnetnuke6.intelequia.com/Module-Test). Before install any of them, be sure that there is no new version at CodePlex with the SQL Azure compatibility fixed.
I think the problem is that the SQL Azure "billing" counter is calculated "per day". So you should wait at least one day before creating a new database or just disable the trial limits by converting the subscription to a paid subscription
---I WORK FOR POWERDNN---
Hi Anonymous,
While Azure does have some advantages, when it comes to running an app like DotNetNuke on Azure, it is really not a good business or technology decision (at least today). Right now Azure does not have parity with standard SQL Server technologies which is what DotNetNuke has been coded against for the past ten years.
I've already talked to more than a couple associates who have tried to run their DNN website on Azure and it has caused serious problems for them, Usually what happens is a SQL script won't completely run and will leave their database in an indeterminate state. The problems usually aren't apparent to them until a few weeks after trying out azure and then they have to decide to either roll back (and loose weeks of data) or spend hours trying to figure out what script didn't fully run and trying to piece things back together in an azure-compatible way.
If you have never had to rewrite a vendor's SQL scripts, I'd highly encourage the experience. It is a lot of "fun" :-)
Always glad to help,
Tony V.

Liferay 5.2.3 to 6.0 Upgradation steps (Windows + Tomcat + MySql)

I have taken the backup of my live portal and I want to upgrade it to newer version in localhost and then move the upgraded version to my live site.
I followed the steps in the link http://www.liferay.com/community/wiki/-/wiki/Main/Upgrade+Instructions+from+5.2+to+6.0. and my stpes are :
1) I have setup the new liferay 6.0 on my localhost
(successful installation of liferay 6.0 with inbuilt Hypersonic database )
2) I have imported MySql database of live system to the local MySql Server.
3) After stopping the tomcat server, I have changed the configuration file portal-ext.properties for the MySql database details.
4) Then, I have started the start.bat file. It is showing upgrade process well but after few minutes window is closed
Can anyone help me in this regard where I have done some mistake or am I missing something during the upgration. If anyone has some idea then please share their steps.
Thanks in advance
You don't mention that you also copy your document library / image gallery folder to the new installation. This typically (unless configured otherwise) sits in your "data" folder. Just importing the MySql database isn't enough, as it contains only the metadata for doclib & image gallery.
You'll have to take this into account for backups also - ideally you'd read about backup and set up a second 5.2.3 instance (and make sure it runs) from your backup, then start upgrading. I typically forbid to call something a backup if nobody ever demonstrated, that they can use it to successfully restore in a completely new environment. This is your chance to ultimately test your backup procedure.

Resources