Unable to initialize GDBusConnection with cellular modem on linux - linux

I am currently trying to access cellular modem data from within a C application on Linux using mmlib-glib.
When trying to establish a GDBusConnection on a GFileIOStream created from a GFile from path /dev/cdc-wdm0, the call to g_dbus_connection_new_sync() hangs infinitely and gives the following error _g_dbus_worker_do_read_cb: error determining bytes needed: Unable to determine message blob length - given blob is malformed. Also, after this the modem will either become no longer available, or be kicked to a different modem number (i.e. org/freedesktop/ModemManager/Modem/2 will change to 3) I am constructing a GFile with the path /dev/cdc-wdm0, and from that initialize a GDBusConnection.
I have also tried initializing a new GIOStream and passing that to the mm_manager_scan_devices_sync() following instructions from this post but this results in the following errors :
g_dbus_connection_signal_subscribe: assertion 'sender == NULL || (g_dbus_is_name (sender) && (connection->flags & G_DBUS_CONNECTION_FLAGS_MESSAGE_BUS_CONNECTION))' failed
I have placed code below followed by mmcli output with modem information
#include <libmm-glib.h>
#include <gio/gio.h>
//#include <gtk/gtk.h>
#include <stdio.h>
#include <stdbool.h>
GDBusConnection *pConnection;
int main (void)
{
printf("begin\n");
GFile * pFile = g_file_new_for_path("/dev/cdc-wdm0");
GFileIOStream *pStream = g_file_open_readwrite(pFile, NULL, NULL);
pConnection= g_dbus_connection_new_sync(pStream,
guid, // for server auth.
G_DBUS_CONNECTION_FLAGS_MESSAGE_BUS_CONNECTION,
NULL, // observer
NULL, // cancellable,
NULL);
pManager = mm_manager_new_sync(pConnection,
G_DBUS_CONNECTION_FLAGS_MESSAGE_BUS_CONNECTION,
NULL,
NULL);
mm_manager_scan_devices_sync(pManager,
cancellable,
NULL);
printf("end\n");
return(0);
}
Below is the output from modemmanager mmcli --modem=0
----------------------------------
General | path: /org/freedesktop/ModemManager1/Modem/0
| device id:
----------------------------------
Hardware | manufacturer: Telit
| model: LE910C4-NF
| firmware revision: 25.21.660 1 [Mar 04 2021 12:00:00]
| carrier config: default
| h/w revision: 1.30
| supported: gsm-umts, lte
| current: gsm-umts, lte
| equipment id: 0
----------------------------------
System | device: /sys/devices/platform/soc#0/32c00000.bus/32e50000.usb/ci_hdrc.1/usb1/1-1/1-1.2
| drivers: qmi_wwan, option
| plugin: telit
| primary port: cdc-wdm0
| ports: cdc-wdm0 (qmi), ttyUSB0 (ignored), ttyUSB1 (gps),
| ttyUSB4 (ignored), wwan0 (net)
----------------------------------
Status | lock: sim-pin2
| unlock retries: sim-pin (3), sim-puk (10), sim-pin2 (10), sim-puk2 (10)
| state: connected
| power state: on
| access tech: lte
| signal quality: 75% (cached)
----------------------------------
Modes | supported: allowed: 3g; preferred: none
| allowed: 4g; preferred: none
| allowed: 3g, 4g; preferred: 4g
| allowed: 3g, 4g; preferred: 3g
| current: allowed: 3g, 4g; preferred: 4g
----------------------------------
Bands | supported: utran-4, utran-5, utran-2, eutran-2, eutran-4, eutran-5,
| eutran-12, eutran-13, eutran-14, eutran-66, eutran-71
| current: utran-4, utran-5, utran-2, eutran-2, eutran-4, eutran-5,
| eutran-12, eutran-13, eutran-14, eutran-66, eutran-71
----------------------------------
IP | supported: ipv4, ipv6, ipv4v6
----------------------------------
3GPP | imei: 3
| enabled locks: fixed-dialing
| operator id: 311480
| operator name: VZW
| registration: home
----------------------------------
3GPP EPS | initial bearer path: /org/freedesktop/ModemManager1/Bearer/0
| initial bearer apn: super
| initial bearer ip type: ipv4
----------------------------------
SIM | primary sim path: /org/freedesktop/ModemManager1/SIM/0
| sim slot paths: slot 1: /org/freedesktop/ModemManager1/SIM/0 (active)
| slot 2: none
----------------------------------
Bearer | paths: /org/freedesktop/ModemManager1/Bearer/2
| /org/freedesktop/ModemManager1/Bearer/1

Related

Terraform: AWS Codepipeline multiple Codecommit sources

I am moving away from Github.com and to Codecommit, I have been leveraging terraforms modular approach to import GitHub repos as modules for years. That said Codecommit is very different in that nature. I have seen where people leverage SSH to clone the repos locally but I have also noticed codepipeline can leverage multiple sources. I need a way to add multiple repos to my pipeline so I can replicate the modular github approach offered by terraform. I want that code locally to execute it in a modular fashion.
I have googled looking for an example that shows me how to leverage multiple codecommmit resources in my pipeline and i can not find anything that clearly outlines how to leverage multiple resources in terraform. Has anyone figured this out or have examples they can point me to?
Looking into this, I have found that it's not very well documented anywhere which is actually very frustrating. Leveraging hashicorp vague description of the service and AWS multi-input example I was finally able to come up with this for terraform:
"aws_codepipeline" "foo" {
name = "tf-test-pipeline"
role_arn = "codepipeline service role arn"
artifact_store {
location = "s3 bucket name, NOT THE ARN"
type = "S3"
}
stage {
name = "Source"
action {
name = "Source"
category = "Source"
owner = "AWS"
provider = "CodeCommit"
version = "1"
output_artifacts = ["src"]
configuration = {
RepositoryName = "vpc" //MUST BE the name of the your codecommit repo
BranchName = "master"
}
run_order = "1"
}
action {
name = "2ndSource" //you can make this any name
category = "Source"
owner = "AWS"
provider = "CodeCommit"
version = "1"
output_artifacts = ["src2"]
configuration = {
RepositoryName = "ec2"
BranchName = "master"
}
run_order = "2"
}
}
stage {
name = "Build"
action {
name = "Build"
category = "Build"
owner = "AWS"
provider = "CodeBuild"
input_artifacts = ["src","src2"] //pass through both repositories
version = "1"
configuration = {
ProjectName = "codebuild_project_name"
PrimarySource = "Source"
}
}
}
}
The trick here is to add additional sources into one stage, not separate ones. The reference below shows two of them but I have been able to add three with no problem.
Reference Links:
Hashicorp CodePipeline
https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/codepipeline#run_order
AWS Multiple Inputs Json Example:
https://docs.aws.amazon.com/codebuild/latest/userguide/sample-pipeline-multi-input-output.html
For those of you getting started for the first time, I recommend this link, it's pretty comprehensive and walks you through the entire build process which includes roles and policies:
https://medium.com/swlh/intro-to-aws-codecommit-codepipeline-and-codebuild-with-terraform-179f4310fe07
# _____ ____ _ _ _____ _____ ______
# / ____|/ __ \| | | | __ \ / ____| ____|
# | (___ | | | | | | | |__) | | | |__
# \___ \| | | | | | | _ /| | | __|
# ____) | |__| | |__| | | \ \| |____| |____
# |_____/ \____/ \____/|_| \_\\_____|______|
Stages:
- Name: Source
Actions:
- ActionTypeId:
Category: Source
Owner: AWS
Provider: CodeStarSourceConnection
Version: "1"
Configuration:
ConnectionArn: !Ref CodeStarConnectionArn
FullRepositoryId: !Ref BitBucketRepo
BranchName: !Ref BitBucketRepoReleaseBranch
OutputArtifactFormat: "CODE_ZIP"
DetectChanges: true
Name: SourceCode
OutputArtifacts:
- Name: !Sub ${SourceArtifactName}
Namespace: SourceVariables1
RunOrder: 1
- ActionTypeId:
Category: Source
Owner: AWS
Provider: CodeStarSourceConnection
Version: "1"
Configuration:
ConnectionArn: !Ref CodeStarConnectionArn
FullRepositoryId: !Ref PipelineBitBucketRepo
BranchName: !Ref PipelineBitBucketRepoReleaseBranch
OutputArtifactFormat: "CODE_ZIP"
DetectChanges: true
Name: PipelineDefinition
OutputArtifacts:
- Name: !Sub ${PipelineCodeArtifactName}
Namespace: SourceVariables2
RunOrder: 1
# _____ ______ _ ______ __ __ _ _ _______ _______ ______
# / ____| ____| | | ____| | \/ | | | |__ __|/\|__ __| ____|
# | (___ | |__ | | | |__ | \ / | | | | | | / \ | | | |__
# \___ \| __| | | | __| | |\/| | | | | | | / /\ \ | | | __|
# ____) | |____| |____| | | | | | |__| | | |/ ____ \| | | |____
# |_____/|______|______|_| |_| |_|\____/ |_/_/ \_\_| |______|
- !If
- ShouldUpatePipelineStackOnChange
- Name: UpdatePipeline
Actions:
- Name: CreateChangeSet
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: "1"
Configuration:
ActionMode: CHANGE_SET_REPLACE
StackName: !Ref AWS::StackName
ChangeSetName: !Sub ${AWS::StackName}-ChangeSet
TemplatePath: !Sub ${PipelineCodeArtifactName}::${PipelineTemplateName}
Capabilities: CAPABILITY_NAMED_IAM
RoleArn: !GetAtt PipelineStackCloudFormationExecutionRole.Arn
InputArtifacts:
- Name: !Sub ${PipelineCodeArtifactName}
RunOrder: 1
- Name: ExecuteChangeSet
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: "1"
Configuration:
ActionMode: CHANGE_SET_EXECUTE
StackName: !Ref AWS::StackName
ChangeSetName: !Sub ${AWS::StackName}-ChangeSet
RoleArn: !GetAtt PipelineStackCloudFormationExecutionRole.Arn
OutputArtifacts:
- Name: !Sub ${AWS::StackName}ChangeSet
RunOrder: 2
- !Ref AWS::NoValue

New to Web development not sure how to fix reactjs errors [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
These is a long page full for error messages and I cannot get hold of the person teaching this class been 2 weeks. I tried redoing it but I still got the same messages I do not know how or what to do in order to fix them. Here is the git hub link: https://github.com/SadiaSanam/petshop
And these are the messages, how to I fix it? Because some of those pages I cannot even find..
TypeError: path.split is not a function
get
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/src/utils/get.ts:6
TypeError: path.split is not a function
get
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/src/utils/get.ts:6
3 | import isUndefined from './isUndefined';
4 |
5 | export default (obj: any = {}, path: string, defaultValue?: unknown) => {
> 6 | const result = compact(path.split(/[,[\].]+?/)).reduce(
7 | (result, key) => (isNullOrUndefined(result) ? result : result[key]),
8 | obj,
9 | );
View compiled
(anonymous function)
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/src/useForm.ts:967
964 |
965 | const register: UseFormRegister<TFieldValues> = React.useCallback(
966 | (name, options) => {
> 967 | const isInitialRegister = !get(fieldsRef.current, name);
| ^ 968 |
969 | set(fieldsRef.current, name, {
970 | _f: {
View compiled
Login
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/src/components/Login.js:88
85 | <div className='form-control'>
86 |
87 | <label htmlFor='email'>Email</label>
> 88 | <input type='email' name='email' id='email' ref={register( {required:true}) } />
| ^ 89 | { errors.email ? <span className='err'> email is required!</span> : null }
90 |
91 | <label htmlFor='password'>Password</label>
View compiled
renderWithHooks
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:14985
14982 | }
14983 | }
14984 |
> 14985 | var children = Component(props, secondArg); // Check if there was a render phase update
| ^ 14986 |
14987 | if (didScheduleRenderPhaseUpdateDuringThisPass) {
14988 | // Keep rendering in a loop for as long as render phase updates continue to
View compiled
mountIndeterminateComponent
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:17811
17808 |
17809 | setIsRendering(true);
17810 | ReactCurrentOwner$1.current = workInProgress;
> 17811 | value = renderWithHooks(null, workInProgress, Component, props, context, renderLanes);
| ^ 17812 | setIsRendering(false);
17813 | } // React DevTools reads this flag.
17814 |
View compiled
beginWork
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:19049
19046 | switch (workInProgress.tag) {
19047 | case IndeterminateComponent:
19048 | {
> 19049 | return mountIndeterminateComponent(current, workInProgress, workInProgress.type, renderLanes);
| ^ 19050 | }
19051 |
19052 | case LazyComponent:
View compiled
HTMLUnknownElement.callCallback
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:3945
3942 | function callCallback() {
3943 | didCall = true;
3944 | restoreAfterDispatch();
> 3945 | func.apply(context, funcArgs);
| ^ 3946 | didError = false;
3947 | } // Create a global error event handler. We use this to capture the value
3948 | // that was thrown. It's possible that this error handler will fire more
View compiled
invokeGuardedCallbackDev
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:3994
3991 | // errors, it will trigger our global error handler.
3992 |
3993 | evt.initEvent(evtType, false, false);
> 3994 | fakeNode.dispatchEvent(evt);
| ^ 3995 |
3996 | if (windowEventDescriptor) {
3997 | Object.defineProperty(window, 'event', windowEventDescriptor);
View compiled
invokeGuardedCallback
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:4056
4053 | function invokeGuardedCallback(name, func, context, a, b, c, d, e, f) {
4054 | hasError = false;
4055 | caughtError = null;
> 4056 | invokeGuardedCallbackImpl$1.apply(reporter, arguments);
4057 | }
4058 | /**
4059 | * Same as invokeGuardedCallback, but instead of returning an error, it stores
View compiled
beginWork$1
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:23964
23961 | } // Run beginWork again.
23962 |
23963 |
> 23964 | invokeGuardedCallback(null, beginWork, null, current, unitOfWork, lanes);
| ^ 23965 |
23966 | if (hasCaughtError()) {
23967 | var replayError = clearCaughtError(); // `invokeGuardedCallback` sometimes sets an expando `_suppressLogging`.
View compiled
performUnitOfWork
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:22776
22773 |
22774 | if ( (unitOfWork.mode & ProfileMode) !== NoMode) {
22775 | startProfilerTimer(unitOfWork);
> 22776 | next = beginWork$1(current, unitOfWork, subtreeRenderLanes);
| ^ 22777 | stopProfilerTimerIfRunningAndRecordDelta(unitOfWork, true);
22778 | } else {
22779 | next = beginWork$1(current, unitOfWork, subtreeRenderLanes);
View compiled
workLoopSync
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:22707
22704 | function workLoopSync() {
22705 | // Already timed out, so perform work without checking if we need to yield.
22706 | while (workInProgress !== null) {
> 22707 | performUnitOfWork(workInProgress);
22708 | }
22709 | }
22710 |
View compiled
renderRootSync
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:22670
22667 |
22668 | do {
22669 | try {
> 22670 | workLoopSync();
| ^ 22671 | break;
22672 | } catch (thrownValue) {
22673 | handleError(root, thrownValue);
View compiled
performSyncWorkOnRoot
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:22293
22290 | }
22291 | } else {
22292 | lanes = getNextLanes(root, NoLanes);
> 22293 | exitStatus = renderRootSync(root, lanes);
| ^ 22294 | }
22295 |
22296 | if (root.tag !== LegacyRoot && exitStatus === RootErrored) {
View compiled
(anonymous function)
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:11327
11324 | var callback = _queue[i];
11325 |
11326 | do {
> 11327 | callback = callback(_isSync2);
| ^ 11328 | } while (callback !== null);
11329 | }
11330 | });
View compiled
unstable_runWithPriority
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/scheduler/cjs/scheduler.development.js:468
465 | currentPriorityLevel = priorityLevel;
466 |
467 | try {
> 468 | return eventHandler();
| ^ 469 | } finally {
470 | currentPriorityLevel = previousPriorityLevel;
471 | }
View compiled
runWithPriority$1
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:11276
11273 |
11274 | function runWithPriority$1(reactPriorityLevel, fn) {
11275 | var priorityLevel = reactPriorityToSchedulerPriority(reactPriorityLevel);
> 11276 | return Scheduler_runWithPriority(priorityLevel, fn);
11277 | }
11278 | function scheduleCallback(reactPriorityLevel, callback, options) {
11279 | var priorityLevel = reactPriorityToSchedulerPriority(reactPriorityLevel);
View compiled
flushSyncCallbackQueueImpl
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:11322
11319 | try {
11320 | var _isSync2 = true;
11321 | var _queue = syncQueue;
> 11322 | runWithPriority$1(ImmediatePriority$1, function () {
| ^ 11323 | for (; i < _queue.length; i++) {
11324 | var callback = _queue[i];
11325 |
View compiled
flushSyncCallbackQueue
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:11309
11306 | Scheduler_cancelCallback(node);
11307 | }
11308 |
> 11309 | flushSyncCallbackQueueImpl();
11310 | }
11311 |
11312 | function flushSyncCallbackQueueImpl() {
View compiled
discreteUpdates$1
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:22420
22417 | if (executionContext === NoContext) {
22418 | // Flush the immediate callbacks that were scheduled during this batch
22419 | resetRenderTimer();
> 22420 | flushSyncCallbackQueue();
| ^ 22421 | }
22422 | }
22423 | }
View compiled
discreteUpdates
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:3756
3753 | isInsideEventHandler = true;
3754 |
3755 | try {
> 3756 | return discreteUpdatesImpl(fn, a, b, c, d);
| ^ 3757 | } finally {
3758 | isInsideEventHandler = prevIsInsideEventHandler;
3759 |
View compiled
dispatchDiscreteEvent
C:/Users/sadia/OneDrive/SheCodes/Full stack/app/petshop/node_modules/react-dom/cjs/react-dom.development.js:5889
5886 | flushDiscreteUpdatesIfNeeded(nativeEvent.timeStamp);
5887 | }
5888 |
> 5889 | discreteUpdates(dispatchEvent, domEventName, eventSystemFlags, container, nativeEvent);
5890 | }
5891 |
5892 | function dispatchUserBlockingUpdate(domEventName, eventSystemFlags, container, nativeEvent) {
View compiled
TO use react-hook-forms, theres some fix to work:
The inputs fields calls the register function. This function has 2 params:
register(field_name <- string, options <- object);
In your case, you need to call it like that:
<input type='email' name='email' id='email' ref={register("email", {required:true}) } />
<input type='password' name='password' id='password'
ref={register("password", {required:true, minLength:6, maxLength: 10} )} />
You're calling the error object the wrong way. Thats how you should call it:
const { register, handleSubmit, formState: { errors }, reset } = useForm();
The last error I found after the fix is about the way you call register function.
You are setting the register at the ref property. According to the docs, you should just set the register in the component, and this will return all the props:
<input type='email' id='email' {...register("email", {required:true}) } />
Here at the Sources, you can read and deep in "why am I doing this?" =):
register():
https://react-hook-form.com/api/useform/register
errors:
https://react-hook-form.com/api/useformstate/errormessage
I'll add here some tips to help you found the solution to new errors:
Make a path to discover where to focus: When you have an error, you need to found exactly what's causing it. In your case, the console was accusing a file that isn't even in your main folders (that was a dependency). In that case, remove code, try to delete some code, and see if the project works. If works, you now is somewhere there, and do again filtering the removed code.
Go to the official docs/demos and compare your code: I've never used react-hook-forms, But a look at the docs helps me to find the errors.

node-postgres: database "database_name" does not exist error

I am building REST API with PERN(postgres,express,react,node). I am trying to test my user registration route on postman and when I send the request I get this error. "database [my database name] does not exist"
I checked postgres server and i can clearly see i own the database and it is created. This is my connection.
const Pool = require("pg").Pool
const pool = new Pool({
host: "localhost",
user: "[myuser]",
password: "[mypassword]",
port: 5432,
database: "rental"
})
module.exports = pool;
Name | Owner | Encoding | Collate | Ctype | Access privileges
----------------+----------------+----------+-------------+-------------+-----------------------
lucasleiberman | lucasleiberman | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
postgres | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
rental | lucasleiberman | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
rentalapp | lucasleiberman | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
template0 | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =c/postgres

liquibase ignores already executed changeSets

I'm trying to use liquibase (3.5.5) from an existing database (on MySQL).
I've used the generateChangeLog command to generate a db.changelog.xml file.
C:/liquibase-3.5.5/liquibase.bat --driver=com.mysql.jdbc.Driver ^
--classpath=C:/Libraries/mysql-connector-java-5.1.37-bin.jar ^
--changeLogFile=db.changelog.xml ^
--url="jdbc:mysql://vbalder/izalerting" ^
--username=* ^
--password=* ^
generateChangeLog
result: Liquibase 'generateChangeLog' Successful
The generated db.changelog.xml file contains changeSets with author BGADEYNE (generated) and id's who are prepended by 1533645947580-. e.g. 1533645947580-1
Added logicalFilePath="db.changelog.xml" to the databaseChangeLog tag
I've used the changelogSync command to create and fill the DATABASECHANGELOG and DATABASECHANGELOGLOCK tables. They do contain rows for each changeSet.
C:/liquibase-3.5.5/liquibase --driver=com.mysql.jdbc.Driver ^
--classpath=C:/Libraries/mysql-connector-java-5.1.37-bin.jar ^
--changeLogFile=db.changelog.xml ^
--url="jdbc:mysql://vbalder/izalerting" ^
--username=izalerting ^
--password=alfa ^
changelogSync
result: Liquibase 'changelogSync' Successful
Created a CDI component to execute the db.changelog.xml when the application starts.
Added maven dependency:
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-cdi</artifactId>
<version>3.5.5</version>
</dependency>
Added CLI component:
#Dependent
public class LiquibaseProducer {
#Resource(name="java:/izalerting")
private DataSource dbConnection;
#Produces #LiquibaseType
public CDILiquibaseConfig createConfig() {
CDILiquibaseConfig config = new CDILiquibaseConfig();
config.setChangeLog("be/uzgent/iz/alerting/liquibase/db.changelog.xml");
config.setContexts("non-legacy");
return config;
}
#Produces #LiquibaseType
public DataSource createDataSource() throws SQLException {
return dbConnection;
}
#Produces #LiquibaseType
public ResourceAccessor create() {
return new ClassLoaderResourceAccessor(getClass().getClassLoader());
}
}
When deploying the application to WildFly i can see this
2018-08-07 15:07:09,234 ERROR [stderr] (MSC service thread 1-4) INFO 8/7/18 3:07 PM: liquibase.integration.cdi.CDILiquibase: Booting Liquibase 3.5.4
2018-08-07 15:07:09,285 ERROR [stderr] (MSC service thread 1-4) INFO 8/7/18 3:07 PM: liquibase: Successfully acquired change log lock
2018-08-07 15:07:09,781 ERROR [stderr] (MSC service thread 1-4) INFO 8/7/18 3:07 PM: liquibase: Reading from PUBLIC.DATABASECHANGELOG
2018-08-07 15:07:09,814 ERROR [stderr] (MSC service thread 1-4) SEVERE 8/7/18 3:07 PM: liquibase: db.changelog.xml: db.changelog.xml::1533645947580-1::BGADEYNE (generated): Change Set db.changelog.xml::1533645947580-1::BGADEYNE (generated) failed. Error: Table "ALERTRESULT" already exists; SQL statement:
2018-08-07 15:07:09,815 ERROR [stderr] (MSC service thread 1-4) CREATE TABLE PUBLIC.alertresult (triggerid VARCHAR(255) NOT NULL, application VARCHAR(40) NOT NULL, resultid INT NOT NULL, subject VARCHAR(255), content CLOB, contenturl CLOB, executetime TIMESTAMP, html BOOLEAN DEFAULT TRUE NOT NULL, alertlevel VARCHAR(20) DEFAULT 'INFO' NOT NULL, closable BOOLEAN DEFAULT TRUE NOT NULL, screenwidth INT, screenheight INT) [42101-173] [Failed SQL: CREATE TABLE PUBLIC.alertresult (triggerid VARCHAR(255) NOT NULL, application VARCHAR(40) NOT NULL, resultid INT NOT NULL, subject VARCHAR(255), content CLOB, contenturl CLOB, executetime TIMESTAMP, html BOOLEAN DEFAULT TRUE NOT NULL, alertlevel VARCHAR(20) DEFAULT 'INFO' NOT NULL, closable BOOLEAN DEFAULT TRUE NOT NULL, screenwidth INT, screenheight INT)]
2018-08-07 15:07:09,816 ERROR [stderr] (MSC service thread 1-4) INFO 8/7/18 3:07 PM: liquibase: db.changelog.xml::1533645947580-1::BGADEYNE (generated): Successfully released change log lock
The DATABASECHANGELOG table contains a row for each changeSet.
+------------------+-----------------------+-------------------+-----------+
| # ID | AUTHOR | FILENAME | EXECTYPE |
+------------------+-----------------------+-------------------+-----------+
| 1533645947580-1 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-2 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-3 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-4 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-5 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-6 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-7 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-8 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-9 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-10 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-11 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-12 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-13 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-14 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-15 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-16 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-17 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-18 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-19 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
| 1533645947580-20 | BGADEYNE (generated) | db.changelog.xml | EXECUTED |
+------------------+-----------------------+-------------------+-----------+
Does anyone know what I'm doiing wrong here?
instead of
#Resource(name="java:/izalerting")
I needed to use
#Resource(lookup="java:/izalerting")
on wildfly 9

Beaglebone black - Debian 4.1 - PRU - prussdrv_open() failed with -1

I am running the example as root http://mythopoeic.org/BBB-PRU/pru-helloworld/example.c
and I receive the Error:
"prussdrv_open() failed with -1" during the execution
BBB has Debian 4.1
These are the commands used:
sudo cp EBB-PRU-Example‐00A0.dtbo /lib/firmware
echo EBB-PRU-Example > /sys/devices/platform/bone_capemgr/slots
cat /sys/devices/platform/bone_capemgr/slots
0: PF---- -1
1: PF---- -1
2: PF---- -1
3: PF---- -1
4: P-O-L- 0 Override Board Name,00A0,Override Manuf,EBB-PRU-Example
modprobe uio_pruss
dmesg
[ 195.985512] bone_capemgr bone_capemgr: part_number 'EBB-PRU-Example', version 'N/A'
[ 195.994182] bone_capemgr bone_capemgr: slot #4: override
[ 195.999703] bone_capemgr bone_capemgr: Using override eeprom data at slot 4
[ 196.006752] bone_capemgr bone_capemgr: slot #4: 'Override Board Name,00A0,Override Manuf,EBB-PRU-Example'
[ 196.039095] pruss_uio 4a300000.pruss: No children
[ 196.057144] gpio-of-helper ocp:gpio_helper: ready
[ 196.070956] bone_capemgr bone_capemgr: slot #4: dtbo 'EBB-PRU-Example-00A0.dtbo' loaded; overlay id #0
and
/boot/uEnv.txt has disabled the HDMI
EBB-PRU-Example.dts
/* Device Tree Overlay for enabling the pins that are used in Chapter 13
* This overlay is based on the BB-PRU-01 overlay
* Written by Derek Molloy for the book "Exploring BeagleBone: Tools and
* Techniques for Building with Embedded Linux" by John Wiley & Sons, 2014
* ISBN 9781118935125. Please see the file README.md in the repository root
* directory for copyright and GNU GPLv3 license information.
*/
/dts-v1/;
/plugin/;
/ {
compatible = "ti,beaglebone", "ti,beaglebone-black";
part-number = "EBB-PRU-Example";
version = "00A0";
/* This overlay uses the following resources */
exclusive-use =
"P9.11", "P9.13", "P9.27", "P9.28", "pru0";
fragment#0 {
target = <&am33xx_pinmux>;
__overlay__ {
gpio_pins: pinmux_gpio_pins { // The GPIO pins
pinctrl-single,pins = <
0x070 0x07 // P9_11 MODE7 | OUTPUT | GPIO pull-down
0x074 0x27 // P9_13 MODE7 | INPUT | GPIO pull-down
>;
};
pru_pru_pins: pinmux_pru_pru_pins { // The PRU pin modes
pinctrl-single,pins = <
0x1a4 0x05 // P9_27 pr1_pru0_pru_r30_5, MODE5 | OUTPUT | PRU
0x19c 0x26 // P9_28 pr1_pru0_pru_r31_3, MODE6 | INPUT | PRU
>;
};
};
};
fragment#1 { // Enable the PRUSS
target = <&pruss>;
__overlay__ {
status = "okay";
pinctrl-names = "default";
pinctrl-0 = <&pru_pru_pins>;
};
};
fragment#2 { // Enable the GPIOs
target = <&ocp>;
__overlay__ {
gpio_helper {
compatible = "gpio-of-helper";
status = "okay";
pinctrl-names = "default";
pinctrl-0 = <&gpio_pins>;
};
};
};
};
There are two pre compiled versions of the 4.1 kernel for the BBB in the repos: The "TI" version and the "Bone" version. The TI version uses a newer API for controlling the PRU and the Bone version has the same API as the 3.8 kernel and the prussdrv_open() function should work fine.
To install the 4.1 "bone" kernel, you can do:
cd /opt/scripts/tools
sudo ./update_kernel.sh --bone-rt-kernel --lts-4_1
More info: https://groups.google.com/forum/#!topic/beagleboard/cyM3f935wMA
Hmmm looks similar to a previous issue.. The problem was that the PRU was not enabled, and I quote :
echo BB-BONE-PRU-01 > /sys/devices/bone_capemgr.8/slots fixed it.
You could try to use the 4.1.5-ti-r10 version of the kernel. Apparently pruss_uio does not work for some 4.1.x kernels.
Moreover, the dts file you are using was not working for me either (don't know why). I used the following, and prussdrv_open does not fail:
/dts-v1/;
/plugin/;
/ {
compatible = "ti,beaglebone", "ti,beaglebone-black";
/* identification */
part-number = "BB-ENABLE-PRU";
/* version */
version = "00A0";
fragment#1 { // Enable the PRUSS
target = <&pruss>;
__overlay__ {
status = "okay";
};
};
};
If found all this out on that thread: https://groups.google.com/forum/#!category-topic/beagleboard/VBNEoCbEHUQ

Resources