jOOQ open source version - can no longer generate source from a schema? - jooq

Am I reading the pricing page correctly?
I thought when I looked at that "footnote 8", it was telling me that the open source version no longer will generate code from a DB schema.
But the doco code generation section says the open source version supports it.
If I did read the pricing page wrong, what is "Java (to generate jOOQ code) [8]" trying to communicate?

That page lists the supported SQLDialect values. The thing you're referring to specifically is SQLDialect.JAVA, which can be used to translate SQL to "Java code" (i.e. jOOQ code). You can try it for free here: https://www.jooq.org/translate
E.g. with this input:
SELECT first_name, last_name
FROM actor
ORDER BY id DESC
And this meta data:
CREATE TABLE actor (
id INT PRIMARY KEY,
first_name TEXT,
last_name TEXT
)
You get this output:
select(
ACTOR.FIRST_NAME,
ACTOR.LAST_NAME
)
.from(ACTOR)
.orderBy(ACTOR.ID.desc())
Nothing was removed from the jOOQ Open Source Edition. Something was added to the other editions.
See the original jOOQ 3.15 feature request here:
https://github.com/jOOQ/jOOQ/issues/6277
And known issues here:
https://github.com/jOOQ/jOOQ/issues/11911

Related

Why does PySide6.QSql.QSqlTableModel not see one of the existing tables MS Access?

There are N tables in the DB with the following data types:
Numeric, long text, date and time bigint, boolean.
All of them opens, except one
I'm opening a database
db = QSqlDatabase("QODBC")
db.setDatabaseName(r"DRIVER={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=C:\Users\...\file.accdb")
db.open(username, password)
I output the tables contained in the db
db.tables()
Output:
["messages", "table1", "table2", ..., "tableN"]
And I'm trying to open the "messages" table
model = QSqlTableModel(db=db)
model.setTable("messages")
model.select()
Output:
False
Then I checked which other tables are not opening
for i in db.tables():
model.setTable(i)
if model.select() == False:
print(i)
Output:
"messages"
This means that the problem is only in this table.
But directly through MS Access the table opens
I have already tried to open it through the cycle. The keyword was found in db.tables(), but QSqlTableModel does not see the 'messages' table specifically.
I tried to change MS Access to the 2016 version. I thought, suddenly some certain type from MS Access 2019 conflicts with the old driver. It didn't help.
I was thinking of downloading a newer driver, but I didn't find one. I tried to dig into the registry... I didn't find anything either.
Please help
So, I figured out the problem by poking. I was initially right about the driver conflict with the bigint data type, but a few additional actions were missing.
Apparently, when you try to set the bigint data type and save it, Access warns you that because of it, databases may not support older versions, and you save anyway, it automatically sets the minimum supported version, or what?
Data separation helped, but before that you need to change the bigint data type to another data type. Database Tools->Move Data->Access Database.

Adding json data to json column is adding escape characters

I am using postgres database, I have a table called names which has a column named 'info' which is of type json. I am adding
{ "info": "security" , description : "Sednit update: Analysis of Zebrocy: The Sednit group \u2013 also known as APT28, Fancy Bear, Sofacy or STRONTIUM \u2013 is a group of attackers operating since 2004, if not earlier, and whose main objective is to steal confidential information from specific targets.\n\nToward the end of 2015, we started seeing a new component deployed by the group; a downloader for the main Sednit backdoor, Xagent. Kaspersky mentioned this component for the first time in 2017 in their APT trend report and recently wrote an article where they quickly described it under the name Zebrocy.\n\nThis new component is a family of malware, comprising downloaders and backdoors written in Delphi and AutoIt. These components play the same role in the Sednit ecosystem as Seduploader; that of first-stage malware."}
Here I am using node js, with sequelize as orm. When I save it in table. I see "\\n" for "\n" and "\\u" for \u. Can anyone help me to avoid escaping characters while saving to table.
I see \n for \n and \u for \u.
In your json description is type of string , so it will convert the new line/enter to \n that the default behavior , or else you will not get the new line / enter when you try to fetch the data again.
And \u is for unicode , you might be saving some smily or special character so that will be converted to such strings.
So there is no bug , this is how it works.

Rocket Software SQL ODBC, Retrieve Wrong Filenames

I installed Rocket Software for accessing an Unidata Db through SQL Server 2008. The idea is to write SQL Procedures for populating SQL Tables, but the problem I am getting is retrieving wrong filenames i. e. Select * from MyDb_Members. I got the field names as Member{Name, Phone{number. In my unidata core these fields are named as Member Name, Phone Number.
Do you know if there is way to run sql queries with those field names without getting sql query errors. It looks sql server does not like to use that name convention:
Select Member{Name from MyDb_Members
Error near '{'
Thanks for your help
Try formatting the query using NATIVE keyword. I don't use Unidata, but in UniVerse this works well. I have a lot of columns that contain periods and those are illegal column names in standard SQL.
{ NATIVE "Select * from MyDb_Members" }

Can I import SAP tables that were exported by SE16?

I have exported the contents of a table with transaction SE16, by selecting all the entries and going selecting Download, unconverted.
I'd like to import these entries into another system (where the same table exists and is active).
Furthermore, when I import, there's a possibility that the specific key already exists for a number of entries (old entries).
Other entries won't have a field with the same key present in the table where they're to be imported (new entries).
Is there a way to easily update my table in the second system with the file provided from the first system? If needed, I can export the data in the 3 other format types (Spreadsheet, Rich text format and HTML format). It seems to me though like the spreadsheet and rich text formats sometimes corrupt the data, and the html is far too verbose.
[EDIT]
As per popular demand, the table i'm trying to export / import is a Z table whose fields are all numeric, character, date or time fields (flat data types).
I'm trying to do it like this because the clients don't have any basis resource to help them transport, and would like to "kinna" automate the process of updating one of the tables in one system.
At the moment it's a business request to do it like this, but I'm open to suggestions (and the clients are open too)
Edit
Ok I doubt that what you describe in your comment exists out of the box, but you can easily write something like that:
Create a method (or function module if that floats your boat) that accepts the following:
iv_table name TYPE string and
iv_filename TYPE string
This would be the method:
method upload_table.
data: lt_table type ref to data,
lx_root type ref to cx_root.
field-symbols: <table> type standard table.
try.
create data lt_table type table of (iv_table_name).
assign lt_table->* to <table>.
call method cl_gui_frontend_services=>gui_upload
exporting
filename = iv_filename
has_field_separator = abap_true
changing
data_tab = <table>
exceptions
others = 4.
if sy-subrc <> 0.
"Some appropriate error handling
"message id sy-msgid type 'I'
" number sy-msgno
" with sy-msgv1 sy-msgv2
" sy-msgv3 sy-msgv4.
return.
endif.
modify (p_name) from table <table>.
"write: / sy-tabix, ' entries updated'.
catch cx_root into lx_root.
"lv_text = lx_root->get_text( ).
"some appropriate error handling
return.
endtry.
endmethod.
This would still require that you make sure that the exported file matches the table that you want to import. However cl_gui_frontend_services=>gui_upload should return sy-subrc > 0 in that case, so you can bail out before you corrupt any data.
Original Answer:
I'll assume that you want to update a z-table and not a SAP standard table.
You will probably have to format your datafile a little bit to make it tab or comma delimited.
You can then upload the data file using cl_gui_frontend_services=>gui_upload
Then if you want to overwrite the existing data in the table you can use
modify zmydbtab from table it_importeddata.
If you do not want to overwrite existing entries you can use.
insert zmydbtab from table it_importeddata.
You will get a return code of sy-subrc = 4 if any of the keys already exists, but any new entries will be inserted.
Note
There are many reasons why you would NOT do this for a SAP-standard table. Most prominent is that there is almost always more to the data-model than what we are aware of. Also when creating transactional data, there are often follow-on events or workflow that kicks off, that will not be the case if you're updating the database directly. As a rule of thumb, it is usually a bad idea to update SAP standard tables directly.
In that case try to find a BADI, or if that's not available, record a BDC and do the updates that way.
If the system landscape was setup correctly, your client would not need any kind of basis operations support whatsoever to perform the transports. So instead of re-inventing the wheel, I'd strongly suggest to catch up on what the CTS and TMS can do once they're setup with sensible settings.

Selecting rows referenced by another table in Yesod

Suppose I have the following Yesod (Database.Persist) database schema:
File
path Text
Tag
name Text
FileTag
file FileId
tag TagId
UniqueFileTag file tag
What is the most convenient way in Yesod to select File records that are referenced by a given Tag record? Do I need to resort to custom SQL? I'm using PostgreSQL as the database backend.
You can use custom SQL to solve this problem; I don't think Persistent offers a different solution, since it's not an ORM because it has to support non-relational backends like MongoDB.
You can implement a basic join like this:
let tagFileStatement =
Text.concat
[ "SELECT ?? "
, "FROM file, file_tag "
, "WHERE file.id = file_tag.file "
, "AND ? = file_tag.tag"
]
files <- runDB $ rawSql tagFileStatement
[toPersistValue theTagIdThatYouWantToLookupFilesFor]
files :: [Entity File]
There is a module for handling one-to-many relationships, either as application-level joins or as a proper SQL join. These modules probably count as the most poorly documented aspect in the entire Yesod project, unfortunately. Longer term, we're hoping to improve the state of more complex SQL queries, but we don't have any actual code to show for this right now.
There are a few threads on the Yesod mailing list covering how to use these, here's one thread that popped up: https://groups.google.com/forum/#!msg/yesodweb/a4EAvPS8wFA/ClPuS94TRFwJ%5B1-25%5D . We really need to either improve the Haddocks or write a wiki page detailing the behavior better.

Resources