Is there somewhere to be found a complete list of default accounts for PR15Q2.2 or an updated Generic Charts of Accounts file for version PR15Q2.2? - openbravo

I am working on creating a new charts of accounts file.
Started working with the coa file from here:
http://centralrepository.openbravo.com/openbravo/org.openbravo.forge.ui/ForgeModuleDetail/Generic-Chart-Of-Accounts
And using this as a guide:
http://wiki.openbravo.com/wiki/Creating_Accounts_Files
When I look at the file and the guide there is a difference when it comes to the standard accounts.
These are not in the guide on the web but in the file:
"CB_EXPENSE_ACCT"
"CB_RECEIPT_ACCT"
"CH_EXPENSE_ACCT"
And these are not in the Generic Charts of Accounts file but in the guide on the web:
P_COGS_RETURN_ACCT
P_REVENUE_RETURN_ACCT
T_CREDIT_TRANS_ACCT
T_DUE_TRANS_ACCT
Is there somewhere to be found a complete list of default accounts for PR15Q2.2 or an updated Generic Charts of Accounts file for version PR15Q2.2?

I'd recommend to use the testing spreadsheet from openbravo. The information in the wiki is indeed confusing, but in the spreadsheet you can paste your own COA and then debug until no errors are shown:
http://sourceforge.net/projects/openbravo/files/04-openbravo-accounting/Chart%20Of%20Accounts%20Test/
Hope it helps,
Regards
EDIT:
On the testing spreadsheet, the following default accounts are:
A_ACCUMDEPRECIATION_ACCT
A_DEPRECIATION_ACCT
B_ASSET_ACCT
B_EXPENSE_ACCT
B_INTRANSIT_ACCT
B_REVALUATIONGAIN_ACCT
B_REVALUATIONLOSS_ACCT
C_RECEIVABLE_ACCT
CB_ASSET_ACCT
CB_CASHTRANSFER_ACCT
CB_DIFFERENCES_ACCT
* CB_EXPENSE_ACCT
* CB_RECEIPT_ACCT
* CH_EXPENSE_ACCT
CURRENCYBALANCING_ACCT
DEFAULT_ACCT
INCOMESUMMARY_ACCT
NOTINVOICEDRECEIPTS_ACCT
P_ASSET_ACCT
P_COGS_ACCT
P_EXPENSE_ACCT
P_REVENUE_ACCT
SUSPENSEBALANCING_ACCT
SUSPENSEERROR_ACCT
T_CREDIT_ACCT
T_DUE_ACCT
V_LIABILITY_ACCT
W_DIFFERENCES_ACCT
WRITEOFF_ACCT
As per Openbravo pR15Q1.5 (released on the 19th of Juni 2015 - http://wiki.openbravo.com/wiki/Release_Notes/3.0PR15Q1.5), a COA tested with this spreadsheet, can be imported into Openbravo Business Suite.

Related

Invoice2Data template not found

I am currently trying to parse some invoice data and came across this package on PyPi. It seems to be very handy for this task. There is one problem I can't run it due to 'no template found for ' error. From the documentation [https://pypi.org/project/invoice2data/0.2.31/#description]
it becomes clear that you need to specify a template and then run the following command: invoice2data <invoice_file>.pdf. To specify a template you need to run the command invoice2data --template-folder <yourfolder>.
When executing these commands in my wsl (Linux virtual machine, needed to run package) it keeps complaining. My invoice files are in the map 'tpl' (2 custom yml files), and the invoice file is called invoice.pdf see screenshots. I have attached the invoices too for clearity, please not these are all testfiles. My aim is to first make sure invoice2data operates, and then make my own custom YML template conforming the tutorial.
Somehow invoice2data does not get, it needs to assign a template ( I really don't care about which template at this stage) to execute the parsing. I have looked everywhere on google, there are topics on these, but none offer me a solution. I hope somebody can help me out. Much appreciated, tnx a lot in advance
Jeffrey
1. files and directories
2. YML template files
3. Command line execution gives error
4a. invoice sample 1
4b. invoice sample 2

Receiving Market Data into Excel

My summer project is developing an algorithmic trader that receives market data and trades based off indicators. I pull data from a company called Interactive Brokers using their TWS(Trader workshop station). I have downloaded their Excel API which uses DDE, but cannot get the Excel spreadsheet to properly connect with TWS.
In my excel spreadsheet, I wrote this command into a random cell. I've replaced 'sample123' with my TWS username It is suppose to evaluate to 0 before I make other adjustments but it evaluates to #REF!:
=Ssample123|tik!'id1?req?EUR_CASH_IDEALPRO_USD_~/'
Image of error received
Another issue with the error is it deletes two characters off my username and I am unsure of why.
Ex: username -> Sample123
Outcome -> Sample1
Any suggestions would be greatly appreciated! Thank you and have a nice day.
'sample123' in the example formula is the username that was used to login to TWS. It has to be replaced in your formulas with the actual username you used to login to TWS.
Also, to use the older legacy TWS DDE API you have to be running 32 bit TWS and have the setting "Enable DDE" checked in TWS Global Configuration.
Issues causing the #REF error
There is a newer 'DDE Socket Bridge' TWS API also available in the Latest version of the API which has additional functionality and is compatible with the 64 bit version of TWS.
DDE SocketBridge API
I've had the same problem. Just put twsserver instead

xlwings' RunPython fails on MAC

MAC 10.12.5
Excel 2011 for MAC
xlwings.__version__ '0.11.2'
xlwings.__path__
['/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/xlwings']
Trying to do a RunPython from excel, always fails with:
Compile Error
Connot find file or library
Trying to narrow down, running the VBA code found in
Function GetConfigFilePath() As String), specifically:
> mymodule = Left(ThisWorkbook.Name, (InStrRev(ThisWorkbook.Name, ".", -1, vbTextCompare) - 1))
invariably gives me a:
Compile Error
Connot find file or library
I see that the
~/Library/Containers/com.microsoft.Excel/Data/xlwings.conf
could not be found but i have no idea how to get it there, any lead please ?
UPDATE: v0.11.4 supports Mac Excel 2011 again, see: http://docs.xlwings.org/en/stable/whatsnew.html#v0-11-4-jul-23-2017
Make sure to check that it references the correct xlwings addin: In the VBA editor, go to Tools > References and select xlwings. Unselect those that start with MISSSING.... Make sure the correct project is selected while you do this.
original answer:
Mac Excel 2011 support hasn't quite caught up with the new add-in. The issues is that Excel 2011 doesn't show the ribbon and so the config file will not be created automatically. We will work on improving the user experience, but for now you should be able to work around it like this:
Create an empty xlwings.conf in the following path (you'll need to create it if you don't have Excel 2016 installed) ~/Library/Containers/com.microsoft.Excel/Data
Edit it following the instructions, if you want to set global settings that deviate from the defaults: http://docs.xlwings.org/en/stable/addin.html#config-file
Alternatively, skip the 2 steps above and include a xlwings.conf sheet as created automatically by xlwings quickstart <projectname>, see: http://docs.xlwings.org/en/stable/addin.html#workbook-settings (you need to use the addin from >= 0.11.3 though as there was a bug in the earlier versions.

Using Logic Apps to get specific files from all sub(sub)folders, load them to SQL-Azure

I'm quite new to Data Factory and Logic Apps (but I am experienced with SSIS since many years),
I succeeded in loading a folder with 100 text-files into SQL-Azure with DATA FACTORY
But the files themselves are untouched
Now, another requirement is that I loop through the folders to get all files with a certain file extension,
In the end I should move (=copy & delete) all the files from the 'To_be_processed' folder to the 'Processed' folder
I can not find where to put 'wildcards' and such:
For example, get all files with file extensions .001, 002, 003, 004, 005, ...until... , 996, 997, 998, 999 (thousand files)
--> also searching in the subfolders.
Is it possible to call a Data Factory from within a Logic App ? (although this seems unnecessary)
Please find some more detailed information in this screenshot:
(click to enlarge)
Thanks in advance helping me out exploring this new technology!
Interesting situation.
I agree that using Logic Apps just for this additional layer of file handling seems unnecessary, but Azure Data Factory may currently be unable to deal with exactly what you need...
In terms of adding wild cards to your Azure Data Factory datasets you have 3 attributes available within the JSON type properties block, as follows.
Folder Path - to specify the directory. Which can work with a partition by clause for a time slice start and end. Required.
File Name - to specify the file. Which again can work with a partition by clause for a time slice start and end. Not required.
File Filter - this is where wildcards can be used for single and multiple characters. (*) for multi and (?) for single. Not required.
More info here: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-onprem-file-system-connector
I have to say that separately none of the above are ideal for what you require and I've already fed back to Microsoft that we need a more flexible attribute that combines the 3 above values into 1, allowing wildcards in various places and a partition by condition that works with more than just date time values.
That said. Try something like the below.
"typeProperties": {
"folderPath": "TO_BE_PROCESSED",
"fileFilter": "17-SKO-??-MD1.*" //looks like 2 middle values in image above
}
On a side note; there is already a Microsoft feedback item thats been raised for a file move activity which is currently under review.
See here: https://feedback.azure.com/forums/270578-data-factory/suggestions/13427742-move-activity
Hope this helps
We have used a C# application which we call through 'app services' -> webjobs.
Much easier to iterate through folders. To call SQL we used sql bulkinsert

JIRA Groovy - Link issues from another project

We have a special Jira setup so that 1 Epic in our master project = 1 Jira project full of stories/bugs/etc (Governance/compliance wanted this)
Each Issue type has a custom column called 'Ideal days'
For each Epic to get the total estimated days we have a custom function that Sums all 'Ideal days' of issues linked to that Epic that are in the backlog (we manually do this).
With Groovy Runner can I auto-link all issues in a project (in the backlog) to an Epic? Thanks
If you want to link issues without cloning them, you should use IssueLinkManager.getLinkCollection().
The code should look something something like this:
IssueLinkManager issueLinkManager = ComponentManager.getInstance().getIssueLinkManager()
issueLinkManager.getLinkCollection(someIssue, yourUser).each() {
it.getOutwardIssues("Some Link Name")
}
There is a built-in script that clones an issue and links it. You should be able to link based on that code: https://jamieechlin.atlassian.net/wiki/display/GRV/Built-In+Scripts#Built-InScripts-Clonesanissueandlinks

Resources