I am using SubSonic Active Record in a C# web application. I followed all the setup instructions and went well. i queried the database for some simple results and presto I got data. I then changed the namespace in the settings file right clicked on the ActiveRecord file and everything broke. Since then any classes and namespace generated in the ActiveRecord.cs file is not included in the project. Its like the file and the code inside ActiveRecord.cs do not exist or visual studio cannot process or recognize the files. I inspected the file and all seems well.
I am using TFS2008 if that may be the cause. I did notice earlier I also had an issue with System.Data if I pressed System. no Data namespace? This subsequently fixed itself somehow? and now I'm left trying to access the SubSonic generated classes and namespace.
Any ideas?
tks Hans
You have to make sure your project file is checked out - probably solution file too. TFS locks these things and you can't add files in unless the mechanism (your proj file) is unlocked.
Related
In a biztalk project, why do some XSD files have a hidden xsd.cs and some do not? What are these files used for and why is it, modifying the XSD and rebuilding does nothing to modify the .cs files?
For example: I have an XSD which is used to map messages to a SQL Send/Receive Port and execute a stored procedure. If I change the stored procedure (say change, delete, add a parameter) and thus, change the xsd to match, these changes aren't reflected when I deploy the orchestration unless I delete the xsd.cs. I CAN see the modified xsd in the Schemas tab of the BizTalk Administrative Console. I can see it is modified, yet I will still receive a message routing / mapping error unless that .cs is deleted and the orchestration redeployed. And by the way, after deleting, it never seems to regenerate though it also does not cause any issues.
Every xsd in your solution should have a .cs file. If you aren't getting them then there is something wrong with your solution. They are the compiled version of the schema that get deployed to the GAC. If the .cs file is not being changed after you recompile then you again have an issue. Check to see if you've accidentally checked the .cs files into source control and that they are now read-only (they should not be checked in).
When you modify the schema you need to both update the version of the schema in the BizTalk database and in the GAC. If you don't you will get some strange results. Using the Deploy option from Visual Studio will do this for you automatically, but if you are manually deploying you will have to ensure that it is both imported and GACed.
.cs files are generated for every BizTalk Artifact, that's how them become a .Net Type.
All of this should be handled automatically by Visual Studio. If you are having problems that only deleting .cs files will solve, then there's something wrong with your VS setup.
Note, the .cs files should not be in source control. If they are, remove them.
However, the scenario you describe doesn't make sense. What you see in BT admin is from the .cs file.
Kofax Capture Version 9
I have an existing Project and Batch class that works, built previously by Kofax engineer.
What I need to do is change the script in the project to use a new DB connection. This seemed simple enough.
Using project builder I copied the existing project, altered the script and saved the project. Using Capture Administration I copied the existing batch class and then used Synchronize Kofax Transformation Project and pointed to the new project. All this seemed to work without error.
However the script being executed is the original not my altered one, any guidance would be great.
Make sure you are creating a new batch after publishing your change. The batch class class update function works in very limited scenarios, so I don't generally recommend it.
There are many ways that a database connection might be handled in script. Usually I would expect that a function at the project script level handles the connection and is called from any sub class, but you might want to check any sub classes to make sure they aren't using locally defined connection strings.
Even if you are making a connection in script (which you've now changed), you might also be using product features that use databases. Open Project Settings and check the Databases tab.
If there are relational databases listed, simply change as needed.
If you are actually using "Remote Fuzzy" databases then these might be using Kofax Search and Matching Server which connects to a relational database to build the fuzzy db. In this case you would need to use KSMS Admin to change the connection on the KSMS server.
If you are using "Local Fuzzy" databases then the info is based on the content of a text file. You might have some external process (possibly Markview) that dumps this text file from a database.
I'm new to C++/COM. I have created a ATL COM Project with a callback mechanism to send messages to managed side. It has one idl file (sample1.idl) which expose 'n' number of methods, hence managed environment can access it. Now i would like to add another .idl(sample2.idl) file to that project.
.tlb is created for both sample1 & sample2 and build succeeded. On browsing the .dll , I couldn't find the sample2.idl related stuffs. I suspect that .tlb generated from ‘sample2.idl’ is not reflected in the .dll.
Can we have more than one IDL’s in the ATL (COM) project ?
The default for ATL, as with many native build environments, is to embed the type library as a resource in the DLL. Something you can see in Visual Studio (retail edition required), use File + Open + File and select the DLL. Open the "TYPELIB" node, you'll see one type library with resource ID #1. This is the one that Visual Studio sees when you use Add Reference.
Most any build tool that consumes type libraries will only ever look for that one resource. Visual Studio is no exception. It also can encode only one type library in its project files. You perhaps can make it work by selecting the 2nd .tlb file with the Add Reference dialog. Albeit that it is very likely that you'll now get exposed to more problems in your ATL project, like forgetting to register that 2nd type library in your .rgs file.
Very hard to give proper advice without any hint what that second IDL file might contain. Stay out of trouble by merging them or by using the existing support in IDL to import other .idl files or type libraries.
I am making changes to a web forms application in visual studio 2012 and part of that change is to remove reference to an old dll. The dll is responsible for handling authentication and I have written a new class library to handle this.
My problem is everytime I build my website in the solution, it always generates this old dll. I've annotated out all references to it in the entire application and it's not in the project dependencies of the solution.
If I exclude the unwanted dll, it just generates a new one. I am completely baffled by this. It's something I've not encountered before so I am not sure what else I can do.
I'm not sure what to post so if anyone can help then that would be great.
I had a similar problem (but it was with a persistent DB file being generated). I ended up having to go through the bin from the project file and editing it from there. Some files are 'hidden' from the project, and are either visible by
Make sure you are showing all files. There is a button at the top of
the Solution Explorer called "Show All Files". To see this button,
make sure that your project is selected in the solution explorer.
or by manually going through your project files.
EDIT
It can sometimes be hidden in the Global Assembly Cache (GAC) where it can reside indefinitely. http://msdn.microsoft.com/en-us/library/zykhfde0.aspx explains how to remove it (if it is indeed hiding there).
try this, remove the old project from your Solution.
When I try to generate a javadoc, using the menu command Project\Generate Javadoc, the following warnings and error are produced for my custom classes in XPages:
javadoc: warning - No source files for package net.focul.utilties
javadoc: warning - No source files for package net.focul.workflow
javadoc: error - No public or protected classes found to document.
The packages are in the WebContent/WEB-INF/src folder which is configured in the build path and are selectable in the Generate Javadoc wizard. The classes are public with public methods.
Javadocs are generated for all of the Xpage and Custom Control classes if I select these.
You're experiencing this behavior because javadoc doesn't understand the Designer VFS (Virtual File System). It assumes that your project consists of a bunch of separate files in some folder structure on your local hard drive, not self-contained inside a single NSF. On the whole, the Designer VFS successfully tricks Eclipse into believing it's interacting with local files by intercepting read/write requests for project resources and importing/exporting DXL or CD records, etc. But apparently they haven't applied this sleight of hand to javadoc as well.
The Java source files corresponding to each XPage and Custom Control are processed successfully because, ironically, they are never stored in the NSF. During every project build, Designer discards any of these it has already generated and re-creates them based on the current contents of the various .xsp files. It then compiles those Java files into .class files, which are stored as design notes inside the NSF. At runtime, it's these files that are extracted from the VFS and executed... the source code no longer matters at this point, so there's no reason to ever bother including the .java files in the NSF, so they're just kept on the hard drive. One indication of this behavior is that the folder is named "Local" when viewed in Package Explorer / Navigator.
If you're using the built in (as of 8.5.3) version control integration (see this article for a great explanation of how to use this feature), you can tweak the Build Path to include the copy of the src folder stored in the on-disk project as a "linked source folder". This causes javadoc to consider the duplicate copies valid source files, and therefore includes them in the generated documentation. On the downside, it also causes Designer to consider them valid source files, which causes compilation errors due to the duplication. So this approach is only viable if you only need to generate the documentation on an infrequent basis, and can therefore break the Build Path temporarily just to run javadoc, then revert to the usual settings.
An alternative is to actually maintain your custom Java code this way on an ongoing basis: instead of creating the folder in WEB-INF inside the NSF, just create a folder on your hard drive that stores the source, then include that location as a linked source folder indefinitely. That way Designer can still find the source, but so can javadoc. NOTE: if you go this route, then you definitely need to use SCM. Because your source code no longer lives inside the NSF, providing the convenient container we're used to for getting the source code to other developers and ensuring inclusion in whatever backup schedule you use, the only place your source code now lives is on your local hard drive. So make sure you're regularly committing those files to Git / Subversion / Mercurial, etc., or, at the very least, storing them on some file server that is backed up regularly and, if applicable, accessible to all other members of the project team.
When you expand the net.focul.utilties in Designer, you will see all the methods and properties. But when you click on on of the methods, you will see neo source code.
So this is where javadoc fails to generate the documentation. I guess that the author of the application has not provided you with the source code. If you have the source somewhere, you can attach this code and then javadoc will be able to generate the documentation.
I run into the same situation and I have found the most straightforward method is to export the source to an external folder and then use regular Eclipse to generate the JavaDoc. Not sure my process is any less of a hassle than Tim's suggestions but for me it just feels less risky than trying to deal with the VFS vagaries.