NodaTime update manual tz db - nodatime

there is a section "Using a NodaZoneData file" on how to include and load the tz data from a NodaZoneData file.
the code section show the following
IDateTimeZoneProvider provider;
// Or use Assembly.GetManifestResourceStream for an embedded file
using (var stream = File.OpenRead("tzdb-2013h.nzd"))
{
var source = TzdbDateTimeZoneSource.FromStream(stream);
provider = new DateTimeZoneCache(source);
}
Console.WriteLine(provider.SourceVersionId);
How do you set the created provider to the NodaTime library in order to use it as a default?
NodaTime release do not follow the tz data release. Will this be changed in the future?
Is there another way to get a updated NodaTime lib with tzdb data updated as a nuget?
Thanks

How do you set the created provider to the NodaTime library in order to use it as a default?
Very few things in Noda Time use any provider as the default. (Similarly we don't use the system time zone by default, and try to avoid implicitly using the current culture, other than for compatibility.) Looking at the current 2.0 source code, the only references are in ClockExtensions and DateTimeZoneProviders.Serialization (which is used by XML and binary serialization, and which can be set in application code).
If you want to have your own application-wide default, I suggest you create your own class with a static property exposing an IDateTimeZoneProvider, or maybe a singleton. Refer to that anywhere you'd otherwise refer to DateTimeZoneProviders.Tzdb.
In terms of keeping things up to date:
A new nzd file is posted on the Noda Time web site very soon after each new release. You can detect that by fetching http://nodatime.org/tzdb/latest.txt which contains a URL to the latest file.
There are options we're considering around how to release nuget packages:
Have a nuget package just containing the data (and a tiny amount of "bootstrapping" code probably), and removing the embedded data from NodaTime.dll
Have a nuget package containing all the TZDB-specific code, and removing the embedded data from NodaTime.dll
Just change our release practices so that a new NodaTime nuget package can be pushed at the same time as the new file
None of the options is simple to implement, and there's a lot of other Noda Time work to do (reimplementing the web site with docfx, scheduling benchmarks using BenchmarkDotNet and exposing that data on the web site, and of course finishing the 2.0 code base). We'll get to it, but don't hold your breath for it being Real Soon Now.

Related

Where to store API keys and other 'secrets' in a yesod app

I'm trying out a yesod applications which I will eventually put up on github or similar.
I will use oauth2 with google which means I have to provide an email and secret token. Which I obviously do not want up on github.
What is a good place to store these in a yesod scaffolded application? I'm hoping to store it in a seperate, config/secret.yml for example, so I can put that into the ignore file of git/mercurial and never commit it.
But i can't find out how to include such a file. Or if such a file already is provided by yesod. config/settings.yml seemed possible, but there's entries there which I would like in github.
So my question is, in a yesod scaffolded application. Where can I store secret keys in a way I can easily exclude it from version control systems?
There are many approaches to this, mostly depending on what flavor of devops/hosting your prefer. One option is to put a dummy value in the config file and override it with an environment variable at runtime (see: https://github.com/yesodweb/yesod/wiki/Configuration#overriding-configuration-values-with-environment-variables). You can also having an extra settings file for production that overrides the values in the default config file, which is how the test suite works. A completely different approach would be to use a system like vault in production and query it for your secure credentials.
EDIT To spell out one of the approaches:
Create a new YAML file with the settings you won't to override, e.g. in config/production.yml:
copyright: This is a copyright notice for production
When you run the application, pass in a command line argument giving the location of the config file

What is needed to make sure new versioning-enabled Core Data files go with the app?

I set up my working Core Data sqlite file for versioning. The versioning setup process created 3 files:
foo.sqlite
foo.sqlite-shm
foo.sqlite-wal
Since then, I can access the Core Data store programmatically (using MagicalRecord), but I can't read any data using either the Firefox add-in (SQLite Manager) or the app SQLiteManager. I'm concerned that when I send the updated app to the App Store, the additional files are not going to go and the app is going to crash.
What do I need to do to make sure new versioning-enabled sqlite files go with the app?
Those are not version-related files, they're SQLite log files. These files get created automatically when write-ahead logging is enabled. That's not the default in iOS 6, but it's possible if you use PRAGMA journal_mode=WAL;. It might or might not be the default in iOS 7 (I have no comment at this time).
I don't know why Firefox and SQLiteManager can't open the file. I speculate that they're both using an old version of SQLite (since WAL is only available as of SQLite 3.7.0). Regardless, they have nothing to do with whether the necessary files are available in your app. You can find out what's included in the app by just looking. The .app is just a directory, really, so take a look inside and see what's there.
If you are using lightweight migration (wich is enabled by passing the
right options when you open the store), Core Data takes care of
upgrading the schema in-place.
The additional WAL and SHM files are not a result of lightweight
migration, but are instead simply produced by SQLite in the “write
ahead logging” mode that Core Data puts it into. (An
oversimplification is that new data goes into the .wal file until
enough accumulates and then it is moved to the .sqlite file.)
Yes, you definitely want to test using Ad Hoc builds for lightweight
migration; testing from Xcode is insuffient.
Mike Fikes

Trouble syncing file-based templates to database using MSM and config bootstrap

Had started my typical EE build (using a bootstrapped config) for a client when they announced they wanted another additional site using the MSM module (le sigh).
So added the MSM module, I commented out the $config['site_url'] and $config['cp_url'] and set those in index.php instead using $assign_to_config.
That's when I discovered this bug where MSM config file settings are not recognized, which is a pain but I can work around it. However, I noticed that when I created the secondary site, it wouldn't recognise my custom location for add-ons and so I had to add that to index.php as well to $assign_to_config['third_party_path'] = "../assets/third_party/";.
Then I discovered that when I create or modify a template file, it won't automatically sync and so I need to manually do that each time which is a real PITA.
Why would my templates not be syncing to the database? Is this related to the MSM config bug?
While I haven't tried bootstrapping the third party path yet, I've definitely been able to bootstrap the template path for MSM sites... What bootstrap method are you using?
Are your sites on subdomains or subfolders? I've only had experience with subfolders so perhaps that makes a difference (although it shouldn't).
Could you maybe walk through in a bit more detail what's happening? Your first site (site_id = 1) templates sync automatically from filesystem edits, but your second site does not? Yet if you go to CP > Design > Synchronize Templates, that works?
The $assign_to_config portion of MSM setup is definitely a weakspot when it comes to bootstrapping... I wonder if we need to work up an additional bootstrap for MSM+CP environment, where it looks at the cp cookie ($_COOKIE['exp_cp_last_site_id']), and sets values based on that.
It may be helpful if you let us know which bootstrap you are using. For example, if you look at this bootstrap the site_url and cp_url are set using the HTTP_HOST server variable, so this shouldn't clash with your MSM install (and multiple domains) at all.
Perhaps you could try using that boostrap file instead, and see if it fixes your issue with template syncing?
Finally, if you're going to use the EE template manager, you don't really need to store templates as files. Conversely, if you want to save templates as files, it's probably much easier editing them using Sublime Text or another editor, rather than the clunky built-in editor (which is really only useful for small/simple changes).

How to generate a javadoc in XPages

When I try to generate a javadoc, using the menu command Project\Generate Javadoc, the following warnings and error are produced for my custom classes in XPages:
javadoc: warning - No source files for package net.focul.utilties
javadoc: warning - No source files for package net.focul.workflow
javadoc: error - No public or protected classes found to document.
The packages are in the WebContent/WEB-INF/src folder which is configured in the build path and are selectable in the Generate Javadoc wizard. The classes are public with public methods.
Javadocs are generated for all of the Xpage and Custom Control classes if I select these.
You're experiencing this behavior because javadoc doesn't understand the Designer VFS (Virtual File System). It assumes that your project consists of a bunch of separate files in some folder structure on your local hard drive, not self-contained inside a single NSF. On the whole, the Designer VFS successfully tricks Eclipse into believing it's interacting with local files by intercepting read/write requests for project resources and importing/exporting DXL or CD records, etc. But apparently they haven't applied this sleight of hand to javadoc as well.
The Java source files corresponding to each XPage and Custom Control are processed successfully because, ironically, they are never stored in the NSF. During every project build, Designer discards any of these it has already generated and re-creates them based on the current contents of the various .xsp files. It then compiles those Java files into .class files, which are stored as design notes inside the NSF. At runtime, it's these files that are extracted from the VFS and executed... the source code no longer matters at this point, so there's no reason to ever bother including the .java files in the NSF, so they're just kept on the hard drive. One indication of this behavior is that the folder is named "Local" when viewed in Package Explorer / Navigator.
If you're using the built in (as of 8.5.3) version control integration (see this article for a great explanation of how to use this feature), you can tweak the Build Path to include the copy of the src folder stored in the on-disk project as a "linked source folder". This causes javadoc to consider the duplicate copies valid source files, and therefore includes them in the generated documentation. On the downside, it also causes Designer to consider them valid source files, which causes compilation errors due to the duplication. So this approach is only viable if you only need to generate the documentation on an infrequent basis, and can therefore break the Build Path temporarily just to run javadoc, then revert to the usual settings.
An alternative is to actually maintain your custom Java code this way on an ongoing basis: instead of creating the folder in WEB-INF inside the NSF, just create a folder on your hard drive that stores the source, then include that location as a linked source folder indefinitely. That way Designer can still find the source, but so can javadoc. NOTE: if you go this route, then you definitely need to use SCM. Because your source code no longer lives inside the NSF, providing the convenient container we're used to for getting the source code to other developers and ensuring inclusion in whatever backup schedule you use, the only place your source code now lives is on your local hard drive. So make sure you're regularly committing those files to Git / Subversion / Mercurial, etc., or, at the very least, storing them on some file server that is backed up regularly and, if applicable, accessible to all other members of the project team.
When you expand the net.focul.utilties in Designer, you will see all the methods and properties. But when you click on on of the methods, you will see neo source code.
So this is where javadoc fails to generate the documentation. I guess that the author of the application has not provided you with the source code. If you have the source somewhere, you can attach this code and then javadoc will be able to generate the documentation.
I run into the same situation and I have found the most straightforward method is to export the source to an external folder and then use regular Eclipse to generate the JavaDoc. Not sure my process is any less of a hassle than Tim's suggestions but for me it just feels less risky than trying to deal with the VFS vagaries.

How do you handle code promotion in a Sharepoint environment?

In a typical enterprise scenario with in-house development, you might have dev, staging, and production environments. You might use SVN to contain ongoing development work in a trunk, with patches being stored in branches, and your released code going into appropriately named tags. Migrating binaries from one environment to the next may be as simple as copying them to middle-ware servers, GAC'ing things that need to be GAC'ed, etc. In coordination with new revisions of binaries, databases are updated, usually by adding stored procedures, views, and adding/adjusting table schema.
In a Sharepoint environment, you might use a similar version control scheme. Custom code (assemblies) ends up in features that get installed either manually or via various setup programs. However, some of what needs to be promoted from dev to staging, and then onto production might be database content that supports the custom code bits.
If you've managed an enterprise Sharepoint environment, please share thoughts on how you manage promotion of code and content changes between environments, while protecting your work and your users, and keeping your sanity.
I assume when you talk about database content you are referring to the actual contents contained in a site a or a list.
Probably the best way to do this is to use the stsadm import and export commands to export and import content from one environment to another. (Don't use backup/restore when going from one environment to another.)
For any file changes (assemblies, aspx) you can use Features and then keep track of the installers. You would install the feature and do an upgrade to push changes.
There's no easy way to sync the data...you can use stsadm import/export commands as John pointed out. But this may not be straight-forward, especially if the servers are configured differently.
There's also Data Sync Studio product (http://www.simego.net/DataSync_Studio.aspx) you can try.
Depending on what form the database content takes, I would keep the creation of it in code so it's all in one place (your Visual Studio project) and can also be managed via source control. Deployment of the content could either be via a console application or even better feature receiver.
You might also like to read this blog post and look at the tool mentioned there for another approach.
The best resource I can point you to is Eric's paper:
http://msdn.microsoft.com/en-us/library/bb428899.aspx
I was part of a team working to better the story around development of WSS and MOSS solutions with TFS, but I don't know where that stands.

Resources