Azure does not update files on a web server (Pyramid wsgi-app) - azure

I have a project Pyramid Application. I store it on git and pull the branch to the server when I need update. Until now I was working on Koding but lately decided to check out azure and it's developer's benefits.
After I've created ubuntu server virtual machine (which actually is what runs under Koding) I've downloaded my project using git pull, but forgot to change the branch to the one I'm working on atm. So I did, but server still shows me the old page (like I didn't checkout the other branch). So I checked sftp and files show me they have been updated.
Why am I still seeing the old page?

Now I know the reason why! (at least I think, but please. correct me if I'm wrong)
I noticed that there was .pyc file for every .py file, and those are "compiled" (bit of simplification?) python files as I understood it. And it seemed to me that they were not "compiled" on app launch. But they compiled with setup.py... edit dates suggest that.
So the reason why I didn't see the changes I did in code was that... http.server was using old "compiled" files instead of the source files! But is that normal/expected behaviour? Dunno. There are many other quetions now, but main question was answered so I mark this as answer until someone gives better answer.

Related

New developer process for Hybris

I am working on learning Hybris. I have successfully install hybris, there are lots and lots of blogs out there that talk about getting the core hybris install with your own custom moduleds to make changes to, such as this one:
http://javainsimpleway.com/hybris-b2b-installation/
In the blog above the gentleman creates a mystore. The question I have is this: Once you have this all setup, you have made changes to the mystore modules and you want get those changes onto a new developers machine (or productions machine), who do you do it?
What I have tried, which does not work is this:
zipped up the bin/custom/mystore, config/local.properties, and localextensions.xml
followed his steps 1 thru 4
unzipped the files on the new machine
jumped down to step 12 where he does an ant clean all initialize
One difference between his process and mine is that I am adding some addon's. It is my impression that all those changes happen with custom/mystore, but to be safe between my steps 3 & 4 I have rerun the ant addoninstall for all four addon's.
The process I have documented, SmartEdit was not working and I found SAP's documentation about running ant npminstall because Hybris does not include npm-related 3rd party JavaScript libraries. This are blowing up when I go to run ant npminstall.
I really feel like I am trying to recreate the wheel here. I would imagine what I am trying to do is very common to any Hybris team, but I cannot find documentation on how to do it. Does anyone know of a blog out there that talks about how to migrate the source from one machine to another?
there are lots and lots of blogs out there that talk about getting the
core hybris install with your own custom moduleds to make changes to
Although they may be helpful, I would suggest you stick to official Hybris documentation (e.g. https://help.sap.com/viewer/4c33bf189ab9409e84e589295c36d96e/1905/en-US/8acc8a5a86691014a20781b3f738213e.html) which is quite rich.
Once you have this all setup, you have made changes to the mystore
modules and you want get those changes onto a new developers machine
(or productions machine), who do you do it?
For production deployment, please go through https://wiki.hybris.com/display/hybrisALF/Ant+Production+for+Continuous+Integration
However, for simply copying the things from one machine to another machine, whatever artefacts you have already copied to the target machine (after you have installed Hybris on the target machine), are correct. If you are working in a team, you typically set up an SCM (e.g. git, SVN etc.) code repository and then it becomes easier.
It is my impression that all those changes happen with custom/mystore
This is a wrong impression. When you run addon install it creates/updates the project.properties file in the addon; not in your custom/mystore. So, if the addon is part of the code repository (which is typically not the case unless it is a custom addon), anyone pulling your code on their machine will automatically get the addon project.properties and therefore they will not require to run addon install on their machines; otherwise, they need to run addon install on their machines. A workaround is to copy the content of the addon project.properties to the local.properties (and thus getting the changes to the target machine when the local.properties is copied to the target machine).
This are blowing up when I go to run ant npminstall.
Make sure to run ant npminstall as an admin user. Please check https://answers.sap.com/questions/12771768/smart-edit-unable-to-find-local-grunt.html for another option.

What to do if files content is the same and nothing else has been changed?

I'm working in a environment where files are at Windows side (because I like to work with phpStorm from that side) and Linux side (because I've a Virtual Machine running CentOS 6.6 and there is where LAMP environment is). The phpStorm project is a remote files one. This are the steps I followed to create the project:
Clone the repo at Windows directory
Copy the files to Linux using WinSCP
Create the remote project using phpStorm and this step copy the whole files from Linux to Windows.
I'm using SmartGit to manage my repos and do GIT/SVN tasks (the easy way). But surprise, files hasn't been changed but SmartGit says it does, but waits? How is that possible if the only steps I did was the one described above? Even if you try to open a file SmartGit will said that the content is the same? So, how to avoid this behavior? How to not to commit the whole files? If I made a commit already, how do I dismiss it? Is not the first time I'm running this problem but before repos was mine and I can lose every but now is a serious project and I take care for not damage others work. Any advice? Help? What you can do in this case?
See this pic:
There you can see what I'm talking about.

Sync Android Studio projects across multiple workstations

I want to be able to work across multiple workstations synchronously jumping from one to the other without having to worry about committing.
I have windows personal and work desktop and a Mac OSX laptop. At the moment, I point my project to a cloud directory and have the local install of Android Studio pointing to a gradle offline cache in another cloud directory. This keeps failing as it tells me that the path to gradle is invalid. Which I understand because gradle is referenced in different locations on different machine (considering the differing file management system in MACOSX and Windows7).
Edit: When I try to open the project, it brings up the "Import Project from Gradle" screen. To which it has the option for me to select "Use local gradle distribution" and select the Gradle home directory. I pointed it to the cache directory, and it tells me:
Cannot Save Settings
Gradle location is incorrect.
Location:C:/Users/Username/.gradle
All my research (include these answers here, and here) suggest that VCS is the way to go. However, I don't see this as a solution to my problem. I'm not looking to version control, I'm looking to transition seamlessly across workstations. Of course I will still use Version Control System for the purpose of saving a working version of my code, or sharing it with other developers, but there has to be a better way when I simply just want to keep all workstations synced.
I come from web development, and I synchronise local environment on AMPPS across multiple computers without any issue. This meant I can transition from my personal desktop, laptop, and work desktop instantly. It frustrates me if I have to remember to commit every time I move around. If I have to do this 20 times a day, and it takes about a minute to do this, that's 20 minutes that could have been spent writing a couple of functions. And what if I forget to commit, then I get to work, or home, that would be a day wasted because I won't actually have the current up to date code...
So the question remains, is there a way to instantly synchronise Android Studio projects? How do I keep all my code base (ie gradle) in sync?
Ok thanks to the comments above which pointed me in the right direction.
Android Studio create some local files that are specific to the machine that you are on. Following on this principle, to sync the "source" files (files that are specific to your application only), you must ignore all these local files. This is similar to what you would store on github. I followed the answer for this question to apply the ignore rules.
Having ignored all the "local files", when I create a new project, the source files are synchronised across all my workstations. In order to establish a local version, I need to "import" the project first. Once it has been imported, "local files" will be created for that particular machine. From then on, I can "open" the project locally.
To summarise:
Set your sync to ignore files as per .gitignore or refer to this question.
Create a project on one of your workstation and save it in the cloud.
When you are ready to work on the project for the first time on another workstation, "import" the project.
Once the project has been imported, all local files should have been created.
From then on, use the "open" option to continue working on the project.
I hope this helps somebody else, saving hours on googling.

About updating a node-webkit app

I want to set auto-updates up for my apps before I release. I'm a budding programmer, so when I looked into node-webkit-updater I was pretty confused. It seems under-documented to me. Can someone explain the overall update mechanism that it helps implement?
As an alternative to node-webkit-updater, I was thinking of creating my own update system. I kinda like how Apple handles extension updates and I was thinking about replicating it. This would involve putting a JSON/XML manifest file on Amazon S3 along with the latest versions of the app for all platforms. The app checks the file at startup and replaces itself with the new version.
Is the latter sound plausible? Am I better off going with node-webkit-updater? If so, can someone explain it to me please? My app is a Mac + Windows project.
This is what we did:
The first script of the page checks a custom "manifest" (.txt file) on the server, which contains some arbitrary text, e.g. version number.
If this value differs from a local version of the manifest, then download a .zip file from server. (The zip contains the latest nwjs website. You could have a separate one for each platform).
Unzip into a local directory (we use 7za command line util).
Set window.location.href to above local directory (index.html).
I know this is a old question, but here is the answer :)
https://www.npmjs.org/package/node-webkit-updater

OwnCloud Remove all files prompt

I have a owncloud server and the owncloud desktop client.What I want to do is to be able to delete things server wise and have it automatically delete from the pc. The problem is that the owncloud client displays a warning message of "Remove All Files"? with the choices of Remove all files or to keep files when the files are deleted from the server. Is there a way to not have the prompt come up and automatically remove all files?
In the version 2.2.3 (maybe earlier), you can change the configuration file to disable the prompt.
See the code where the prompt is invoked and the code showing the configuration file property.
If you edit (on Windows): c:\Users\myuser\AppData\Owncloud\owncloud.cfg and add the following, under the [General] section, you will no longer get the prompt.
promptDeleteAllFiles=false
The short answer: You cannot change this currently.
The long answer: The dialog was added as a safe-guard because there were cases where you could lose all your files unintentionally, e.g. if your admin re-created your account and left it empty. The client would assume the files had gone and would replicate this (it could not know better), so it would replicate the data removal locally. The code is still there today just to be safe.
If you are fearless, you can patch Folder::slotAboutToRemoveAllFiles(). Alternatively, you could open a bug report so we can solve this for everyone. What is your motivation to be able to do this without a prompt?
PS: The sources can be found on GitHub. URL and build instructions at http://doc.owncloud.org/desktop/1.5/building.html.
I have a script that processes the files that someone drops into ownCloud and it will then move them to the final storage place. However, this prompt stops the client from syncing until I manually log in to acknowledge it... I guess I will learn how to patch this.. Dropbox doesn't do this. Google Drive doesn't do this. But since I can't use cloud services (compliance issues), I have to use this solution until I can build a new secure upload means.

Resources