i am trying to run a project of kohana-3 in wamp server but getting following error
Kohana_Exception [ 0 ]: Directory APPPATH\cache must be writable
please help.
Previously this was working fine.
You must set your cache folder to be writable.
Bare in mind that usually in WAMP the PHP user is 'nobody' which doesn't have access to your folders, therefore you must set your folders to be writable.
Check whether the file actually exists (application/cache), this folder comes empty with Kohana, and some versioning systems and other software usually delete and ignore empty folders.
If it does not exist, create the folder and insert a blank file in it (ie, empty.txt) if the error persists, give the appropriate permissions.
Oh yeah, I read an answer from someone yesterday but I only figured it out today. What I did was this:
Go inside application/
Create a new folder cache/
Go inside cache/
Create a empty textfile. (I created empty.txt)
Then thats it. hope it helps
Related
Scenario:
I want to create a field field and attach it to a content type and willing to store the uploaded file into private file system.
I am using Drupal 8.6, php 7.1 and linux
My default file system is public, but for that specific file field, I want to store it, into private folder. So I created a folder outside my document root folder e.g. /var/www/private and set that into settings.php.
The purpose of this field is that, to allow, certain logged in users to able to download and restrict other.
I googled a lot, there are lot of suggestions, none worked or pointed the issue I am facing. I set the file directory as "dcouments/[date:custom:Y]-[date:custom:m]".
Now when i trying to upload a file, it gives me error like, folder "documents/2018-09" can't be created and upload failed. Surely it seems permission issue, i gave "private" folder to "rwxrwxr-x" permission, but its not working. Strangely it works in windows system.
Could anyone suggest, how to fix that? Whats going wrong there?
thanks in advance!
Permissions you defined may just be fine. The problem maybe that the private folder doesn't belong to www-data neither to its group. Check the real owner of the folder and set it to www-data. This should do the trick.
What I actually find out that, the web-server don't have permission to create sub-directory in the private folder.
I have been encountering the following issue and I´m going a bit crazy. I hope you can help me:
In the backoffice, I can see the media folders but when I click on it, it doesn´t show me the content (files)
The strange thing is that this is only happening in some folders. And there are no restrictions or rules applied ont he media browser.
Some things to take in to consideration:
Modx version 2.3 (I already upgraded, but still not working)
Ftp folder permissions are all 777
The media Path file source are all correct
I deleted the cache + permissions
The error.log file isn´t showing any visible error.
My ACL is correct, it has administrator permissions
enter image description here
enter image description here
Please advice.
Many thanks!!
UPDATE :
The directories was corrupted I created the new directories and rename the old directories.
Not sure if this will fix your particular issue - but I noticed in the media source setup you have a forward slash (/) before your paths as well as after. I think in general it's good practice to have the same number of forward slashes as you have directories listed. I usually include the / after the last directory, but DON'T include one before "assets".
I don't know if this is the "official" way to do things, but I have had many issues uploading and using the media browser if I don't set it up as above.
I wanna to freeze a folder in red hat so nobody (even root) can not add file into the folder or change files that exist in the folder already, i tried to make folder read only but this does not work and root user can add files normally as before, please somebody help me to solving this problem.
Create a filesystem in a file (eg: an iso file) containing the files you want in the directory then use a loopback mount to mount it read only onto the directory.
Anybody who tries to modify the filesystem normally (including root) will get a "read-only filesystem" error.
No. By design, in Linux, root ignores existing permissions on all entities. However, what you can do is encrypt files so that they can't be read and can't be modified by those who don't know the key. You can't prevent new files from being added, but with both encryption and decryption keys private, you can easily verify if any file is valid.
This also means you can't have either key on your computer!
Just getting my head around the new Azure web sites feature and hitting my first obstacle. I'm deploying a PHP site which writes cache data to the file system, but the app is throwing an error because the folder it wants to write to does not have write permission. Is it possible to set permissions on folders or is this a no-no?
I can probably work round this but would like to know if it's possible.
Folder permissions cannot be set/customized. This means whatever location your app writes to should be under your site root.
Your site can only write to locations under C:\DWASFiles\Sites\[siteName]\VirtualDirectory0 and to the %TEMP% folder.
Two caveats here:
Stuff can't be written directly under VirtualDirectory0, you have to create a subfolder under there and place your files in that subfolder
The %TEMP% folder really is temporary! If your site instance goes down for any reason and is brought back up somewhere else then everything in your %TEMP% folder will be gone. Use it only for files that really are temporary.
Is the folder that the app is trying to write to under the site's folder?
It's my understanding that folder permissions cannot be set/changed. But I haven't seen anything from Microsoft that definitively says "yes" or "no" to that.
It should be possible using webdeploy.
However I don't think there is a way do it without manually setting up the webdeploy package - as described in the post http://blogs.msdn.com/b/azureappgallery/archive/2013/04/03/set-file-folder-permissions-for-your-content-on-azure-website-using-web-deployable-package.aspx.
I am having a frequent problems with my web hosting (its shared)
I am not able to delete or change permission for a particular directory. The response is,
Cannot delete. Directory may not be empty
I checked the permissions and it looks OK. There are 100's of files in this folder which I don't want.
I contacted my support and they solved it saying it was permission issue. But it reappeared. Any suggestions?
The server is Linux.
You can't rmdir a directory with files in it. You must first rm all files and subdirectories. Many times, the easiest solution is:
$ rm -rf old_directory
It's entirely possible that some of the files or subdirectories have permission limitations that might prevent them from being removed. Occasionally, this can be solved with:
$ chmod -R +w old_directory
But I suspect that's what your support people did earlier.
This could also be because your FTP client might not be showing the hidden files (like cache, or any hiddn files that your application might create), while the hidden files are preventing you from deleting the directory. (though, in your case, I am not sure if this is the cause .. .it could be permission issue with your hosting provider.. Webserver running as another user (like apache or www) combined with your directories having global write perms).
I assume that's a response from an FTP server?
Usually, a message from an FTP server really means it. If it says the directory is not empty, there might be certain files you cannot see that exists in the directory which maybe one of:
Your PHP/JSP/ASP/whatever scripts may run under a different user account thus creating files which you may not be able to see/delete
Is your hosting's web interface run under your FTP account? There might be conflicting permissions there if you manage some files from the web interface and then later via FTP.
Hosting server/operating system files created unintentionally e.g. from the hosting's web interface
If it comes from a script, write a one-time throw-away script that delete the files and that directory and then uploads and executes it.
And just to be sure, some FTP server doesn't support direct directory deletion, you need all the files first, is that the case?