Map box locations that don’t have edit available - rename

I’m trying to edit a few locations in Mapbox that don’t have the edit function available. These are mainly government tags. Some need to be moved to an accurate location, renamed, or deleted due to being redundant. How do I do this?

Related

NodaTime TimeZone Data files naming

It appears that the time zone database files used by Nodatime are named by year, with releases within the same year incrementing by a
letter - i.e., "tzdb2019a.nzd" is current as I write this, the next release
will be "tzdb2019b.nzd", and some of the previous versions may have been
"tzdb2018a.nzd", "tzdb2018b.nzd", "tzdb2018c.nzd", etc.
However, I have not been able to find this naming convention formally documented anywhere, and assumptions make me nervous.
I expect the time zone data to change more often than my application
is updated, so the application periodically checks for the latest data file at
https://nodatime.org/tzdb/latest.txt, and downloads a new file if the one in
use is different. Eventually there will be several files locally available.
I want to know that I can sort these by name and be assured that I can
identify the most recent from among those that have already been
downloaded.
That's what I anticipate, certainly. We use the versioning from the IANA time zone page, just with a tzdb prefix and a .nzd suffix. So far, that's been enough, and it has maintained the sort order.
It's possible that we might want to provide other files at some point, e.g. if there's no IANA changes for a long time (as if!) but the CLDR Windows mapping files change significantly. I don't have any concrete plans for what I'd do in that case, but I can imagine something like tzdb2019-2.nzd etc.
It's hard to suggest specific mitigations against this without knowing the exact reason for providing other files, but you could potentially only download files if they match a regex of tzdb\d{4}[a-z]+.nzd.
I'd certainly communicate on the Noda Time discussion group before doing anything like this, so if you subscribe there you should get advance warning.
Another nasty possibility that we might need more than 26 releases in a single calendar year... IANA says that would go 2020a...2020z, then 2020za...2020zz etc. The above regex handles that situation, and it stays sortable in the normal way.
Another option I could provide is an XML or JSON format for "all releases" - so just like there's https://nodatime.org/tzdb/index.txt that just lists the files, I could provide https://nodatime.org/tzdb/index.json that lists the files and release dates. If you kept hold of that file along with the data, you'd always have more information. Let me know if that's of interest to you and I'll look into implementing it.

Are these tasks all doable in sharepoint 2010 [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I'm not all that familiar with sharepoint. A client of ours asked if the following can be setup in sharepoint. I believe the following is all achievable, however he had a few questions, which I've included at the bottom. Here's the description:
Client wants to catalog all of his images in sharepoint. These images are used for marketing, annual reports, etc. Here are some features they need:
We’ll setup a subsite and make this guy an admin. He can edit a couple of group memberships to define who can have full access and who has read only.
Let him upload pictures…this is a photo library. Probably in a document library. He’ll need metatags, or custom fields. Description, expiration date, some others.
Give them some views grouping by some of this metadata. Like country.
Send out a weekly report of images nearing expiration.
When images have expired, delete them automatically
General search that will search all metatags and return hits
And here are his questions:
Couple of questions (not sure if these are possible):
- They would like to have a low quality image with a watermark over top of it for read only people. And they would have to click to ask for permission for the full version. The manager would get an email when this permission is asked for. Not sure what is the easiest way after that. Maybe the manager clicks something that will email the full image to that person. If this is doable, write up for me how it would work. So people with full permission see the full image, people with read only see the watermark version.
Is it possible to have it search by only one field, like country. Or give them to the choice to do a general search for all.
In sharepoint, is it possible to show a thumbprint image in the list of pictures? So if they search and get 10 results, they see the thumbnail and they don’t have to click on each one to even see a basic picture.
Are these all doable in sharepoint?
Thanks
Let him upload pictures…this is a photo library. Probably in a
document library. He’ll need metatags, or custom fields. Description,
expiration date, some others.
Give them some views grouping by some of this metadata. Like country.
Send out a weekly report of images nearing expiration.
When images have expired, delete them automatically
General search that will search all metatags and return hits
Everything in the first section SharePoint provides out of the box. The email may be the hardest part but even then it is likely a simple timer job.
a low quality image with a watermark over top of it for read only
people
Showing different images based on user security may be tricky. There is the ability for each item in a library to have its own security but it can be hard to maintain and slow down performance so I would recommend storing them in two lists. One for the watermark images and one for the full image. Linking the two is easy.
Is it possible to have it search by only one field, like country. Or
give them to the choice to do a general search for all.
Searching on one field and general search is also provided out of the box and you can create custom pages with any kind of search you could need.
In sharepoint, is it possible to show a thumbprint image in the list
of pictures? So if they search and get 10 results, they see the
thumbnail and they don’t have to click on each one to even see a basic
picture.
I know the 2013 search provides a preview but I do not know about 2010.

Can user have private area on Perforce?

Can user have private area on Perforce server ?
This area will be used by user to check-in code/files for which exact target branch is undecided.
Also this area wont be shared, so this code is not visible to any other user other than himself. Not visible to other users should be preferred but not a must to have.
At our company, we have a Sandboxes portion of the depot, with individual users having directories underneath that. It doesn't use any permissions or anything technical from Perforce to enforce this, but it is generally understood that a user's sandbox area is just for that use, liable to have broken or partial code, and shouldn't be relied upon for anything (or even to exist in the same form from one day to the next).
We also have some sandboxes for specific projects that show up once in a while. It might make other users curious, depending on how the sandbox was named, but the same general idea applies to them - just with a few more users working in the area, so a little bit less likely to arbitrarily change.
One benefit of this is the ability to tell another user that they can pull some changes made in a sandbox, or look at them for ideas, if desired. You would lose that possibility with restrictive permissions.
Assuming you are the admin (or know the admin) of the Perforce server, you can use the protections table of Perforce to accomplish that. With the help of the manual, what you need to do is
Set up the branches (if they don't exist yet)
Give the user all rights on his branch (list, read, write).
If you want to prohibit branching off of that location, specify the =branch right.
For all other users, you'd deny those rights (i.e. =read, =write).
I want to add that this protection table could grow pretty big (depending on the number of Perforce users you have) and you need to be sure that the benefit is worth the hazzle.
Sorry if I've misunderstood the question, but I think the functionality you require is called "Shelving".
In P4V, you can right-click on your pending changelist and select "Shelve". This has the effect of "checking in" in the sense that your code is safely held on the server, but it's unlike checking-in because your code doesn't go into any branch of your project.
Normally other users won't see the contents of your shelves, but, depending on permissions, it is possible for one user to browse another's workspace and see his shelved files. Even so, it will be clear to them that they're looking at shelved files.

Building a code asset library [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have been thinking about setting up some sort of library for all our internally developed software at my organisation. I would like collect any ideas the good SO folk may have on this topic.
I figure, what is the point in instilling into developers the benefits of writing reusable code, if on the next project the first thing developers do is file -> new due to a lack of knowledge of what code is already out there to be reused.
As an added benefit, I think that just by having a library like this would encourage developers to think more in terms of reusability when writing code
I would like to keep this library as simple as possible, perhaps my only two requirements being:
Search facility
Usable for many types of components: assemblies, web services, etc
I see the basic information required on each asset/component to be:
Name & version
Description / purpose
Dependencies
Would you record any more information?
What would be the best platform for this i.e., wiki, forum, etc?
What would make a software library like this successful vs unsuccessful?
All ideas are greatly appreciated.
Thanks
Edit:
Found these similar questions after posting:
How do you ensure code is reused correctly?
How do you foster the use of shared components in your organization?
Sounds like there is no central repository of code available at your organization. Depending on what you do this could be because of compatmentalization of the knowledge due to security restrictions, the fact that external vendor code is included in some/all of the solutions, or your company has not yet seen the benefits of getting people to reuse, refactor, and evangelize the benefits of such a repository.
The common attributes of solutions I have seen work at mutiple corporations are a multi pronged approach.
Buy in at some level from the management. Usually it's a CTO/CIO that the idea resonates with and they claim it's a good thing and don't give any money to fund it but they won't sand in your way if they are aware that someone is going to champion the idea before they start soliciting code and consolidating it somewhere.
Some list of projects and the collateral available in english. Seen this on wikis, on sharepoint lists, in text files within a source repository. All of them share the common attribute of some sort of front end search server that allows full text over the description of a solution.
Some common share or repository for the binaries and / or code. Oftentimes a large org has different authentication/authorization methods for many different environments and it might not be practical (or possible logistically) to share a single soure repository - don't get hung up on that aspect - just try to get it to the point that there is a well known share/directory/repository that works for your org.
Always make sure there is someone listed as a contact - no one ever takes code and runs it in production without at lest talking to the previous owner of it - and if you don't have a person they can start asking questions of right away then they might just go ahead and hit file->new.
Unsuccessful attributes I've seen?
N submissions per engineer per time period = lots of crap starts making it's way in
No method of rating / feedback. If there is no means to favorite/rate/give some indicator that allows the cream to rise to the top you don't go back to search it often because you weren't able to benefit from everyone else's slogging through the code that wasn't really very good.
Lack of feedback/email link that contacts the author with questions directly into their email.
lack of ability to categorize organically. Every time there is some super rigid hierarchy or category list that was predetermined everything ends up in "other". If you use tags or similar you can avoid it.
Requirement of some design document to accompany it that is of a rigid format the code isn't accepted - no one can ever agree on the "centralized" format of a design doc and no one ever submits when this is required.
Just my thinking.

How Prevent Google Duplicate Content Problem | Multi Site

I'm about to launch a multi-domain affiliate sites which have one thing in common which is content. Reading about the problem with duplicate content and Google I'm a little worried that the parent domain or sub sites could get banned from the search engine for duplicated content.
If I have 100 sites with similar look and feel and basically same content with some minor element changes, how will I go on preventing banning, indexing these correctly?
Should I should just prevent sub-sites from been indexed completely with robots?
If so how will people be able to find their site... I actually think the parent is the one that should only be indexed to avoid, but will love to her other expert thoughts.
Google have recently released an update that will allow you to include a link tag in the head of pages that are using duplicated content that point to the original version, they're called canonical links and they exist for the exact reason you mention, to be able to use duplicated content without penalisation
For more information look here..
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
This doesn't mean that your sites with duplicated content will be ranked well for the duplicated content but it does mean the original is "protected". For decent ranking in the duplicated sites you will need to provide unique content
If I have 100 sites with similar look
and feel and basically same content
with some minor element changes, how
will I go on preventing banning,
indexing these correctly?
Unfortunately for you, this is exactly what Google downgrades in its search listings, to make search results more relevant, and less rigged / gamed.
Fortunately for us (i.e. users of Google), their techniques generally work.
If you want 100s of sites, to be properly ranked, you'll need to make sure they each have unique content.
You won't get banned straight away. You will have to be reported by a person.
I would suggest launching with the duplicate content and then iterating over it in time, creating unique content that is dispersed across your network. This will ensure that not all sites are spammy copies of each other and will result in Google picking up the content as fresh.
I would say go ahead with it, but try to work in as much unique content as possible, especially where it matters most (page titles, headings, etc).
Even if the sites did get banned (more likely they would just have results omitted, but it is certainly possible they would be banned in your situation) you're now just at basicly the same spot you would have been if you decided to "noindex" all the sites.

Resources