The company I work for replaced some handheld scanners recently and need them to have the same functionality as some older ones. We used to use InterMec-EasySet to create barcodes that performed the following actions.
Scan one of 20 barcodes (made in easySet) that contain an address
Scan MANY MANY tracking barcodes on packages
Result: After scanning the address barcode the scanner would set its current rule to say "append [tab] "." [tab]' and [Address] to ALL scans from now on. So that all the scans would fill in the package info and then the destination address in the next field of their DB program. This would continue until another address is scanned, then it would scrap its rule and make a new one with a different address.
Now we have a Motorola (Symbol) DS9808
It has rule programming but not enough memory to store 20+ addresses. I can create a series of barcodes to scan one after another to set the rule I want using the Programming Manual However with the amount of traffic this thing will see it would be a huge waste of time.
QUESTION: Sorry it took so long to get to the real question. Does anyone know how to decode the barcodes in the Programming Manual so that I can create 1 barcode that erases all rules, then sets the default rule to be [data scanned] [tab] "." [tab] [FIXED ADDRESS]
Do not hesitate to ask for clarification if any of this is not clear.
Thank you
Turns out the 123Scan software does have this functionality... just not a very user friend program... somehow I didnt even see the button for this hidden in the top right.
Thanks for the help everyone
Related
I apologise if this question seems a bit vague but i will try explain what I require to the best of my ability.
I'm looking for a solution that involves either BarCode or NFC Scanners. These Barcode/NFC scanners will be assigned to a person and they will then need to tap/scan other tags to tell me where they are or what they're doing.
Ideally I need the following scans to gather the correct information:
A form of UserID (the colleagues I work with have access cards which have their NFC details on. There is no issue adding a barcode on the back of these that has the same detail)
A location scan (these barcodes/NFC tags would be planted on walls around the place so we can identify where they are).
These would need time stamps against them.
An example would be:
Username: Bob Marley | Location: Café | TimeStamp: 24/05/2022 11:36:23
OR
Username: Bob Marley | TimeStamp: 24/05/2022 11:36:23
Location: Café | TimeStamp: 24/05/2022 11:36:35
Does anyone know of piece of hardware/software that would be able to work to collect this information? I really appreciate the support.
Thank you
Something like this would be ideal but with the capability of adding time stamps after each scan: Link
It sort of sounds like you are after a complete solution rather than to interface and collate the data into a specific format. I have used various barcode scanners and RFID readers and they generally have two types of interface with the computer.
The more common type of interface, at least for barcode scanners I have used would be for the data to be transferred as if it was being typed from a keyboard. This would require the software to look for keyboard input from the user. You could then validate the input and match a location and user scan and pass out the current date/time.
The second interface I regularly come across is a serial connection, either a physical COM port or a virtual USB-COM port. The RFID scanners I have used fall into this category. One advantage that I find here is that you can send a trigger command if you only want to scan after an external event has occured.
Doesn't really answer your question but maybe you can knock up some code to capture the two items of data and format them with the date stamp. Excel could be configured to accept keyboard entries and format them into cells.
I have recently observed an issue regarding my data in a column that I use to perform data validation on my spreadsheet.
So There is nothing wrong with the formula, neither is there anything from with the use of data validation.
It should be looking for duplicate entries, which works quite fine.
The issue is that it no longer recognizes input made from a smartphone using the excel app.
so what i did was to retype cell text field from my PC and it worked perfectly.
Is there a way that I can continue using this technique (Data validation) without having to re-enter data from a PC in order for it to process?
Certainly! Yes, that is possible.
But... with all the possibilities in today's world, is your current strategy the one that is the best for you?
That is something I cannot answer for you.
That is something I cannot enumerate for you.
But... There is something that I can introduce to you.
PowerQuery
PowerQuery was a free add-on for Excel 2010 and 2013 and it has been baked directly into Excel for more than half a decade. So, if you're using the mobile app then you probably have a modern version of Excel with PowerQuery right at your finger tips.
Your first step if to determine how you want to make your data available for Excel to get. Go to the Data Tab on the ribbon and review your options in the "Get Extetnal Data" group.
It doesn't matter if free data is your Creed and your most intimate moments are publicly available through your raw data feed. Or if paranoia is the reason why you constantly drive around the block scraping SSIDs before squirreling them away to SQL server for detailed analysis. Or if you're using a USB cable to transfer photos to your PC because your mom walked in on you without knocking and was so disgusted by what she saw on your desktop that you're banned from the family LAN... For life. None of that matters because Excel can connect to your data in so many ways that one of them will be perfect for you.
There is a sense of familiarity when Importing your data into PowerQuery. It's not unlike following those timeless MS Wizards; but nothing like the uncanny sensation of being dropped into the PowerQuery editor. It is simultaneously the same as Excel and different from Excel and it may be the closest you ever come to visiting a parallel universe. Many of the same tools are available but they behave just slightly differently. And in some cases, like the Text To Columns tool, it is light years ahead of Excel and you will find yourself cursing at MS for not using it as a replacement for the old tool.
When you're done transforming your data, you'll have a tight clean table. But the real prize, is that you have fully automated pipe from source to product .
I figured that the phone user included extra spaces when inputting the data.
So i Used the TRIM() function which takes care of the extra spaces between, before, or after each word, and that did the job.
Therefore the major error was that there were additional spaces that was not recognized in the tested data.
I am trying to extract the state space from Amidar in order to hard code an agent for some specific purposes. For example, I want the agent to go down whenever an enemy is 2 cells away or up until they hit a wall then go down again. However, I'm not quite sure how to extract the state space at a specific frame, or in general for that instance, and how to go about interpreting the output.
I have tried env.observation_space but that just returns the frame size (i.e: Box(250,160,3)). Anyone have an idea of how to go about doing this?
Thank you.
I am working with a document, where each row contains a description for a specific incident (fire incidents, where firefighters turn up and thereafter write a report).
The incidents/reports are written by several different people, so the language varies a lot, which makes it difficult to code for one specific context using one word: is.number(search(substring;text))
Because even if the word is in the text piece, the context is not related to what I am trying to analyse.
I want to broaden my word search to be more flexible, by being able to "put" or "store" several different words/phrases into my "substring" - being able to get closer to the specific context that I wish to analyse.
This way to cover more data that is in fact related, but different in how it is described in the individual incident reports.
I have tried to search for a solution myself, but am unsure on how to phrase this specific inquiry.
So far I have only been able to use the code piece above, which is a bit insufficient, when trying to comb through 2000 rows.
I hope that someone is able to help me!
Thank you
An example:
Store the following words: stopped fire, killed fire, fire was put out into: Killed fire
So that when I use Killed fire all the above wordings are included in my search.
I have a column which is made up of addresses as show below.
Address
1 Reid Street, Manchester, M1 2DF
12 Borough Road, London, E12,2FH
15 Jones Street, Newcastle, Tyne & Wear, NE1 3DN
etc .. etc....
I am wanting to split this into different columns to import into my SQL database. I have been trying to use Findstring to seperate by the comma but am having trouble when some addresses have more "sections" than others. ANy ideas whats the best way to go about this?
Many THanks
This is a requirements specification problem, not an implementation problem. The more you can afford to assume about the format of the addresses, the more detailed parsing you will be able to do; the other side of the same coin is that the less you will assume about the structure of the address, the fewer incorrect parses you will be blamed for.
It is crucial to determine whether you will only need to process UK postal emails, or whether worldwide addresses may occur.
Based on your examples, certain parts of the address seem to be always present, but please check this resource to determine whether they are really required in all UK email addresses.
If you find a match between the depth of parsing that you need, and the assumptions that you can safely make, you should be able to keep parsing by comma indexes (FINDSTRING); determine some components starting from the left, and some starting from the right of the string; and keep all that remains as an unparsed body.
It may also well happen that you will find that your current task is a mission impossible, especially in connection with international postal addresses. This is why most websites and other data collectors require the entry of postal address in an already parsed form by the user.
Excellent points raised by Hanika. Some of your parsing will depend on what your target destination looks like. As an ignorant yank, based on Hanika's link, I'd think your output would look something like
Addressee
Organisation
BuildingName
BuildingAddress
Locality
PostTown
Postcode
BasicsMet (boolean indicating whether minimum criteria for a good address has been met.)
In the US, just because an address could not be properly CASSed doesn't mean it couldn't be delivered - cip, my grandparent-in-laws live in enough small town that specifying their name and city is sufficient for delivery as local postal officials know who they are. For bulk mailings though, their address would not qualify for the bulk mailing rate and would default to first class mailing. I assume a similar scenario exists for UK mail
The general idea is for each row flowing through, you'll want to do your best to parse the data out into those buckets. The optimal solution for getting it "right" is to change the data entry method to validate and capture data into those discrete buckets. Since optimal never happens, it becomes your task to sort through the dross to find your gold.
Whilst you can write some fantastic expressions with FINDSTRING, I'd advise against it in this case as maintenance alone will drive you mad. Instead, add a Script Transformation and build your parsing logic in .NET (vb or c#). There will then be a cycle of running data through your transformation and having someone eyeball the results. If you find a new scenario, you go back and adjust your business rules. It's ugly, it's iterative and it's prone to producing results that a human wouldn't have.
Alternatives to rolling your address standardisation logic
buy it. Eventually your business needs outpace your ability to cope with constantly changing business rules. There are plenty of vendors out there but I'm only familiar with US based ones
upgrade to SQL Server 2012 to use DQS (Data Quality Services). You'll probably still need to buy a product to build out your knowledge base but you could offload the business rule making task to a domain expert ("Hey you, you make peanuts an hour. Make sure all the addresses coming out of this look like addresses" was how they covered this in the beginning of one of my jobs).