This question already has an answer here:
How to represent an attribute's data type as an array of objects on class diagram?
(1 answer)
Closed 2 years ago.
I am new to UML diagrams, so this may be a very ignorant question though I can't find the answer anywhere.
There is a class Classroom that holds an object of lecture times. Should lecture times be its own class or should it be an attribute of Classroom?
Feel free to critique notation.
Edit: I have already seen this post and it has not helped. I would like to know if LectureTime should be a separate class.
Option 1:
Option 2:
You create a new class if it has more than a single attribute and/or additional operations (which are not just a getter/setter). Or of you plan to add them in a later phase.
In your case lectureTime is obviously a simple type and the 2nd variant is to be preferred, except see above.
However, instead of the round braces you should use square brackets like validLectureTimes[] or validLectureTimes[0..*] which are equivalent.
If LectureTime is an Entity then yes. I guess it is, because of the relation you've added.
This question already has answers here:
NSExpression custom variables inside expression
(1 answer)
Swift - Resolving a math operation in a string
(4 answers)
Closed 7 years ago.
I have been wondering for far too long now: Is it possible in any way to convert a String containing an equation to Int?
For example, this is what I tried:
var equation: String? = "\(5) \(+) \(8)" // 5, +, and 8 would be generated Ints/operaters
if let answer = Int(equation) {
print("\(answer)")
}
I've been trying to find a way to do this across various languages with no luck. Any help is appreciated.
You're looking for an arithmetic expression evaluator, which isn't trivial to do and there doesn't exist a simple function that just does it. You can try to find an open-source implementation of a simple one.
EDIT: Have a look at the Arithmetic Evaluation Rosetta Code page, which is the simplest thing you probably want. There isn't a Swift version (yet), but you should be able to do it yourself by looking at the other languages.
Another EDIT: Check out this Code Golf Thread which features some of the shortest arithmetic expression evaluators in some languages (no Swift yet)
Last EDIT: Also have a look at the Shunting-yard algorithm and generalised Operator-precedence parsing. I'm actually very attempted now to write a simple one in Swift...
Post last EDIT: There exists a Swift library for that :D. Also: Downvotes are for non-useful answers (see hover text), not for when there's a better answer (see comments)
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Are there any better methods to do permutation of string?
I know recursion and can write programs like fibonacci number, tree traversals so I think I know it but when it comes to this question specifically I feel bad
Please guide me with how to calculate all possible permutations of a string
Here is good examples of different permutation algorithms, including recursive one: http://www.bearcave.com/random_hacks/permute.html
This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Open file in ML(SMLNJ)
I have a string value which has value like this:
"[(1,2,3),(2,3),(6,8)]" -> string
but I want to have these values in int type like this:
[(1,2,3),(2,3),(6,8)] -> (int list) list
what should I do?is there any function which can help me? or I have to do it myself?
This problem is perfectly suited to parsing combinators, which you can steal from my friend Greg Morrisett at Harvard.
If you want to understand the underlying ideas, read Graham Hutton's paper Higher-Order Functions for Parsing. If you want to know how to implement I/O in Standard ML, consult the TextIO module in the Standard Basis Library. If you want someone to write the code for you, you may have reached the wrong web site.
You can use stringint from the String module.
See for example http://wwwcgi.rdg.ac.uk:8081/cgi-bin/cgiwrap/wsi14/poplog/ml/help/string#Character%20Transformation%20Functions
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 months ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I have never thought about until recently, but I'm not sure why we call strings strings. I am a .NET programmer, but I believe the concept of strings exist in virtually every programming language.
Outside of programming, I don't believe I've heard the word string used to describe words or letters. A quick Google of, 'Define: string' yields a bunch of definitions that have nothing to do with the concept of letters, words, or anything of the nature associated to programming.
My guess of it, is that, back in the day, strings were really just arrays of characters of a particular length, often with a delimiting character at the end. But, I don't see a natural transition from 'character array' to string.
Can someone offer up some insight to why we call strings strings?
My assumption has always been that the programming term originated from the following definition of the word "string" (from Merriam-Webster):
(1): a series of things arranged in or as if in a line <a string of cars> <a string of names>
(2): a sequence of like items (as bits, characters, or words)
Since a string in programming is simply an ordered sequence of characters, referring to this as a "string of characters" (or simply "string") seems like the most probable origin.
From this reference:
The 1971 OED (p. 3097) quotes an 1891
Century Dictionary on a source in the
Milwaukee Sentinel of 11 Jan. 1898
(section 3, p. 1) to the effect that
this is a compositor's term. Printers
would paste up the text that they had
generated in a long strip of
characters. (Presumably, they were
paid by the foot, not by the word!)
The quote says that it was not unusual
for compositors to create more than
1500 (characters?) per hour.
From searching through the ACM bibliography it seems the word string acquired its meaning in computer science during the 1960s. At the beginning a string is a general kind of sequence or list, e.g. A command language for handling strings of symbols from 1958.
This article explicitly mentions "character strings" in 1964.
Unfortunately I can't access the full texts, which are behind a toll booth.
I had guessed that "string" was in use by mathematicians long before its adoption in programming languages. Turing machines effectively operate on strings. Turing may not have used the term, but it is used everywhere in automata textbooks, going back decades.
The earliest reference I could find was a fragment in Google books of a 1944 article "Recursively enumerable sets of positive integers and their decision problems" by logician Emil Post in Bulletin of the AMS. Fortunately, AMS provides online archives of complete articles free for download. Here is a link: http://www.ams.org/journals/bull/1944-50-05/S0002-9904-1944-08111-1/S0002-9904-1944-08111-1.pdf
I think there is little doubt that he is using "string" in the conventional sense used in computer science. P. 286 "For working purposes, we in-
troduce the letter b, and consider "strings" of 1's and b's such as
11b1bb1. An operation on such strings such as "b1bP produces P1bb1"
we term a normal operation. This particular normal operation is ap-
plicable only to strings starting with b1b, and the derived string is
then obtained from the given string by first removing the initial b1b,
and then tacking on 1bb1 at the end. Thus b1bb becomes b1bb1."
I suspect it's because string originally meant just a sequence of data values: "I'll just string these together" etc. These values didn't have to be characters. One very common use for this general concept happened to be a sequence of characters, and this took over as the general meaning of the word.
The earliest reference I could find in computing is from March 1963's METEOR: A LISP Interpreter for String Transformations by Daniel G. Bobrow at MIT's AI Labs.
However, definition 15d. in the Oxford English Dictionary is:
Computing A linear sequence of records or data.
... and with a first quotation from a 1956 Journal of the Association for Computing Machinery:
Areas are set aside for shuttling strings of control fields back and forth until a completely sorted sequence is obtained.
This use naturally follows on from definition 15c.:
Math., etc. A sequence of symbols or linguistic elements in a definite order.
... and first used in Clarence Irving Lewis and Cooper Harold Langford's Symbolic Logic (1932):
Propositions are not strings of marks, or series of sounds, except incidentally.
This in turn follows on from many other, much earlier definitions for things in a line.
The word was originally used to differentiate between a set of values to which the particular order of elements doesn't matter (for instance, a set of random samples of measurements) and another that could only have its meaning preserved when the order is also preserved. Originally a string could be a set of any kind of values, but since in the post-mainframe era a string of characters is by far the most common kind, the fact that the values are characters became a "default".
A string is a sequence of discrete objects (usually char).
Given that, I would probably venture a guess that it may have to do with a metaphor related to "string of pearls". Each bead on the string is a single character.
It's called a strings, because it's actually an array of char type elements.
That being said, they are "stringing together" (or is it strung together) via this array, which turns them into a "string".