My Tree definition is
data Tree = Leaf Integer | Node Tree Tree
This is a binary tree, with only values at the leaves.
I am given following definition for balanced trees:
We say that a tree is balanced if the number of leaves in the left and right subtree of every node differs by at most one, with leaves themselves being trivially balanced.
I try to create a balanced tree as follows:
t :: Tree
t = Node (Node (Node (Leaf 1) (Leaf 2)) (Node(Leaf 3)(Leaf 4))) (Node (Node (Leaf 5) (Leaf 6)) (Node (Leaf 7) (Leaf 8)) )
Can you please let me know if t above is a balanced tree with values only at the leaves?
Another question, how do I create another tree with values only at the leaves and it is unbalanced as per above definition.
Thanks
Can you please let me know if t above is a balanced tree with values only at the leaves?
I can, but I won't. However, I hope I can guide you through the process of writing a function that will determine whether a given tree is balanced.
The following is certainly not the most efficient way to do it (see the bottom for a hint about that), but it is a very modular way. It's also a good example of the "computation by transformation" approach that functional programming (and especially lazy functional programming) encourages. It seems pretty clear to me that the first question to ask is "how many leaves descend from each node?" There's no way for us to write down the answers directly in the tree, but we can make a new tree that has the answers:
data CountedTree = CLeaf Integer | CNode Integer Tree Tree
Each node of a CountedTree has an integer field indicating how many leaves descend from it.
You should be able to write a function that reads off the total number of leaves from a CountedTree, whether it's a Leaf or a Node:
getSize :: CountedTree -> Integer
The next step is to determine whether a CountedTree is balanced. Here's a skeleton:
countedBalanced :: CountedTree -> Bool
countedBalanced CLeaf = ?
countedBalanced (CNode _ left right)
= ?? && ?? && getSize left == getSize right
I've left the first step for last: convert a Tree into a CountedTree:
countTree :: Tree -> CountedTree
And finally you can wrap it all up:
balanced :: Tree -> Bool
balanced t = ?? (?? t)
Now it turns out that you don't actually have to copy and annotate the tree to figure out whether or not it's balanced. You can do it much more directly. This is a much more efficient approach, but a somewhat less modular one. I'll give you the relevant types, and you can fill in the function.
-- The balance status of a tree. Either it's
-- unbalanced, or it's balanced and we store
-- its total number of leaves.
data Balance = Unbalanced | Balanced Integer
getBalance :: Tree -> Balance
Related
If I understand correctly, modifying (insertion or deletion) a Binary Search Tree in Haskell requires copying the whole tree, so practically making it being O(n). Is there a way to implement it in O(log n) or maybe compiler would optimize O(n) insertion down to O(log n) "under the hood"?
If I understand correctly, modifying (insertion or deletion) a Binary Search Tree in Haskell requires copying the whole tree, so practically making it being O(n).
You do not need to copy the entire tree. Indeed, let us work with a simple unbalanced binary search tree, like:
data Tree a = Node (Tree a) a (Tree a) | Empty deriving (Eq, Show)
then we can insert a value with:
insertIn :: Ord a => a -> Tree a -> Tree a
insertIn x = go
where go Empty = Node Empty x Empty
go n#(Node l v r)
| x < v = Node (go l) v r
| x > v = Node l v (go r)
| otherwise = n
Here we reuse r in case we construct a Node (go l) v r, and we reuse l in case we construct a Node l v (go r). For each node we visit, we create a new node where one of the two subtrees is used in the new node. This means that the new tree will point to the same subtree objects as the original tree.
In this example, the amount of new nodes thus scales with O(d) with d the depth of the tree. If the tree is fairly balanced, than it will insert in O(log n).
Of course you can improve the algorithm and define an AVL tree or red-black tree by storing more information in the node regarding balancing, in that case you thus can guarantee O(log n) insertion time.
The fact that all data is immutable here helps to reuse parts of the tree: we know that l and r can not change, so the two trees will share a large amount of nodes and thus reduce the amount of memory necessary if you want to use both the original and the new tree.
If there is no reference to the old tree necessary, the garbage collector will eventually collect the "old" nodes that have been replaced by the new tree.
I have just started learning Haskell and I am trying to write a code for searching for a particular value in a binary tree and if present return true else false
This is how my tree structure looks like
data Tree = Leaf Int | Node Tree Int Tree
I am not sure how to proceed with the function to traverse through the tree and return the value. I did try BFS and DFS but I am not sure on how to return once I have got my value.
An example of how my function should look
Search 5 (Node (Node (Leaf 1) 3 (Leaf 4)) 5 (Node (Leaf 6) 7 (Leaf 9)))
A binary search could be written as follows. The type can be more generic, as we only need the items to be orderable to store / search in a binary tree.
We visit each node and either return true, or search in 1 of the child nodes.
example Node
5
/ \
3 7
lets search for 7.
We first visit the root. since 5 != 7, we test a child node. Since 7 > 5, we search in the right node, since 7 cannot appear in the left child (all values guaranteed to be lower than 5 on the left child)
If we reach a leaf, we just check if it contains the search term.
search :: Ord a => a -> BinaryTree a -> Bool
search a (Leaf b) = compare a b == EQ
search a (Node left b right)
case compare a b of
EQ -> True
LT -> search a left
GT -> search a right
I am not sure how to proceed with the function to traverse through the tree and return the value.
From that sentence, I understand you would have no problem writing a traversal yourself, but that there is a mental leap you need to take to understand how Haskell works.
You see, you never return anything in Haskell. Returning is fundamentally an imperative statement. Haskell is a declarative language, which means that writing programs is done by stating facts. That nuance can be discomforting, especially if you've been introduced to programming through learning imperative languages like C, Java, JavaScript, etc. But once you truly understand it, you will see how much more expressive and easy declarative programming is.
Because of its strong mathematical roots, in Haskell facts are stated in the form of equations, i.e. expressions where the = sign literally means the left- and right-hand side are equal (whereas in an imperative language, it would probably mean that you assign a value to a variable -- that does not exist in Haskell).
The program #Haleemur Ali wrote is in 1:1 correspondence with how you would write search using math notation:
search(x, t) = { x == y if t = Leaf y
, true if t = Node l y r and x == y
, search(x, l) if t = Node l y r and x < y
, search(x, r) if t = Node l y r and x > y
}
Indeed many times, at least as a beginner, writing Haskell is just a matter of translation, from math notation to Haskell notation. Another interpretation of Haskell programs is as proofs of theorems. Your search is a theorem saying that "if you have a tree and an integer, you can always tell if the integer is somewhere inside the tree". That's what you are telling the compiler when you write a function signature:
search :: Int -> Tree -> Bool
The compiler will only be happy if you write a proof for that theorem ... you probably guessed that the algorithm above is the proof.
An interesting observation is that the algorithm is almost dictated by the shape of the data type. Imagine you wanted to sum all the values in a tree instead:
sum(t) = { x if t = Leaf x
, x + sum(l) + sum(r) if t = Node l x r
}
Every time you want to write an algorithm over a binary tree, you will write something like the above. That is fairly mechanical and repetitive. What if later on you expand your program to deal with rose trees? Tries? You don't want to write the same algorithms and take the risk of making a mistake. One would try to come up with a function that walks down a tree and combines its values (using Haskell notation from now on):
walk :: (Int -> b) -> (b -> b -> b) -> Tree -> b
walk f g (Leaf x) = f x
walk f g (Node l x r) =
let a = walk f g l
b = walk f g r
in g (g (f x) a) b
With this function alone, you can write all manners of traversals on trees:
sum t = walk id (+) t
search x t = walk (== x) (||) t
walk is such a recurring pattern that it has been abstracted. All the data structures that expose the same pattern of recursion are said to be foldable, and the implementation is often so obvious that you can ask the compiler to write it for you, like so:
{-# LANGUAGE DeriveFoldable #-}
data Tree a = Leaf a | Node (Tree a) a (Tree a) deriving (Foldable)
There's even a definition of sum for any foldable data structure.
I'm trying to write a haskell function that will return the max int inside a binary tree of integers. My binary tree is defined as follows:
data Tree = Node Int Tree Tree | Leaf Int
deriving (Eq,Show)
The way I understand it this declaration is saying that for the 'Tree' data type, it can either be a single leaf int, or be a subtree containing two more trees.
So my maxInt function will look something like this ( I think )
maxInt :: Tree -> Int --maxInt function receives Tree, returns int
maxInt --something to detect if the Tree received is empty
--if only one node, return that int
--look through all nodes, find largest
and so when my function is given something like
maxInt (Node 5 (Leaf 7) (Leaf 2)) , the correct value for maxInt to return would be 7.
I'm new to haskell and don't really know where to start with this problem, I would really appreciate some guidance. Thank you
Let me start it for you:
maxInt :: Tree -> Int
maxInt (Leaf x) = ?
maxInt (Node x l r) = ?
You may find it helpful to use the standard function max, which takes two arguments and returns their maximum:
max 3 17 = 17
To begin with, we have this datatype:
data Tree = Node Int Tree Tree | Leaf Int
deriving (Eq,Show)
That means, we have two constructors for things of type Tree: either we have a Leaf with a single Int value, or we have a Node which allows us to represent bigger trees in a recursive fashion.
So, for example we can have these trees:
Leaf 0
And more complex ones:
Node 3 (Leaf 0) (Leaf 4)
Recall that this tree representation have information both in the leaves and in the nodes, so for our function we will need to take that into account.
You guessed correctly the type of the function maxInt, so you are halfway through!
In order to define this function, given we have a custom defined datatype, we can be confident in using pattern-matching.
Pattern-matching is, putting it simple, a way to define our functions by equations described by, on the left side, one element of our datatype (either Leaf or Node, in our case) and on the right side, the result value. I'd recommend you to learn more about pattern-matching here: pattern matching in Haskell
Hence, we start our function by its type, as you correctly guessed:
maxInt :: Tree -> Int
As we have seen earlier, we will use pattern-matching for this. What would be the first equation, that is, the first pattern-matching case for our function? The simplest tree we have given our datatype is Leaf value. So we start with:
maxInt (Leaf n) = n
Why n as a result? Because we don't have any other value than n in the tree and therefore it's the maximum.
What happens in a more complex case?
maxInt (Node n leftTree rightTree) = ...
Well... we can think that the maximum value for the tree (Node n leftTree rightTree) would be the maximum among n, the maximum value of leftTree and rightTree.
Would you be encouraged to write the second equation? I strongly recommend you to first read the chapter of the book I just linked above. Also, you might want to read about recursion in Haskell.
So, I have that Huffman tree, that is used for encoding strings. And I have defined the function plant, but I am not sure whether my tree is not tilting too much only to the one side. Here is my code:
data HuffTree
= Leaf Char
| HuffTree |*| HuffTree
deriving (Eq, Show)
|*| is an infix Constructor.
plant :: [(Char,Int)] -> HuffTree
plant [(x,y)] = (Leaf x)
plant ((x,y):xs) = plant xs |*| (Leaf x)
For me, it looks one-sided and hence it really doesn`t implies the encoding idea, since not being a real binary tree. How could I turn it into a regular binary tree?
You're not constructing the Huffman tree correctly. The process is supposed to go like this:
Turn all the source symbols into single-element huffman trees
Pair each source symbol up with its frequency into a big list of tree/frequency pairs.
If there is just one tree/frequency pair left, that tree is your Huffman tree.
Else remove the two trees/frequency pairs with the lowest freqyency, combine the trees and add the frequencies to make a new tree/frequency pair, and add it back to the list.
Goto 3.
So I'd change it to plant :: [(HuffTree,Int)] -> HuffTree. In the second case I'd sort the elements, pluck off the first two, combine them, then call plant recursively. You might also want to swith to (Int,HuffTree) pairs so that you can use the default sort implementation. You'd also need to add Ord to your HuffTree deriving clause.
I want to count how many elements in a Tree "respect" a certain rule.
For example:
For the data type:
data Tree = Leaf Int | Node Tree Int Tree
and the function signature:
nSatisfy :: (Int->Bool) -> Tree -> Int
for the input:
(>0) Tree
it should return the values of the tree that are (>0).
Here's what i've tried:
nSatisfy :: (Int->Bool) -> Tree -> Int
nSatisfy condition Leaf x = if condition x then 1 else 0
nSatisfy condition (Node left x right)
|(if condition x then 1 else 0) + nSatisfy condition Tree
| otherwise = nSatisfy condition left || nSatisfy condition right
Any help?
UPDATE:
I found a much simpler way to do this:
nSatisfy :: (Int->Bool) -> Tree -> Int
nSatisfy n (Leaf x) = if n x then 1 else 0
nSatisfy n (Node left x right) = (if n x then 1 else 0) + (nSatisfy n left) + (nSatisfy n right)
That function is doing way too much at once: count, check a predicate and traverse a complex type.
I suggest to write a function
listFromTree :: Tree -> [Int]
and build your nSatisfy with listFromTree and the Prelude functions length and filter.
Edit: OP found a working answer himself, now here my code:
nSatisfy' p = length . filter p . listFromTree
listFromTree :: Tree -> [Int]
listFromTree (Leaf x) = [x]
listFromTree (Node left x right) = listFromTree left ++ [x] ++ listFromTree right
Functions that check something, i.e. a -> Bool are usually called predicate and shortened with p like in filter. n is usually an integer, not a function.
There's nothing wrong with your updated version. Luis Casillas and Franky, however, are encouraging you to think about breaking up the ideas in your code into the smallest possible pieces. This is generally the best way to deal with programming problems, for several reasons:
The human brain can only think about so much at once. If you break up the problem into different pieces or layers and only think about one at a time, you have a much better chance of solving it correctly.
You will create functions that you can reuse to solve other problems, and ways of thinking that you can reuse to solve other problems.
You will be able to test each piece of the solution separately. In this case, the problem is simple enough to test the whole solution, but in most realistic cases, waiting until you have a complete solution before you start testing will lead you down the rabbit hole of "I know it's wrong, but I don't know where".
Once you've broken your problem down into little pieces, you are much more likely to find that other people have already solved those problems. Sometime soon, your Haskell study will lead you to polymorphic data structures and functions. By generalizing your Tree type a little, you will gain the ability to use library functions like toList, fmap, and sum, building your solution from solution pieces that other people have written for you.
Well, here's a hint. You can solve this problem much more easily if you split it into three parts:
A mapTree :: (Int -> Int) -> Tree -> Tree function that applies the supplied function to every Int in the tree.
A function that tests an individual Int and returns 1 if it satisfies your condition, 0 otherwise.
A sumTree :: Tree -> Int function that sums all the Ints in a tree.
Then you can put these three parts together to solve your problem fairly easily. And what's more, mapTree and sumTree will be useful for other problems.