Friday, March 28, 2014

Week 11: Sorting and Efficiency

This week's topic is about the efficiency of sorting algorithms and searching techniques given. I have learned how to make possible sorting algorithms and which one is the best depending on the given situation.

         For example, say if I have 7 cards with each having a different value,(we'll use 1, 3, 4, 7, 9, J and Q) you want me to arrange them in an order from lowest to highest, that is what I will do. Quick sort uses pivot method and compares the next card on the right and compares if the values are less or more, if less, the card moves down, elif more, the card remains the same position and the next card gets examined. Such as if the pivot is 7, and the element on the right is 3, 3 goes to the left of 7, it the card is 9, it remains there. You have to ensure all the elements are correctly arranged in this position. Using an order of O(nlgn), it may exactly be efficient, but during my labs studies, I have learned that selection, quick and lists sorts work best when sorted, due to pivot changing every instance and how they don't have to switch and exchange for every value coming its way. There are many ways to sort using quick and merge. However, Merge sort and best case insertion sort works better in the case where lists were not sorted.

       Here's why, merge sort splits up lists until each list has only one element and , but because the sorting treats elements like lists, it may take a while, but it cuts down and merges everything until a fine list is made. Selection sorts and bubble sorts may not be a good idea because they have to look over all the elements for every turn, and the algorithm will also compare each other to other elements, so the order is O(n**2). A quick sort is generally faster, but may not work in some cases because of they may take a while when it comes to much larger lists, ranging the 1000s. So these are examples that if lists were somewhat sorted or put into a sever random order, different sorts must be used to understand which one is best efficient and useful.

Anyways, that should be it for sorting and maintaining the best possible efficient code while minimizing work and
An example as proof of my knowledge:
def merge(L1, L2) -> list:
 L = empty list for accumulator
 // take two elements
i1, i2 = 0, 0
#    every slice is a (linear) copy!
    while L1[i1:] and L2[i2:]:
#    while i1 < len(L1) and i2 < len(L2):
        if L1[i1] < L2[i2]:
            L.append(L1[i1])
            i1 += 1
        else:
            L.append(L2[i2])
            i2 += 1
    return L + L1[i1:] + L2[i2:]


Please check out my other SLOG for the past weeks, they contain pretty interesting information that I managed to pick up and practice on my own.

Saturday, March 22, 2014

Week 10: Sorting Lessons

Sorting is generally a simple way of searching and performs a subroutine that involves searching in a certain order. I find it easy to use Timsort sorting algorithms and trying to create possible outcomes with them. I did find the labs and the exercise to be difficult, probably because I had trouble remembering to make trees and sorting out my work.

Here is a general form of a function:
def function(key, node):
if node is Null:
     return node
elif key < node.key:
    return left.node
elif key > node.key:
    return right.node

Saturday, March 15, 2014

Week 9: BSTs and Expressive Lists

Today, I have learned about searching algorithms in Binary Search Trees. It is important to keep track of certain elements if the Trees consist of hundreds or even thousands or elements per tree. I am reviewing selection and the best and worst case insertion algorithms and even trying to find possible ways of binary searching. The assignment two which involves regexes to return True of False or permuations, will require many huge lists and recursive structures to generate desired results. So this is what I have learned in class and I hope I learn more throughout my practices, even though they are not easy to do.

I am trying to put examples in my blogs to show what I know and I hope I can use this knowledge to improve my works. I do try and write this information down on paper and put that into my computer, sometimes I need to change things and it gets tiring and bothersome to get the code working.
Here are some attempted pseudocodes to get the general idea of searching algorithms used in lists and trees.
def selection (list) -> sorted list:
     """ Takes the first element of the list and compares it to the . This process is repeated and the left side of the tree remains unchanged as it is sorted and reaches n-1 elements and ensures the last one is the same sorted one. By selecting the smallest number and swap it with the n position.
     """

def insertion (list) -> sorted list:
     """ Takes the first list and . The problem is that there exists a worst case scenario that is very inefficient and takes up quite a bit of memory. Usually the worst case insertion performs hundreds to the power of hundreds of tasks to sort out a list, whereas the best case takes the shortest. Insertion usually requires take i position and insert it into leftmost i + 1 position without touching the right positions, the opposite direction.
     """

def searchlist(list, number) -> bool:
""" Takes a list and finds the number in the list, if the number is not there, it returns False. usually by splitting
the given list into two lists, left and right, and searches for the number in both lists using recursion
"""
    middle = len(list) / 2
    max     = max(the middle index, left and right index)
    min      = min(the middle index, left and right index)
    if number is in list[middle], list[middle - 1] or list[element after middle]:
            return True;
   if x is in left list:
       searchlist(of left list[0 : left element of middle], number);
   elif x is in right list:
       searchlist(of right list[right element: last element], number);
   else:
       return False;

Saturday, March 8, 2014

Week 8 : Assignments and More Links! These are Evil!!!

Yes. These are evil...!!!   :c

The assignment seems simple to understand, but it is hard to write down code that follows the instructions.  This focuses more onto designing Trees that inherit a main method and to manipulate the Trees that carry out certain tasks, or raise Exceptions in case something goes wrong.

Example code in the Week 7 CSC Lab(This is a length function in CSC Labs):

def __len__(self):
        """ Return the number of elements in this linked list. """
        counter = 0                       # Start an accumulator here
        temp_list = []                    # create an empty list to add things
        while not self.empty:
            temp_list.append(self.item)       # Add the item into new list that was created
            self.decapitate()                          # Remove the old list away from the new list
            counter += 1                                  # Keep going/adding until while loop reaches false
        while temp_list != []:
            self.prepend(temp_list[-1])
            temp_list.pop(-1)
        return counter

Me and my partner used this code as an example to take a list of numbers given to us, and we created a new list, add/appended items and cut down the list one element by one element as a new list extends itself. I was able to show my understanding by practice. For each element, a value exists and I have learned how to create a copy of a list by creating a new list, appending it, then cutting down the old list until the new list only existed. Finally, returns number of elements of this list.

This image was from the CS Website and Wikepedia using examples on Linked Lists

It doesn't make any difference that this week was more based towards Linked Structures and Trees with an arity of one, creating a series of links towards the node. I have been reading the computer science websites on how to construct Linked Lists and I am currently focusing on the applications and uses for the Lists.

Saturday, March 1, 2014

Week 7: Linked Lists and Recursion

     After the week on Tree traversing, we now learned on how to make pre-order, in-order, and post-order to make a sequence of leaves in the trees. This week, there are lists of numbers and we learned to prepend values in the class. Generally, with arity of only 1 and a continuous branching of lists end up creating nodes with only 1 child, a list of values. This also shows that empty lists are more than just empty lists, that the head can trace or refer to None and preserve the id of the list using prepending of the head of the list.
    I worked on prepending and replacing old elements with new ones by creating a Queue class in the labs, and had to refer to the Tree class ADT to understand the rules of Tree classes and their special functions like __eq__, __repr__, __contains__. I suspect it is the same on learning trees, there isn't much an application to recursion, but there is new information regarding on Tree manipulation
For example:

Recursive functions call back to solve smaller problems to get to the big one:

def  function(a):
    if a == 0:
         perform first action    // A normal base case where 0 or 1, something simple first used
   elif a != 0:
        use function(a-1)     // the previous function that was calculated is applied
Then this keeps up until a actions are completed.
This is mathematical induction or proving mathematical/engineering concepts..

With Linked Lists, the classes construct an empty list, with values of a head then growing more nodes at the end and continues on until the end. IT kinda looks like a growing centipede

Example:      
 """ Beginning Node and Licked lists in this section. """
 
    def __init__(self, item=None, further=None):
        self.item = item
        self.further = next
 
    def __str__(self):
        return str(self.item)

This example shows for every node node n is in class Node(n) and will continue by using next to move on to the next node until all n nodes are completed.

Sunday, February 9, 2014

Week 5: Tracing

     I usually try out a bunch of tests and see if they actually work, and most of the time they don't. I have continued on studying Tracing using the examples that the professor left online. I learned the concepts on how methods can call back functions if a certain condition works, and I learned some examples on how to understand how the code runs using tracing backwards. It's very similar to binary representation, with returning a string of binary code after inputing in integer, by shifting left depending on the number of digits, will move and create new digit from the right when binary of 2, or 4 or 8 and so on is called. This is just one example of what I've learned in the labs last week.

Thursday, January 30, 2014

Week 4: Mostly Recursion

    The turtle class and canvas was a great way to learn recursion in class. I got to try out and trace easily and learn what recursion allows you to do. I do believe that practice is a good way to learn programming, however, it takes more time to grasp difficult concepts, especially learning what to do on the assignment. I try to take some time of anyday without CSC class to learn ahead some materials. The built in functions that allow recursive functions and exception cases could have been useful in CSC108, but now I am glad to learn this so I can practice this with Java. I mostly learned functions and statements taken place in one line and calls itself to perform tasks.