Selection sort sorts a list by starting from the beginning of the list, taking the first element, and looks through the rest of the list to find an element smaller than the first element. If a smaller element is encountered, it takes this element and continues to check the list until it finds the smallest element in the whole list. Once the smallest element is found it's put to the front of the list, then repeats the process at the left most unsorted half of the list. The worst case and best case performances both have a running time algorithm of n^2.
Quick sort sorts a list by starting from a random element in the list and makes this the pivot of the list. Then from the first element and from the last element in the list, a check is made to see if the first element is bigger than the pivot element and if the last element is smaller than the pivot element. If one or both of the checks are false, then it goes to the next element closer to the pivot and does the same check. Once these 2 elements satisfy the check, they switch places and do the same check on the next 2 elements closer to the pivot. After every element on the left side is smaller than the pivot element and every element on the right side is bigger than the pivot element, the pivot element is in the correct sorted position in the list. Now quick sort gets applied to the left and right parts of the list with respect to the pivot until every element in the list is in the correct sorted position. The worst case performance has a running time algorithm of n^2 and the best case performance has a running time algorithm of nlog(n).
Merge sorts a list by breaking the list of elements in 2 halves, and does this until each sub-list has 1 element in it. Then 2 sub-lists are compared by taking the first element of one list and looping over the other until the loop encounters an element smaller than the first element of the first list, and this element is put at the end of a new list. Once the loop is done, it takes the first element of the first list and is also put at the end of the new list. The same process is repeated if there is a second element in the first list. Now the 2 sub-lists have merged into a sorted list, and merge sort gets applied to any 2 sub-lists until 1 list is left, which will be the complete sorted list. The worst case and best case performances both have a running time algorithm of nlog(n).
In sorting algorithm terms for computer science, efficiency is what compares all algorithms from being the best sorting algorithm to being the worst sorting algorithm depending on the types of cases, like if a list is sorted already or if a list is partially sorted. When an algorithm is said to be efficient, that means this sorting algorithm uses the least amount of time and effort to sort a given list. Out of the sorting algorithms talked about above, merge sort is the most time efficient no matter what kind of list is being sorted. Regardless of the list being sorted or not, merge sort has to partition all sub-lists into smaller lists which makes it efficient for a very large list. The least time efficient sorting algorithm talked about above is selection sort. Although the best case performance is like quick sort, if selection sort is given a reversed list, it has to look over the whole list in order to find the smallest element as many times as there are elements in the list. Although merge sort is the most time efficient and selection sort is very effort efficient, quick sort is generally the more time and effort efficient since it can be written in few lines of code and can be just as time efficient as merge sort. Out of the 3 sorting algorithms talked about above, the rank of general efficiency from best to worst is quick sort, merge sort, then selection sort.