::Welcome to my online journal where I record,
on a weekly bases, the progress, challenges,
and other interesting events that happen
with me in CSC236 this fall at UofT::



October 28, 2008

Week Eight

Monday,
---------- Oct. 13, 2008
After talking about the correctness of recursive programs last week, we moved to iterative programs. The tricky part in codes with “for” and “while” loops is that you have to prove that the code eventually terminates; and when it does; it satisfies the postcondition.
So we might need to find a loop invariant first; something that demonstrates how the loop is progressing towards the goal. Then, we need to find a non-empty decreasing sequence of natural numbers to prove that the loop actually terminates. Finally, we use the information we gathered from the first and second part to conclude that if the preconditon holds then the postcondition is satisfied when the loop exists.


Assignment 2,
---------- Oct. 13, 2008
WOW! that was one hard assignment. I have to admit that although I started to read and think about the assignment a week ago, I did not do any real work till yesterday. The result? I was typing till 9:50 pm today.
Question 4 was okay because it is similar to the one we did in lecture (I hope so). Question 3 was fine. The sequence of G(n) was easy but I do not know why it took my partner and I a long time to figure it out. For question 2, I did exactly as we did in lecture; I found T(n), found its closed form, proved it holds for powers of 7, proved T(n) is non-decreasing, and finally squeezed non-powers of 7 between powers of 7. It was a long proof, but this is how I understood it in lecture and I could not risk trying some other method 5 hours before the assignment was due.
Question 1; oh what can I say. My partner and I used an approach which we thought was promising (we got T(n) = 3T(n-1) + other stuff). We wanted to push our formula further but we neither had the time nor the nerve, so we used a method that includes summation. I guess I should drop by Professor Danny's office and discuss that first formula with him; I am sure he will figure something out.
In conclusion, I have spent around, without the time my partner's time, around 17 hours working on this assignment, and I think this exceeds the weekly 8 hours for CSC236. I should have started earlier, and I should have given more credit to the difficulty of question 1. I really need to do well in the up-coming test because I think I will get 60 or something in this assignment.


Wednesday,
---------- Oct. 15, 2008
I could not make it to class today; but I checked the lecture and course notes. Professor Danny did another example of iterative correctness; this time with a tricky termination.


Friday,
---------- Oct. 17, 2008
We concluded out discussion about program correctness by doing an example of nested loops. The first impression I had was "Oh my God...that looks nasty." But Professor Danny broke the nest loop into an inner and an outer loop. Then he proved the inner one normally and then the outer one. Pretty simple, huh? I just love the way in Danny manages to break down the most complicated proofs into smaller, simpler, and familiar problems. But anyway I do not think I would like to see a double-loop question in the exam.

October 20, 2008

Week Seven

Monday,
---------- Oct. 23 2008

Today's lecture was presented by Nick Cheng from UTSC because Danny was at UTM doing some kind of a survey regarding CSC207. We discussed the meaning of program correctness. One of the definitions we came up with was; “a program is correct if it produces the correct output given that the input was correct."
From this definition, we started talking about pre-and-post conditions. It is interesting how careful we should be when defining them. For example, if you have a function that returns the index of the first occurrence of some x in a list L, you have to say in the postcondition that L remains unchanged. Otherwise, doing something like L[0] = x and then returning 0 will be perfectly right.



Wednesday,
---------- Oct. 25 2008
We proved the correctness of binary search. The proof was long but difficult. There are several cases which we need to trace as follows: If the value at the middle(call it L[m]) is less that x and x is in L, if L[m] is less that x and x is not in L, if L[m] is greater than x and x is in L, and finally, if L[m] is greater than x and x is not in L.
Doing Question 4/Assignment 2 was a great practice because it requires a very similar kind of proof.


Friday,
---------- Oct. 27 2008
We worked on gcd(n,m) and pow(n, m) as examples of divide and conquer algorithms.

October 17, 2008

Week Six

Monday,
---------- Oct. 13, 2008
A holiday :)


Wednesday,
---------- Oct. 15, 2008
We worked on the time complexity of merger sort. I knew that its complexity is (nlgn) from CSC165 but I could never figure out how to prove such a complicated thing. After we did the proof I find it to be lengthy more than complicated.


Friday,

---------- Oct. 17, 2008
The master theorem... I need to go to the course notes and read through it again because I kind of lost track by the end of the lecture (not because it is that hard but 4-hours of sleep is never enough for me).

October 10, 2008

Week Five

Monday,
---------- Oct. 6, 2008
Today, we talked about the closed form of recurrence. We started with Fibonacci Sequence. Although the formula seemed so strange, Professor Danny explained that it is very necessary. Writing a recursive code for F(n) is not something wise especially if n is greater than 25. I remember professor Paul Gries doing this example in CSC148 last year. When he called F(30), a continuous flow of integers kept appearing on the screen, just like a matrix. It was cool, but it taught me that even nice useful things like Fibonacci sequence can be lethal.
Moreover, using while loops instead of the closed form is not a good option if n is larger, because F(n) is a linear function and you have to keep adding till you reach that larger n.
The second example we did was H(n) and we used a similar approach to find the closed form.
Finally, we talked about defining sets with induction. The example about proving that the number of nodes in a binary tree is less than or equal to 2^(H(T) +1) - 1 was really easy but I did not get the part that defining a property can be derived from the definition itself.


Wednesday,
---------- Oct. 8, 2008
We continued with defining sets with induction and the example of the binary tree. We did a second example about matching parentheses, it is easy but I am so confused. What I did to prove that P("(s1)") holds was just proving that 1+L(s1) >= 1 + R(s1). It is pretty simple and straightforward. According to professor Danny's solution, we need to consider s^ = s^^ * s^^^ where s^ is a prefix of s1, s^^ is a prefix of "(" and s^^^ is a prefix of s^^. Then by induction hypothesis, L(s^) >= L(s^^) +L(s^^)>= R(s^^) + R(s^^).
I do not really get it; why should we break them like that? I guess I have to take another look at the lecture slides and the course notes.
Another thing that is bugging me is that it has been more than 24 hours since A1 marks have been posted, and I still did not get my mark. I cannot use the online marking tool because my partner was the one who submitted the assignment. I cannot use cdf/students because the marks are not up on cdf students either, and my partner did not email me with the mark :(


Friday,
---------- Oct. 10, 2008
We had our first test today. I did not find a problem in studying induction because the examples in the lecture slides and course notes were thorough and easy to follow. However, it was not until 8:30 pm when I discovered that materials from chapter 3 will be on the test as well. I did not know where to stop; so I studied everything even Monday's and Wednesday's lectures and I went to the test praying that nothing outside induction will show on the test.
The test was neither easy nor hard, but I spent the whole 50 minutes writing. For question 2, we had to find a number k such that \forall n>k, 3^n <>

October 06, 2008

Week Four

Monday,
---------- Sept. 29, 2008
All right, it is recursion time. Dealing with running times of recursive functions is something I was avoiding since last year. Of course, we did not start directly with time complexity or efficiency of recursive codes. Rather, we started with my favorite sequence ever, the Fibonacci Sequence. We used complete induction to prove that the sum of the first n Fibonacci numbers is f(n+2) -1.
I am happy because we did this example because it corrected a misunderstanding I had. Although we are done with the induction chapter, I still instinctively assume P(n-1) then use P(n) to prove P(n+1) in complete induction (I guess I have mis-learned this point in MAT137). So when I did the scratch work and compared it to the professor's solution, I was a bit shocked for the silly mistakes I made because P(n) is not assumed to be true; we have to prove it.


Wednesday,
---------- Oct. 1, 2008
I could not attend today's lecture because I stayed awake working from Tuesday, 10:30 am to Wednesday, 11 am on a project for my architecture course.
I think they talked about something called unwinding, I am not sure. I will have to read the course notes and the lecture slides.


Friday,
---------- Oct. 3, 2008
Unwinding turned out to be really interesting. You start with a recursively defined sequence, then keep applying the definition and recording the number of steps done so far, till you get to n = 0 or 1 or whatever base case you have. Unwinding is used for finding closed form for recursive formulas.