22C111 Fall 2007 Programming Languages: Lectures Page
This page will show, by date, some of the topics and background sources for lecture material. Reading this page is not a substitute for attending class.
14 December. Review for final examination (which is Tuesday morning at 7:30am).
12 December. Introduce Formal Semantics of Programming Languages. Languages (even ones not used for programming) have several components: syntax rules, lexicons (dictionaries and ontologies), and semantics, which are higher-level rules for giving meaning to what is said using the language. What does a program "mean"? There are different answers to this question, including Denotational Semantics, Operational Semantics, and Axiomatic Semantics. The point of defining semantics of the language is to get some confidence that a program expressed in the language of your choice will actually be or do what the specification says.
12 December. Continue some with scripting languages. The idea of using scripting languages goes beyond simple scripts, it can even be at the level of large distributed systems (see BPEL as an example of a scripting language based on XML, to be used for Web Services).
10 December. Start some topics either from or related to Chapter 13 on Scripting Languages (PHP, Perl, Python, Tcl, Rexx, etc) and data definition languages (XML and similar). The main ideas first to consider are: metaprogramming, which often occurs when scripting languages are used to glue (see glue language) subsystems together, and macros (see macro (computer science)). One interesting question for you to think about: is it better to use macros or OO-inheritance to automate programming activities?
- 10 December. Finishing coverage of Chapter 12, looking at the characterization of send/receive by how they synchronize threads or processes (Section 12.4). See the first sentence of 12.4.4 on Remote Procedure Call: there are three principal forms of send (no-wait, synchronization, remote-invocation, see Fig 12.19) and two types of wait (implicit or explicit). Rendezvous refers to the combination of these where sender and receiver communicate and change each other's state in one operation.
- 7 December. In addition to continuing topics of the previous lecture, I say a little more about communication by message-passing (taken from Section 12.4.2). Section 12.4 has some nice examples, though the details of RPC and RMI are more than we need.
- 5 December. This lecture is just a fast review of some key terminology from Chapter 12 of the textbook (I'm not advising that you read all of the Chapter, but it does introduce some terms that you should know about.) Here are the terms I've identified: thread (computer science)
coroutine, race condition, deadlock, dispatch loop (not the same as dynamic dispatch), blocking (scheduling), preemptive scheduling or preemption (computing), busy-wait, polling (computer science), synchronization (computer science), ready list, rendezvous. During the lecture, I illustrate some of the ideas using Java Concurrency Mechanisms, and maybe a little of Java's concurrency libraries. Java's model of sharing between threads has become very complex: you can get an idea of this from viewing this video.
3 December. This lecture will explain the concept of coroutines and the relation to concurrency of this concept.
30 November. The homework description has been improved and posted on this web site. As additional help, this lecture presents another example, finding the maximum palindrome length: erlang palindrome example.
28 November. Covering the concurrency example (concurrentFac) on the erlang page, showing some syntax errors and the debugger module; also handed out draft of homework assignment.
27 November. A section on Debugging Concurrency has been added to the erlang page, which will be covered in the next lecture.
26 November. A short review of Erlang syntax, taken mostly from Chapter 1 of an Erlang Book. The lecture didn't get to the syntax for receive, so that will be continued.
12 November. Please read the erlang reading assignment to get started on Erlang. A great exercise (and possible homework) would be to program the Sieve of Eratosthenes so that each time a new prime number is generated, a new Erlang process is created for that prime number: then, as each new larger number K is generated (you'll need some process that simply generates 2,3,4,5, etc, as candidates to be primes), K is sent through the chain of existing processes. Each process in the chain is dedicated to "throwing away" any number it receives which is divisible by its prime, only sending on the numbers that can't be divided evenly. At the end of the chain, there's no place to send a candidate, so in that case a new process must be spawned (which is initialized to have the candidate as its prime). For testing, you should terminate for any candidate number larger than 300 or so, to limit output. Later, you can try larger limits.
9 November. Notes on erlang.
7 November. After finishing slides of previous lecture, I'll start with a Mini-Course on Erlang.
5 November. Starting coverage of concurrency (it's the topic of Chapter 12, but I will likely go outside the text in my own coverage of the topic). I presented some slides from two sources, one about how memory is used (stack, heap, program code) with processes and threads; the main feature to remember is that processes are "heavyweight", and make it difficult to share data, whereas with threads, memory is shared and creation is a lighter-weight procedure. The other slides I presented come from Concurrency Oriented Programming.
5 November. Proposed in class: A Quiz Friday (9 November) over Chapter 2 and 3 material.
2 November. A bit more coverage of Chapter 3. Please see notes on binding and scoping for more details.
- 29 October. Initial coverage of scoping (again, from Chapter 3).
- 26 October. Initial coverage of binding (starting Chapter 3).
- 24 October. More specific information on Chapter 2 reading: read Chapter 2 through up to 2.2.2; then skip to 2.2.5; then skip the rest of the chapter. Pay attention to questions 1,2,3,4,6,7,8,9,12,16, and 19.
23 October no class, but here is an approximate, cumulative grade distribution, summing over all assignments, quizzes, and the midterm:
x x xx x x x x x x xxx xxx xxxxx xxxx x x x xxx x x x |....|....|....|....|....|....|....|....|....|.... 50 60 70 80 90 100 110 120 130 140 D ---C---- ----B----- ---------A------------
- 22 October. Returned graded midterm examinations. Continue coverage of Chapter 2, reviewing that regular expressions generate regular languages. For each regular expression, a DFA can be constructured (Chapter 2 shows the classical progression of first making an NDFA, then transforming that into a DFA, and further optimizing the number of states in the DFA). A DFA can either generate a language or be used as a design for a "recognizer" of the language. During the lecture, I also sketched a proof of the theorem that any CFG can be transformed into an equivalent CFG where there are only two symbols on the right side of any derivation rule. Once there are only two symbols (which can be nonterminals), then it's a mechanical procedure to build a Prolog program that tells you whether a given sequence is in the language of the CFG or not.
19 October. We start Chapter 2. The official course catalog for 22c111 tells that one topic is syntax specification and informal semantic models, so we need to cover some syntax specification.
- 17 October. In class, a sample of exam questions was presented and there was a discussion of how to answer these questions; in the evening, we had the midterm exam.
15 October. Just a review of Prolog's search strategies and some examples; see the prolog page for some of the notes.
- 12 October. Finish covering Chapter 11. Add to the list of problems to solve for yourself 7-8 on page 571; skip 11.2.6 through 11.4.3. One item of interest from the text we are skipping is problem 13 on page 583, which is answered in the small paragraph in 11.4.1:
Horn clauses cannot be used to express statements whose clausal form includes a disjunction with more than one nonnegated term
a(X) or b(X) :- c(X,Y), e(Y).
This is invalid syntax ("or" is not in the language). What this example is trying to say is that from knowing c(X,Y) for Y such that e(Y) is true, we can conclude that a(X) is true or b(X) is true -- but a(X) and b(X) don't have to both be true. There is no way to express this in Prolog. Another shortfall of Prolog is that negation doesn't have the usual meaning in logic. Negation, supported by the not( ) operator in Prolog, can be tricky to use, as shown on page 582. The term not(a(X)) is true in Prolog if there is neither a(X) in the database of facts nor can a(X) be derived from clauses -- essentially not(a(X)) is true if Prolog's search fails, trying to find a(X). This is different from searching for a term like "!a(val)", that is, a term stating that a() is false for a particular value. Prolog doesn't have a way to express negative facts.
10 October. Continue covering Chapter 11, 11.2.3, 11.2.4, and 11.2.5. You should answer for yourself problems 1-6 on page 571. Also, some of the predicates listed on the prolog page are demonstrated in class.
8 October. Covering Chapter 11 starts, covering 11.1 and the introduction to 11.2, but not the details of 11.2.1. Also, some demonstration of prolog in class.
5 October. Answering questions about the fourth assignment and covering some advanced functional programming topics. Also, a bit more about the lambda calculus, and how binding and variable scope work in that notation.
- 3 October. Finish covering Chapter 10.
- The lecture looked in more detail at applicative/normal evaluation orders and lazy evaluation (with strict/non-strict functions).
- We don't look in detail at I/O issues, though some understanding of the limitations of functional programming languages on current hardware is helpful.
Note: you should read and answer, for yourself, problems 12, 14, 15, 17, 18, and 19 on page 551 by reading about these terms in the text (use the index if needed).
The lecture also started to discuss the fourth assignment, and how the Haskell example of LispEval illustrates evaluation of functional programming.
Reading for the first week of October (with lectures going over some of the material): Chapter 10 of the textbook. Chapter 10 actually talks about "monads", the technique for I/O in Haskell. The chapter also covers a bit of Scheme, which has a more Lisp-like syntax than Haskell. Skip 10.3.6, the DFA example.
- 1 October. Start of covering Chapter 10. Some important concepts from the text:
The notion of side-effects (see Side effect (computer science)) which are available in Scheme via "set" functions, but actually break the rules of referential transparency. I/O in functional programming is inherently a side-effect (other efficiency-related cases are discussed at the very end of Chapter 10). Read the box on the bottom of page 534.
The notion of "context", an association list (see association list) and how it is used evaluating expressions in functional languages. Context can include previous definitions and local definitions. The association list is, in some sense, managed like a stack during evaluation of definitions that have "let" or "where" clauses: the stack is pushed down to include more local terms in the context during function evaluation, then popped up to roll back the context to previous state after function evaluation finishes. Just how inner definitions override or inherit from outer scope during binding (see binding (computer science)) is a design choice in making up a language semantics. Scheme does it one way, Haskell another way, and Java has even more complicated rules.
I worked through the "compose" example on page 535, to show how the compose function, which is defined by a lambda expression, returns a new function.
I illustrated how (quote + 3 5), also written as '(+ 3 5) is not evaluated in Scheme or Lisp, but treated as a constant, much like a string in Java or Haskell. But, you can force (quote + 3 5) to be evaluated, using the "eval" function (see eval), which invokes the Scheme or Lisp interpreter for a given input list.
We'll skip reading how eval and apply work, and also skip for now the presentation of denotational semantics.
Note: you should read and answer, for yourself, problems 4, 8, and 9 on page 539 by looking up these terms in the text (use the index).
- The discussion of different evaluation orders starting on page 539 is interesting. The different choices of evaluation order can have efficiency consequences.
- 28 September. The lecture revisited how LISP lists are represented as trees, and how such functions as equality and map (which is a higher order function) can be defined for trees.
26 September. For the first part of the class, two problem solutions to the third assignment were presented, and the quiz took the second part. Sample solutions are now added to the third assignment page.
- 24 September. A very brief look at the LISP language, its basic syntax and the striking feature that programs and input data both are treated as lists -- like (A B (C D E) () F G) which can be represented as trees. Evaluating expressions in a LISP program is a combination of reduction of known functions (arithmetic, simple built-in logic functions) and definition expansion with substitution for function arguments.
22 September. No class (Saturday), but check out this tattoo.
21 September. Further discussion of third assignment, showing some techniques, with more attention to foldl, working a new example added to functions and sequences. At the end of the lecture, reviewed datatypes and constructors, especially for trees.
17 September. Review of Quiz questions, including more explanation of pattern matching in Haskell. Also, start of applying techniques of functions and sequences to second assignment problems, to show solutions that are not recursive.
14 September. More on functions and sequences, plus the first quiz.
12 September. I try to cover the material in functions and sequences. This is more difficult material about higher-order functions on sequences.
10 September. The lecture presented some miscellaneous syntax notes on Haskell, summarized in more haskell syntax.
7 Septemer. This was a mostly interactive lecture, working through examples showing Haskell function definitions; the material was partly from the functional programming page.
5 September. I added a problem-solving page to the haskell notes page, and posted the second assignment.
31 August. I made good progress explaining how functions are defined in Haskell, see the haskell notes (in the notes, I explain better how lambda abstraction could be used, and mention the notions of "sections" and case definitions).
- 29 August. I'll start some presentations on the Haskell language (notes to be posted later).
- 27 August (first day). Some motivations for taking the course.
The topic Sapir–Whorf and programming languages is a controversial and interesting psychological motivation for the study of programming language concepts.
- Students of the programming languages course report that the main useful thing they get from the course is an improved ability to pick up (learn) new programming languages, which is useful because the world of computing changes quickly.