Wednesday, January 13, 2016

Languages Summary

Well, I guess if I'm going to get my thoughts together on last semester, I'd better do it now because the next one starts next week. Algorithms was the more meaty course, so I'll save that for this weekend when I have more time to write. Plus, I might set fire to my crappy Languages text at any moment and I need to refer to it one last time to write this entry.

CS4250 - Programming Languages.

Course Outline (basically following the text, omitting chapters on Concurrency and Exception/Event handling):

  1. Review of significant languages
  2. Syntax and Semantics
  3. Lexical and Syntax Analysis
  4. Names, Bindings, and Scopes
  5. Data Types
  6. Expressions and Assignments
  7. Control Structures
  8. Subprograms
  9. Subprogram Implementation Details
  10. Abstract Data Types
  11. Functional Programming Languages
  12. Logic Programming Languages
  13. Object Oriented Programming
Key Takeaways:

The commonality between languages far exceeds the differences. The C-based languages start looking downright identical after a while. Even Prolog and Lisp aren't as far off the beaten path as they first appear.

While this wasn't really stated in the course, it was obvious to me that the choice of a language should be based on the supporting tools and programming environment. Just about any modern language can get the job done, but there are some big gains to be had with good toolkits. In particular, when completing the programming assignments using freebie stuff, I had no trouble with the languages, but certainly wrestled with some of the environments. Since I'm not a Java guy, I haven't used Eclipse. As it's the popular choice for programming in such a major language, I assume it's pretty good. Everything I used for this class downright sucked next to Visual Studio. I can't tell how serious Google is about Dart. If it really is going to become their language of choice AND they keep it entirely open source, look for that to turn into a serious contender. Google surely does know how to build tools and there are thousands of very good programmers that will happily stay up all night improving on them on the chance it might get them in the door at the big G.

Logic programming is fun, even if there is no market for it.

Functional programming is not nearly as much fun, but there are times when it is absolutely the right tool for the job. I like the way F# slides seamlessly into a .net solution. You only have to write the functional part in the functional language. I'm not aware of any good way to shoehorn Lisp or Scheme into Java other than Silk, which might be a good answer - I've never really looked into it.

While I'm absolutely sold on C# (see note above about programming tools), I'm not a true object-oriented adherent. Classes have their place, but forcing everything into a class just results in a bunch of silly classes that are really just function libraries. At least Java and C# both let you declare such classes as static so you don't need to instantiate an actual object to get at the routine you want.

Unfortunately, I knew almost all of what I just wrote last August.

Stuff from the class that's probably worth remembering:
  • BNF and parse trees
  • Dynamic semantics, particularly axiomatic semantics. I still think there's a decent paper to be written on using the axiomatic semantics of design patterns to select the appropriate pattern based on the functional test cases. I don't know that I'll be the one to write it, but it could be a pretty significant work (even a dissertation) if done right.
  • Recursive descent versus bottom-up parsing
  • Binding time
  • Static vs Dynamic Scoping
  • Referencing environments
  • Lots of stuff on data types that I already knew and am in no danger of forgetting since I use it every day.
  • Lots of stuff on statements which I also already knew and use every day.
  • The one exception to the last statement is guarded commands, but I'm not sure there's much merit to remembering that. Nobody uses them in their pure sense. Co-routines and forking offer essentially the same thing in a better package. One might argue that Hadoop's map-reduce logic is essentially guarded commands. That one would not be me, but one might.
  • Parameter passing methods (value, value-result, reference, name). That last one isn't nearly as theoretical as the book or class would imply. It's basically how PL/SQL works.
  • Overloads and generic functions/methods
  • Closures, which are used a lot more than I realized. I knew that C# uses them when passing a lambda expression as a delegate, but it builds them in lots of other situations, too. I recognized this when the debugger threw a weird entry on the call stack a few weeks ago. I didn't recognize the code block being referenced and after poking at it a bit I realized that it had taken an anonymous block and created a closure for it, even though that block wasn't referenced outside of it's context. Not quite sure why the compiler thought that was necessary, but it was cool to dig through it and figure out what was really going on.
  • Co-routines and parallel structures.
  • Implementation details of stack and local variables, including static chains.
  • Nested subprograms and why they are a pain to implement.
  • Deep (dynamic chain) and shallow (activation record) access for dynamic scoping
  • The last third of the course (10-13 above) had lots of stuff that was interesting 30 years ago when I saw it for the first time. As none of it was new this go round and I use all of it regularly, I won't bother summarizing. That said, it was still the most interesting part of the course.
In summary, it was a class I shouldn't have taken, especially in light of the fact that I switched my concentration to Stats. But, there wasn't a good way to know that going in and having such an easy class meant I was able to focus on Algorithms which was certainly not easy. We'll get to that shortly.

No comments:

Post a Comment