37

In the spirit of some general discussions like this one, I'm opening this thread with the intention to gather opinions on what are the open challenges and hot topics in research on programming languages. I hope that the discussion might even bring to surface opinions regarding the future of research in programming languages.

I believe that this kind of discussion will help new student researchers, like myself, interested in PL, as well as those who are already somewhat involved.

zpavlinovic
  • 296
  • 1
  • 5
  • 10
  • 7
    community wiki ? – Suresh Venkat Jun 02 '13 at 06:14
  • 2
    I think it would really improve this question and those answering it if you quoted or summarised the text of the "Frontiers of TCS" question. The expected scope of answers to this question is unclear while the other question is more precise about what it expected. – Vijay D Jun 04 '13 at 02:01
  • when I asked this question on stackoverflow some time ago...I got downvotes and my question was closed ! – Pankaj Sejwal Jun 05 '13 at 16:17

4 Answers4

26

I think the overall goal of PL theory is to lower the cost of large-scale programming by way of improving programming languages and the techincal ecosystem wherein languages are used.

Here are some high-level, somewhat vague descriptions of PL research areas that have received sustained attention, and will probably continue to do so for a while.

  • Most programming language research has been done in the context of sequential computation, and by now we have arguably converged on a core of features that are available in most modern programming languages (e.g. higher-order functions, (partial) type-inference, pattern matching, ADTs, parametric polymorphism) and are well understood. There is as yet no such consensus about programming language features for concurrent and parallel computation.

  • Related to the previous point, the research field of typing systems has seen most of its activity being about sequential computation. Can we generalise this work to find tractable and useful typing disciplines constraining concurrent and parallel computation?

  • As a special case of the previous point, the Curry-Howard correspondence relates structural proof theory and functional programming, leading to sustained technology transfer between computer science and (foundations of) mathematics, with e.g. homotopy type theory being an impressive example. There are many tantalising hints that it can be extended to (some forms of) concurrent and parallel computation.

  • Specification and verification of programs has matured a lot in recent years, e.g. with interactive proof assistants like Isabelle and Coq, but the technology is still far away from being usable at large scale in everyday programming. There is still much work to be done to improve this state of affairs.

  • Programming languages and verification technology for novel forms of computation. I'm
    thinking here in particular of quantum computation, and the biologically inspired computational mechanisms, see e.g. here.

  • Unification. There are many approaches to programming languages, types, verification, and one sometimes feels that there is a lot of overlap between them, and that there is some more abstract approach waiting to be discovered. In particular, biologically inspired computational mechanisms are likely to continue to overwhelm us.

One problem of PL research is that there are no clear-cut open problems like the P/NP question where we can immediately say if a proposed solution works or not.

Martin Berger
  • 11,420
  • 3
  • 39
  • 71
  • 1
    if i may add, quantum computing and quantum programming languages, even if quantum computing is not happening, yet the study of how some programming concepts might be transfered in this model of computation is interesting if nothing else, programming in natural language, fuzzy programming, and even physical computation and physical programming (programming directly on matter, beyond the molecular level) – Nikos M. Jun 12 '14 at 18:48
  • 1
    @NikosM. I agree, QC is a big deal and heavily investigated. This paper shows a surprising connection between the foundations of quantum mechanics and programming language theory, unearthed only by abstraction. – Martin Berger Jun 12 '14 at 18:55
  • Nice, maybe a question could address these kinds of formal (or not formal) relations – Nikos M. Jun 12 '14 at 19:01
12

Let me list some assumptions which limit the programming language research. These are hard to break away from because they feel like they are an essential part of what programming languages are about, or because exploring alternatives would be "not programming language design anymore". With each assumption I list its limiting effects.

  1. Programs are syntactic constructs.

    • Real programmers would never use iPads to construct source code. And even if they did, they could never be as efficient as with Emacs, Eclipse, NetBeans, XCode, etc.
    • Research on alternative ways of constructing programs is not programming language design, but either graphical user interface design, or education (cf. Scratch).
  2. A partially written program cannot be executed.

    • At the very least, runtime error occurs when execution gets to a missing part.
    • What good could there be in running unfinished programs?
  3. Programs are about giving instructions to computers.

    • Programming language design has nothing to say about how to write and organize laws. apliances.
    • Bacteria do not write programs.
  4. Programming is like enginnering and cannot be done by ordinary people.

    • Ordinary people do not know the syntax, the concepts, the tools, so they cannot possibly write programs.
    • Even if we try to make it possible for ordinary people to write programs, they will only be able to write trivial stuff.

I think I could go on.

Andrej Bauer
  • 28,823
  • 1
  • 80
  • 133
  • Is program synthesis not part of PL? I think a lot of researchers would disagree with parts 2 and 4. – James Koppel Jun 03 '13 at 03:57
  • @James: of course I agree with you that PL should be understood and applied broadly. My point is you cannot get into POPL with a paper on how your programmed yor iPad so your aunt can program it in trivial ways. It would be very badly "out of scope". But even something like "GUI for Agda" will be out of scope of top PL conferences. – Andrej Bauer Jun 03 '13 at 05:10
  • @AndrejBauer I'm not sure I fully agree that these are open problems in PL. It is generally agreed that one can program in ways other than textual. It's just that (with the exception of UI programming) the quality of non-textual programming has not been convincingly established. E.g. have you got credible empirical evidence that "your aunt can program [the iPad] in trivial ways"? – Martin Berger Jun 03 '13 at 07:07
  • @AndrejBauer Perhaps, but Sumit Gulwani's POPl '11 paper is basically on how your aunt can program Excel in trivial ways, so... – James Koppel Jun 03 '13 at 08:10
  • 2
    James: excellent, I shall inform my aunt. Martin: this is precisely the sort of thing I am talking about - non-textual programming has not been convincingly established because PL community isn't taking it seriously, because it has not been convincingly established. But it seems quite obvious to me that humans were not made for typing words on screens. We're good at throwing stuff and picking blueberries. – Andrej Bauer Jun 03 '13 at 13:21
  • 1
    @AndrejBauer As a scientific argument, "It's quite obvious to me" is not beyond improvement. If you look at the history of writing systems, of which programming languages are but a recent example, their historical trajectory has been away from logographic writing. Maybe our ability to parse strings is more relevant than blueberries. Alphabetic writing has evolved over millennia, so it's unlikely that it contains massive, easily fixable bugs. That said, I'm happy to believe that we can do better than ASCII based linear strings. I think it'll be a while before we do. – Martin Berger Jun 03 '13 at 16:32
  • Yeah, we could try abstract syntax trees (with little blueberries at the leaves) on iPads and see how that goes. – Andrej Bauer Jun 03 '13 at 19:02
  • I don't understand what this answer is about. Is running unfinished programs a hot topic or a research challenge in PL research? – Sasho Nikolov Jun 04 '13 at 07:57
  • 2
    The point of my answer is to "think outside the box". To examine hidden assumptions in PL research and to see how they limit the potential PL research has got. – Andrej Bauer Jun 04 '13 at 09:20
  • @AndrejBauer Dynamically typed programming languages like Python, Javascript and the Lisp family enable you to run unfinished programs. They are also extremely popular. In what sense don't they do what you are asking for? – Martin Berger Jun 04 '13 at 11:56
  • What's the semantics of "we hit an unfinished piece of code", other than "undefined" or "exception"? Can it be more descriptive than that? What if we equip the unfinished part with a specification, or at least a type, can we do better then? – Andrej Bauer Jun 04 '13 at 12:57
  • @AndrejBauer With dynamically typed programming languages you can 'poke around' in the execution state. But I think you mean something different. I think you mean to substitute specifications (e.g. types of Hoare logic formulae) for missing parts. I see specifications as a form of two-version programming (with the specification language typically being much high level), so what happens is to switch from one to another programming language. If the specification is weak (e.g. $truth$) then not much of interest is going to happen. – Martin Berger Jun 04 '13 at 13:09
  • 4
    @AndrejBauer, I think limiting the scope to POPL is a mistake -- lots of this kind of work is done at OOPSLA, or at ICSE, or even at CHI. POPL isn't interested unless there's a novel formal approach, but POPL isn't the whole PL community. – Sam Tobin-Hochstadt Jun 11 '13 at 15:34
  • Who is limiting scope to POPL? What are you talking about? – Andrej Bauer Jun 11 '13 at 20:58
  • @AndrejBauer, is point two really a limitation on the PL design community? It seems this "limitation" has been broken in dependently typed languages like Matita where (typed) metavariables are first class entities, and also in context calculi where one can even abstract over unfinished bits of code. – Dominic Mulligan Jun 13 '13 at 09:39
  • 2
    @DominicMulligan: sure, these are all very welcome ideas. With my comments I am trying to change the perception of what programming is. So if the theoretical ideas can be put to good use in practice (by which I mean that "ordinary" programmers will use them in daily life), then we have won. – Andrej Bauer Jun 13 '13 at 10:18
  • @SamTobin-Hochstadt is right: you argued that "you cannot get into POPL with a paper on how your programmed your iPad so your aunt can program it in trivial ways". But limitation 1 is broken by research on modeling languages (in software engineering conferences) and projectional editing and limitation 4 is broken by research on end-user programming. About limitation 2, I heard of work on "Prorogued Programming" published at Onward, an (admittedly forward-looking) OOPSLA track. All of this is in scope in "top PL conferences". Or see the PLDI work on type-based completion for Scala. – Blaisorblade Mar 12 '14 at 16:02
  • @AndrejBauer: in other words, what you say becomes true only once you replace "PL community" with "theoretically-minded PL community". To me, one reason is that things like human-computer interaction are not nearly understood enough to allow building a precise theory of them (what are the axioms for humans? What are the judgements?). On point 1, the other problem is that this point has been unsuccessfully and massively challenged for decades by "graphical editors", yet nobody has a good solution yet. – Blaisorblade Mar 12 '14 at 16:11
  • 1.) They were investigating this with Pharo, Smalltalk, Jupyter notebooks, and – aoeu256 Apr 29 '21 at 18:12
0

There is one problem I have been wondering about. I have no idea whether it qualifies as an open challenge.

Mathematical knowledge has been steadily growing with time. The theoretical foundations, concepts, notations, and proofs have evolved over the centuries. Mathematicians have managed aggregation without necessarily checking its global consistency in a systematic and formal way at any point in time (though there were attempts to do it).

We should expect programming languages and program libraries to aggregate and evolve similarly over the time. What kind of tools could help manage aggregation of programming results and libraries so as to keep them consistent and effectively usable by all, as computers can be more formal and demanding regarding consistency. Do we have to redo the libraries for each new programming language. Why should we have to choose a language because it has the right libraries for the intended application rather than for its intrinsic qualities as a programming medium?

On a different topic, you might find ideas in the following question: Are programming languages becoming more like natural languages? I realize that the idea may not appeal to many theoretical computer scientists, but it may still be useful by looking at different issues or from a different point of view. I am far from agreeing with many of the ideas that were posted, but that is what discussion is for.

babou
  • 1,542
  • 10
  • 18
  • Concistency is over-reated. – Andrej Bauer Jun 03 '13 at 05:11
  • 1
    I can see there is not much agreement on this, however modest, suggestion. Still, it might be more helpful, at least to me, to have a few explanatory words as to why. In case I was unclear, I never meant to say that mathematics could be inconsistent, only that it was not (necessarily) aggregated with consistent means (historians would tell better). From a CS point of view, I may be wrong regarding the difficulty of aggregation (I never did any technical work on this), but I am only relating user experience and commonly heard point of view, indirectly detrimental to languages produced by TCS. – babou Jun 03 '13 at 08:15
  • 1
    Well, my remark is mostly about the fact that consistency is an all-or-nothing idea, whereas in reality most software is "mostly consistent". And yet we use it and find it useful. Why then are theoreticians obsessed with what seems to be a practically unattainable and too idealistic a concept? It would seem better to be able to quantify consistency in some less trivial way. – Andrej Bauer Jun 03 '13 at 13:15
  • @AndrejBauer - Thank you for replying. I am a bit surprised by your statement, as applied to what I wrote. Nothing there supports some form of absolute consistency, but only a wish for some workable approach that would make aggregation possible and meaningful in an evolving context. Mostly consistent as you say, might do. Finding what consistent should mean for the purpose was part of the idea, and I was not suggesting any answer, trivial or otherwise. I have never been an obsessed theoretician, and I do not see from your answer where we could be at odds. – babou Jun 03 '13 at 19:10
  • 1
    I think I was just ranting about "pure theoreticians", that's all. Please ignore me. – Andrej Bauer Jun 03 '13 at 20:46
  • Indeed, this is a good problem to think about! There is some work in this area at present. See, for instance, the blame calculus which attempts to integrate languages with different type systems (including typed and untyped languages). Microsoft's .NET platform is a good medium to do these things in, if the theoreticians can get around their inherent distaste for Microsoft platforms. – Uday Reddy Dec 01 '13 at 12:01
0

there has been a tremendous innovation and explosion in programming languages from applied and theoretical sides over the last century, yet a case might be made that this is a singular/one-time event in the history of computing, similar to an "evolutionary explosion" (see also "why are there so many programming languages?" on cs.se), and that therefore the future will not be like the past in this respect. however there are some identifiable long-range current trends in play/under development.

  • Programming/software complexity and ways of managing/minimizing/mitigating/reducing it is a topic that has always influenced language design and is possibly even more significant in the current age with very large/complex software systems quite common. it was a major aspect of OOP design rationale yet now we have highly complex OOP systems! focused pondering of it has led to classics in the field such as Mythical man-month by Brooks which in many ways is still a very valid perspective, possibly even more relevant than when it was written.

  • parallelism. there is a shift in hardware toward greater parallelism (eg multicore etc) and clock speed increases are no longer sufficient to increase performance. this shift happened around the mid 2000s and is having a major influence on language research/design. parallelism was always a topic but it has a new foremost prominence/urgency, and there is some widespread thinking/consensus that parallelism is overly complicated and difficult in programming and maybe different theoretical approaches could alleviate some of this. a nice ref on this: The Landscape of Parallel Computing Research: A View from Berkeley

  • datamining/big data. these are influencing programming language design. also new directions in database architecture are rippling/impacting programming languages.

  • supercomputing has a significant impact on language design and also overlaps with parallelism and datamining/big data eg with new languages like MapReduce.

  • visual/dataflow programming. there has been an increase in these types of "languages" (in a sense visual programming is in many ways actually decoupling programming from "languages"). also strong cross-pollination with parallelism.

  • AI. this is more of a longrange wildcard and its not very clear right now how it will impact computer languages and programming but its probably going to be very substantial. in the past [in a different form] it led to entire languages like prolog. an early indication of how it can be applied with striking results is Genetic Algorithms/Genetic Programming.

a reference that might have some helpful ideas along the lines of "future of programming languages", Beyond Java by Tate. he ponders (albeit controversially) that maybe Java (arguably one of the most sophisticated/comprehensive programming languages in existence) is starting to show its age and there are early signs of new languages/approaches emerging to fill in its place in the long term.

vzn
  • 11,014
  • 2
  • 31
  • 64