I attended the ICSE 2016 conference in Austin, Texas. My main purpose was to present a paper on company-university collaborations at one of the ICSE workshops. I also attended the first day of the main conference (May 18), and I had many interesting conversations with ICSE conferees.
I was at the 3rd International Workshop on Software Engineering Research and Industrial Practice (SER&IP) -- which was small but interesting. There was an excellent keynote talk by James Herbsleb (Carnegie-Mellon University, also an old Bell Labs research guy). Jim talked about "Socio-Technical Coordination", with some analysis of data about collaboration behavior extracted from GitHub. There are some interesting thoughts about how people share information in geographically-distributed projects.
Lori Pollack (University of Delaware) also gave a great talk on her experiences doing "field studies" of software productivity tools use. Her presentation was useful, because it outlined some useful practices in setting up effective industry studies and doing "instrumentation" of software development tools.
There were a number of other interesting talks in the workshop. For the full report, see the following link: workshop_serip2016_report.html.
Chaos Engineering is a set of testing approaches for finding bugs in the interactions between software components. Chaos techniques try to automatically exercise some of a system's failure recovery functionality in testing, with goal of improving software reliability.
There was a panel session on Wednesday afternoon, with representatives of Chaos Engineering efforts at Netflix, Yahoo, and Microsoft. They explained some of their successes and failures, and they had some really good ideas on how to get buy-in from developers and managers for this "very extreme" form of texting.
For my notes on this panel session, see the following link: chaos_panel_summary.html. For more information on Chaos Engineering, visit the Principles of Chaos website: principlesofchaos.org.
Mary Shaw presented the main conference keynote talk: "Progress Toward an Engineering Discipline of Software." This was a "rerun" of her talk at the SATURN 2015 talk. (For my notes on the SATURN 2015 talk, see notes on Mary Shaw 2015 SATURN keynote. See https://www.youtube.com/watch?v=S03bsjs2YnQ for the YouTube video of the SATURN 2015 talk.) The talk has been updated a bit from 2015 -- and I thought she made some strong points. In order to be considered "engineering", a field has to have activities based on science or best codified practice. Mary is a bit suspect of whether software engineering is mature enough to have well-codified practice yet.
There were several good papers in the main research track on Wednesday afternoon. The most interesting was "On the Techniques We Create, the Tools We Build, and Their Misalignments" by Eric Rizzi, Sebastian Elbaum, and Matthew Dwyer from University of Nebraska-Lincoln. The paper talks about the problems of doing software engineering research using tools that are built on other tools. The authors were building on a popular symbolic execution tool called KLEE -- and they found after some reflection that there were a few serious bugs in the tool, serious enough to invalidate some of the research conclusions in some of their studies. The paper is a good *cautionary tale* of software engineering research, and the paper deserves some significant attention in the software engineering research community.
I listened to the presentations in one of the "software engineering education and training" sessions, which were mostly about Agile and Scrum training in a high school or university environment. The main points: it is really hard to get students to "pay attention to following the process" in any project-based course. They always want to "deliver the product on schedule" no matter the cost, and they often abandon the agile team process -- reverting to "hacking" when they are under schedule pressure. The speakers had a few ideas about how to get experienced coaches into the teams that are struggling... by monitoring the team's progress as well as their "commit" behavior in the software change management system.
I picked up a couple of ideas from a conversation with Dewayne Perry (an old contact from Bell Labs days, now at University of Texas). He is looking into some of the issues that make measurement of software performance difficult -- such as memory-greedy long-lived background jobs (like Firefox).
I also spoke to Michael Hilton from Oregon State University. He is a PhD student working with Danny Dig. He is working on "interviews" with people and projects that are using Continuous Integration -- trying to assemble a set of best practices. There might be multiple contexts for CI... developers will usually focus on unit tests to get quick feedback on their work, but DevOps people may be drawn to higher-level integration and deployment tests. Michael has a new paper on his partial results, which should be submitted soon to one of the software engineering conferences.
ICSE 2017 will be held the week of May 20-28, 2017 in Buenos Aires, Argentina. There will be more information on the ICSE 2017 website.