After week 1 my result was a bare bones communication between the wrapper bear and the given executable, as I mentioned in the previous post. The whole gimmick/trick of this project is to use JSON strings in order to communicate between the 2 entities, this way we can assure that any language which is able to parse a JSON string will be a valid language to write a bear for coala. Last week I worked on making the wrapper send JSON strings containing the necessary information to the executable and parsing its output into result objects. My goal for this week was to write tests and setup the CI to build properly the project.
When I wrote my timeline for this summer I was pretty sure I will be encountering difficulties so I reserved some weeks especially for solving them. In that way I was sure that I would not delay my project. While developing the JSON communication I had a minor (maybe not minor) setback. It is a bit technical but I will try to explain it without getting into many details. Basically, coala can take settings for its bears (ex: max_line_length) which are specific to each bear. A potential developer of a foreign language bear must be able to have access to such settings, obviously via the JSON string passed to his executable. The class returned by the decorator should include this functionality, but the settings have to be passed in the user's wrapper somehow. I managed to find a nifty way to do it by creating a dummy function. Without getting into any more details the settings parsing works well but there is a problem when optional settings are used. If an optional setting is not specified in the
.coafile or in the command line, then the default value should be used. Well in my case, the default value is not used. In order to solve this problem I tried many simple fixes and workarounds but I think I have to dig a bit more into the coala core. I decided to postpone this for my buffer/problem-solving week, which is week 4. Fortunately this problem is not anything game-breaking and the week 3 goal can be touched without any setbacks.
By testing your software you make sure that everything works as expected but more importantly you assure that any following development will not break the other modules. In coala continuous integration (CI) is set up to use coverage1 such that all of the pushed code is relevant (without unused parts). To make sure that every line of code you write is executed, you also have to write appropriate test cases. This means that a pull request (PR) is never accepted in coala if it does not have proper testing (aka coverage drops bellow 100%). In my case, I had to study how testing is done in
coalib (coala library) since I had only written tests for bears so far.
Since I started, I haven't managed to merge any of my code because it didn't have appropriate testing. By the end of this week, I expect to be able to merge my code and bring coala (a little more than just bare bones) foreign language bear support. The following parts of the project will include utilities to make the foreign language bear developers write as little python code as possible, ideally none at all (though I don't see why, python is awesome :D).