Sunday, November 18, 2007

implementing the clover sensor

i just finished an initial implementation of the hackystat ant-based clover sensor, which process the xml clover report to send to hackystat. as usual creating an ant sensor is pretty easy assuming that the report that the tool creates is in xml and makes sense.

here are some issues that i came across.

different attribute names
after writing the sensor, i noticed that the attributes names are different than emma's. for example, clover generated Coverage data has statement_Covered and statement_Uncovered. however, emma generated Coverage data has line_Covered and line_Uncovered. i addition, because there are no required attributes (other than timestamps, type, etc) the Coverage DPD needs to be "oblivious" to the different granularities. this all brings up issues with DPD implementations, will the DPDs contain the last snapshot per tool? if a project uses multiple coverage tools, you might have a random sampling of last batches in the DPDs coming from different tools. that would make it really hard to use the DPDs over time. it seems that the generality of everything that is going on is going to make it harder for the analysis writer or even user to know what is going on, because they would have to know about the details. i thought the whole idea of the abstraction hierarchy was to make "abstractions". let me just throw this out as well, what if we wanted to combine emma's coverage values and clovers coverage values (they are a different set). there is no way to do that.

issues with the clover ant task
it actually took longer to configure the clover tool with the ant tasks than it took to write the sensor. in fact, i wasn't able to configure the ant clover tasks the way i wanted. so, technically i'm not done. for some reason i wasn't able to clean before doing the clover-setup task. i don't get way that is. but, i won't bother with it for now.

our test cases are junk
i need to work on those test cases. basically, i think following this will help.

clover can send filemetric data
clover's data looks like this:

<metrics classes="1" methods="18" coveredmethods="7"
conditionals="36" coveredconditionals="11"
statements="141" coveredstatements="77"
elements="195" coveredelements="95"
ncloc="270" loc="456" />

to me that looks like coverage data and filemetric data. so... would it be wrong to send both?

2 comments:

austen.ito said...

It seems weird to use 2 different coverage tools and expect to compare the results from both.

If you really wanted to, you could filter the coverage data by tool, but like you said that may break the abstraction.

aaron said...

in v7, we used locc and sclc. we made sure that one of the tools always ran before the other so it didn't mess up the DPD. that was bogus because some times the second tool would fail or someone actually wanted locc data, then the DPD was created with the different tool. the mixture of those tools were unavoidable as long as you had them. what really should have happened was that we could have merged the results as one.

and there is no reason not to have them. the system should be able to deal with multiple tools sending the same SDT. at this point, the abstractions are letting us down because it can't handle that use case.