12:00:47 <E-P> #startmeeting Mer QA meeting 28/06/2012
12:00:47 <MerBot> Meeting started Thu Jun 28 12:00:47 2012 UTC.  The chair is E-P. Information about MeetBot at http://wiki.merproject.org/wiki/Meetings.
12:00:47 <MerBot> Useful Commands: #action #agreed #help #info #idea #link #topic.
12:00:55 <Stskeeps> o/
12:00:59 <E-P> #topic Current status
12:01:21 <E-P> #info Tools and tests development process updated to wiki
12:01:29 <E-P> #link https://wiki.merproject.org/wiki/Quality/Development
12:01:51 <Stskeeps> #info will start work with doing Mer releases/builds in a way that can support automated QA and replicatable releases, finally
12:01:52 <phaeron1> o/
12:02:54 <E-P> I would like to move the tools development process to somewhere else in wiki, now it is under QA
12:03:16 <E-P> I haven't yet created the Category:About pages for QA tools
12:04:32 <E-P> lbt, phaeron1: any wishes where we should have the tools development description? as an own page or part of the current tools page?
12:04:54 <lbt> part of Tools I think
12:05:04 <E-P> I don't have anything else
12:06:09 <Stskeeps> my plan with releases is starting to utilize the copy project patch within Mer CI quite soon, and do snapshot builds quite actively
12:06:13 <Stskeeps> which we can run with test .kses
12:06:38 <lbt> #info Release of OBS 2.3.1mer-1 is underway
12:06:53 <E-P> great
12:07:05 <Stskeeps> so far eat-host,eat-device is a nice combo
12:07:09 <Stskeeps> and hasn't failed me in demos
12:07:36 <Stskeeps> and testrunner-ui working fine
12:07:58 <E-P> nod
12:08:11 <Stskeeps> so QA is truly a selling point
12:08:32 <Stskeeps> i also use eat-device as a nice way of demonstrating it's easy to prototype on device
12:08:36 <Stskeeps> as you can scp/ssh with ease
12:08:47 <phaeron1> I would like to start using eat-host and eat-device , testrunner with my just finished vm based testing
12:09:34 <phaeron1> it's very simple for now but should allow for testerunner based arbitrary testing of i586 vms
12:09:41 <E-P> phaeron1: do you have a ks file for creating test automation vm image?
12:10:03 <Stskeeps> i've been missing smoke testing a lot with the recent release, so being able to run even the simplest of tests would be good
12:11:08 <phaeron1> E-P: the vm to perform tests or the vm that  will be tested :)
12:11:39 <E-P> Stskeeps: we can have that as our main priority for now
12:11:57 <Stskeeps> even simple tests like "did this boot" would be good
12:11:58 <E-P> phaeron1: vm that will be tested
12:12:05 <Stskeeps> as we're having new compiler, systemd coming in
12:12:31 <phaeron1> E-P: basically any i586 ks with ssh and eat device added
12:12:43 <E-P> phaeron1: + the eth0 setup
12:13:20 <phaeron1> well right now I am using a big kernel that can do dhcp. so it is not HA dependent
12:13:32 <phaeron1> virtio-net + dhcp
12:13:39 <phaeron1> I am sure this will change
12:13:39 <E-P> ok
12:14:04 <phaeron1> but it was much simpler to not use the image's kernel , as it probably might fail to boot in a vm
12:14:07 <phaeron1> not sure yet
12:15:36 <E-P> anything else to add?
12:15:59 <Stskeeps> i've added a few task bugs for performance work as well, debugging, etc
12:16:03 <Stskeeps> will be in next triage
12:16:08 <Stskeeps> powertop, sp-* tools, etc
12:17:20 <phaeron1> I would like comments on that HA thing
12:17:40 <Stskeeps> phaeron1: sounds good to me, tbh - if you can do it in a way that's equal to what is in mer kernel-headers..
12:17:47 <Stskeeps> then that's a winner in my book
12:18:41 <Stskeeps> ie, kernel sources
12:18:53 <Sage_> what is the reason for kernel to do the dhcp, no possibility to config connman or something else?
12:19:11 <Stskeeps> Sage_: connman's notoriously difficult to do a static ip first it seems
12:19:20 <Stskeeps> and other side is that ping comes up quite quickly
12:20:07 <Stskeeps> wasn't there something called net console too?
12:20:11 <lbt> yes
12:20:13 <Stskeeps> so we can have kernel messages from early on
12:20:27 <Sage_> ok, sounds good to me.
12:20:30 <phaeron1> Stskeeps: I use a recent 3.x kernel built on cobs , we can sync it with mer core of course , it needs various virtio options and the builtin dhcp thing
12:20:34 <lbt> http://www.mjmwired.net/kernel/Documentation/networking/netconsole.txt
12:21:38 <Stskeeps> we also need a new way of receiving syslog messages, i think
12:21:39 <E-P> that will be useful tool for debgging
12:21:42 <Stskeeps> for testrunner*
12:21:55 <lbt> nc
12:22:00 <lbt> add it to tools
12:22:21 <Stskeeps> systemd journal is a new interesting player in this market
12:22:51 <lbt> uh huh
12:25:51 <E-P> ok, I think we can go on
12:26:44 <E-P> #topic Test packaging
12:26:54 <E-P> Let's start with the naming
12:27:15 <E-P> in Meego the way was to use -tests suffix in the packages
12:27:46 <iekku> i think it would be good to have similar also in mer
12:27:55 <E-P> yes, for test binaries/libraries
12:28:26 <Sage_> what does test binaries mean?
12:28:41 <lbt> I don't have a problem with that but I think we need to take the mapping out of the "just use package names"
12:29:32 <E-P> Sage_: executables that are on the device/host that do the actual testing
12:29:40 <Sage_> e.g. there is package connman-test which has test scripts for testing connman functionality.
12:29:57 <Sage_> should that be also called -tests ?
12:30:07 <E-P> yes
12:30:27 <Sage_> even if those have nothing to do with Mer QA actually, but is something released by upstream?
12:30:54 <E-P> lbt: yes, like we planned some time ago, the mapping is more than just package names
12:31:03 <lbt> Sage_: well. the only package is created by Mer packaging
12:31:16 <lbt> they provide the tests, we decide how to package/use them
12:31:21 <Sage_> http://pastie.org/4165425
12:31:22 <E-P> Sage_: yes, just having a constant naming in packages that provide tests
12:31:28 <lbt> E-P: just making sure :)
12:31:54 <Sage_> lbt: yes, but I just want to know if the -tests is for certain kind of test files, or just any kind of tests written by anyone.
12:32:03 <lbt> excellent question
12:32:28 <lbt> what if we have upstream tests and our own tests in 2 source trees/packages
12:32:38 <E-P> I would say that -tests package can have any kind of tests which are approved by mer qa team
12:32:52 <lbt> connman-tests-*
12:32:52 <iekku> +1 for E-P
12:33:22 <Sage_> what kind of tests we have atm.?
12:33:37 <Sage_> build time tests are not packaged so we can rule those out.
12:33:46 <lbt> (they will be)
12:34:03 <E-P> lbt: that is good question too
12:34:06 <lbt> (we have a goal to split build-time tests out)
12:34:08 <Sage_> lbt: well in src package
12:34:34 <Sage_> lbt: we have, didn't know that.
12:34:38 <E-P> should we have bluez-unit-tests and bluez-dbus-tests
12:34:53 <lbt> Sage_: yes, it's a goal. The idea is to run the tests on devices, not build VMs
12:35:04 <E-P> yep
12:35:55 <Sage_> E-P: I would prefer PACKAGENAME-tests-* syntax
12:36:10 <Sage_> but that migth be just me.
12:36:12 <E-P> Sage_: we don't have much tests, mainly from  meego and upstream
12:36:58 <lbt> and you can have PACKAGENAME-tests[-*]   so the -* are allowed but not mandatory. PACKAGENAME-tests is made from the package source
12:36:59 <Sage_> we have tests that are runned by testrunner, right? These tests require testrunner to be used. Then we have test scripts by upstream that user can call in shell but do not require testrunner or anything.
12:37:33 <Sage_> I don't think we should package all to one as that would just mean when user wants to have shell scripts to test he would have extra dependencies that he doesn't need.
12:37:33 <lbt> the -* varieties can come from the main src (if you want to split them for some reason)
12:38:50 <E-P> Sage_: test packages do not require testrunner to be used, the idea was that the testrunner's definition is separated from the test package
12:39:36 <E-P> test package contains scripts/executables for testing some feature or function
12:39:54 <yunta> would be cool if we had a common way to define "entry point" for a test package
12:39:57 * lbt cautions we're spending a lot of time on this ... PACKAGENAME-tests[-*]  seems like a good compromise to support all requirements?
12:40:09 <yunta> so automated systems know how to start the test....
12:40:28 <Stskeeps> yunta: well, /usr/share/tests/testsuitename/tests.xml?
12:40:41 <E-P> yunta: that is handled with test definition then
12:41:01 <yunta> so xml is obligatory in every test package?
12:41:03 <E-P> but it is not required to be in the test package, we can get back to this in mer channel
12:41:09 <lbt> yunta: no
12:41:15 <Sage_> lbt: I agree with PACKAGENAME-tests[-*] however should we define some common things for [-*] part?
12:41:26 <yunta> fine, let's discuss it later
12:41:37 <E-P> good :)
12:41:57 <lbt> Sage_: I suggest we defer that until someone feels the need to make one? Then raise it here?
12:42:05 <Sage_> fair enough
12:42:15 <lbt> then we'll see why they want to do it.. and we'll have had more experience :)
12:42:32 <E-P> yep, good way to do that
12:42:46 <Sage_> so -test packages should be renamed to -tests
12:43:05 <E-P> yes
12:44:20 <E-P> #info test packages must use PACKAGENAME-tests[-*] naming
12:45:46 <E-P> #info [-*] suffixs are defined later
12:46:07 <Sage_> one question does QA need to approve all content of -tests ?
12:46:44 <Sage_> and are upstream provided tests scripts always considered as approved?
12:48:03 <E-P> for the first one, yes. All tests should pass before they are accepted
12:48:32 <E-P> for the latter one, I think we have to check them case by case
12:49:31 <E-P> these are the guidelines at the moment, if we see that we have to change our process then we can change it
12:49:54 <lbt> is there a concept of "expected result: fail" ?
12:50:24 <lbt> and/or skipping a test
12:50:48 <E-P> if using testrunner, you can define expected result
12:50:57 <E-P> and you can use filter with testrunner to skip tests
12:51:07 <iekku> E-P, one thing... how about tests finding actual fault in code
12:51:46 * iekku can't figure out why all tests need to be passed to get approved
12:52:42 <E-P> iekku: we had many issues with failing test cases in Maemo, they never got passed
12:52:43 <Sage_> E-P: ok, then what if QA doesn't approve test scripts by upstream that would still be usefull where those would go? ;)
12:53:01 <Sage_> no need to answer just a though ;)
12:53:15 <lbt> Sage_: that's why I think they should still be in the package
12:53:25 <E-P> Sage_: heh, -tests-notapprove ;)
12:53:38 * Sage_ :headdesk:
12:53:54 <lbt> but the test-suite (ie T-R definition) should not run them or rely on them
12:54:09 <E-P> I think we have to discuss about this more detailed
12:54:33 <iekku> E-P, and in meego also, i think
12:54:33 <lbt> E-P: I think that we kinda permit anything into -test packages
12:54:52 <lbt> but we manage carefully how we treat the results
12:55:14 <lbt> so if a new test appears, TR should ignore it until it's "approved"
12:55:26 <E-P> and what goes to release and package testing
12:55:40 <E-P> lbt: yes, good idea
12:56:00 <lbt> this lets Sage_ put anything into -tests
12:56:12 <lbt> it also lets QA decide what they rely on
12:56:13 <Sage_> example:
12:56:29 <E-P> yes with the test mapping
12:56:40 <Sage_> http://review.merproject.org/#change,624 <- might have additional test scripts, how would QA notice? :)
12:57:02 <Sage_> maybe better view: http://review.merproject.org/#patch,sidebyside,624,1,connman.spec
12:57:41 <Sage_> %{_libdir}/%{name}/test/*
12:57:49 <lbt> Sage_: I know it's hard.... but you may have to have the developer talk to QA... :D
12:58:40 <E-P> who is updating the package should inform the qa if there is new tests
12:58:48 <Sage_> E-P: that won't work.
12:59:09 <lbt> E-P: *nod* ... ideally with a patch to the T-R definition
12:59:25 <lbt> Sage_: why?
12:59:32 <E-P> Sage_: any better solution?
13:00:02 <Sage_> approve all upstream tests scripts
13:00:41 <lbt> that's not quite what was said
13:00:43 <Sage_> lbt: because that would actually mean that developer would need to test all those scripts and then ask QA and only after that do request
13:00:58 <lbt> I didn't read it like that
13:01:28 <lbt> Sage_: when someone reviews the 624 and see extra tests, they should see if they're now part of the testplan
13:01:57 <lbt> we all agree they should be?
13:02:02 <Sage_> lbt: the point is that you can't see from the review if there are extra tests without checking the tarball and comparing it to the old one.
13:02:24 <Sage_> in my opinion upstream tests are not part of testplan
13:02:25 <lbt> Sage_: yeah, unpleasant technicality though?
13:02:51 <lbt> upstream tests aren't part of the testplan? Why not?
13:02:56 <E-P> by default the upstream tests are not part of the test plan
13:03:08 <E-P> after the review we can add them
13:03:12 <lbt> ah
13:03:15 <E-P> test case review
13:03:17 <Stskeeps> :nod:
13:03:34 <iekku> sounds good
13:03:38 <E-P> that really depends how the upstream tests are made
13:03:42 <lbt> OK... upstream tests should be part of the testplan
13:03:48 <Stskeeps> if possible.
13:03:51 <E-P> yes
13:03:53 <Stskeeps> it is not always.
13:03:59 <lbt> so, let me continue
13:04:15 <Sage_> http://pastie.org/4165567 <- include that ;)
13:04:20 <lbt> we all agree they *should* be ... subject to real world
13:04:25 <lbt> sec Sage_
13:04:52 <Sage_> :P
13:05:02 <lbt> so when the 1.1 -> 1.2 packaging is done, it would be nice to also send a patch to the testplan
13:05:05 <lbt> however
13:05:14 <lbt> a couple of tests may fail or not be runnable
13:05:25 <lbt> so they're marked as 'expected fail' or 'skip'
13:05:59 * Sage_ thinks we aim for the same thing but are talking about totally different perspective
13:06:02 <lbt> the reviewer may either reject until that submit is done, or log a bug that it needs doing?
13:07:01 <lbt> Sage_: what I'm saying is "package all new tests into -tests package and tell QA they're there"
13:07:19 <lbt> then QA decide how to change the test plan, if at all
13:07:27 <E-P> yes
13:07:30 <lbt> a polite developer may submit a testplan patch
13:07:33 <Sage_> ok, well why didn't you say so at the first place :P
13:07:37 <lbt> I tried
13:07:38 <E-P> :D
13:07:51 <Sage_> so -tests can have test scripts that aren't part of the plan
13:07:55 <lbt> yes
13:07:57 <E-P> yes
13:08:03 * Sage_ is fine
13:08:16 <E-P> the actual test plan is separated from the test package
13:08:29 <Sage_> where is that btw?
13:08:52 <lbt> *mumble*
13:09:11 * Sage_ is not sure about all the terms.
13:09:17 <E-P> Sage_: that is still under development :)
13:09:25 <Sage_> ah.
13:09:36 <E-P> we had a meeting about that looong time ago
13:09:51 <E-P> https://wiki.merproject.org/wiki/Quality/Test_processes
13:09:52 * Sage_ poors a bucket of water on top of lbt to avoid burning his brain.
13:10:05 <lbt> way too late :)
13:10:06 <E-P> anyway, we are now happy with the naming?
13:10:24 <lbt> seems so - feels much clearer now
13:10:25 * Sage_ though we agreed on that already
13:10:29 <E-P> #info test package can have not approved tests
13:11:05 * Sage_ is confused with terms "-tests" and "test package"
13:11:19 <E-P> #info meaning failing or not ready test cases
13:11:36 <E-P> Sage_: they are the same, but not the same as in Meego :)
13:11:50 <E-P> did I confuse you a bit more?
13:12:07 <Sage_> or test scripts that are provided by upstream and are not part of test cases.
13:12:41 <lbt> Sage_: tes cases = test plan == which of the test scripts provided by upstream we care about
13:12:52 <lbt> and also our own tests
13:12:53 <E-P> those should be in the -tests package, even if they are not test cases, we can use them as test cases later
13:13:02 <lbt> 100%
13:13:27 <E-P> I have to go soon, so let's continue
13:13:35 <Sage_> lets continue and talk about this more when someone rejects my submits about the matter ;)
13:13:52 <E-P> any wishes for the locations of the test executables and test data?
13:13:54 <lbt> yep
13:14:08 <E-P> Sage_: you meantioned long time ago that we should use /opt/tests/<packagename>
13:14:22 <E-P> (or something like that)
13:14:38 <Sage_> can't recall exactly but I think that was the one I prefered at the time at least.
13:15:09 <E-P> by location I mean where the -tests package is installing the executable files
13:15:28 * Sage_ *smiles*
13:15:48 <Sage_> what about the upstream stuff should we keep those where upstream wants or move them? ;)
13:16:04 <Sage_> eg. /usr/lib/connman/test/backtrace
13:16:41 <lbt> I'd use SHOULD rather than MUST :)
13:16:47 <E-P> I would say that not moving
13:17:02 <E-P> yes, should
13:18:00 <iekku> :)
13:18:08 <E-P> how about for common test data, /usr/share/testdata/
13:18:59 <E-P> just for as a guideline
13:20:03 <Sage_> /opt/tests/<packagename>/testdata/ ?
13:20:17 <E-P> and for test packages own test-definition to /usr/share/<packagename>/
13:20:45 <Sage_> /opt/tests/<packagename>/test-definitons/
13:21:18 * Sage_ is pondering why not putting everythin under same dir that is related to testing?
13:21:43 <E-P> if test package has test data that only the package itself uses, it should be under the /opt/
13:21:58 <E-P> I meant more like common test data that many test packages are using
13:22:14 <Sage_> common test data?
13:22:21 <E-P> or should that be also in the /opt/tests/<test data package name>/
13:22:25 <lbt> video
13:22:34 <lbt> E-P: yes
13:22:35 <Sage_> ah
13:22:43 <Sage_> I would say similar to everyting else
13:22:51 <E-P> ok, fine for me
13:22:52 <aard_> E-P: the problem with /usr/share is that you can't put arch-dependant stuff there, and imo it does not belong in lib as well -- putting everything in /opt/tests is not a good solution, but imo the best one
13:23:13 <Sage_> aard_: +1
13:23:28 <E-P> aard_: thanks for the clearance
13:23:35 <E-P> fine for me
13:24:21 <iekku> +1
13:24:32 <aard_> (additional benefit of the /opt is that we can easily do package checks. test-package and something not in opt? reject. non-test package and something in /opt? reject.)
13:24:56 <E-P> #info test executables should be installed to /opt/tests/<packagename>/
13:25:29 <E-P> #info common test data should be installed to /opt/tests/<test data package name>/{audio video image etc}
13:25:48 <E-P> #info test package's test-definition should be installed to /opt/tests/<packagename>/test-definiton/
13:25:52 <Sage_> sounds like a macro to me %{_testdir} = /opt/tests/%{name}/
13:26:12 <Sage_> *testsdir
13:26:37 <E-P> not bad idea, can you file a task bug about that?
13:27:15 <Sage_> sure, not sure though if %{_name} can be used like that in macro but then just /opt/tests/
13:27:33 <E-P> like I said, we can change everyhing later if we see that something is not working well
13:27:49 <E-P> anything to add?
13:28:41 <E-P> if not, thanks for all
13:29:17 <iekku> thanks guys
13:29:24 <E-P> #endmeeting