couchdb-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Davis <paul.joseph.da...@gmail.com>
Subject Re: [RFC] On the Testing of CouchDB
Date Fri, 15 Dec 2017 18:13:27 GMT
I went ahead and added a `make elixir` command to the elixir-suite branch.

Of note, my earlier instructions that referenced the elixir_suite
directory are now slightly different as I moved things to test/elixir
cause Russel was being slow.

Current rundown is now:

$ # Get Elixir installed as per previous
$ # Build CouchDB as per previous
$ make elixir

The `make elixir` target is currently not integrated with dependencies
(i.e., you need to run `make` on your own) and also doesn't have the
fancy things for running a single test or anything. And I haven't put
it as part of `make check` itself.

On Fri, Dec 15, 2017 at 11:45 AM, Paul Davis
<paul.joseph.davis@gmail.com> wrote:
> For `make check` it should be fairly straightforward to map the
> current approach to it. I could probably knock that out fairly quickly
> if you want me to give it a whirl.
>
> On Fri, Dec 15, 2017 at 11:42 AM, Russell Branca <chewbranca@apache.org> wrote:
>> Yeah just to reiterate what Paul said, the Elixir dev experience is really
>> nice and easy to get rolling with. I had no prior actual experience with
>> Elixir and I was able to get things rolling in a few hours.
>>
>> RE Ben's question about diving in: please do! Just grab one of the unported
>> js suites and goto town. I've just been cherry-pick'ing things out of
>> Paul's branch and we can continue to do the same until we get this more
>> locked down. My goal with the porting is to keep chugging along and just
>> get it knocked out, as I really don't think it will be overly onerous to do
>> so. And if anyone else wants to jump in, there's still a fair number of
>> tests to port, just take your pick.
>>
>> One other thing that needs work is figuring out how to hook all this into
>> "make check" and what not. I've mostly ignored that as this just points at
>> a CouchDB instance and can be run directly, but we'll need to sort that out
>> at some point.
>>
>>
>> -Russell
>>
>> On Fri, Dec 15, 2017 at 9:03 AM Paul Davis <paul.joseph.davis@gmail.com>
>> wrote:
>>
>>> Hello everybody!
>>>
>>> I figured I should probably go ahead and chime in seeing as I've also
>>> been playing around porting some of the tests in my free time between
>>> ops shifts the last couple weeks.
>>>
>>> My first impression was that it was ridiculously easy to get involved.
>>> On OS X at least, `brew install elixir` was enough to get a working
>>> elixir installed (however, if you use kerl or erln8 you'll want have
>>> to build an Erlang 20.x VM to use the brew package). I went from not
>>> having Elixir installed to a full port of uuids.js with the config tag
>>> logic written in about two hours one night. So far the Elixir docs and
>>> seem very well written and put together. I'd say the worst part of
>>> Elixir so far is that knowing Erlang I find myself searching for "How
>>> do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
>>> The Elixir libraries have certainly had a considerable amount of
>>> thought put into them to make them easy to use and remember. I find it
>>> to be a lot like my experience when learning Python in that I may have
>>> to Google once and then its muscle memory. As opposed to Erlang's
>>> library where I'm constantly reading the lists manpage to remember
>>> argument orderings and whether I want search or find versions etc.
>>>
>>> Which I guess is a long way of saying I'm rather liking the Elixir
>>> development experience so far.
>>>
>>> That said, I'm currently about half way through porting replication.js
>>> tests to Elixir. For the most part its fairly straightforward. My
>>> current approach as we've done for the other modules is to do a direct
>>> port. Once that's finished we'll want to break up that huge module
>>> into a series of modules that share a lot of the utility functions.
>>> One of the nice things about moving to Elixir is that its got a full
>>> on development story rather than our current couchjs approach that
>>> prevents sharing code easily between subsets of tests.
>>>
>>> For Ben's question on diving in, I'd do just that. I'd say leave a
>>> note here about which module(s)? you're going to port so that we're
>>> not duplicating efforts and then its basically just a matter of
>>> getting Elixir installed. For that, here's a quick rundown on how I
>>> got that working:
>>>
>>> $ brew update
>>> $ brew install elixir
>>> $ # wait for all the things...
>>> $ iex # which fails cause I have an Erlang VM older than 20.0 as a default
>>> $ erln8 --fetch
>>> $ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
>>> $ # wait while erln8 does its thing
>>> $ git clone https://github.com/apache/couchdb
>>> $ cd couchdb
>>> $ ./configure --disable-docs --disable-fauxton --with-curl
>>> $ make
>>> $ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
>>> your own name
>>> $ cd elixir_suite # Russel promises to move this to test/elixir
>>> eventually... :)
>>> $ mix deps.get
>>> $ # For the moment, in another terminal, run ./dev/run -a adm:pass
>>> $ mix test --trace
>>> $ # For development you can also do this:
>>> $ mix test --trace test/module_i_am_working_on.exs
>>>
>>> For the time being, anyone that does any porting work, I'd just let
>>> Russel know and he can pull the changes into the main elixir-suite
>>> branch. For the initial work it might get a bit messy but we can
>>> always clean up after the fact if we decide this is a direction we'd
>>> like to go for real. To that end, I'd also make sure that we do a
>>> single .js -> .exs port per commit to try and make any future cleanup
>>> work easier.
>>>
>>> Also, even if people don't feel like doing any actual porting work I'd
>>> still be interested in hearing what its like for people to just run
>>> through their platform equivalent of the above steps. And even just
>>> initial impressions on toying around with Elixir. My only experience
>>> with Elixir prior to this was reading through their quick
>>> start/tutorial pages a couple of times to get a feeling for the syntax
>>> but hadn't actually even typed it into an editor till last week.
>>>
>>> And that's all I've got for now.
>>>
>>> On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
>>> <banjiewen@apache.org> wrote:
>>> > Slick! This seems like it's coming together really nicely. Can't argue
>>> > with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
>>> >
>>> >> I hope others have similar opinions after diving in!
>>> >
>>> > How should one dive in? Are you looking for others to help out with
>>> > the ports, or just thinking aspirationally about future regular
>>> > contributions to the test suite?
>>> >
>>> > --
>>> > b
>>> >
>>> > [1]:
>>> https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
>>> >
>>> > On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <chewbranca@apache.org>
>>> wrote:
>>> >> Howdy folks!
>>> >>
>>> >> The testing of CouchDB is something that has seen focus and improvements
>>> >> for the last several years, for instance migrating the etap suite to
>>> eunit,
>>> >> and updating the JS suite to run against clusters in 2.x. There's still
>>> >> improvements to be made, and that was one of the topics of the CouchDB
>>> dev
>>> >> summit early in the year [1].
>>> >>
>>> >> Before we go further, I want to clarify some nomenclature. I'm by no
>>> means
>>> >> going to try and define unit testing vs integration testing vs quantum
>>> >> phase shift testing, but instead I want to focus on the distinction
of
>>> >> where the testing takes place. Fundamentally, we have two places we
test
>>> >> CouchDB: 1) at the Erlang VM level where we conduct assertions against
>>> >> module functions or process states; 2) at the HTTP level where we test
>>> the
>>> >> behavior of CouchDB at the user level API. This post focuses entirely
on
>>> >> the latter; that's not to say the former doesn't also merit attention,
>>> just
>>> >> that the two are different enough that we can focus on them in
>>> isolation.
>>> >>
>>> >> So with that, let's chat about the current HTTP test suite in CouchDB.
>>> This
>>> >> is the "JS suite" I referred to above, which is a custom built test
>>> suite
>>> >> written in Javascript and executed in the aging SpiderMonkey. The JS
>>> suite
>>> >> has put in work for years, but it's showing it's age, and is a bit
>>> awkward
>>> >> to work with and improve. However, I think the biggest issue with the
JS
>>> >> suite is that it's utilized far less than it should be, and folks seem
>>> to
>>> >> avoid extending it or adding additional tests to it. There's been
>>> >> discussion for years about replacing said suite, but the discussions
>>> >> invariably got blocked on the bike shed of whether to rewrite the suite
>>> in
>>> >> Javascript or Python. This thread provides a third option, with code!
>>> >>
>>> >> I started hacking on a replacement for the JS suite, this time written
>>> in
>>> >> Elixir. Overall I'm quite impressed with how it's come along, and have
>>> some
>>> >> good examples to show. This is basically an Elixir app that has an HTTP
>>> >> client and then runs a series of tests that conduct tests against the
>>> >> CouchDB HTTP API and make assertions therein.
>>> >>
>>> >> You can find the current code in [2], and a comparison of the changes
in
>>> >> [3]. The core HTTP client is only a handful of lines of codes and works
>>> >> quite well [4]. The utility functions used across all tests are located
>>> in
>>> >> [5], and the tests themselves are in [6]. The existing test modules
>>> have a
>>> >> 1:1 correspondence with the associated JS suite test modules, and in
>>> >> general are as direct of a port as possible.
>>> >>
>>> >> The test modules ported in their entirety or most of the way are:
>>> >>
>>> >>   * all_docs.js
>>> >>   * basics.js
>>> >>   * config.js
>>> >>   * reduce.js
>>> >>   * rewrite.js
>>> >>   * uuids.js
>>> >>   * view_collation.js
>>> >>
>>> >> Paul has dove in and is responsible for a few of those test modules
and
>>> >> he's almost completed porting the replication.js suite as well. We
>>> started
>>> >> with the hard ones first, so for the most part the rest of the ports
>>> should
>>> >> be fairly smooth sailing.
>>> >>
>>> >> Here's an example of a very basic test:
>>> >>
>>> >> ```erlang
>>> >> defmodule WelcomeTest do
>>> >>   use CouchTestCase
>>> >>
>>> >>   test "Welcome endpoint" do
>>> >>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
>>> welcome"
>>> >>   end
>>> >>
>>> >> end
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> As you can see, the `Couch` client is very simple HTTP client with
>>> >> easy HTTP verb based methods. Let's look at a more complicated test
>>> >> for asserting we can create documents in a database:
>>> >>
>>> >>
>>> >> ```erlang
>>> >>
>>> >>   @tag :with_db
>>> >>   test "Create a document and save it to the database", context do
>>> >>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
>>> >> :a => 1, :b => 1}])
>>> >>     assert resp.status_code == 201, "Should be 201 created"
>>> >>     assert resp.body["id"], "Id should be present"
>>> >>     assert resp.body["rev"], "Rev should be present"
>>> >>
>>> >>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>>> >>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>>> >>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>>> >>   end
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> This is fairly straightforward code to POST a new doc, make assertions
>>> >> on the response, and then fetch the doc to make sure everything
>>> >> matches up. What I really wanted to highlight here is the `@tag
>>> >> :with_db` decorator. We can easily add custom "tags" to the tests to
>>> >> simplify setup and teardown. That `:with_db` tag does two things, it
>>> >> dynamically generates a random database name, and then takes care of
>>> >> setup/teardown for creating and deleting said database for that
>>> >> particular test. This is really useful and has been very nice to work
>>> >> with so far. We also have tag functionality in place for executing a
>>> >> test with a particular set of config options:
>>> >>
>>> >>
>>> >> ```erlang
>>> >>
>>> >>   @tag config: [
>>> >>     {"uuids", "algorithm", "utc_random"}
>>> >>   ]
>>> >>   test "utc_random uuids are roughly random" do
>>> >>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>>> >>     assert resp.status_code == 200
>>> >>     uuids = resp.body["uuids"]
>>> >>
>>> >>     assert String.length(Enum.at(uuids, 1)) == 32
>>> >>
>>> >>     # Assert no collisions
>>> >>     assert length(Enum.uniq(uuids)) == length(uuids)
>>> >>
>>> >>     # Assert rough ordering of UUIDs
>>> >>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>>> >>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>>> >>     assert u1 < u2
>>> >>   end
>>> >> ```
>>> >>
>>> >>
>>> >> The tag system really simplifies a lot of the standard auxiliary
>>> >> actions needed to conduct tests.
>>> >>
>>> >>
>>> >> To test out the suite, you'll need to spin up the dev server in one
>>> window with:
>>> >>
>>> >>
>>> >> ```
>>> >>
>>> >> ./dev/run --admin=adm:pass
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> and then in another window go into the relevant CouchDB src directory
>>> and run:
>>> >>
>>> >>
>>> >> ```
>>> >>
>>> >> cd ~/src/couchdb/elixir_suite/
>>> >>
>>> >> mix deps.get
>>> >>
>>> >> mix test --trace
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> The `--trace` flag makes the nice line item output per test, which I
>>> >> greatly prefer over a slew of periods. You can run an individual test
>>> >> with `mix test --trace tests/basics_test.exs`. I've pasted the output
>>> >> from running the basics suite at the bottom of this email so you can
>>> >> see what the real output looks like.
>>> >>
>>> >>
>>> >> Overall I'm quite impressed with the toolkit we've been able to put
>>> >> together in a short amount of time, and I propose we migrate fully to
>>> >> this test suite by porting all remaining JS suite tests and then
>>> >> removing the JS suite entirely. Given we've already ported most of the
>>> >> "hard suites", I think a full port is reasonable to do and just
>>> >> requires some leg work. Again, I'm impressed with how simple the
>>> >> tooling here is and how quickly we've been able to run with things,
>>> >> turns out the Elixir dev experience is actually quite nice! I hope
>>> >> others have similar opinions after diving in! Let me know what you
>>> >> think.
>>> >>
>>> >>
>>> >>
>>> >> -Russell
>>> >>
>>> >>
>>> >>
>>> >> [1] https://github.com/janl/couchdb-next/issues/39
>>> >> [2] https://github.com/apache/couchdb/tree/elixir-suite
>>> >> [3] https://github.com/apache/couchdb/compare/elixir-suite
>>> >> [4]
>>> >>
>>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
>>> >> [5]
>>> >>
>>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
>>> >> [6]
>>> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>>> >>
>>> >>
>>> >> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
>>> >> test/basics_test.exs
>>> >> Excluding tags: [pending: true]
>>> >>
>>> >> BasicsTest
>>> >>   * test Session contains adm context (66.8ms)
>>> >>   * test Creating a new DB with slashes should return Location header
>>> >> (COUCHDB-411) (85.8ms)
>>> >>   * test oops, the doc id got lost in code nirwana (82.1ms)
>>> >>   * test Welcome endpoint (7.6ms)
>>> >>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>>> >>   * test On restart, a request for creating an already existing db can
>>> >> not override (skipped)
>>> >>   * test Creating a new DB should return location header (118.7ms)
>>> >>   * test _bulk_docs POST error when body not an object (95.0ms)
>>> >>   * test Empty database should have zero docs (161.0ms)
>>> >>   * test _all_docs POST error when multi-get is not a {'key': [...]}
>>> >> structure (104.3ms)
>>> >>   * test Regression test for COUCHDB-954 (skipped)
>>> >>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>>> >>   * test Revs info status is good (127.3ms)
>>> >>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>>> >>   * test Database should be in _all_dbs (117.7ms)
>>> >>   * test Check for invalid document members (122.4ms)
>>> >>   * test Can create several documents (213.0ms)
>>> >>   * test Make sure you can do a seq=true option (99.1ms)
>>> >>   * test PUT doc has a Location header (skipped)
>>> >>   * test Create a document and save it to the database (116.3ms)
>>> >>   * test Created database has appropriate db info name (99.7ms)
>>> >>   * test PUT error when body not an object (89.5ms)
>>> >>   * test Simple map functions (473.0ms)
>>> >>   * test POST doc response has a Location header (117.1ms)
>>> >>
>>> >> CouchTestCase
>>> >>
>>> >>
>>> >> Finished in 3.3 seconds
>>> >> 24 tests, 0 failures, 3 skipped
>>> >>
>>> >> Randomized with seed 936284
>>>

Mime
View raw message