couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Anderson <>
Subject Re: Proposal for digital signatures of documents
Date Mon, 09 Mar 2009 07:14:34 GMT
On Sun, Mar 8, 2009 at 8:42 PM, Chris Anderson <> wrote:
> On Sun, Mar 8, 2009 at 6:35 PM, Noah Slater <> wrote:
>> On Mon, Mar 09, 2009 at 12:56:04PM +1030, Antony Blakey wrote:
>>> On 09/03/2009, at 12:42 PM, Noah Slater wrote:
>>>> The choice of mailing list has no bearing on good practice.
>>> This is a list to discuss uses of CouchDB, rather than the core or
>>> direction of the product, which is dev@. If this were dev@ then the
>>> discussion would be in the context of official CouchDB support and/or
>>> incorporation. As this is user@, what is an RFC a prerequisite for
>>> (apart from your interest)?
>> I am speaking as a user of CouchDB, not a developer. If the community wants to
>> interest me in a canonicalisation of JSON then I want to see that the plan is to
>> push this through the IETF as an RFC. Otherwise, what's the point?
> As an application developer (not a database developer) I'm a bit of a
> loose cannon. As such I'd probably implement a halfway decent JSON
> canonicalization strategy (recursive key sort) that works for plain
> ascii. I'd get the edge cases wrong and not notice. I think this is
> probably the proper way to proceed.
> Inevitably, someone would come along and complain about the edge
> cases, and I would ignore them. After a while, either someone would
> write a freaking patch or I'd get fed up with complaints and ensure
> that all non 7bit chars were encoded with their \uXXXX equivalents.

Reading back over Jens' proposal I see that he's addressed most of
these points - the sticky one I see is floating point numbers. I think
the most robust option for canonicalization for signing is to avoid
numbers altogether, encoding them as strings, so that intermediaries
don't lose precision.


    To take
    just one minor point, neither nor RFC 4627 makes
    an promises whatever about the range of numbers that can be
    passed through JSON.  There isn't even any minimum range.  It
    seems as though a JSON implementation could reject all numbers
    other than 0 as too large and still conform!  This is stupid.
    We can PROBABLY rely on IEEE doubles; we almost certainly cannot
    expect to get large integers through JSON.

The requirement not to use JSON numbers might be too stringent. The
other option is to have the signature state which form of number it's
working with. Basically this would be a function from the number to a

Eg: all numbers in this signature were converted to strings using the
printf format "%4.2f" or somesuch.

This would allow signers to specify the precision that must be
maintained for them to consider the document representative of what
they chose to sign. So you might end up signing a document that would
be valid if transport lost some precision, but invalid if it lost too
much precision.

As long it rounds to "3.33" I'm good. If the JSON makes it all the way
to the other end, and still contains 3.3333333333 that may be better,
but it doesn't effect the signature.


Chris Anderson

View raw message