[rbldnsd] Re: rbldnsd 0.99 TTL issue
Mon, 6 Oct 2003 20:32:22 -0400 (EDT)
On Tue, 7 Oct 2003, Michael Tokarev wrote:
> Ok. It seems the attached patch will do the work "automagically".
> It keeps all RRs of the same type with the same TTL - the smallest
> one, but it does nothing if we're have the same data with different
> TTLs - first TTL "wins".
This is good.
> >>Note that TTL is a propery of a dataset, and *last* value will be
> >>used. (For $SOA, *first* value will be used, and for $NS, *all*
> >>values will be used, so here's quite some inconsistency). Again,
> >>throughts/suggestions for this problem are welcome... ;)
> > This makes sense, I don't recall running into this in the docs (which I
> > have read only in part thus far), but it is the only sensible default
> Heh... Documentation is silent at this point... ;)
But, clearly the first SOA should win (usually specified locally) and the
TTL should be per data source. As for the NS, these perhaps should be
be taken from the first dataset to list an SOA for the zone. This way one
can use a single generic dataset for the SOA and NS records, and be sure
that the remaining (third-party) datasets only specify data and TTLs.
> DJB goes the simplest way here: he chooses arbitrary hardcoded TTL
> value which can't be changed at all. This is very simple and easy,
> but is impractical.
Indeed, your approach is better I think. It fits just right for NS and
TTL records, so the only not obvious issue at this point is NS records.
> There's another very similar issue with A values. Suppose you want
> to combine data from diffferent sources, giving each "source" it's
> own A value, so it will be possible to do different things based on
> results of one query to this combined zone. How to force A values
> different from the original? Changing source file - one byte of
> several MBs - seems to be impractical too (you have to keep original
> file with original timestamp for rsync/whatever). A natural way
> may be to change A values using a trick similar to the one with TTL
> above - as suggested earlier, by adding some local files. But how
> to specify all the necessary logic? And what to do if original
> data ALREADY contains several "types" of listings, with different
> A records? Ugh.
Anyone who wants to go to this length can write scripts to update the
datasets. This level of complexity is not called for.