lists at wildgooses
Mar 3, 2012, 10:21 AM
Post #14 of 19
Re: Licence compliance - capturing all source files used to make a build?
[In reply to]
>> But not all the patches are in the portage tree? Trivial example might
>> be the kernel where the ebuild is tiny and references an http location
>> for the patches?
> Then you would change the kernel ebuild in your snapshot, so that it
> becomes self-contained.
That's clearly not a practical suggestion because there are many such
ebuilds with this behaviour and the suggestion to "rewrite all your
ebuilds" kind of defeats the benefit of using gentoo?
>> My understanding is that for a GPL licence one should provide a
>> copy of these patches in the "code dump", not just an http link?
>> Is that your understanding?
> I think your understanding is incomplete, and I recommend that you
> read through the license again.
?? Why all the stupid hints rather than just stating the answer!
Under what circumstances do you claim that it's not necessary to
actually supply the code for a patch which has been made to a GPL
licenced code base?? I think you are implying that it's satisfactory to
"supply" code by having a twisted and nested chain of source locations
for all the code, some of which may not be under my control? As you
hint, I then have the risk of servers outside of my control causing my
compliance failure. However, this is all moot because my whole question
is about accurately capturing all the upstream source so that I can
maintain my own cache?
I'm not sure why GPL seems to attract such special behaviour. In every
other industry one will usually provide both a legal licence and also a
non legal "summary of intent". For some reason the open source
advocates seem to excel in leaping on any minor misunderstanding of
their licensing agreements, but then enjoy confounding the situation
with "nah that's not it, but I can't give you any hints as to why I
*think* you are wrong...". Look it's just a straightforward licence -
we don't need to be lawyers to have a stab at complying with it and
generally helping with understanding it's nuances...
The big thing which annoys me is that one can comply with to the letter
of the GPL with a big code dump that, and lets be honest here, benefits
absolutely no one really (what do you do with a lump of undocumented and
obfuscated hacky code. There are several open letters on the internet
discussing this, but what you are looking for is people to get involved
with the *spirit* of working within the open source process and sharing
in a useful way, not just code dumping.
The piece we are discussing here is really the boring compliance piece
which personally I think is largely unhelpful, last chance saloon kind
of code dump. All the useful pieces of code I try to push upstream.
For sure the GPL provision at least means you get the code even if *I*
don't try and push it upstream and am uncooperative, but really, for the
vast majority of code, it's just boiler plate reproduction of stuff that
you would get from upstream if you needed it anyway...
>> So by implication it's not clear that catalyst does satisfy your GPL
>> requirements for distribution?
> I never say it did. I said that it helps with some things.
What "some things"? Previously I asked for help capturing the source
code tree and you implied that it would be correctly captured by
catalyst - however, now it seems to be becoming clear that catalyst
doesn't capture all the patches either? So we seem to be back to the
original question again and catalyst seem to be just a detour that
hasn't advanced us?
With that in mind if you are using only catalyst, how do *you* make sure
you are GPL compliant and provide all patches/sources, etc? (Not a
challenge, just genuinely trying to learn from how others are doing things?)
>> I suspect something more is probably happening, eg some of the linked
>> patches probably get included into the source download location and
>> probably you can pick them up there - however, there are now a LOT of
>> ways to fetching source and patches and it would be hard to be sure
>> of 100% coverage?
> Fourth time: Add bookkeeping into the epatch function.
No, it's not "fourth time". It was my idea in the original email!
However, patching portage is unsatisfactory in that it's fragile and
easily overwritten accidently. By all means if you have a way of
patching which is less fragile, eg if there is some way to patch the
eclass using some overlay in /usr/local/portage then I would be grateful
for *that* information.
you are just saying "do it" like having the idea is the easy bit!
Actually the implementation seems hacky to me. Wrapping the patch
utility seems more robust to me, but it's still not ideal...
> Downloading is irrelevant, especially since sometimes many more
> patches are downloaded than are actually applied.
I'm not sure I follow? My understanding is that we need to supply
patches that are applied, not just every patch to every ebuild - I think
we are agreeing on that?
> It's the other way around:
> You provide a snapshot to catalyst, and catalyst builds kernel from
> that. You say what you want catalyst to build, and you create the
> You may end up doing more ebuild maintenance, but you likely want to
> do just that anyway, in order to keep track of what actually goes
> into your system.
Hmm, that's a very superficial description of what is done, but I can
infer some of what you mean.
You might be saying that you figure out every ebuild that you need in
your solution, then manually patch them all to use source pulled down
from your own server, plus sync all the sources from gentoo to yoru own
server? However, this seems like a desperate amount of work?
You might be saying you just snapshot the gentoo portage tree, however,
I don't see how that helps you capture all the sources and patches
Can you please clarify how you generate your portage snapshot for
catalyst and how you create your own offline snapshot of all sources
(including downloaded patches) - this is I think is what I'm looking to