Skip to Content.
Sympa Menu

metadata-support - Re: [Metadata-Support] significant slowdown in XML Signature validation

Subject: InCommon metadata support

List archive

Re: [Metadata-Support] significant slowdown in XML Signature validation

Chronological Thread 
  • From: Jeffrey Eaton <>
  • To: "" <>
  • Subject: Re: [Metadata-Support] significant slowdown in XML Signature validation
  • Date: Fri, 19 Feb 2016 03:19:10 +0000
  • Accept-language: en-US

> On Feb 18, 2016, at 7:24 PM, Cantor, Scott
> <>
> wrote:
> On 2/18/16, 7:06 PM,
> "
> on behalf of Jeffrey Eaton"
> <
> on behalf of
> >
> wrote:
>> The signature check passes (just as slowly as without the
>> EntityRoleWhiteList filter), and the filtering completes but doesn’t seem
>> to have much effect on the in-memory resident size (still taking ~250MB).
> Linux really doesn't release memory. Results on Windows would be different.
>> Loading that aggregate complete in 2 seconds, and the resulting shibd
>> process only takes up about 40 MB of RAM. That actually seems like a very
>> reasonable compromise.
> I'm not sure how that's a compromise, that's just the old non-global set of
> entities. Or am I not understanding? That would be a rollback, not a
> compromise.

I believe that there is value in having a metadata file which contains the
non-global set of entities. An SP which is not published in the eduGAIN
metadata (and is therefore unknown to non-InCommon IDPs) has no need to load
those IDPS.

>> I’m still evaluating what we’re going to recommend to our SP operators,
>> but in the worst case, I can have them use the export aggregate, or my
>> split and re-signed IDP-only aggregate, with no further changes needed
>> from the InCommon side.
> I'm still not understanding the real problem, apart from the obvious trend
> that eventually it will be. I've run an SP that consumes more than this
> amount of metadata for a decade, and it's never been a concern.

Other than being a waste of resources, there’s probably no real concern.
That said, consider every SP out there that’s loading and parsing the full
metadata file, containing a ton of SP metadata which is completely useless,
and potentially thousands of IDPs which aren’t useful if they’re not part of
the global federation. Why waste hundreds of megabytes of RAM and the CPU
cycles when it’s not substantially harder to publish multiple metadata files?

I have at least one SP (not mine, but run by someone on my campus) which is
run on a machine configured with 512 MB of RAM. Chewing up half of that with
mostly useless memory is rather unreasonable, since they’re not going to be
in eduGAIN, and last week, the shibd process took a small fraction of that
and worked fine for the InCommon set of IDPs. Now they’re drastically under
resourced because shibd eats half of that memory, with no benefit to them.

I’d like to see at least the InCommon metadata file split into SP-only and
IDP-only files, and ideally versions with and without the full eduGAIN set.
As I said before, in the worst case, I can just build something to republish
my own versions of the files for my campus if I need to do so.


Archive powered by MHonArc 2.6.16.

Top of Page