PRISM: NSA/FBI Internet data mining project

Mark Seiden mis at seiden.com
Fri Jun 7 23:43:59 UTC 2013


what a piece of crap this article is.

the guy doesn't understand what sniffing can and can't do.  obviously he doesn't understand peering or routing, and he doesn't understand what cdns are for.

he doesn't understand the EU safe harbor, saying it applies to govt entitites, when it's purely about companies hosting data of EU citizens.

he quotes a source who suggests that the intel community might have privileged search access to facebook, which i don't believe.

he even says "company-owned equipment" might refer to the NSA, which i thought everybody calls the "agency" so to not confuse with the CIA.

and he suggests that these companies might have given up their "master decryption keys" (as he terms them) so that USG could decrypt SSL.

and the $20M cost per year, which would only pay for something the size of a portal or a web site, well, that's mysterious.

sheesh.

this is not journalism.


On Jun 7, 2013, at 3:54 PM, Paul Ferguson <fergdawgster at gmail.com> wrote:

> Also of interest:
> 
> http://www.guardian.co.uk/world/2013/jun/07/nsa-prism-records-surveillance-questions
> 
> - ferg
> 
> 
> On Fri, Jun 7, 2013 at 3:49 PM, Michael Hallgren <m.hallgren at free.fr> wrote:
> 
>> Le 07/06/2013 19:10, Warren Bailey a écrit :
>>> Five days ago anyone who would have talked about the government having this capability would have been issued another tin foil hat. We think we know the truth now, but why hasn't echelon been brought up? I'm not calling anyone a liar, but isn't not speaking the truth the same thing?
>> 
>> 
>> ;-)
>> 
>> mh
>> 
>>> 
>>> 
>>> Sent from my Mobile Device.
>>> 
>>> 
>>> -------- Original message --------
>>> From: Matthew Petach <mpetach at netflight.com>
>>> Date: 06/07/2013 9:34 AM (GMT-08:00)
>>> To:
>>> Cc: NANOG <nanog at nanog.org>
>>> Subject: Re: PRISM: NSA/FBI Internet data mining project
>>> 
>>> 
>>> On Thu, Jun 6, 2013 at 5:04 PM, Matthew Petach <mpetach at netflight.com>wrote:
>>> 
>>>> 
>>>> On Thu, Jun 6, 2013 at 4:35 PM, Jay Ashworth <jra at baylink.com> wrote:
>>>> 
>>>>> Has fingers directly in servers of top Internet content companies,
>>>>> dates to 2007.  Happily, none of the companies listed are transport
>>>>> networks:
>>>>> 
>>>>> 
>>>>> http://www.washingtonpost.com/investigations/us-intelligence-mining-data-from-nine-us-internet-companies-in-broad-secret-program/2013/06/06/3a0c0da8-cebf-11e2-8845-d970ccb04497_story.html
>>>>> 
>>>>> Cheers,
>>>>> -- jra
>>>>> --
>>>>> Jay R. Ashworth                  Baylink
>>>>> jra at baylink.com
>>>>> Designer                     The Things I Think                       RFC
>>>>> 2100
>>>>> Ashworth & Associates     http://baylink.pitas.com         2000 Land
>>>>> Rover DII
>>>>> St Petersburg FL USA               #natog                      +1 727
>>>>> 647 1274
>>>>> 
>>>>> 
>>>> I've always just assumed that if it's in electronic form,
>>>> someone else is either reading it now, has already read
>>>> it, or will read it as soon as I walk away from the screen.
>>>> 
>>>> Much less stress in life that way.  ^_^
>>>> 
>>>> Matt
>>>> 
>>>> 
>>> When I posted this yesterday, I was speaking somewhat
>>> tongue-in-cheek, because we hadn't yet made a formal
>>> statement to the press.  Now that we've made our official
>>> reply, I can echo it, and note that whatever fluffed up
>>> powerpoint was passed around to the washington post,
>>> it does not reflect reality.  There are no optical taps in
>>> our datacenters funneling information out, there are no
>>> sooper-seekret backdoors in the software that funnel
>>> information to the government.  As our formal reply
>>> stated: "Yahoo does not provide the government with
>>> direct access to its servers, systems, or network."
>>> I believe the other major players supposedly listed
>>> in the document have released similar statements,
>>> all indicating a similar lack of super-cheap government
>>> listening capabilities.
>>> 
>>> Speaking just for myself, and if you quote me on this
>>> as speaking on anyone else's behalf, you're a complete
>>> fool, if the government was able to build infrastructure
>>> that could listen to all the traffic from a major provider
>>> for a fraction of what it costs them to handle that traffic
>>> in the first place, I'd be truly amazed--and I'd probably
>>> wonder why the company didn't outsource their infrastruture
>>> to the government, if they can build and run it so much
>>> more cheaply than the commercial providers.  ;P
>>> 7 companies were listed; if we assume the
>>> burden was split roughly evenly between them, that's
>>> 20M/7, about $2.85M per company per year to tap in,
>>> or about $238,000/month per company listed, to
>>> supposedly snoop on hundreds of gigs per second
>>> of data.  Two ways to handle it: tap in, and funnel
>>> copies of all traffic back to distant monitoring posts,
>>> or have local servers digesting and filtering, just
>>> extracting the few nuggets they want, and sending
>>> just those back.
>>> 
>>> Let's take the first case; doing optical taps, or other
>>> form of direct traffic mirroring, carrying it untouched
>>> offsite to process; that's going to mean the ability to
>>> siphon off hundreds of Gbps per datacenter and carry
>>> it offsite for $238k/month; let's figure a major player
>>> has data split across at least 3 datacenters, so about
>>> $75K/month per datacenter to carry say 300Gbps of
>>> traffic.  It's pretty clearly going to have to be DWDM
>>> on dark fiber at that traffic volume; most recent
>>> quotes I've seen for dark fiber put it at $325/mile
>>> for already-laid-in-ground (new builds are considerably
>>> more, of course).  If we figure the three datacenters
>>> are split around just the US, on average you're going
>>> to need to run about 1500 miles to reach their central
>>> listening post; that's $49K/month just to carry the
>>> bitstream, which leaves you just about $25K/month
>>> to run the servers to digest that data; at 5c/kwhr, a
>>> typical server pulling 300 watts is gonna cost you $11/month
>>> to run; let's assume each server can process 2Gbps of
>>> traffic, constantly; 150 servers for the stream of 300Gbps
>>> means we're down to $22K for the rest of our support
>>> costs; figure two sysadmins getting paid $10k/month
>>> to run the servers (120k annual salary), and you've got
>>> just $2k for G&A overhead.
>>> 
>>> That's a heck of an efficient operation they'd have to be
>>> running to listen in on all the traffic for the supposed
>>> budget number claimed.
>>> 
>>> I'm late for work; I'll follow up with a runthrough of the
>>> other model, doing on-site digestion and processing
>>> later, but I think you can see the point--it's not realistic
>>> to think they can handle the volumes of data being
>>> claimed at the price numbers listed.  If they could,
>>> the major providers would already be doing it for
>>> much cheaper than they are today.  I mean, the
>>> Utah datacenter they're building is costing them
>>> $2B to build; does anyone really think if they're
>>> overpaying that much for datacenter space, they
>>> could really snoop on provider traffic for only
>>> $238K/month?
>>> 
>>> More later--and remember, this is purely my own
>>> rampant speculation, I'm not speaking for anyone,
>>> on behalf of anyone, or even remotely authorized
>>> or acknowledged by any entity on this rambling,
>>> so please don't go quoting this anywhere else,
>>> it'll make you look foolish, and probably get me
>>> in trouble anyhow.  :(
>>> 
>>> Matt
>> 
>> 
> 
> 
> 
> --
> "Fergie", a.k.a. Paul Ferguson
> fergdawgster(at)gmail.com
> 





More information about the NANOG mailing list