TBR News January 14, 2014

Jan 14 2014


The Voice of the White House



          Washington, D.C. January 11, 2014: “The Internet has proven to be the greatest source of information since lunatic Christians burnt down the library of Alexandria. Anything being sought, be it an address or an in-depth analysis of Dead Sea scrolls, is there and is the main reason that the famous Encyclopedia Britannica has gone out of business.


            At the same time, because it is open to one and all, the Internet is also a breeding ground for a legion of strange persons with a frantic desire to air their pet theses, themselves and their friends.


            We see earnest discussions about the 1963 assassination of President Kennedy, , the Sinister Truth about Hurricane Katrina, Tesla Death Rays used to bring down the buildings of the WTC, balanced with other information proving beyone a shadow of a doubt that Russian bombers were used. We also discover the evil plottings of the Illuminiati, a group that has been long gone, or that the Rothschild banking house had taken over the whole world. And from one source, now long  vanished, we discover that Houston was destroyed by a nuclear bomb set off by Jewish radicals or that the Fukishima disaster was really caused by an Israeli submarine, using German-made nuclear torpedos!


            And please do not forget that President Obama was born on Mars and flown to this planet in a special helicopter, piloted by the Evil Xenu! Remember, scientists have proven this so it must be true!


            Yes, the Internet can entertain as well as inform.”




No NSA fears as tech hawks data-hungry devices


January 10, 2013

by Tony Romm



            LAS VEGAS — The most significant privacy debate in recent history is rattling an NSA-wary Washington, but you wouldn’t have known it here at one of the largest tech gatherings in the world.


            The International Consumer Electronics Show this week practically overflowed with gadgets that promise to improve daily tasks like driving and shopping — all by collecting consumers’ personal information. The flashy car systems, baby sensors, and smartphone-connected refrigerators and door locks amass truckloads of new data about users and their daily habits.

            That’s business-as-usual for Silicon Valley’s tech industry, where ever-more-granular details about consumers’ lives and preferences are seen as engines for innovation. But it provides a sharp contrast to the mood in the nation’s capital, where revelations about government surveillance are sparking heated political debate about the protections Americans should have from invasive snooping.


            While many of the major tech companies at CES have lobbied to limit National Security Agency surveillance, the industry as a whole hasn’t turned its gaze inward to the information it also collects. The NSA bombshells instead have been afterthoughts here at the Vegas show, where the focus instead is on eye-popping new tech toys.


            “I was a little surprised it hadn’t bled over to CES,” said Gary Shapiro, the head of the Consumer Electronics Association, about the privacy debates back in the Beltway.


            But Shapiro, like others here, argued there’s a difference between the government collecting data and companies collecting information on their customers. People understand that turning over personal information is the “cost of admission” for using new products and services like the kind on display at the electronics show, he said.


            And if CES indicated anything, it’s that companies are finding novel ways to amass that data.


            Entire zones of the CES show floor this year were dedicated to devices like fitness trackers that share your glucose level with doctors and in-home systems that alert you when your fridge is empty. And carmakers are developing vehicles that understand your driving style — and moods — on the road.


            The vision is that these tools can improve health, ease daily burdens like shopping and improve congestion on city streets. But the tradeoff is that users must reveal new aspects of their private lives — in ways they may not fully understand.


             Washington in many ways is playing catchup on the privacy implications of these new commercial tools, and some at the CES want to keep it that way.


            An entire panel devoted to the so-called “Internet of Things,” the wave of new digitally connected objects and appliances, argued against regulation of the emerging space. Robert McDowell, the former FCC commissioner now at the Hudson Institute, said government needs to “be very careful and allow markets to develop.” Another panel focusing on in-car computers echoed that same message, just days after a government watchdog faulted automakers over their practices in tracking drivers’ locations.


            For the most part, though, privacy issues — and concerns about surveillance — were notably absent from the show.


            When Cisco CEO John Chambers took the stage for a CES keynote, for example, he made no mention of the NSA, even after the company recently suggested it lost global business because of U.S. surveillance fears. Instead, Chambers talked up his vision for the “Internet of Everything,” describing $19 trillion up for grabs for companies that connect cities, homes and more.


            For now, it’s the tech industry’s biggest names — including Google and Microsoft — that have had the most vocal reaction to the leaks from Edward Snowden. The companies and others, fearful that the NSA revelations are eroding public trust and global business, have pressed the Obama administration to limit government surveillance practices —even as they bristle at attempts to combine the issue with commercial privacy reform.


            The NSA has been “hanging over all the privacy discussions on an international level, and even on a domestic level,” said FTC Commissioner Maureen Ohlhausen, an attendee in Las Vegas.

Asked, though, if the surveillance debate had fueled concerns about commercial privacy, she said: “I haven’t gotten that sense so much at CES.”



Massive Target breach could have lasting effects


January 10 2014

by Michelle Chapman 




AP Business Writer= NEW YORK (AP) — Fallout from Target’s pre-Christmas security breach is likely to affect the company’s sales and profits well into the new year.


The company disclosed on Friday that the massive data theft was significantly more extensive and affected millions more shoppers than the company reported in December. As a result of the breach, millions of Target customers have become vulnerable to identity theft, experts say.


The nation’s second largest discounter said hackers stole personal information — including names, phone numbers as well as email and mailing addresses — from as many as 70 million customers as part of a data breach it discovered last month.


Target announced on Dec. 19 that some 40 million credit and debit card accounts had been affected by a data breach that happened between Nov. 27 and Dec. 15 — just as the holiday shopping season was getting into gear. As part of that announcement, the company said customers’ names, credit and debit card numbers, card expiration dates, debit-card PINs and the embedded code on the magnetic strip on the back of cards had been stolen.


According to new information gleaned from its investigation with the Secret Service and the Department of Justice, Target said Friday that criminals also took non-credit card related data for some 70 million shoppers who could have made purchases at Target stores outside the late Nov. to mid-Dec. timeframe.


Some overlap exists between the two data sets, the company said Friday. That means that more than 70 million people may have had their data stolen.


The latest developments come as Target said that just this week it was starting to see sales recover from the crisis. The company, however, cut its earnings outlook for the quarter that covers the crucial holiday season and warned that sales would be down for the crucial period.


But with the latest news, some analysts believe the breach could be a financial drag on the company for several more quarters.


“Target is in a critical situation with consumers because its credibility and brand loyalty are being questioned,” said David E. Johnson, CEO of Strategic Vision, LLC, which specializes in crisis communications. “Right now, investors think Target can weather the storm. But the longer it gets worse, the worse it is for Target.”


Meanwhile, the Attorney General from New York announced that it is participating in an investigation into the security breach. Attorney General Eric T. Schneiderman called the latest news “deeply troubling.”


Molly Snyder, a Target spokesman, told The Associated Press that she didn’t have new details to share about how the data breach was conducted.


“I know that it is frustrating for our guests to learn that this information was taken and we are truly sorry they are having to endure this,” said Gregg Steinhafel, Target chairman, president and CEO, in a statement.


The theft from Target’s databases could potentially be the largest data breach on record, surpassing an incident uncovered in 2007 that saw more than 90 million credit card accounts pilfered from TJX Cos. Inc.


Target investors have been largely unmoved by the company’s disclosures. Target’s stock, while volatile, has traded at about $63 since news of the breach leaked on Dec. 18. It slipped just 80 cents, or more than 1 percent, to $62.54 in trading Friday.


But some observers believe the stock could get battered if consumers stay away from Target stores. Several Wall Street analysts downgraded their earnings forecasts for the retailer on Friday.


Colleen McCarthy, 26, of Cleveland, Ohio, is among those who are avoiding Target. McCarthy used her Chase debit card at a local Target on the Friday after Thanksgiving and received a notice from Chase a few days after news of the breach first broke. The letter identified her as a potential victim of the Target breach but said, “don’t worry.” At the time, she was only somewhat concerned.


But Monday night McCarthy received a call from Chase, alerting her that someone tried to use her debit account twice in Michigan. The thief cleared $150, which caused her rent check to bounce. Chase restored the money to her account. “This has been a nightmare,” she said. “My rent check bounced. My debit card had to be canceled. And who’s to say what other people have access to my information?”


Target tried to woo scared shoppers back to stores on the last weekend before Christmas with a 10 percent discount on nearly everything in its stores. Target is also offering a year of free credit monitoring and identity theft protection to customers that shopped at its stores.


Still, some experts believe the company should do more. Johnson of Strategic Vision says Target needs to rebuild shoppers’ trust. He believes Target needs to air TV commercials assuring them that it’s safe to shop in its stores. It also should offer more incentives like deeper discounts to woo consumers, Johnson said.


Clearly, Target shoppers were scared off during the holiday season, when stores can make roughly 20 percent to 40 percent of their annual revenue.


The Minneapolis company also said that it now foresees fourth-quarter sales at stores open at least a year will be down about 2.5 percent. It previously predicted those sales would be about flat.


This figure is a closely-watched indicator of a retailer’s health.


Target cautioned that its fourth-quarter financials may include charges related to the data breach. The chain said the costs tied to the breach may have a material adverse effect on its quarterly results as well as future periods.


The company has 1,921 stores, with 1,797 locations in the U.S. and 124 in Canada.


More well-known U.S. retailers victims of cyber attacks – sources


January 12, 2014

by Jim Finkle and Mark Hosenball



BOSTON/WASHINGTON – Target Corp and Neiman Marcus are not the only U.S. retailers whose networks were breached over the holiday shopping season last year, according to sources familiar with attacks on other merchants that have yet to be publicly disclosed.


Smaller breaches on at least three other well-known U.S. retailers took place and were conducted using similar techniques as the one on Target, according to the people familiar with the attacks. Those breaches have yet to come to light. Also, similar breaches may have occurred earlier last year.


The sources said that they involved retailers with outlets in malls, but declined to elaborate. They also said that while they suspect the perpetrators may be the same as those who launched the Target attack, they cannot be sure because they are still trying to find the culprits behind all of the security breaches.


Law enforcement sources have said they suspect the ring leaders are from Eastern Europe, which is where most big cyber crime cases have been hatched over the past decade.


Only one well-known retailer, Neiman Marcus, has said that they too have been victim of a cyber attack since Target’s December 19 disclosure that some 40 million payment card numbers had been stolen in a cyber attack. On Friday, Target said the data breach was worse than initially thought.


An investigation found that hackers stole the personal information of at least 70 million customers, including names, mailing addresses, telephone numbers and email addresses. Neiman Marcus said it was not sure if the breach was related to the Target incident.


Most states have laws that require companies to contact customers when certain personal information is compromised. In many cases the task of notification falls on the credit card issuer.


Merchants are required to report breaches of personal information including social security numbers. It was not immediately clear if that was the case with the retailers who were attacked around the same time as Target.


The Secret Service and Department of Justice, which are investigating the Target breach, declined to comment on Saturday.




Target has not disclosed how the attackers managed to breach its network or siphon off some of its most sensitive data.


The sources who spoke to Reuters about the breaches said that investigators believe the attackers used similar techniques and pieces of malicious software to steal data from Target and other retailers.


One of the pieces of malware they used was something known as a RAM scraper, or memory-parsing software, which enables cyber criminals to grab encrypted data by capturing it when it travels through the live memory of a computer, where it appears in plain text, the sources said.


While the technology has been around for many years, its use has increased in recent years as retailers have improved their security, making it more difficult for hackers to obtain credit card data using other approaches.


Visa Inc issued two alerts last year about a surge in cyber attacks on retailers that specifically warned about the threat from memory parsing malware.


The alerts, published in April and August, provided retailers with technical details on how the attacks were launched and advice on thwarting them.


A Visa spokeswoman declined comment on the reports, which did not identify specific victims.


It was not clear whether Target’s security team had implemented the measures that Visa had recommended to mitigate the risks of being attacked.


Yet a law enforcement source familiar with the breach said that even if the retailer had implemented those steps, the efforts may not have succeeded in stopping the attack.


That is because the attackers were more sophisticated than the ones in the previous attacks described in the Visa alerts, according to the source. The source asked not to be identified because they were not authorized to discuss the matter publicly.




Retailers are often reluctant to report breaches out of concern it could hurt their businesses. Target only acknowledged its 2013 attack after security blogger Brian Krebs reported the breach, prompting inquiries from journalists and investors.


Neiman Marcus said an outside forensics firm discovered evidence on January 1 that indicated the retailer had been the victim of a cyber attack. It disclosed the breach nine days later, after another inquiry from Krebs, who was following up on reports about a surge in fraudulent charges traced to the retailer.


Target and J.C. Penney Co Inc. waited more than two years to admit that they were victims in 2007 of notorious hacker Albert Gonzalez, who was accused of masterminding the theft and reselling of millions of credit cards and ATM numbers.


During his trial the companies were represented by lawyers who did not identify their clients as Target and J.C Penney.


Doug Johnson, vice president of risk management policy with the American Bankers Association, said banks and credit card firms like Visa are forbidden from naming merchants that have been breached, unless they disclose it themselves.


“It is really frustrating to the bank and also the customer,” Johnson said.


One of the sources who told Reuters about the recent rash of attacks said the memory parsing malware cited in the Visa reports was among the tools that the hackers had used, but said they used other techniques as well.


Target spokeswoman Molly Snyder said the retailer is not commenting on the company’s investigation of the breach.


“This continues to be an active and ongoing investigation. It would be inappropriate to discuss details at this point.”


Avivah Litan, a security analyst for Stamford, Connecticut -based Gartner information technology research firm, said she learned about a separate set of breaches, dating back no more than a few months before the November 28 Thanksgiving Day start of the holiday shopping season, from a forensics investigator. She declined to provide his name.


“Target was not the only retailer who got hit, but they got hit the biggest,” Litan said.


Investigators believe that the early series of attacks on retailers staged before late November were mostly used as trial attacks to help the hackers perfect new techniques they then used against Target, stealing payment cards at unprecedented speed, Litan said.


Chris Gray, director of Denver, Colorado -based Accuvant information security firm’s risk and compliance practice, said that sophisticated cyber crime groups do that because they only have once chance to get it right before victims catch on.


“You want to test it and make sure it works,” Gray said. “Then you push it out at the appropriate time and do as much damage as you can.”


(Reporting by Jim Finkle in Boston and Mark Hosenball in Washington; Editing by Grant McCool)




Gates, Obama and denying reality in the Middle East


January 8, 2014

by David Rohde 



            The talk about former Defense Secretary Bob Gates’ blistering new memoir “Duty” has focused on the description of President Barack Obama’s tense 2011 Situation Room meeting with his top military advisers. A frustrated Obama expresses doubts about General David Petraeus, then U.S. commander in Afghanistan, and questions whether the administration can do business with Afghan President Hamid Karzai.


“As I sat there,” Gates wrote, “I thought: The president doesn’t trust his commander, can’t stand Karzai, doesn’t believe in his own strategy and doesn’t consider the war to be his. For him, it’s all about getting out.”


Republicans quickly seized on these criticisms as proof Obama was a dithering commander in chief. Democrats, in turn, hailed Obama for standing up to the Pentagon brass.


            Yet the book — and the reactions to it — represents something far larger: a fundamental, post-Iraq and Afghanistan change in how Americans view the use of military force. Gates, joining Obama, liberal Democrats and libertarian Republicans, is arguing that Washington relies on military intervention far too often.


            “Today, too many ideologues call for U.S. force as the first option rather than a last resort,” Gates wrote in a short excerpt that ran in the Wall Street Journal. “On the left, we hear about the ‘responsibility to protect’ civilians to justify military intervention in Libya, Syria, Sudan and elsewhere. On the right, the failure to strike Syria or Iran is deemed an abdication of U.S. leadership.”


“There are limits to what even the strongest and greatest nation on Earth can do,” he added, “and not every outrage, act of aggression, oppression or crisis should elicit a U.S. military response.”


For all the talk about stepping back from the region, however, the administration’s Middle East priorities still match those of Republican and Democratic administrations for the last 50 years.


Consider Obama’s landmark U.N. speech in September, when he laid out his second-term aspirations. He stated that the United States would “use all elements of our power, including military force,” to secure four “core interests” in the region.

He vowed to “confront external aggression” against our allies, “ensure the free flow of energy,” dismantle terrorist networks that “threaten our people” and “not tolerate the development or use of weapons of mass destruction.”


Yet Sunday, when militants affiliated with al Qaeda seized control of parts of the Iraqi cities Ramadi and Fallujah, the White House offered a different scenario.

“It’s not in America’s interests to have troops in the middle of every conflict in the Middle East,” Benjamin Rhodes, Obama’s deputy national security adviser, said in an email to the New York Times. “Or to be permanently involved in open-ended wars in the Middle East.”


James Jeffrey, who served as U.S. ambassador to Iraq from 2010 to 2012, told me Tuesday that the real problem is that the White House tries to have it both ways politically — seeking to protect American economic interests even as it talks of withdrawal.


“They want everything,” Jeffrey said.


During a telephone interview, Jeffrey stated that if Obama wants to achieve the four Mideast goals that he laid out in his U.N. speech, he must maintain the credible threat of military force. This means air strikes and other limited efforts, not Iraq-style invasions. Jeffrey specifically criticized the administration for repeatedly suggesting that any U.S. force would lead to another Iraq.


“The sin of this administration is conflating any use of military force with that,” Jeffrey said, referring to Iraq.


In an email exchange with me Tuesday, Rhodes flatly rejected that criticism and insisted the administration has used “many different ways to advance U.S. interests.”


Rhodes noted that the United States uses force in the region, citing drone strikes against militants in Yemen. Washington provides military aid to Iraq, he said, as well as to other governments battling militants. And he said the administration uses diplomacy — referring to current efforts to reach a nuclear agreement with Iran.


            “It is dangerous and costly to simply revert, time and again, to the use of military force as the only way to advance our interests,” Rhodes added, “it has to be seen as one tool among many.”


Rhodes’ points about the administration’s actual policies are correct. But the White House rhetoric is inconsistent and contradictory.


The administration sounds a pacifist tone in the United States but has carried out covert drone strikes that have killed more than 2,000 people around the world. It talks of upholding international norms but raises the specter of “another Iraq” when it comes to using conventional military force.


The administration’s messaging on Syria has been particularly erratic. Obama first demanded President Bashar al-Assad’s ouster and threatened air strikes if the “red line” of WMD was crossed. He then backed down on both.


For better or worse, the world — and America’s — economy remain deeply entangled with the Middle East. Even if the United States becomes energy independent, oil from the region fuels China’s production of cheap consumer goods to Americans. It also supports European growth, which boosts U.S. companies’ profits.


If the Middle East descends into chaos and oil prices soar, the world, and America’s, economy would stall.


Obama’s U.N. speech was one of his best. He should stand by those four core American interests and, if needed, use limited force as a last resort to defend them.

Yes, the United States should mount fewer military interventions in the region. But that does not absolve Obama — and all of us — from facing difficult choices in the Middle East.


Americans do benefit from a world economic order based on cheap, reliable Middle Eastern oil. Pretending we don’t is a fantasy.


California Legislators Introduce Bill To Banish NSA

Bipartisan duo wants to cut NSA’s utilities, ban research at state schools and impose sanctions on contractors


January 7, 2014

by Steven Nelson

US News


            A bipartisan team of California state senators introduced legislation Monday that would prohibit the state and its localities from providing “material support” to the National Security Agency.


            If the bill becomes law, it would deny NSA facilities access to water and electricity from public utilities, impose sanctions on companies trying to fill the resulting void and outlaw NSA research partnerships with state universities.


             Companies with state contracts also would be banned from working with the NSA.


            “I agree with the NSA that the world is a dangerous place,” state Sen. Ted Lieu, the bill’s Democratic co-author, said in a statement. “That is why our founders enacted the Bill of Rights. They understood the grave dangers of an out-of-control federal government.”


            Lieu said the NSA’s surveillance programs pose “a clear and present danger to our liberties.”


            “The last time the federal government massively violated the U.S. Constitution,” he said, “over 100,000 innocent Americans were rounded up and interned.”


            State Sen. Joel Anderson, a Republican, is Lieu’s co-author. The California state senate has 40 members.


            “I support this bill because I support the Constitution, our Fourth Amendment rights and our freedoms to live in the United States of America,” Anderson said.


            The bill’s intent is largely symbolic. Universities might be affected, but the NSA does not currently operate a large data facility in the state.


            A similar bill was introduced in Arizona by state Sen. Kelli Ward, a Republican, in December. Ward described her bill as a preventive strike and a way “to back our neighbors [in Utah] up.”


            The OffNow coalition of advocacy groups is urging Utah lawmakers to pass their own version of the legislation to override the city of Bluffdale’s water contract with the NSA’s $1.5 billion Utah Data Center. No legislator has publicly announced they will sponsor the bill.


            The NSA is based in Fort Meade, Md. Its massive phone and Internet surveillance programs – secretly authorized for years by the Foreign Intelligence Surveillance Court – were revealed in June by whistle-blower Edward Snowden. A federal judge ruled Dec. 16 the bulk collection of phone records almost certainly violates the Fourth Amendment, but another judge disagreed. As court challenges pend, any substantial federal legislation curbing the NSA likely would be vetoed by President Barack Obama, a supporter of the NSA programs.


            The Arizona and California bills are based on model legislation drafted by the Tenth Amendment Center, which organized the OffNow coalition with the Bill of Rights Defense Committee.


             “Violations of our basic civil liberties impact us all – Democrats, Republicans and independents alike,” Mike Maharrey of the Tenth Amendment Center said. “For all of our political bickering, Americans rally around certain core principles enshrined in our Constitution. It’s fitting that Lieu and Anderson are standing together to defend these values.”


            The California bill would specifically ban the state and its political subdivisions from “[p]roviding material support, participation or assistance in any form to a federal agency that claims the power, by virtue of any federal law, rule, regulation or order, to collect electronic data or metadata of any person pursuant to any action not based on a warrant that particularly describes the person, place and thing to be searched or seized.”



Peak Oil Is Dead

Long Live Peak Oil!

by Michael T. Klare



Among the big energy stories of 2013, “peak oil” — the once-popular notion that worldwide oil production would soon reach a maximum level and begin an irreversible decline — was thoroughly discredited.  The explosive development of shale oil and other unconventional fuels in the United States helped put it in its grave.


As the year went on, the eulogies came in fast and furious. “Today, it is probably safe to say we have slayed ‘peak oil’ once and for all, thanks to the combination of new shale oil and gas production techniques,” declared Rob Wile, an energy and economics reporter for Business Insider.  Similar comments from energy experts were commonplace, prompting an R.I.P. headline at Time.com announcing, “Peak Oil is Dead.”


Not so fast, though.  The present round of eulogies brings to mind Mark Twain’s famous line: “The reports of my death have been greatly exaggerated.”  Before obits for peak oil theory pile up too high, let’s take a careful look at these assertions.  Fortunately, the International Energy Agency (IEA), the Paris-based research arm of the major industrialized powers, recently did just that — and the results were unexpected.  While not exactly reinstalling peak oil on its throne, it did make clear that much of the talk of a perpetual gusher of American shale oil is greatly exaggerated.  The exploitation of those shale reserves may delay the onset of peak oil for a year or so, the agency’s experts noted, but the long-term picture “has not changed much with the arrival of [shale oil].”


The IEA’s take on this subject is especially noteworthy because its assertion only a year earlier that the U.S. would overtake Saudi Arabia as the world’s number one oil producer sparked the “peak oil is dead” deluge in the first place.  Writing in the 2012 edition of its World Energy Outlook, the agency claimed not only that “the United States is projected to become the largest global oil producer” by around 2020, but also that with U.S. shale production and Canadian tar sands coming online, “North America becomes a net oil exporter around 2030.”


That November 2012 report highlighted the use of advanced production technologies — notably horizontal drilling and hydraulic fracturing (“fracking”) — to extract oil and natural gas from once inaccessible rock, especially shale.  It also covered the accelerating exploitation of Canada’s bitumen (tar sands or oil sands), another resource previously considered too forbidding to be economical to develop.  With the output of these and other “unconventional” fuels set to explode in the years ahead, the report then suggested, the long awaited peak of world oil production could be pushed far into the future.


The release of the 2012 edition of World Energy Outlook triggered a global frenzy of speculative reporting, much of it announcing a new era of American energy abundance. “Saudi America” was the headline over one such hosanna in the Wall Street Journal.  Citing the new IEA study, that paper heralded a coming “U.S. energy boom” driven by “technological innovation and risk-taking funded by private capital.”  From then on, American energy analysts spoke rapturously of the capabilities of a set of new extractive technologies, especially fracking, to unlock oil and natural gas from hitherto inaccessible shale formations.  “This is a real energy revolution,” the Journal crowed.


But that was then. The most recent edition of World Energy Outlook, published this past November, was a lot more circumspect.  Yes, shale oil, tar sands, and other unconventional fuels will add to global supplies in the years ahead, and, yes, technology will help prolong the life of petroleum.  Nonetheless, it’s easy to forget that we are also witnessing the wholesale depletion of the world’s existing oil fields and so all these increases in shale output must be balanced against declines in conventional production.  Under ideal circumstances — high levels of investment, continuing technological progress, adequate demand and prices — it might be possible to avert an imminent peak in worldwide production, but as the latest IEA report makes clear, there is no guarantee whatsoever that this will occur.


Inching Toward the Peak


Before plunging deeper into the IEA’s assessment, let’s take a quick look at peak oil theory itself.


As developed in the 1950s by petroleum geologist M. King Hubbert, peak oil theory holds that any individual oil field (or oil-producing country) will experience a high rate of production growth during initial development, when drills are first inserted into a oil-bearing reservoir.  Later, growth will slow, as the most readily accessible resources have been drained and a greater reliance has to be placed on less productive deposits.  At this point — usually when about half the resources in the reservoir (or country) have been extracted — daily output reaches a maximum, or “peak,” level and then begins to subside.  Of course, the field or fields will continue to produce even after peaking, but ever more effort and expense will be required to extract what remains.  Eventually, the cost of production will exceed the proceeds from sales, and extraction will be terminated.


For Hubbert and his followers, the rise and decline of oil fields is an inevitable consequence of natural forces: oil exists in pressurized underground reservoirs and so will be forced up to the surface when a drill is inserted into the ground.  However, once a significant share of the resources in that reservoir has been extracted, the field’s pressure will drop and artificial means — water, gas, or chemical insertion — will be needed to restore pressure and sustain production.  Sooner or later, such means become prohibitively expensive.


Peak oil theory also holds that what is true of an individual field or set of fields is true of the world as a whole.  Until about 2005, it did indeed appear that the globe was edging ever closer to a peak in daily oil output, as Hubbert’s followers had long predicted.  (He died in 1989.)  Several recent developments have, however, raised questions about the accuracy of the theory.  In particular, major private oil companies have taken to employing advanced technologies to increase the output of the reservoirs under their control, extending the lifetime of existing fields through the use of what’s called “enhanced oil recovery,” or EOR.  They’ve also used new methods to exploit fields once considered inaccessible in places like the Arctic and deep oceanic waters, thereby opening up the possibility of a most un-Hubbertian future.


In developing these new technologies, the privately owned “international oil companies” (IOCs) were seeking to overcome their principal handicap: most of the world’s “easy oil” — the stuff Hubbert focused on that comes gushing out of the ground whenever a drill is inserted — has already been consumed or is controlled by state-owned “national oil companies” (NOCs), including Saudi Aramco, the National Iranian Oil Company, and the Kuwait National Petroleum Company, among others.  According to the IEA, such state companies control about 80% of the world’s known petroleum reserves, leaving relatively little for the IOCs to exploit.


To increase output from the limited reserves still under their control — mostly located in North America, the Arctic, and adjacent waters — the private firms have been working hard to develop techniques to exploit “tough oil.”  In this, they have largely succeeded: they are now bringing new petroleum streams into the marketplace and, in doing so, have shaken the foundations of peak oil theory.


Those who say that “peak oil is dead” cite just this combination of factors.  By extending the lifetime of existing fields through EOR and adding entire new sources of oil, the global supply can be expanded indefinitely.  As a result, they claim, the world possesses a “relatively boundless supply” of oil (and natural gas).  This, for instance, was the way Barry Smitherman of the Texas Railroad Commission (which regulates that state’s oil industry) described the global situation at a recent meeting of the Society of Exploration Geophysicists.


Peak Technology


In place of peak oil, then, we have a new theory that as yet has no name but might be called techno-dynamism.  There is, this theory holds, no physical limit to the global supply of oil so long as the energy industry is prepared to, and allowed to, apply its technological wizardry to the task of finding and producing more of it.  Daniel Yergin, author of the industry classics, The Prize and The Quest, is a key proponent of this theory.  He recently summed up the situation this way: “Advances in technology take resources that were not physically accessible and turn them into recoverable reserves.”  As a result, he added, “estimates of the total global stock of oil keep growing.”


From this perspective, the world supply of petroleum is essentially boundless.  In addition to “conventional” oil — the sort that comes gushing out of the ground — the IEA identifies six other potential streams of petroleum liquids: natural gas liquids; tar sands and extra-heavy oil; kerogen oil (petroleum solids derived from shale that must be melted to become usable); shale oil; coal-to-liquids (CTL); and gas-to-liquids (GTL).  Together, these “unconventional” streams could theoretically add several trillion barrels of potentially recoverable petroleum to the global supply, conceivably extending the Oil Age hundreds of years into the future (and in the process, via climate change, turning the planet into an uninhabitable desert).


But just as peak oil had serious limitations, so, too, does techno-dynamism.  At its core is a belief that rising world oil demand will continue to drive the increasingly costly investments in new technologies required to exploit the remaining hard-to-get petroleum resources.  As suggested in the 2013 edition of the IEA’s World Energy Outlook, however, this belief should be treated with considerable skepticism.


Among the principal challenges to the theory are these:


1. Increasing Technology Costs: While the costs of developing a resource normally decline over time as industry gains experience with the technologies involved, Hubbert’s law of depletion doesn’t go away.  In other words, oil firms invariably develop the easiest “tough oil” resources first, leaving the toughest (and most costly) for later.  For example, the exploitation of Canada’s tar sands began with the strip-mining of deposits close to the surface.  Because those are becoming exhausted, however, energy firms are now going after deep-underground reserves using far costlier technologies.  Likewise, many of the most abundant shale oil deposits in North Dakota have now been depleted, requiring an increasing pace of drilling to maintain production levels.  As a result, the IEA reports, the cost of developing new petroleum resources will continually increase: up to $80 per barrel for oil obtained using advanced EOR techniques, $90 per barrel for tar sands and extra-heavy oil, $100 or more for kerogen and Arctic oil, and $110 for CTL and GTL.  The market may not, however, be able to sustain levels this high, putting such investments in doubt.


2. Growing Political and Environmental Risk: By definition, tough oil reserves are located in problematic areas.  For example, an estimated 13% of the world’s undiscovered oil lies in the Arctic, along with 30% of its untapped natural gas.  The environmental risks associated with their exploitation under the worst of weather conditions imaginable will quickly become more evident — and so, faced with the rising potential for catastrophic spills in a melting Arctic, expect a commensurate increase in political opposition to such drilling.  In fact, a recent increase has sparked protests in both Alaska and Russia, including the much-publicized September 2013 attempt by activists from Greenpeace to scale a Russian offshore oil platform — an action that led to their seizure and arrest by Russian commandos.  Similarly, expanded fracking operations have provoked a steady increase in anti-fracking activism.  In response to such protests and other factors, oil firms are being forced to adopt increasingly stringent environmental protections, pumping up the cost of production further.


3. Climate-Related Demand Reduction: The techno-optimist outlook assumes that oil demand will keep rising, prompting investors to provide the added funds needed to develop the technologies required.  However, as the effects of rampant climate change accelerate, more and more polities are likely to try to impose curbs of one sort or another on oil consumption, suppressing demand — and so discouraging investment.  This is already happening in the United States, where mandated increases in vehicle fuel-efficiency standards are expected to significantly reduce oil consumption.  Future “demand destruction” of this sort is bound to impose a downward pressure on oil prices, diminishing the inclination of investors to finance costly new development projects.


Combine these three factors, and it is possible to conceive of a “technology peak” not unlike the peak in oil output originally envisioned by M. King Hubbert.  Such a techno-peak is likely to occur when the “easy” sources of “tough” oil have been depleted, opponents of fracking and other objectionable forms of production have imposed strict (and costly) environmental regulations on drilling operations, and global demand has dropped below a level sufficient to justify investment in costly extractive operations.  At that point, global oil production will decline even if supplies are “boundless” and technology is still capable of unlocking more oil every year.


Peak Oil Reconsidered


Peak oil theory, as originally conceived by Hubbert and his followers, was largely governed by natural forces.  As we have seen, however, these can be overpowered by the application of increasingly sophisticated technology.  Reservoirs of energy once considered inaccessible can be brought into production, and others once deemed exhausted can be returned to production; rather than being finite, the world’s petroleum base now appears virtually inexhaustible.


Does this mean that global oil output will continue rising, year after year, without ever reaching a peak?  That appears unlikely.  What seems far more probable is that we will see a slow tapering of output over the next decade or two as costs of production rise and climate change — along with opposition to the path chosen by the energy giants — gains momentum.  Eventually, the forces tending to reduce supply will overpower those favoring higher output, and a peak in production will indeed result, even if not due to natural forces alone.


Such an outcome is, in fact, envisioned in one of three possible energy scenarios the IEA’s mainstream experts lay out in the latest edition of World Energy Outlook. The first assumes no change in government policies over the next 25 years and sees world oil supply rising from 87 to 110 million barrels per day by 2035; the second assumes some effort to curb carbon emissions and so projects output reaching “only” 101 million barrels per day by the end of the survey period.


It’s the third trajectory, the “450 Scenario,” that should raise eyebrows.  It assumes that momentum develops for a global drive to keep greenhouse gas emissions below 450 parts per million — the maximum level at which it might be possible to prevent global average temperatures from rising above 2 degrees Celsius (and so cause catastrophic climate effects).  As a result, it foresees a peak in global oil output occurring around 2020 at about 91 million barrels per day, with a decline to 78 million barrels by 2035.


It would be premature to suggest that the “450 Scenario” will be the immediate roadmap for humanity, since it’s clear enough that, for the moment, we are on a highway to hell that combines the IEA’s first two scenarios.  Bear in mind, moreover, that many scientists believe a global temperature increase of even 2 degrees Celsius would be enough to produce catastrophic climate effects.  But as the effects of climate change become more pronounced in our lives, count on one thing: the clamor for government action will grow more intense, and so eventually we’re likely to see some variation of the 450 Scenario take shape.  In the process, the world’s demand for oil will be sharply constricted, eliminating the incentive to invest in costly new production schemes.


The bottom line: global peak oil remains in our future, even if not purely for the reasons given by Hubbert and his followers.  With the gradual disappearance of “easy” oil, the major private firms are being forced to exploit increasingly tough, hard-to-reach reserves, thereby driving up the cost of production and potentially discouraging new investment at a time when climate change and environmental activism are on the rise. 


Peak oil is dead!  Long live peak oil!


             Michael T. Klare, a TomDispatch regular


The Flood Next Time

January. 13, 2014

by Justin Gillis

New York Times


The little white shack at the water’s edge in Lower Manhattan is unobtrusive — so much so that the tourists strolling the promenade at Battery Park the other day did not give it a second glance.


Up close, though, the roof of the shed behind a Coast Guard building bristled with antennas and other gear. Though not much bigger than a closet, this facility is helping scientists confront one of the great environmental mysteries of the age.


The equipment inside is linked to probes in the water that keep track of the ebb and flow of the tides in New York Harbor, its readings beamed up to a satellite every six minutes.


While the gear today is of the latest type, some kind of tide gauge has been operating at the Battery since the 1850s, by a government office originally founded by Thomas Jefferson. That long data record has become invaluable to scientists grappling with this question: How much has the ocean already risen, and how much more will it go up?


Scientists have spent decades examining all the factors that can influence the rise of the seas, and their research is finally leading to answers. And the more the scientists learn, the more they perceive an enormous


Much of the population and economy of the country is concentrated on the East Coast, which the accumulating scientific evidence suggests will be a global hot spot for a rising sea level over the coming century.


The detective work has required scientists to grapple with the influence of ancient ice sheets, the meaning of islands that are sinking in the Chesapeake Bay, and even the effect of a giant meteor that slammed into the earth.


The work starts with the tides. Because of their importance to navigation, they have been measured for the better part of two centuries. While the record is not perfect, scientists say it leaves no doubt that the world’s oceans are rising. The best calculation suggests that from 1880 to 2009, the global average sea level rose a little over eight


That may not sound like much, but scientists say even the smallest increase causes the seawater to eat away more aggressively at the shoreline in calm weather, and leads to higher tidal surges during storms. The sea-level rise of decades past thus explains why coastal towns nearly everywhere are having to spend billions of dollars fighting erosion.


The evidence suggests that the sea-level rise has probably accelerated, to about a foot a century, and scientists think it will accelerate still more with the continued emission of large amounts of greenhouse gases into the air. The gases heat the planet and cause land ice to melt into the sea.


The official stance of the world’s climate scientists is that the global sea level could rise as much as three feet by the end of this century, if emissions continue at a rapid pace. But some scientific evidence supports even higher numbers, five feet and beyond in the worst case.


Scientists say the East Coast will be hit harder for many reasons, but among the most important is that even as the seawater rises, the land in this part of the world is sinking. And that goes back to the last ice age, which peaked some 20,000 years ago.


As a massive ice sheet, more than a mile thick, grew over what are now Canada and the northern reaches of the United States, the weight of it depressed the crust of the earth. Areas away from the ice sheet bulged upward in response, as though somebody had stepped on one edge of a balloon, causing the other side to pop up. Now that the ice sheet has melted, the ground that was directly beneath it is rising, and the peripheral bulge is falling.


Some degree of sinking is going on all the way from southern Maine to northern Florida, and it manifests itself as an apparent rising of the sea.


The sinking is fastest in the Chesapeake Bay region. Whole island communities that contained hundreds of residents in the 19th century have already disappeared. Holland Island, where the population peaked at nearly 400 people around 1910, had stores, a school, a baseball team and scores of homes. But as the water rose and the island eroded, the community had to be abandoned.


Eventually just a single, sturdy Victorian house, built in 1888, stood on a remaining spit of land, seeming at high tide to rise from the waters of the bay itself. A few years ago, a Washington Post reporter, David A. Fahrenthold, chronicled its collapse.


Perhaps the weirdest factor of all pertains to Norfolk, Va., and points nearby. What is now the Tidewater region of Virginia was slammed by a meteor about 35 million years ago — a collision so violent it may have killed nearly everything on the East Coast and sent tsunami waves crashing against the Blue Ridge Mountains. The meteor impact disturbed and weakened the sediments across a 50-mile zone. Norfolk is at the edge of that zone, and some scientists think the ancient cataclysm may be one reason it is sinking especially fast, though others doubt it is much of a factor.


Coastal flooding has already become such a severe problem that Norfolk is spending millions to raise streets and improve drainage. Truly protecting the city could cost as much as $1 billion, money that Norfolk officials say they do not have. Norfolk’s mayor, Paul Fraim, made headlines a couple of years ago by acknowledging that some areas might eventually have to be abandoned.


Up and down the Eastern Seaboard, municipal planners want to know: How bad are things going to get, and how fast?


One of the most ambitious attempts to take account of all known factors came just a few weeks ago from Kenneth G. Miller and Robert E. Kopp of Rutgers University, and a handful of their colleagues. Their calculations, centered on New Jersey, suggest this is not just some problem of the distant future.


People considering whether to buy or rebuild at the storm-damaged Jersey Shore, for instance, could be looking at nearly a foot of sea-level rise by the time they would pay off a 30-year mortgage, according to the Rutgers projections. That would make coastal flooding and further property damage considerably more likely than in the past.


And if the global sea level rises only eight more inches by 2050, a moderate forecast, the Rutgers group foresees relative increases of 14 inches at bedrock locations like the Battery, and 15 inches along the New Jersey coastal plain, where the sediments are compressing. By 2100, they calculate, a global ocean rise of 28 inches would produce increases of 36 inches at the Battery and 39 inches on the coastal plain.


These numbers are profoundly threatening, and among the American public, the impulse toward denial is still strong. But in towns like Norfolk — where neighborhoods are already flooding repeatedly even in the absence of storms, and where some homes have become unsaleable — people are starting to pay attention.


“In the last couple or three years, there’s really been a change,” said William A. Stiles Jr., head of Wetlands Watch, a Norfolk environmental group. “What you get now is people saying, ‘I’m tired of driving through salt water on my way to work, and I need some solutions.’ ”



GPS and smartphone car technology raises questions about drivers’ privacy

• US transportation secretary speaks at Detroit auto show

• Statement about GPS tracking raised alarms at tech convention


by Dominic Rushe in Detroit

January 13, 2014



The increasing connectivity of cars, with GPS systems and other computers becoming more common, raises questions about drivers’ privacy, the US transportation secretary warned on Monday.


Speaking at the North American International Auto Show in Detroit, Anthony Foxx said a balance had to be struck between the convenience and safety of drivers with their expectations of privacy.


A senior executive at the Ford motor company was forced to hastily withdraw claims he made last week about the ability of the carmaker to track its drivers using new technology. At the Consumer Electronics Show in Las Vegas last week Ford’s Jim Farley, global vice-president of marketing and sales, initially claimed the GPS units installed in the company’s cars meant the company knew when drivers were speeding and where they weren’t.


“We know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing. By the way, we don’t supply that data to anyone,” he told attendees.


After his comments caused a furore Farley said he had “left the wrong impression”. He said in a follow-up statement: “We do not track our customers in their cars without their approval or their consent. The statement I made in my eyes was hypothetical and I want to clear this up.”


On Monday, Foxx told reporters in Detroit that new technologies being developed for cars raised potentially thorny issues. He said: “The technology that’s emerging raises questions, and we’re going to be responsive to those questions. But each technology is different, and each application of it is different, and we want to make sure that we’re striking the right balance between helping folks be safe but also making sure that their expectations of privacy are also weighed carefully.”


Sergio Marchionne, the chief executive officer of Chrysler and Fiat, said there were concerns about the data being collected at his company but that it was carefully protected.


“The more information I get about a particular vehicle and the way in which it performs the better quality car I can manufacture,” he said. “There is a large caveat to this which is the potential misuse of this information. We have been very, very wary,” he said. Marchionne said any data was collected in such a way that no individuals information could be compromised “in any shape or form”.


Foxx’s comments came after the US government accountability office (GAO) found inconsistencies in the way automakers handle data from car owners, raising fears of privacy breaches. The study looked at information collected by Chrysler, Ford, General Motors, Honda, Nissan and Toyota as well as navigation device-makers Garmin and TomTom and map and navigation app developers Google and Telenav.


“Without clear disclosures about the purposes, consumers may not be able to effectively judge whether the uses of their location data might violate their privacy,” the report noted. It also criticised the fact that drivers were often not able to delete their data and expressed concern that the data could be used in ways “consumers did not intend or may be vulnerable to unauthorized access”.


After the GAO report was released Minnesota’s senator Al Franken called to on Congress to pass a location-privacy bill. “Companies providing in-car location services are taking their customers’ privacy seriously – but this report shows that Minnesotans and people across the country need much more information about how the data are being collected, what they’re being used for, and how they’re being shared with third parties,” Franken said in a statement.



Tech firms: We don’t want to keep Internet metadata for the government


January 13, 2014

by Ellen Nakashima

The Washington Post


Tech and telephone companies made clear to administration officials in a meeting Friday that they do not want to see bulk collection activities expanded to Internet data.


And they most certainly do not want to see Congress pass mandates to require them to hold that data so the government can collect it.


“Technology is increasingly moving away from phone calls to texting and chat and social media, so it’s hard to believe that governments around the world would not be interested in that data,” said one participant in the White House meeting, who like others interviewed spoke on condition of anonymity to describe the conversation. Administration participants included White House chief privacy officer Nicole Wong, who is a former Google deputy general counsel and Twitter legal director, and cybersecurity coordinator Michael Daniel.


The administration has acknowledged that it had run a program of Internet metadata collection — gathering times and dates and IP addresses of e-mails, for instance — but dropped it in 2011 when it did not prove operationally feasible.


“So the companies made very clear that they don’t want the government restarting that program,” said the participant. “But if you do, we don’t want mandatory retention for Internet metadata.”


The tech firms at the meeting, which included Google, Yahoo, Facebook, Apple, among others, have sent a letter to President Obama and Congress calling for surveillance reform based on principles such as banning bulk collection of Internet communications.


Clearly, the companies are anxious about the impact that revelations by former National Security Agency contractor Edward Snowden has had on their business — in particular overseas. “There was a lot of discussion about the importance of giving the global citizenry comfort, sending messages that the U.S. government is making changes that protect the information of people from all over the world so that there’s more confidence in the companies that have a global customer base,” said the participant.


Also at the meeting were officials from AT&T, Verizon and Comcast. Their discussion focused on the NSA’s bulk collection of domestic phone toll logs, which has been a topic of controversy since the program was revealed last June through a leak from  Snowden. Obama is due to give a speech Friday in which he will announce his decision on the NSA program’s future.


Advocates of ending the program often argue that collection can be done without Congress passing legislation to force the phone companies to hold the data for longer periods. They often point to a Federal Communications Commission rule that says carriers must keep the billing records — numbers dialed, length and time of call — for 18 months.


They explained to administration officials on Friday that increasingly phone companies are turning to flat rate plans, which means they do not keep this data for billing purposes. If a customer pays $100 a month for a set number of minutes, it does not matter whom she calls and where as long as she does not exceed the limit. Some plans offer unlimited calling.


And often companies may not keep toll records of the incoming calls because the data is not needed for billing purposes, beyond the name of the carrier that called their customer, which is needed for inter-carrier billing.


So the take-away, industry officials say, is that relying on the FCC rule won’t be enough if the NSA wants the carriers to keep comprehensive data sets. But even more important, they say, they do not want Congress imposing mandates to keep the data longer than they already do.


“We end up with all sorts of litigation risks, privacy risks, hacking vulnerabilities,” one executive told The Washington Post. “There is a huge cost involved in just protecting them. And, truthfully, we just don’t want to do it.”


Lawmakers who advocate the end of bulk collection also do not want data retention mandates. They argue that so far no intelligence official has offered instances where data held for periods longer than the companies would normally keep them has yielded “unique value” in thwarting a terrorist plot.


“We have yet to see any evidence that the bulk phone records collection program has provided any otherwise unobtainable evidence,” Sens. Ron Wyden (D-Ore.), Mark Udall (D-Colo.), and Martin Heinrich (D-N.M.) said in a letter to Obama last week.





D: Office of the Director


D0: Director’s Staff


D01: Director’s Operation Group (DOG)

D05: Director’s Secretariat

D07: Office of Protocol

D08: Homeland Security Support Office (HSSO)


D1: Office of the Inspector General (OIG)

D2: Office of the General Counsel (OGC)

D5: Corporate Assessments Office


D5T: Technology Test and Evaluation


D6: Office of Equal Employment Oppertunity


D7: Central Security Service (CSS)


D709: CSS Staff and Resources

D7D: Cryptologic Doctrine Office

D7P: Office of Military Personnel

D7R: Director’s Reserve Forces Advisor


D8: Community ELINT Management Office (CEMO)


DA: Directorate of Acquisition/Senior Acquisition Executive (SAE)


DB: Corporate Strategy


DC: Director’s Chief of Staff


DC0: Support

DC3: Policy


DC31: Corporate Policy

DC32: Information Policy


DC321: Freedom of Information Act and Privacy Act (FOIA/PA)

DC322: Information Security and Records Management


DC3221: Information Security Policy

DC3223: Records Management Policy


DC323: Automated Declassification Services


DC33: Technology Security, Export, and Encryption Policy


DC4: Corporate Strategic Planning and Performance

DC6: External Relations & Communications

DC8: Corporate Management Services


DE: Unified Cryptologic Architecture Office (OCAO)

DF: Chief Financial Manager (CFM)

DK: Chief Information Officer (CIO)

DL: Legislative Affairs Office (LAO)

DP: Foreign Affairs Directorate (FAD)

DT: Office of the Chief Technical Officer (CTO)


E: Associate Directorate for Education and Training (ADET)


El: Educational Services and Staff

E2: Educational Technology Integration

E3: Language

E4: Intelligence Analysis and Information Assurance

E5: Signals Analysis, Cryptanalysis, and Math

EL: Center for Leadership and Professional Development


F: Field sites


F1: Cryptologic Services Groups


F1C: ?


F1CA: Cryptologic Services Group USSTRATCOM

F1CD: Life Cycle Logistics


F1CD1: Technical Services Group


F1I: ?


F1I2: Joint Interagency Task Force South


F1T: ?


F1T1: Cryptologic Services Group USSOCOM


F1Z: Cryptologic Services Group CENTCOM


F1Z2: Deputy Chief, CSG CENTCOM


F2: NSA/CSS Europe and Africa


F20: ?


F204: Support to Military Operations for AFRICOM


F22: European Cryptologic Center (ECC)

F23: NCER Mons, Belgium


F3: ?


F4: ?


F411: Military Operations Branch


F5: ?



F6: Special Collection Service (SCS)


F666E: (SCS unit in the US embassy in Berlin?)



F7: ?

F74: Meade Operations Center (MOC)


F741: Deployments & Training Division

F74?: Special Operations Readiness Cell (SORC)


F7: ?

F74: Meade Operations Center (MOC)


F741: Deployments & Training Division

F74?: Special Operations Readiness Cell (SORC)


F77: ?


F77F: Menwith Hill


F79: ?


F79F: Misawa Security Operations Center (MSOC)


F91: ?


FC: NSA/CSS Colorado (NSAC)



FG: NSA/CSS Georgia (NSAG)


FGD: Director, Georgia

FGS: SID, Georgia


FGS3: Transnational issues group

FG32: ?


FG3223 – Media Exploitation & Analysis




FHS: Signals Intelligence Department, Hawaii



FTS: Signals Intelligence Department, Texas


FTS2: Analysis and Production


FTS2F1 – “Southern Arc”


FTS3: Data Acquisition


FTS32: Tailored Access Operations


FTS327: Requirements & targeting




I0: Chief of Staff


I01: Office of Policy


I2: Trusted Engineering Solutions


I209: Support Staff


I21: Architecture

I22: Engineering

I23: ?


I231: HAIPE Program Management Office (PMO)


I2N: National Nuclear Command Capabilities (N2C2) Mission


I3: Operations


I31: Current Operations

I33: Remote & Deployed Operations

I3?: Mission Integration Office

I3?: Technical Security Evaluations

I3?: Red Cell

I3?: Blue Cell

I3?: Advanced Adversary Network Penetration Cell

I3?: Joint Communications Security Monitoring

I4: Fusion, Analysis and Mitigation

IE: Engagement

IS: Strategy

IC: Cyber Integration

IV: Oversight and Compliance


K: National Security Operations Center (NSOC)


K?: (…) Mission Management (APSMM)

K?: (…) Mission Management (CRSMM)

K?: Counter-Terrorism Mission Management Center? (CTMMC)

K9: Capabilities and Sustaining Systems (CASS)

K92: Current Capabilities for Mission Management (C2M2)


L: Associate Directorate for Installations and Logistics (ADIL)


L0: I&L Staff

LF: Facilities Services


LFl: Space Management and Facilities Planning

LF2: ?

LF3: Operations, Maintenance and Utilities

LF4: ?

LF5: Program Management

LL: Logistics Services

LL1: Material Management

LL2: Transportation, Asset, and Disposition Services

LL3: Employee Morale Services


M: Associate Directorate for Human Resource Services (ADHRS)


MA: Office of Workforce Strategies

MB: Office of Recruitment and Staffing

MC: Office of Diversity Management (ODM)

MD: Office of Human Resource Program Management & Service

ME: Office of Occupational Health, Environmental & Safety Services (OHESS)

MG: Office of Global Human Resource Services

M2: Office of Military Personnel

M3: Office of Civilian Personnel

M4: ?

M43: Information Policy Division

MJ: ?

MJ1: HR operations/global personnel SA




Q: Associate Directorate for Security and Counterintelligence


Q0: Staff

Q05: Security Operations Center (SOC)

Q07: NSA Counterintelligence Center (NSACC)

Q09: Security Support Staff

Q1: Office of Physical Security

Q123: ?

Q2: Office of Personnel Security

Q223: Counterintelligence Awareness

Q5: Office of Security

Q509: Security Policy Staff

Q51: Physical Security Division

Q52: Field Security Division


Q56: Security Awareness

Q57: Polygraph

Q7: Counterintelligence

QJ: Joint Program Security Office


R: Research Associate Directorate (RD)


R1: Math & Research

R2: Trusted Systems Research

R3: Laboratory for Physical Sciences (LPS)

R4: Laboratory for Telecom Services (LTS)

R5: Language Study

R6: Computer Information and Science

RX: Special Access Research

RV: Oversight and Compliance




S0: SID Staff

S01: Deputy for Integrated Planning

S012: ?

S0121: SID Communications

S02: Communications and Support Operations

S02L: ?

S02L1: SIGINT Policy


S11: Customer Gateway

S111: (Desk for coordinating RFIs and responses)

S12: Information Sharing Services Branch

S12?: Partnership Dissemination Cell (PDC)

S124: Staff Services Division

S17: Strategic Intelligence Issues

S1E: Electromagnetic Space Program Management Office

S1P: Plans & Exercises Division



S20: A&P Staff

S203A: Access Interface Portfolio

S2A: South Asia Product Line

…S2A4: Pakistan\

S2A5: (South-Asia)

S2A51: S-A Language Analysis Branch

S2A52: S-A Reporting Branch

S2B: China, Korea, Taiwan Product Line

S2C: International Security Issues (ISI) Product Line

S2C32: European States Branch

…S2C41: Mexico Leadership Team

S2C42: Brazilian Leadership Team

S2D: Counter Foreign Intelligence Product Line

S2E: Middle East and Africa Product Line

S2F: International Crime and Narcotics Product Line

S2F1: (“Southern Arc”?)

S2G: Counter Proliferation (CP) Product Line

S2H: Russia Product Line

S2I: Counter-Terrorism (CT) Product Line

S2I3: ?

S2I35: ? (related to RC-135U?)

S2I4: Homeland Mission Center (HMC)

S2I42: Hezbollah Team

S2I43: NOM Team

S2I5: Advanced Analysis Division (AAD)

S2I?: Metadata Analysis Center (MAC)

S2IX: Special CT Operations

S2J: Weapons and Space Product Line

S2T: Current Threats




S31: Cryptanalysis and Exploitation Services (CES)

S31091: Military Operations Branch

S31174: Office of Target Pursuit

S3132: Protocol Exploitation and Dissemination

S3161: Special Deployments Division

S31??: Technical Exploitation Center (TEC)

S32: Tailored Access Operations (TAO)

S321: Remote Operations Center (ROC)

S321?: Network Ops Center (NOC)

S321?: Oper. Readiness Division (ORD)

S321?: Interactive Ops Division (IOD)

S321?: Production Ops Division (POD)

S321?: Access Ops Division (AOD)

S322: Advanced Network Technology (ANT)

S3221: (persistence software)

S3222: (software implants)


S32221: ?

S32222: (routers, servers, etc.)

S3223: (hardware implants)

S3224: ?

S32241: ?

S32242: (GSM cell)

S32243: (radar retro-refl.)

S323: Data Network Technologies (DNT)

S324: Telecomm. Network Topologies (TNT)

S325: Mission Infrastructure Technologies (MIT)

S327: Requirements & Targeting (R&T)

S326: ?

S328: Access Technologies Operations (ATO)

S32?: Network Warfare Team (NWT)

S33: Global Access Operations (GAO)


S332: Terrestrial SIGINT

S33221: ?

S33223: Processing Systems Engineering and Integration Sector

S333: Overhead SIGINT

S333?: Overhead Collection Management Center (OCMC)

S33P: Portfolio Management Office (PMO)

S33P1: ?

S33P2: Technology Integration Division

S33P3: Tactical SIGINT Technology Office

S33?: CROSSHAIR Network Management Center (CNMC)

S34: Target Strategies and Mission Integration (TSMI)

S342: Collection Coordination and Strategies

S3421: ?

S3422: Geographical Regions

S3423: Technical Services

S343: Targeting and Mission Management

S344: Partnership and Enterprise Management

S35: Special Source Operations (SSO)

S351: ?

S352: ?

S3520: Office of Target Reconaissance and Survey (OTRS)

S3521: Special Signal Collection unit (MUSKETEER)

S353: ?


S3533: ?

S35333: PRISM Collection Management

S35P: Portfolio Management Office

S35P2: Technical Integration Division

S35P3: Capabilities Integration Division

SSG: SIGDEV Strategy and Governance


Network Analysis Center



SE: SIGINT & Electronic Warfare

SV: Oversight and Compliance

SV4: Special FISA Oversight and Processing




TE: Enterprise Systems

TS: Information and Systems Security

TT: Independent Test and Evaluation

T1: Mission Capabilities

T132: SCISSORS team

T1?: Strategic SATCOM Security Engineering Office

T2: Business Capabilities

T3: Enterprise IT Services

T32: ?

T3212: Workflow, Standards and Support

…T3221: Transport Field Services (TFS)

T332: Global Enterprise Command Center (GECC)

T332?: Data and Network Operations

T332?: NSA Communications Center

T332?: NISIRT (contains CERT and CSIRT)

T334: National Signals Processing Center (NSPC)

T335: Deployable Communications Operations (DCO)

T33?: National Intelligence and Tactical Operations (NITO)

T5: High Performance Computing Center

T6: Ground Systems Program Office


V: NSA/CSS Threat Operations Center (NTOC)


V2: Office of Analysis

V3: NTOC Operations

V32: Defensive Network Operations

V33: ?

V34: Next Generation Wireless Exploitation Program


X: ?

X3: ?

X31: ?

X312: Planning & Management

X32: ?

? NSA/CSS Commercial Solutions Center (NCSC)





            In the year 2000, then director Michael Hayden reorganized much of NSA’s organizational structure. New officers were appointed, like a Chief Financial manager, a Chief Information Officer (CIO), a Senior Acquisition Executive (SAE) and a Transformation Officer. Around the same time, many NSA divisions and units got new designations.


            Also in 2000, a Senior Leadership team was formed, consisting of the Director (DIRNSA), the Deputy Director and the Directors of the Signals Intelligence (SID), the Information Assurance (IAD) and the Technology Directorate (TD). The chiefs of other main NSA divisions became Associate Directors of the Senior Leadership team.

No responses yet

Leave a Reply