The allegations that came with the Edward Snowden revelations of Microsoft’s cooperation with U.S. spy agencies is evidently still a problem for Redmond, if a blog item posted yesterday by security VP Matt Thomlinson is any indication. It seems the company has opened a second Transparency Center, this one in Brussels. The news comes eleven months after the announcement of the first such center on the company’s Redmond campus.
At the height of the media frenzy that developed around Snowden’s initial revelations, there were allegations that Microsoft had not only built back doors in its software for the NSA and other government agencies to use against foreign businesses and governments, but that it was cooperating with U.S. authorities in other ways as well. For example, one report indicated that the company was passing along details of unpatched security vulnerabilities in Windows to the NSA, effectively adding temporary tools to the spy agency’s cyber arsenal.
The Transparency Center concept was meant to allay fears that might cause foreign governments to consider options other than Microsoft (read: Linux and FOSS), by granting them unprecedented access to source code:
The Transparency Center initiative is a cornerstone of our long-standing Government Security Program (GSP), which offers participating governmental agencies the opportunity to review the source code of Microsoft products, access information on cybersecurity threats and vulnerabilities, and benefit from the expertise and insight of Microsoft security professionals. This extends to important security documentation about our Azure and Office365 cloud services.
“Benefit from the expertise and insight of Microsoft security professionals”? Isn’t that a bit like getting a tour of hen house security conducted by foxes?
As you might expect, these centers appear to be more opaque than transparent. The company is allowing access to 10 key products, but is doing so in an environment completely controlled by Microsoft. Access is available only within the walls of the security center. Diagnostic tools are available, but they’re tools supplied by Microsoft. Inspection of source code, and use of diagnostic tools, will almost certainly be entirely on Microsoft’s computers for reasons that should be obvious.
Is this any way to inspect source code? If you already have doubts about the company supplying an application (which you must, if you find the Transparency Center necessary), would you trust it to be honest with you in an environment it completely controls?
The place to inspect source code of an application for intentional security vulnerabilities is at your own lab, or at the lab of a trusted independent security partner who is not part of the company or organization that’s developing and marketing the application. The code should be inspected on machines that are under your control, and it should be compiled after inspection with the resulting binaries compared with the binary being offered by the organization marketing the application — as Flip Wilson’s Geraldine used to say, you wan to make sure that “what you see is what you get.”
This is, of course, how it’s done in the FOSS world, where we take free and open access to all FOSS applications’ source code for granted. Certainly, major tech companies that rely heavily on FOSS, thoroughly vet all FOSS software before use in a production environment — meaning that the IBMs of this world can have much more confidence in Linux, Red Hat’s stack or anything from SUSE than they can ever have in Windows, MS Office or Oracle’s software.
Let the buyer beware. Trusting Microsoft in this case is very much akin to trusting…well, read my comment above about the fox led security tour.
The Microsoft Transparency Centers are made from tinted glass.
I’ll leave you with my take on the security measures being taken around the new Microsoft Transparency Center in Brussels. Once safely inside the center, visitors will, no doubt, be treated to the infamous Cone of Silence:
We need you to help us make FOSS Force even better. If you enjoyed this article, please visit our IndieGoGo page and make a small contribution to our fundraising campaign. Every little bit helps.
Christine Hall has been a journalist since 1971. In 2001, she began writing a weekly consumer computer column and started covering Linux and FOSS in 2002 after making the switch to GNU/Linux. Follow her on Twitter: @BrideOfLinux
I the USA sells fighter jets to foreign countries, such as Saudi Arabia, should we not also have a secret kill switch or back door into the plane? Or would you want to divulge its existence, a la Snowden? Snowden is lucky that he was born in the ’80’s not the ’30’s; where would he go then? To Stalin’s gulag?
@Richard, that argument is common and has merit. The other side of it is that it requires that the customer and anybody else who can get access to the item not be talented enough to break in and either use the feature or defeat it.
Some agencies want a key to all encrypted communications. This would include communications between businesses or branches of a business. They’ll argue that they can use it to protect us against terrorists, pedophiles, little green men, or whatever. Again, this assumes that people who are not supposed to have access can be kept out and that people who are supposed to have access can and will do only what they’re supposed to with the information.
There have already been accusations that US government agencies have provided information belonging to foreign companies to US businesses, allowing the US businesses to know what the competition thinks it is doing in secret. Allowing the agencies that want keys to have them assumes they are trustworthy in such matters. We know we can’t require it, so assuming it is all we have.
The assumption of trustworthiness has already been destroyed in great measure and will be difficult to rebuild, if it can be done. (How many times can I say “assume” before somebody reminds us how to break the word into three pieces?) One fear is that an agency, or maybe even a lone wolf, likes Ford better than Chrysler (or GM or Toyota or BMW or KIA), acquires information from that company, and shares it with Ford.
Ignoring unauthorized access from someone known to have a key, that the software is known to have a key means that somehow, with enough effort and maybe enough supercomputer time, it might be possible to get in.
Best I can tell, the ONLY way to prevent undesired access is to avoid having ANY secret keys to the encryption mechanism. If there is a known back door into a piece of software, why would anyone with something to keep secret–or who thinks he MIGHT need to keep something secret in two or three years–want to buy software that can be accessed that way? If Microsoft Word has a back door, guess I’ll go buy something made in a country that doesn’t have that kind of requirement. Probably something open source, so somebody who knows more than I do can look and see what I’m running.
And then there is the easy way to get access, with or without a secret key:
@Uncle Ed… Great response. Or instead of a wrench, there’s always waterboarding.
I agree, Christine. It always amazes me as to just how gullible MS customers are. I guess there’s something to be said for being predictable.
We need open hardware and software, not backdoors in products stealing our privacy and selling us out to whomever happens to be working the data collection desk that day.
The NSA, the government, and all the mega-corps have zero business with any of my data.
Anyone who suggests that you should be willing to give up some privacy for a little security from terrorists, criminals, or the boogeyman has an agenda or is terminally stupid. Those who wish can already use encryption to send secret messages anonymously…via the U.S. postal system. That’s been possible for over a hundred years and yet suddenly encryption is a threat to security? No, it’s a threat to their control. They complain about encryption making their jobs harder, to which I say: So what. Not being able to randomly search everyone’s home or business on a whim also makes their jobs harder, but should we allow that? I just know some moron out there is saying yes.
100% open hardware with 100% open software: It is the only rational solution.
Christine – the Get Smart link you provided is only the opening sequence of Get Smart – not the Cone of Silence.
Having been through the entire series, I know about The Cone and have laughed over it (and many other parts) over the seasons.
Possibly looking for this link?
@Ken Roberts. The title sequence is exactly what I intended. The cone would come along after getting through security:
“I’ll leave you with my take on the security measures being taken around the new Microsoft Transparency Center in Brussels. Once safely inside the center, visitors will, no doubt, be treated to the infamous Cone of Silence.”
And I’m with you Ken. There’s nothing that Buck Henry, Don Adams and the gang came up with this show that wasn’t funny…and very true, in a twisted sort of way. 🙂
It amazes me, that with all that people find out and with all that eventually gets revealed about the things MS has done or is doing to obtain information on their users…..that they are even still in existence! If we were talking about an automobile company that had betrayed the public trust by building faulty cars, I’m almost certain that auto company would shut down, or had this been a discussion on a food product that betrayed the public’s trust, they would have to pay billions in restitution to those who purchased their product. But it would seem in the Information Technology world that all you have to do is give your software a new coat of paint, call it something else, add a few tweaks and a new interface, and you’ll do just fine! I don’t understand with the amount of money they would save, why companies don’t stop using Windows….oh well to each their own I guess?…
Comments are closed.