[mdlug] File transfer problem

Mark Montague markmont at umich.edu
Mon Jan 10 01:14:20 EST 2011


  On January 9, 2011 22:14 , "Michael ORourke" <mrorourke at earthlink.net> 
wrote:
> There were several discussions last Friday at the office about which
> approach to take.  Our final decision was that it would be easier to give
> the content management team direct FTP access to the web repository (like
> they had before).

I'm glad you have a decision.  I'm replying just because I find these 
sorts of discussions really fun; everyone, please feel free to stop 
reading and delete this message at any point, especially if I go too far 
afield or seem too preachy/irritating.


> The biggest risk I see in having direct FTP access to the web repository,
> from our intranet, is that a disgruntled employee or compromised system
> could cause a lot of damage very easily.

Well, for the sake of argument, let's assume I'm a disgruntled member of 
your web content team and that you did NOT make the decision you did but 
instead set up the following system that you were originally asking about:

Members of the web content team do not have access to the web repository 
in the DMZ.  Instead, they FTP content to a machine on the management 
network.  Based on a file trigger and/or a schedule, this machine will 
periodically rsync updated web content to the web repository server in 
the DMZ.  Let's further assume that you're paranoid and are running 
rsync without the --delete option, so that files can only be added and 
changed, but so that the members of the web content team will have to 
contact you in order to have you manually remove a file from the web 
repository server in the DMZ.

As a disgruntled member of the web content team, there is still a lot I 
can do just by uploading files to the machine on the management network 
and then waiting for (or triggering) the rsync to happen.  Off the top 
of my head:

- I could replace the contents of each file with a single obscene word, 
effectively taking the site down until it could be restored from a 
backup.  For example,  "find /web/content/dir -type f -exec /bin/sh -c 
'echo naughty > {}' \;"  Even if I did not know how to do this in 
seconds from the Unix shell, I could still do it by obsessively clicking 
in a GUI such as Adobe Dreamweaver or Microsoft Explorer.

- I could add words to some/all of the files that would not be rendered 
in user's browsers (e.g., by using the "display:none" CSS property, by 
setting the text color to match the background, or one of a dozen other 
ways) but which would show up for search engines with the intent of 
destroying the company's search engine rankings and/or causing 
embarrassment through malicious associations.

- I could add some obfuscated JavaScript to one or more .js or .html 
files that spys on legitimate users of the site and encrypts/encodes any 
sensitive information (particularly form data: order information, 
customer feedback, credit card numbers...) and sends it to a Twitter 
account or other destination from which it could be retrieved 
anonymously and decrypted/decoded.  Or, if I decide this is too risky or 
just not worth it, I could use the JavaScript I add to create a denial 
of service attack on some other web site with the intent of getting your 
company into legal and PR trouble.

- If your site uses active content (JSP, Python, PHP, etc.) I could add 
a tiny snippet of code -- made to look either as innocent and/or as 
obfuscated as I can make it -- to do any number of evil things, 
including running arbitrary shell commands that get passed in via a 
secret query string parameter that only I, the disgruntled employee, 
know; or that will open a reverse shell when a certain User Agent string 
is seen.  Either one of these would give me the ability to run arbitrary 
Unix shell commands on the DMZ server as the user that the web server 
runs as (including the ability to read, delete, or modify anything that 
the web server can read, delete or modify -- including contents of any 
database used by the web site, even if you change the database password).


OK, that's enough bad stuff.  You get the idea.  The point is that there 
are a lot of very bad things the web content team can do that do not 
depend on them being able to directly access the web repository server 
via FTP.

You could get fancy and add tests to your script to catch as many of 
these as possible.  If you get lucky, one of your checks would catch a 
disgruntled employee and prevent the changes from getting to the web 
content server in the DMZ.  Of course, this isn't a great use of 
resources, since hopefully none of your employees would ever do any of 
these things, and if a employee does become disgruntled, they could very 
easily wind up doing something you did not anticipate, and you'd be out 
of luck.

What I would suggest is sticking with letting the web content team FTP 
files directly to the web content server in the DMZ.  Screen candidates 
for trustworthiness during the hiring process, treat employees as well 
as possible, keep good backups both on-site and off-site and regularly 
test restoring from backup.  If after all this, you still consider the 
web content team to be a sufficiently likely and valid threat source to 
justify the investment resources to protect against them (which, 
depending on your business situation, you may), then I'd suggest 
implementing Two Person Integrity ( 
http://en.wikipedia.org/wiki/Two_person_integrity ) so that ANY web 
content change has to be signed off on by two people (each of which is 
required to independently review the change) before going live.  Such a 
solution could involve any of the following:

- switch to a web content management system that supports change 
management with multi- person/level approvals.  (while this option is 
likely to be both radical and expensive, such a CMS could bring other, 
additional, features/benefits to your company)

- implement a more sophisticated version of the setup you originally 
proposed, but have the script only perform the rsync if it finds trigger 
files in two separate user accounts, and both are writable only by the 
owner.  Keep a record of which two user accounts authorized each rsync.


As for compromised systems...

The FTP server on your web content server in the DMZ should only be 
accessible from your intranet.  The goal of much of the rest of your 
security measures are to keep your intranet secure.  Hopefully all of 
your intranet machines are fully patched, users don't have 
administrative rights on intranet machines (and so can't install 
software), and users don't visit sites from intranet machines that are 
malicious or have ads (as malicious ads otherwise legitimate sites are 
becoming increasingly common).   Hopefully your firewall is blocking 
non-work-related traffic and you have an Intrusion Detection System in 
place.

But, let's assume that even with all this an intranet system does get 
compromised, as could happen, and you're allowing the web content team 
to FTP directly to the web repository server in the DMZ.  Your best way 
to protect the web content server in the DMZ in this case (which I'll 
assume is also always kept fully patched, and is properly configred, 
well-managed, and has verified-good recent backups) is to require 
multi-factor authentication, as sniffed/keylogged passwords are likely 
to be your largest vulnerability (much more likely, IMO, than a remotely 
exploitable unpatched vulnerability in the FTP daemon itself).  This may 
require you to switch to an FTP server that supports multi-factor 
authentication, or better yet, get rid of FTP entirely (the protocol is 
insecure) and switch to sftp which is a part of SSH.  Lots of options 
for two-factor authentication are available, but make sure you chose one 
that includes something each user physically posseses -- tokens such as 
YubiKey or SecurID are one possibility, and one-time passwords SMS'd to 
the user's mobile phone at time of authentication are another.  This 
way, an attacker who has compromised an intranet system won't be able to 
access the web repository server using a stolen password, and would be 
reduced to attacking the web repository server using no more access than 
they would have from anywhere on the Internet (excepting for the FTP or 
SSH service, which they'd have to exploit at a pre-authentication stage 
in order to have any success; although they could still modify files on 
the intranet machines in hopes that those files would be later uploaded 
to the web repository server, which takes us back using change review 
and change approval processes as defenses).

Hopefully a lot of what I've discussed here would be overkill for most 
situations.  I think the key to good security is deciding which threats 
are the most likely and/or important, and properly defending against 
them; this involves a lot of risk and cost analysis.  And, as local 
security consultant Jon Oberheide, CTO of Duo Security, puts it, "Beware 
of security theater.  Inconvenience != security."

I hope that this reply is useful, or at least interesting, if not to you 
then maybe to others on the list.

--
   Mark Montague
   mark at catseye.org




More information about the mdlug mailing list