Overview of Web Security Methods - HTML

The previous sections covered specific risks and solutions. This section covers preventative solutions—things that you can do on an ongoing basis to ensure you stay on top of security issues.

Drafting a comprehensive security policy
The first step to security is understanding and enforcing a strict and comprehensive policy. Start with a list of what you absolutely need your server to be able to do and pare down the list to the bare essentials, keeping track of any connections to the outside world that your server will require.

Once you know the requirements for your system, document what software you will need to accomplish the requirements and what additional security issues each additional piece of software could create (what ports will be exposed, and so forth).

Decide what user accounts you will require and what permissions are necessary for each. Most operating systems have defaults for server software; these defaults have been tested and should be used whenever possible.

This process simply creates your “to do” list of security concerns. Next, you must document actual policies and procedures—the most important part of the process. I suggest that for any questions you have on documenting specific policies and procedures you seek advice from experts, such as the following:

  • CERT (www.cert.org) is one of the largest, most organized, and experienced security organizations.
  • The SANS (SysAdmin, Audit, Network, Security) Institute (www.sans.org) is another highly respected security community.

Excluding search engines
Excluding certain files and directories from search engine crawlers can help keep your system secure by hiding potentially hazardous files from the search engine. This keeps the same files from being discovered by hackers using a search engine such as Google.

Most search engines look for a file named robots.txt when indexing files on a site. You should place this file on the server’s root; it contains instructions for search engines. The robots.txt file follows this format:

User-agent: agent_name Disallow: file_or_directory_spec Disallow: ...

You can use the name of the agent you want to disallow or an asterisk (*) for all agents. You can specify as many Disallow sections as necessary, each specifying a different directory or specific file. Note that if you specify a directory, all subdirectories of that directory are also disallowed. A typical robots.txt file might resemble the following:

User-agent: * Disallow: /tmp Disallow: /images Disallow: /cgi-bin Disallow: /private.html

More information on robots.txt and other methods for directing search engines can be found on The Web Robot Pages.

Using secure servers
Secure servers offer another layer of security via encrypted data streams between the server and the client. Typically referred to as Secure Socket Layer (SSL), this layer can be implemented on many Internet-enabled applications—Web servers, e-mail servers, and so on.

Secure servers protect against eavesdroppers, hackers that intercept the communication between the user and server to obtain login information, personal data, credit card information, and so on. Various servers implement SSL in various ways. In each case, you will need a certificate for use with your server. A certificate is an electronic document that is signed by a trusted authority, representing that the owner of the certificate is who they say they are. It’s a means of providing ID to users of your site and saying “you can trust me because I am who I say I am.”

There are quite a few certificate authorities, some more trusted than others. You can even sign your own certificates, though the result isn’t worth much for convincing end users to trust you.


All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

HTML Topics