Monday, November 25, 2024
HomeTechnologyComputersProtecting Your Website Against Hacking

Protecting Your Website Against Hacking

Today in the age of affiliate marketing, more and more people are creating their own sites to market products. Having your own website will definitely help you in getting more customers to purchase your products.

You will also attract hackers to play around with your site if you have not taken enough care to protect your site. There are many hacks possible in almost all the programming languages used for websites today. With little care, you can protect your site and your online identity.

The most common type of hacking is Cross-site scripting also called XSS. The cross-site scripting can be carried out in different ways, DOM-based, stored, or reflected.

Instead of looking into what these hacks are, it’s best to understand how you can protect your sites from such hacks.

The best way to protect your site from such attacks is to ensure you validate all the inputs to your site. Any form of inputs like page headers, cookies, query-string, hidden fields used on forms, and forms fields used to gather some sort of input from the users should be validated.

Many site owners normally use web forms for subscription to gather user email. Such inputs should be validated against expected input types and lengths. Any input to the web forms should always be HTML encrypted to avoid any unwanted script elements.

The best way to validate inputs to the site would be to validate against what should be allowed rather than what should not be allowed.

The second most common hacking technique is Google hacking. Today most search engines provide a lot of tools to webmasters to track and analyze their site rankings.

Google has become the most important search engine and it seems to be on top of the list for both web site owners as well as hackers. Google hacking refers to the techniques used to gain access to unauthorized information through advanced search queries.

Google hacking employs searching sites using special characters, logical operators, and operators such as cache, filetype, link, site, in the title, inurl.

Many webmasters put critical data on their servers to enable access from anywhere. Though such documents are kept isolated, it is easy to get access to such pages.

Unless specified in the robots.txt file, all the documents on a particular site are indexed by the search engine spiders. Such documents are then available to the public via search engine queries.

Some of the advanced queries like ext: doc or filetype: doc will search all the word doc files available on the servers. Similarly site:xyz.com private will search for all instances of private on the site xyz.com.

To protect yourself from such attacks you should take necessary precautions like avoiding any storage of critical or sensitive data on the server.

If it is necessary, use the robots.txt file to avoid indexing such documents or folders. E.g. User-agent:
Disallow: /documents

These instructions will not allow the contents of folder “documents” to be indexed by any search engine spider. Similarly the meta tag “meta name=’SPIDERNAME’ content=’NOARCHIVE’ ” can be used on individual HTML pages, if you do not want that page to be indexed by any search engine. Here you need to put the correct spider name of the search engine you want to block.

Lastly, you should also check if your web server allows directory listing. The directory listing will allow anyone to see the contents of the directory by typing in the website address and existing folder name.

If you type http://domainname.com/somefoldername/, and you see the contents of the directory, you should immediately talk to the web host and get it disabled for your site.

Though it is virtually impossible for a normal website owner to avoid all hacking attempts, it is possible to minimize them using some basic precautions.

Most Popular

Recent Comments